- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
You must log in or register to comment.
I am zero surprised.
No way… you’re telling me a free AI is profiting off my data?
Always run AI locally!
Is that feasible for someone with an office PC with integrated graphics? Asking for a friend.
If you have a lot of RAM, you can run small models slowly on the CPU. Your integrated graphics I would guess won’t fit anything useful in it’s vram, so if you really want to run something locally, getting some extra sticks of RAM is probably your cheapest option.
I have 64G and I run 8-14b models. 32b is pushing it (it’s just really slow)
Don’t iGPUs use the RAM as VRAM directly? You’d only need to configure how much in the BIOS (eg. by default it uses 1.5GB of 8GB or smth and you can set it to 6/8GB)