Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Home calculators are cheap as they've ever been, but this era of computing is out of reach for the majority of people.

The analogous PC for this era requires a large amount of high speed memory and specialized inference hardware.





What regular home workload are you thinking of that the computer I described is incapable of?

You can call a computer a calculator, but that doesn’t make it a calculator.

Can they run SOTA LLMs? No. Can they run smaller, yet still capable LLMs? Yes.

However, I don’t think that the ability to run SOTA LLMs is a reasonable expectation for “a computer in every home” just a few years into that software category even existing.


It's kind of funny to see "a computer in every home" invoked when we're talking about the equivalent of ~$100 buying a non-trivial percentage of all computational power in existence at the time of the quote. By the standards of that time, we don't just have a computer in every home, we have a supercomputer in every pocket.

You can have access to a supercomputer for pennies, internet access for very little money, and even an m4 Mac mini for $500. You can have a raspberry pi computer for even less. And buy a monitor for a couple hundred dollars.

I feel like you’re twisting the goalposts to make your point that it has to be local compute to have access to AI. Why does it need to be local?

Update: I take it back. You can get access to AI for free.


No it doesn't. The majority of people aren't trying to run Ollama on their personal computers.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: