Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I assume this is a generic LLM + RAG, so you can't prevent hallucinations and there's a chance (which gets better as the subject gets more complex) that the study materials will be wrong.

And anyway, creating flash cards is a huge study booster by itself. It doesn't make sense to automate that step away.



It makes a ton of sense. I've been doing my own custom version of this for the past year and it's a HUGE shift in studying efficiency.

- LLMs are getting better. That's a reality LLM haters need to contend with. The versions that didn't solve your one issue perfectly 2 years ago simply aren't in the rotation any longer. A major release with significant improvements is a nearly monthly occurrence. - I don't know how to convince anyone of this, but the hallucinations have literally zero impact. Not only are they rare and getting rarer, but because you can always spot a hallucination you learn to ignore them. - As someone who's a huge fan of flashcards, you're wrong that this is worse than making your own cards. Instead of spending hours and hours making cards, those hours are spent actively engaging with and absorbing more material.

I can't speak to OP's product, but with my system it's the closest thing I'm aware of to downloading information directly into the brain.


How would you know if you couldn't spot a hallucination?


Can you share more details about your setup, really interested in this kind of idea for some topics I'm studying right now.


Some ANKI flash card sets sell for thousands of dollars, so, clearly, there is demand


Whoa, really? I'm a big anki user (always make my own decks), and I didn't even know there was a marketplace.


These are probably flash cards older students sell that was on the teacher's test. High stakes cheating.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: