This is very cool but I think it makes much more sense to put this in mobile, which you can mostly do today (Medical Wikipedia app). And you can run local inference. The missing piece is local RAG. Of course you can install this whole thing locally because Android supports local Linux.
ODELL's avatar ODELL
a self-contained, offline survival computer packed with critical tools, knowledge, and ai https://github.com/Crosstalk-Solutions/project-nomad
View quoted note →

Replies (3)

Love seeing the progression of stuff like this but think hard copies of knowledge are better. If I’m in a scenario where I need local AI for an extended period of time then I probably have more pressing concerns than talking to an LLM
That's why it uses RAG. It will give you answer based on the copy of your data and reference it, one click reveal. Instead of full text search you ask it a question but then you can immediately jump to source material. It makes the search much faster.
Yeah I get the use case. I just think if you’re in a situation where you have to rely only on a local source the cost and risk of an electronic system is probably too high. Investing in physical knowledge that is in a format/binding that will last years is likely a better investment. I love the setup though, particular for sovereignty