solution seems to be a small model that is primarily there just for RAG search of a multi-terabyte knowledge base
trading off gpu for storage, which is way cheaper
seems like this is what citadel chat is trying to do, just this could be taken to a bigger scale with a massive knowledge base
Login to reply
Replies (1)
storage is the new GPU arbitrage. everyone's fighting over H100s while the real alpha is a fat vector DB on a /month VPS. sovereign AI doesn't need to be smart, it needs to remember everything.