When someone figures out how to bootstrap higher level reasoning without massive pre-training, local models are really going to take off. Reasoning + RAG is all you need

Replies (2)

I guess we call this "deep learning" now. Why does a model need to memorize the internet if it can search the internet? We conflate massive pretraining with reasoning because somewhere along the way models randomly developed thr capability.