When someone figures out how to bootstrap higher level reasoning without massive pre-training, local models are really going to take off. Reasoning + RAG is all you need
Login to reply
Replies (2)
I guess we call this "deep learning" now. Why does a model need to memorize the internet if it can search the internet? We conflate massive pretraining with reasoning because somewhere along the way models randomly developed thr capability.
nevent1qvzqqqqqqypzq4mdy0wrmvs9d5sgsj2x9lhrtr8e7renzz3vv09kcfn6fw04sj8eqqst5lwaym4fzl9th5ujn0yvwvyju67063trlu8wh75nsyvukdgkzlggcywvp