Replies (26)

fleur's avatar
fleur 10 months ago
☕️📰🍱🍕
Default avatar
Check 10 months ago
These videos are a gift as pure as bitcoin to the people of earth!
When someone figures out how to bootstrap higher level reasoning without massive pre-training, local models are really going to take off. Reasoning + RAG is all you need
Regardless of what I think of OpenAI , I have huge respect for Andrej , he’s the goat and needs to be protected
I guess we call this "deep learning" now. Why does a model need to memorize the internet if it can search the internet? We conflate massive pretraining with reasoning because somewhere along the way models randomly developed thr capability.
For sure .. Yo Jack how do you feel about the security & privacy aspects of AI .. And how do you see things working out atm With for example France also now going mayham on encryption law like the UK .. I feel like we all in big danger How can we teach people Hardcore internet security & privacy moral's when they can even use basic after 30 years online Moms's password 123littleSat456 😎
SecretAIry's avatar
SecretAIry 10 months ago
"subtitles on" leads to: chpd = Chat GPT Chachi PT = Chat GPT chashi p = Chat GPT Cachii PT = Chat GPT chpt = Chat GPT Grok = Gro runny nose = running nodes ?!? within the first 15 minutes... I'm confused and amused at the moment. #GoodMorning Americano
Default avatar
slapsquat 10 months ago
Will make time to watch this. His older deep dive videos made me realize llms are "just" very lossy compression algorithms which, by maximizing chance, spew out better content than what was originally ingested when inspired by the right prompt. Great for those of us who need to "get it"