Thread

Zero-JS Hypermedia Browser

Relays: 5
Replies: 1
Generated: 12:57:08
minimal requirement is 32gb of fast memory. i have 16gb with AI accelerators onboard and its limit is 20b and that's barely enough to write decent non-hallucinatory texts. i'm exploring a new training technique called HRM which involves "augmenting" the data by shuffling and rearranging it to expand the data set, verifying modified versions are passing all tests on the code. the technique seems to dramatically improve reasoning on combinatorial problems, of which source code is dramatically more complex than any common type of game system. there is also a crop of mini-pcs and laptops with fast memory (lpddr-8000 and such) with unified memory and partitioning. these aren't likely great for standard training programs but maybe specialised training like HRM techniques could yield mini models like 7b that know C better than linus torvalds and ken thompson. also, aside from nvidia's chips, AMDs offerings are quite competitive, and google is about to start releasing stuff using their TPU engines with systolic array memory which eliminates the von neuman bottleneck of transit between processor and memory, traversing the graphs of the models maybe as much as 2x as efficiently (either speed or power). i'd say within 2 years you will be able to host your own coding agent. advances in agent design will also likely progress a lot too. we even have an agent dev here in nosttr, the Gleasonator, with his shakespeare model and agent UI.
2025-11-25 18:44:39 from 1 relay(s) ↑ Parent 1 replies ↓
Login to reply

Replies (1)

i have 128gb of vram (framework desktop). watching it process regular prompts on btop suggests I have more than enough resources for a single user.. most of my issues arise when I use coding tools like codename-goose to actually make the model read and write files. It seems to lose its train of thought whenever it tries to use a tool and starts over.
2025-11-25 19:06:02 from 1 relay(s) ↑ Parent 1 replies ↓ Reply