If I did everything correctly, when you open this website, you should be able to run any AI language model locally in your browser, any device. It will utilize your native hardware with new WebGPU API to run these models. This is just a demo; I am working on a proper interface for NostrNet.work users. Demo: https://ai.nostrnet.work

Replies (17)

Yes, it entirely runs on your hardware. On upcoming NostrNet PWA, you should even be able to turn off your internet, and it will still work. There are numerous options, including 1 billion models that can also run on your phone. Click on ""model name" on top centre, it will show you the list of all the models.
Getting stuck in Pixel 6 stock using chrome/brave at: [System Initalize] Finish loading on WebGPU - arm Any chance in future to use the tensor TPU for quicker responses?
I encountered the same issue on Kiwi, but it works fine on Chrome. Have you attempted to switch models? The primary problem with this app is that the webGPU API it is using is not supported by all browsers. However, it is expected to be supported by all major browsers very soon.
I'm not sure what you mean by switch models? I just tried to open nostrnet.work on Chrome and got the message that I needed to install it as a Progressive Web App. So I did that and then tried to open ai.nostrnet.work from the PWA and still get that same message.
โ†‘