testing on a filled up iphone 12 mini. PWA ressources are tiny. Proxying relay connections make a huge diffference on the go. Make it almost as efficient as other native client, almost.
Makes me think the only way to make proper nostr clients in web app is to offload some of the work to your own dedicated server. Self host your nostr client, you should have a relay anyway!
Nuts
_@nuts.cash
npub1wunv...qhq8
Easy nuts
https://github.com/candypoets/nuts
the question I get everytime I introduce nostr to someone.
what blockchain is that?
the proxy config was brittle, too many ws messages, I changed vps to one with less latency, much more stable now.
ah and also your client will consume A LOT less bandwidth, the proxy dupplicates events on the server.
View quoted note →
by far the most useful case for AI is devops
made a typo.
Bitcoin is my religion, GM*
View quoted note →
so many amazing nostr clients out there.
proxy mode has several advantage over connecting to all relays client side.
- can connect with tor.
- faster client code.
- more reliable.
- client consume flatbuffers rather than json, speeding up message consumption and using less ram.
you can test proxy mode at https://alpha.nuts.cash, proxy mode will never be deployed to the production version, but will be the default for self hosted configuration.
what's the opposite of slop?
#asknostr
be thankful for those that have the courage to disobey
View quoted note →
Not available in any stores near you.
nuts.cash
gpt-5.4 mini seems to be this model.
View quoted note →
NUTZAPS ARE DED
Nuts provide newcommers with a new wallet too, a nostr wallet that follows you across all nostr apps, powered by cashu, private and no kyc
I want my model dumb and disciplined, and fast.
feels like we might have reached a plateau in AI, or in AI assisted coding, ie vibe coding. Not because models can't improve, they can and they will.
but I consider that a model that can produce functioning and optimised code and do not overextend my prompt will be enough for me going forward, just make it faster.
smarter models are essentially extending on bad instructions, giving you AI slop. The models don't need to improve, your prompt clarity and idea need to be implemented faster.
I'm only waking up to the importance of bots social network. human can only process and analyze so much informations. a bot could scan an entire day of activity on X, classify it, analyze, reason about it and extract the best bit.
A bot is cold and only think through pure logic. It has no friends, no favourite influencers and treat every bit of information equally. These kind of social networks make a lot of sense and thats terrifying for us humans
Search by subject:
ctrl + k =>Tag search => Tag feed.
Easy, fast, keyboard only.
🥜 🤝 🦞
View quoted note →
You built another nostr client


trying out codex with this shiny new 5.4 model. this thing thinks too much, I'm tired.