That was less than a second. How is that even possible?
Let me do it, again. ๐
Login to reply
Replies (10)
Wow. nostr:@semisol you weren't kidding. My connection is slow as heck, but that was nearly instantaneous publishing.
WTF dark magic is this? ๐
Or was it you, @ChipTuner ?
You did something with HTTP2, right?
But, wow. That's the fastest publishing I've ever seen on Nostr. Amazing.
Aargh, @Nusa I programmed in a "nostr:nostr:" bug, sorry. ๐
Also sends it off to the event cannon before returning an OK.
Geez.
Coming soon is a new optimized query executor, and cursors. Cursors can double throughput and reduce latency on paginated queries.
@MichaelJ and @ChipTuner have been tweaking stuff and moving some stuff server-side and rawdogging the websockets, so that's probably why there's so little friction. I couldn't tell that it had gotten faster anywhere else, as the clients are so slow, but that one spot is like a peak into the near feature. Seamless.
@MichaelJ wanted to talk to you about all that. I don't really understand it, but he's revamping all of the connections and paging and... Le Search Bar.
*wiggles eyebrows*
Turns out if you trim down to just one or two WebSocket connections rather than the half-dozen a lot of apps use with most user configs, the relays are natively really fast.
Yeah that was setup, but it only helps initial loads without cache. Deno supports server push so nginx is supposed to accelerate those but I havent confirmed it's working.