Replies (48)

Something like squoosh for images or exgif for gifs? I use Squoosh to optimize website speed for my clients all the time. Drops the file size like 97% even with lossless
deterministic scaling functions that use fixed point numbers and multi-party verification of the relation between the original and the scaled versions, mainly. i could easily write a sinc function using fixed point integers, like, as a pair of 64 bit limbs one being the integer part, and the other being fractional.
also, probably would require a lossless compression algorithm for the small versions like png, as the bitmap derived from a jpeg is also fixed. in order to maintain the deterministic math used on it, that allows hashes to be related.
1. clients tell the user their image is stupid big. give them some direction on how to fix it 2. stop loading images larger than n
Clients implementing a limit when changing image in kind 0 metadata. You really can't solve it if someone puts a huge file there though. Besides just swapping with a robohash is size greater than x
there is just 2 ways. Have a proxy in front of your client, but that's for self hosting/advanced users. normalize blossom servers and have them stream chunk of contents rather than the full video. blossom servers should also be able to monetise any chunk served, with cashu channel, whatever they are called.
Forced compression? If they don't want it, pay for a service. As for the loading by other viewers, could you put a data notification and manual load button for those with data concerns?
Of course, it just feels very over engineered and centralizing for something that can just be refused to download using HTTP HEAD. How would you distribute the resized image? Where does the resizing take place? How do clients know about this?
1984 reports on kind 0 event id (not naddr) generated and consumed automatically by clients?
all it is, is this: deterministic, repeatable, and content verifiable association with an original, to it's scaled down version, that can be logged as a binding between one hash (original) and another hash. seriously, how under-utilized is your brain? we aren't talking about distributing GGUF files to share an LLM around lol
seriously though. did i say anything about the network protocol part of this? nope. i was just saying, you can make a scale operation, that can be repeated by any other holder of the original, attested, and the small version distributed, with relay hints (blossom hints?) to where you can get the original... IF YOU MUST. the only "complicated" part is the algorithm should be deterministic so that anyone with the original, can create the same scaled down version, and so that anyone who knows the origin of the original (eg, the artist/photographer) can see that bob, charlie and dave all saw this same image with this same hash and that they also performed this cheap scaling operation one time and then signed on a document saying so. doesn't take that many of these to be strongly confirmed. i'm not sure what you were expecting, that's really simple, and only marginally difficult to do right now if you just use a few image libraries.
hodlbod could explain Postell's Law for this case it's really nbd tho, seriously, rescaling images deterministically and distributed attestations would be enough to allow just a few to cache it while others just store and share the scaled down version which refers back to it
Additionally, who determines what size is ok and what is not? If you have plenty of bandwidth 50MB might even be acceptable whereas in some cases 50KB might be pushing the limits already, so any solution can only be on the receiving end i think. If you do not want to waste large amounts of bandwidth the receiver needs controls for that. Either block it or do something like @ᴛʜᴇ ᴅᴇᴀᴛʜ ᴏꜰ ᴍʟᴇᴋᴜ is suggesting. I would settle for simply not showing and use a placeholder or something.
Test the bandwidth and send the smaller jpeg or png version.
🐈's avatar
🐈 1 month ago
This doesn’t work because by the time you check the size you already downloaded it 🤣 some services will pass a bit of info but that’s also not reliable
Don Rickles's avatar
Don Rickles 1 month ago
Limit file size and hurt some feelings. If your gif is bigger than your attention span, rejected.
its possible to specify in kind 0 spec that picture value cannot point to image larger than specific size. its also possible to change spec to specify two images: small and full size.
although in reality, you load profile picture once, you can save whatever quality variant of it in local storage. if someone wants to waste your bandwidth, they can post single post with 100x 500 MB gifs anyway. you could add soft recommendations of picture sizes in spec for all images, as well as bandwidth saving setting which prevents automatic loading of images past specified size. also it could still be beneficial to have two image qualities for all image posts. often its not necessary to load 8K image in feed, but photographer may still want to publish original high quality image.
🐈's avatar
🐈 1 month ago
It's not feasible to do any of this on a single client because another client can just ruin your day. You can implement compression and resizing just fine, but people can just share a link and it renders, or upload through another client. You also can't really block large images because by the time you see it it already loaded. In some cases HTTP headers will solve this for you but in many instances you won't have this info at all. You are right that you can cache things, and it's not an issue then, but this is an issue particularly on mobile connections and particularly on slow ones. For slow connections though we can disable automatic loading of all media (as some clients have done).
that would require changing specs of kinds 0, 1, 20 etc. "images should be only loaded from HTTP servers that return Content-Length HTTP header or specify Content-Length in imeta metadata"
You can HTTP HEAD it and if not supported download a specific maximum size and then abort if bigger. I would not want to rely on complicated setups scaling it for me, and i don't think these things belong anywhere else than clients.
This does not belong in specs because it's totally dependent on the end user. It belongs in a client. If you can't determine content length the client should offer you the option to not download it. There is no one-size-fits-all possible here. On my home connection i wouldn't even be able to tell the difference between 50MB or 50KB. If you are somewhere with near zero bandwidth you should reconsider your priorities and maybe download zero external links and settle for notes only.
🐈's avatar
🐈 1 month ago
You can’t enforce this and there’s no such thing for a random image link … it just wouldn’t make much of a difference I’m afraid
🐈's avatar
🐈 1 month ago
Yeah slow connection or text only mode