how do we solve this issue even
View quoted note →
Login to reply
Replies (48)
Hopefully not with a centralized proxy. Clients can just block the download of larger files.
Something like squoosh for images or exgif for gifs?
I use Squoosh to optimize website speed for my clients all the time. Drops the file size like 97% even with lossless
Client sided? Ask the webserver the size of the picture and ignore it if it is more than a specific size. Can even configure it that way?
deterministic scaling functions that use fixed point numbers and multi-party verification of the relation between the original and the scaled versions, mainly.
i could easily write a sinc function using fixed point integers, like, as a pair of 64 bit limbs one being the integer part, and the other being fractional.
actually, for images, probably could just be fixing the point at byte 4's first bit
in a decentralized network, client side.
Clients optimizing images upon uploading and not showing images based on a certain size.
also, probably would require a lossless compression algorithm for the small versions like png, as the bitmap derived from a jpeg is also fixed. in order to maintain the deterministic math used on it, that allows hashes to be related.
Education
But wouldn't you need the original image first? At which point you are merely adding compute to downscale it after you already obtained it?
fixed point scaling algorithms. the end.
the original has to exist somewhere, but it doesn't have to be everywhere
1. clients tell the user their image is stupid big. give them some direction on how to fix it
2. stop loading images larger than n
Clients implementing a limit when changing image in kind 0 metadata. You really can't solve it if someone puts a huge file there though. Besides just swapping with a robohash is size greater than x
there is just 2 ways.
Have a proxy in front of your client, but that's for self hosting/advanced users.
normalize blossom servers and have them stream chunk of contents rather than the full video.
blossom servers should also be able to monetise any chunk served, with cashu channel, whatever they are called.
Forced compression? If they don't want it, pay for a service. As for the loading by other viewers, could you put a data notification and manual load button for those with data concerns?
What's the issue? Why do you want to "censor" my 500 MB gifs? You should download them for free and relay them for free for me.
/sarc
Of course, it just feels very over engineered and centralizing for something that can just be refused to download using HTTP HEAD.
How would you distribute the resized image? Where does the resizing take place? How do clients know about this?
Forced compression. There literally no reason for multi MB profile images.
Shitcoiner data-storage-maxi hypocrisy is palpable.
you can't force anything here
every client should do that but that's not a solution that works. there are too many clients that don't care.
Why do we even use gifs in 2025?? Why don’t we use videos or something? Gifs have been fucking retarded for 40 years
1984 reports on kind 0 event id (not naddr) generated and consumed automatically by clients?
What a shocker, a prehistoric image format is inefficient for video 😭
overengineered?
you do realise that fixed point integer math done with CPUs is how all games did shit before GPUs, before FPUs
how would you distribute a file. let me see. oh yes! HTTP!
all it is, is this: deterministic, repeatable, and content verifiable association with an original, to it's scaled down version, that can be logged as a binding between one hash (original) and another hash.
seriously, how under-utilized is your brain?
we aren't talking about distributing GGUF files to share an LLM around lol
seriously though. did i say anything about the network protocol part of this?
nope.
i was just saying, you can make a scale operation, that can be repeated by any other holder of the original, attested, and the small version distributed, with relay hints (blossom hints?) to where you can get the original... IF YOU MUST. the only "complicated" part is the algorithm should be deterministic so that anyone with the original, can create the same scaled down version, and so that anyone who knows the origin of the original (eg, the artist/photographer) can see that bob, charlie and dave all saw this same image with this same hash and that they also performed this cheap scaling operation one time and then signed on a document saying so. doesn't take that many of these to be strongly confirmed.
i'm not sure what you were expecting, that's really simple, and only marginally difficult to do right now if you just use a few image libraries.
hodlbod could explain Postell's Law for this case
it's really nbd tho, seriously, rescaling images deterministically and distributed attestations would be enough to allow just a few to cache it while others just store and share the scaled down version which refers back to it
Additionally, who determines what size is ok and what is not? If you have plenty of bandwidth 50MB might even be acceptable whereas in some cases 50KB might be pushing the limits already, so any solution can only be on the receiving end i think.
If you do not want to waste large amounts of bandwidth the receiver needs controls for that. Either block it or do something like @ᴛʜᴇ ᴅᴇᴀᴛʜ ᴏꜰ ᴍʟᴇᴋᴜ is suggesting. I would settle for simply not showing and use a placeholder or something.
True, but you can have opinionated clients.
Test the bandwidth and send the smaller jpeg or png version.
This doesn’t work because by the time you check the size you already downloaded it 🤣 some services will pass a bit of info but that’s also not reliable
often you don't know how large a file is before you download it
I see you came here to make friends
Limit file size and hurt some feelings.
If your gif is bigger than your attention span, rejected.
Nah, just to un-f*ck the money.
its possible to specify in kind 0 spec that picture value cannot point to image larger than specific size. its also possible to change spec to specify two images: small and full size.
although in reality, you load profile picture once, you can save whatever quality variant of it in local storage.
if someone wants to waste your bandwidth, they can post single post with 100x 500 MB gifs anyway.
you could add soft recommendations of picture sizes in spec for all images, as well as bandwidth saving setting which prevents automatic loading of images past specified size.
also it could still be beneficial to have two image qualities for all image posts. often its not necessary to load 8K image in feed, but photographer may still want to publish original high quality image.
It's not feasible to do any of this on a single client because another client can just ruin your day. You can implement compression and resizing just fine, but people can just share a link and it renders, or upload through another client.
You also can't really block large images because by the time you see it it already loaded. In some cases HTTP headers will solve this for you but in many instances you won't have this info at all.
You are right that you can cache things, and it's not an issue then, but this is an issue particularly on mobile connections and particularly on slow ones.
For slow connections though we can disable automatic loading of all media (as some clients have done).
make Content-Length mandatory in spec. make clients ignore images that do not follow spec.
There’s no spec for a link …
that would require changing specs of kinds 0, 1, 20 etc.
"images should be only loaded from HTTP servers that return Content-Length HTTP header or specify Content-Length in imeta metadata"
We don't this is not an issue
You can HTTP HEAD it and if not supported download a specific maximum size and then abort if bigger.
I would not want to rely on complicated setups scaling it for me, and i don't think these things belong anywhere else than clients.
This does not belong in specs because it's totally dependent on the end user. It belongs in a client. If you can't determine content length the client should offer you the option to not download it.
There is no one-size-fits-all possible here. On my home connection i wouldn't even be able to tell the difference between 50MB or 50KB. If you are somewhere with near zero bandwidth you should reconsider your priorities and maybe download zero external links and settle for notes only.
You can’t enforce this and there’s no such thing for a random image link … it just wouldn’t make much of a difference I’m afraid
Yeah slow connection or text only mode