Introducing EXIT You broke up with your ex; it wasn’t treating you well, maybe it was shadow banning you or your friends; it was manipulating you into becoming your worst possible self. But over the years you accrued a bunch of quality shitposts, and perhaps some nuggets of wisdom. Introducing: https://exit.pub: The last bridge you’ll need to port over your data into the new world of decentralized freedom-tech. 1. Download your twitter archive 2. If Elon agrees, you’ll get a zip file; uncompress it and just use exit.pub to import your data into nostr ✅ original dates are used; whatever you posted in 2009 will show up as posted in nostr in 2009 ✅ granular control of which tweets to import (threads, non-replies, replies) ✅ V4V, you choose how much your shitposts are worth ✅ it *should* preserve embedded images ✅ granular control of which relays you want to publish to Try it out: https://exit.pub

Replies (75)

some people were wondering what importing your tweets looks like as you'll see, you get total control of what gets imported and to which relays they are sent here's a quick screencap:
k3tan's avatar
k3tan 2 years ago
Does this work for suspended accounts? Asking for a friend.
Any chance you could add something to copy images to another host and replace links in the new events? I imagine that X will delete media if people start deleting their accounts after this tool. No need to lose years of quality memes if that happens.
im trying to import mine now, but my tweets.js file is 100MB. it seems to be timing out. 😟 i have clicked the 'wait' for page to finish or kill page button now over a dozens times. the tweets.js finally loaded and now im waiting for events to generate. i also have a second tweets-part1.js file to upload after this that's 60MB.
Default avatar
nobody 2 years ago
exit.pub is exited. blank page.. flooded by nostriches....
The event is only in Pablo's personal relay. Your client should have been able to fetch it using the hints that exist in nevent1qqsvmd3nzgspqwqkthfe2rh6xjfd98p5jn66nkjs6273ape7nmkkdlspzfmhxue69uhhyetvv9ujue3h0ghxjmczyrafsj7hmweg9ur7zmn6apajdg48hxuskujx53rhrux0ttjcqx84y9szxl5 Try or `nak decode nevent1qqsvmd3nzgspqwqkthfe2rh6xjfd98p5jn66nkjs6273ape7nmkkdlspzfmhxue69uhhyetvv9ujue3h0ghxjmczyrafsj7hmweg9ur7zmn6apajdg48hxuskujx53rhrux0ttjcqx84y9szxl5` then `nak req -i cdb63312201038165dd3950efa3492d29c3494f5a9da50d2bd1e873e9eed66fe wss://relay.f7z.io | jq`
To be honest I didn't get the metaphore with breaking up with your ex and accruing a bunch of shitposts.
Can you make a relay that charges per byte per month and stores events indefinitely and stores them on disk or S3 or somewhere cheap with very limited querying capabilities?
I think you can query S3 by filename, so just save each event as a separate file with name = pubkey:created_at and done? Or save each event with an id and keep local disk indexes.
Default avatar
Zombie Node 2 years ago
I wish I had this a couple months ago when I deleted that account. Great idea though!
lol wow that’s weird; it’s the same rendering engine! 😅 This would definitely not work on mobile though; I should put a warning or something
That makes sense, but also the implications are kind of scary 😂 I wonder if there’s a way to leverage Open Timestamps or similar, to prove the ~blockheight of when something was posted. Maybe a service that aggregates and merkelizes a bunch of notes, and puts some piece of data on chain.
The implication is that you could post anything as though it were posted in the past, and point to it as evidence, etc
True. But for certain types of content, I could see value in hashing some data into a block that proves it was posted at some relative (blockheight) point in time
Not only is this a great idea, it's also could be helpful as a way to spread nostr as a brand. I'd quite like to see nostr in the same breath as when content creators recommend following their backup accounts or eg YouTubers 'follow me on rumble in case I get banned off YouTube and for my spicier takes'. Yet to see this happen (outside of bitcoin circles) but it's very possible and this is the sort of tool which might help. ;)
Hey @PABLOF7z Do you think you could add a way to restart an import if it fails for some reason? I am trying to import to my relay and I'm not sure why, but it fails to import at some point. I end up starting it again, but it will fail again at a different spot. It would be nice to be able to restart where it left off once an error occured instead of doing the same events over again and failing at roughly the same point.
Should this thing still work? I've got a few thousand tweets I'm trying to import and the app keeps crashing and hanging at various stages, no matter how I try mixing and matching the various settings and relay configurations.
Close to 10,000 And almost every image it tries to fetch from Twitter fails - so that seems to slow down the works quite a bit also. I think I have 32GB of RAM, but maybe my browser is choked in other ways. I'll try again later. Thanks!