Hello, fellow Nostriches. We at #GitCitadel have spent this week pricing out remote servers and brainstorming; considering the most cost-effective, but still performant, way to implement a public #gitserver based upon Nostr. In order to help us front the rental fees and lower the price to individual npubs, we have begun a new goal. The resulting server will have multiple #gnostr implementations and tools on it. We are planning to begin with: * the #GitRepublic API, that will make it possible to log in to git with your Nostr signer, and will trade out e-mail communication with Nostr notes, * an HTTP-based git server (the first thing that will be going online, so that everyone can move or mirror their repos), * a paid AUTH relay https://gitcitadel.nostr1.com (currently 5000 sats/month, to cover all the running costs, but we hope to bring that down dramatically, due to your donations), * an #Alexandria instance focused on technical documentation and styled for the dev users, * and an instance of that displays the repos on the server. It's difficult to compete with a "free" public service, like GitHub or Codeberg, but we're hoping that tighter integration with Nostr will add some value to the developers using it, and for the public interested in browsing the PoW stored on it. Through the integration of Nostr and Lightning, we are leaning into the comfortable, Nostrized, friction-free logins, and will facilitate V4V payments for Nostr's hard-working devs. Many thanks in advance. And may you have a good morning. image

Replies (29)

Is there a need for a custom API? You could use use repository state events to authorise changes instead?
Good luck! I can see that lots of folks are excited about this (or at least I got tagged in more than one giant thread about it). I’d certainly like to self-host GitServer in the future if it can run on a cheap VPS. I know that to attract any sort of code contributions these days, you have to be on GitHub. A lot of devs won’t even touch GitLab or Codeberg, let alone self-hosted Git servers. Maybe Nostr can change the landscape a bit? Otherwise, even if it means mirroring my stuff to GitHub, it would be great to have Nostr-oriented infrastructure that Nostr devs can rely on.
I forgot to hit send on this earlier today, but speaking of David vs. Goliath, if you haven’t done it already, make sure to tarpit Alexandria, TheForest1, GitCitadel, and anything else you’re deploying that is expected to hold a lot of data. Otherwise, scrapers will come for the goods, overload the infrastructure and cost you quite a few sats.
this stuff annoys the hell out of me too, i run my test #realy fairly infrequently but within minutes some bot is trying to scrape it for WoT relevant events, and i wish there was an effective thing to slow that down i tried adding one kind of rate limiter but that didn't really work out so well, in the past what i've seen work best tends to be where the relay just stops aswering and dropping everything that comes in... probably if that included pings the other side would automatically drop i've now added plain HTML and i think that requires something also but maybe more simple, like, if it gets a query more than once every 5 seconds for 5 such periods it steadily adds more and more delay in processing, the difference to sockets is to do it on http it has to associate with an IP address
yeah, they are only reading, not writing, so all that would really be required is to slow them down probably could split the direct followed vs follows of follows into two tiers also so that the second level of depth get less service quality probably maybe should work on it today because this problem with scrapers is fairly annoying it's only nostr WoT spiders at this point, i wish i could just suggest to the spider people to just open a subscription for these and catch all the new stuff live, that would be more efficient for them and less expensive for relay operators at this point it's fairly early in the game for that stuff so maybe some education of WoT devs might help... cc: @Pip the WoT guy there is no point in doing sliding window spidering on events once you have the history, from then on just open a subscriptiion this would be more friendly from the AI spiders if they had a way to just catch new updates, but unless dey pay me! i'm just gonna tarpit them
Yeah, I don't get the point of spidering data you can just stream. Like, they spider Nostr websites and it's like... just subscribe to the relay. *roll eyes*
ok, so it's not you! no problem, it seems like you put a lot of thought into doing that right yeah, the behaviour i have observed from this one particular spider is super tarded, you are just the only person i know who is doing WoT analytics so whoever it is, shame on them, for being retarded
I'm going to give you a ticket for making the interface more reactive. That's fun and makes it easier to use on a reduced screen size. We can fight about it in the ticket. 😁
haha, yes in the documentation they talk about how you can load other UIs as well as using different CSS and whatnot, it's not really my thing but it isn't nice for a 1280 wide display likek my side-monitor (2k monitor in portrait) if you were to do some little homework to scope out other options for RESTful UIs i most likely can switch it, this is just the default for Huma
No stress. Just pull the ticket, when you're in the mood to work on it. Always good to have some tickets lined up. Need to define things, when you see them, and then pull them according to prioritization (and mood 😂).
Yeah right now I just have a user agent list and that does most of the work, the bots that are hitting me mostly have "~bot" in the user agent field so it's been working to stave them off for now. I have to hop in and update the list every few days it helps. I was interested in rDNS lookups for blocking since most of the IPs i've dealt with come from rDNS bot sources like bytedance, openai and so on.