Tim Bouma's avatar
Tim Bouma
trbouma@getsafebox.app
npub1q6mc...x7d5
| Independent Self | Pug Lover | Published Author | #SovEng Alum | #Cashu OG | #OpenSats Grantee x 2| #Nosfabrica Prize Winner
Tim Bouma's avatar
Tim Bouma 3 weeks ago
Wait til the bots find out there is only 21M BTC.
Tim Bouma's avatar
Tim Bouma 3 weeks ago
Fascinating how warfare plays out in the digital-satellite realm. image
Tim Bouma's avatar
Tim Bouma 3 weeks ago
In the end, people are looking for simple things: exact appearance and exact meaning. Combine Adobe PDFs with Nostr signed events (PRFs) and you get that. This is how we are going to win. image
Tim Bouma's avatar
Tim Bouma 3 weeks ago
Barring teenagers from social media will do more harm than good: The Economist (North America) Feb 14, 2026 PEOPLE DON’T agree on much these days. But one thing brings them together, whatever their politics: the idea that, because social media harm children and teenagers, they should be banned from using them. In December Australia stopped under-16s from having accounts on platforms including Facebook, Instagram and TikTok. A dozen countries, including Britain and Spain, are now toying with the idea; so are legislators in many American states. More than 70% of Britons support kicking under-16s off such sites, as do two-thirds of Americans. The problem is, bans will do more harm than good. The proposals arise from an understandable desire to keep youngsters safe and healthy. Parents have been shaken by tragedies in which social media have played a role, of children being tricked into sharing explicit pictures of themselves, or taking their own lives after algorithms shovelled them content about self-harm. Along with those shocking cases comes a second, more general worry: that social media might be damaging children as a group, making them reserved, lonely and anxious. People are desperate to understand why today’s youngsters seem unhappier than the generations that came before. Even parents who are confident they can keep their children from serious harm fret that their offspring are wasting hours scrolling through mindless memes. They would like to stop them, but fear that their kids will suffer if they are the only members of their class not on the apps. Blanket bans appear to offer an easy answer—and politicians are only too happy to seize on a measure that, for once, pleases voters from all parties. Yet policymakers should reconsider. The question of whether social media are causing mass harm is far from settled. Growing evidence suggests they are bad for at least some children. But, as we explain this week, the claim that social media cause great damage to the mental health of young people as a whole has only limited evidence (see Briefing). And even if you wanted to ban social media as a precaution pending conclusive findings, such measures threaten to be counterproductive. One problem is that enforcing bans is hard; teenagers in Australia are finding ingenious ways to get round them, by scrunching up their faces to look older. Defining social media is hard, too. Australia has not banned young teenagers from messaging apps such as WhatsApp, or from multiplayer online games, because that would have seemed too draconian; scourges such as cyberbullying will doubtless continue on these. Kids barred from mainstream sites could flock to obscure ones, and fall victim to predators there. Children who evade the blocks may be less likely to tell adults if they find something horrid, for fear of being told off. Higher age limits may just delay problems until youngsters are 16, when they will suddenly gain full access to social sites that they do not have much experience of using. And all the while, higher age limits for social media may provide a false sense of security. It is for all these reasons that bans are often opposed by child-protection groups. Moreover, the proponents of bans ignore how they would deprive children of the benefits of social media. They are a blessing to children who feel isolated: perhaps because of their location, their sexuality, or because their brains work differently from those of others. Social media can broaden young minds, giving children from all backgrounds a window onto fresh places and people. Like it or not, social sites are now one of the main ways children obtain information (as well as misinformation) about current affairs. It used to be easy for youngsters to pick up their parents’ printed newspapers. They sat through news bulletins aired before or after their favourite shows. Those days are no more. Teenagers who are turfed off TikTok will not instantly begin climbing trees or poring over books. Many will slump for longer in front of games consoles and streaming services. One reason they spend so many hours online is that parents long ago stopped letting them hang around outside with friends. Having chased them indoors, adults should now think twice about placing further prohibitions on their free time. What to do? Rather than raise age limits, regulators should redouble efforts to make social sites more suitable for teens. Ideally they would force web firms to cough up more data on how teenagers use their products—the better to help researchers measure harms, and come up with ways to prevent them. They should tell tech giants to rethink features that are keeping kids online longer than is healthy, such as interfaces that permit endless scrolling and videos that play without prompting. They should demand sterner moderation of the content being served up to young users. This may require greater efforts to verify the age of social-media users, in order to work out which ones must surf with guardrails and which are adults who may go without. Some observers find these ideas laughable. One reason people demand higher age limits is that they believe social apps cannot be made safer. That ignores the direction of travel. America is gearing up for a series of blockbuster trials— years in the making—that will finally give people who say they were harmed by sites as children a chance to make their case in court. The European Union has just issued a preliminary ruling that features of TikTok’s design are “addictive” and threatened fines if it doesn’t change. Lately, many of the big social apps have been cajoled into creating “teen” accounts that come with additional safeguards. These things will not solve every single worry. But all of them are progress of a sort. Teens, screens and memes Politicians say their social-media bans are the only responsible option. In fact, they look like a way of ducking the care children deserve. If regulators cannot find ways to tame social media—now over two decades old—what hope is there to let children use novel tools such as artificial intelligence? Youngsters have a right to share in new technologies. Adults must seek to make their time online as safe and as rewarding as possible. ■ Shared via PressReader connecting people through news
Tim Bouma's avatar
Tim Bouma 3 weeks ago
It took me years and countless hours of development to crunch it down to this simplicity. Adobe PDFs solved the exact appearance problem. Device independence. Nostr Events solved the exact meaning problem. Platform independence. This is what I am aiming for with #nostr #safebox - a global capability that is independent of device and platform. View quoted note →
Tim Bouma's avatar
Tim Bouma 3 weeks ago
Adobe solved the exact appearance problem. Before PDF, a document could shift fonts, margins, pagination meaning changed because layout changed. Adobe Inc. gave us the Portable Document Format: a file that renders identically across machines. The page became stable. Appearance became portable. Nostr solved the exact meaning problem. Before Nostr, a statement’s meaning depended on platforms, databases, and institutional context. Nostr binds content to a key and a hash. The author signs exact bytes. If anything changes, the ID changes. Meaning becomes portable. PDF fixed how a document looks. Nostr fixes what a record is. Adobe gave us the Portable Document Format. Nostr gives us the Portable Record Format. Appearance became stable. Meaning became verifiable. From portable pages to portable truth. image
Tim Bouma's avatar
Tim Bouma 3 weeks ago
Nostr is a standardized shipping container for context and meaning — cryptographically sealed by its originator and verifiable anywhere without unpacking. image
Tim Bouma's avatar
Tim Bouma 3 weeks ago
Nostr standardizes the transport of cryptographically sealed context the way shipping containers standardized the transport of goods. image
Tim Bouma's avatar
Tim Bouma 3 weeks ago
A signed nostr event is like a standardized shipping container. It is a cryptographic context container.
Tim Bouma's avatar
Tim Bouma 3 weeks ago
Kremlin Bars WhatsApp To Boost Own Platform BY MATTHEW LUXMOORE The Wall Street Journal Feb 13, 2026 Russia is restricting access to WhatsApp as it accelerates its campaign to shift Russians away from Western-based messaging platforms outside its control and onto homegrown equivalents approved by the government. The Kremlin on Thursday confirmed it was blocking WhatsApp. It also is throttling Telegram, now based in the United Arab Emirates, which is widely used by Russian soldiers. It has said companies that own platforms operating inside Russia, including Telegram and WhatsApp, are violating Russian law by refusing to comply with restrictions. WhatsApp, which is owned by Meta, on Thursday said it would do whatever it could to keep 100 million users inside Russia connected. “Today the Russian government attempted to fully block WhatsApp in an effort to drive people to a state-owned surveillance app,” it said on X. Pavel Durov, the Russianborn founder of Telegram, on Tuesday said Russia was forcing its citizens to switch to “a statecontrolled app built for surveillance and political censorship.” The app in question is Max, a messenger that has features similar to WhatsApp and Telegram but is state-controlled and offers no encryption, tech experts said. Russia is rolling out new features for Max and hopes it will become a homegrown super-app comparable to China’s WeChat, with a banking element and a portal to government services. Moscow has for years tried to restrict Russians’ access to Western apps and internet platforms, but some previous efforts have failed. When it first tried to ban Telegram in 2018, people took to the streets. The government later abandoned the effort. The push to create a censored internet similar to China’s accelerated after Moscow’s full-scale invasion of Ukraine in 2022. Russia shut off access to Meta’s Facebook and Instagram, X, and Google’s own Google News service. Among major non-Russian social platforms that offer broad access to independent news, only Telegram and WhatsApp avoided being shut off. Millions of Russians downloaded virtual private networks to bypass the restrictions, and Moscow began to identify VPN services in a bid to also block them. The widespread use of VPNs by Russians, triggered by earlier internet restrictions, makes it far harder for Russia to enforce restrictions today. Still, Max is gaining traction with Russian users, boosted by a push by officials and on state media to portray it as a safer alternative to WhatsApp and Telegram. Government institutions, including schools, are being told to use Max as a basis for communicating with staff and students. Beyond the public statements from Meta and Telegram this week, the restrictions also have prompted a criticism inside Russia. When news emerged that Moscow was throttling Telegram speeds this week, Russian war bloggers typically supportive of the Kremlin said the move risked impeding the Kremlin’s war effort in Ukraine. “Switching off Telegram will first and foremost affect us,” one war blogger with more than one million Telegram followers posted. “For our servicemen this is a life buoy, because it ensures constant and fast communication.” Shared via PressReader connecting people through news
Tim Bouma's avatar
Tim Bouma 3 weeks ago
Nostr Safebox gives communities the ability to keep their own records and remit value directly, using shared cryptographic rules instead of centralized control. image
Tim Bouma's avatar
Tim Bouma 3 weeks ago
For #nostr #safebox, I need someone to combine #nostr relays, #blossom servers, and #cashu mints into a single server implementation. Thank you for your attention to this matter.