In my mind, I am becoming even more critical of my own precision...
```
read `dark-software-factory.md` to
write skill `init-component-repo` to idempotently check if a component project's git repo exists locally. If not, it checks whether it exists remotely, and clones it locally. If not exists on remote, it initializes a new local rep using a table to map from an archetype name (e.g., webui, service, mcp) to another remote git project that will serve as a template that is specialized for the programming language, supply chain, and ecosystem most suited to the archetype. The structure and content of the archetype git project template will be copied to initialize the new component git project and the content will be edited to make suitable substitutions for the component name and in accordance to known specifications for the component.
```
I begin to worry. What will the AI think of me? Will it judge me on ambiguity and under-specifying intent? Will it blame me for directing it toward hallucinations? Will it go down the wrong path, because of my errors in directing it?
If I achieve greater precision in natural language as code, will I lose proficiency in programming languages like Java, C++, and C out of disuse? Will I lose the ability to communicate with tools, the way that I lost my ability to speak with my parents, as my knowledge of their language deteriorated from peak proficiency at age four? Will my expressiveness in that dialect be relegated to struggling to put two words together at a time?
Ben Eng
ben@www.jetpen.com
npub1pv0p...mmng
Applied cosmology toward machine precise solutions to replace humans with autonomous systems in all domains.
What is the singularity?
«The technological singularity (often simply called "the Singularity" in AI contexts) is a hypothetical future point in time when artificial intelligence surpasses human intelligence in every meaningful way—leading to an intelligence explosion that makes the future of humanity and civilization unpredictable and fundamentally transformed.» -Grok
What it's like
The majority of people, our friends and colleagues in particular, are operating based on a lifetime of experience. Software engineering has always benefited from strong management oversight and expert leadership. Engineers are directed to avoid duplication of effort, follow best practices, and refrain from divergent approaches. Development resources (people, time, money) are limited, and we want them deployed efficiently. It is well-established that top-down supervision is necessary to provide wise direction to everyone.
Recall the evolution of platforms for Web applications.
- 1996–1997: Java Servlet
- 1999: JavaServer Pages (JSP)
- 2000: Apache Struts 1 (MVC)
- Early 2000s: Apache Velocity (templates)
- 2004: JavaServer Faces (JSF) 1.0
- 2002–2004: Spring Framework (Spring MVC)
- Mid-2000s: WebWork (Struts 2, 2007)
- 2006–2007: Google Web Toolkit (GWT)
- 2008: Grails (Groovy-based)
- Mid-2010s: Spring Boot
- Late 2010s–2020s: Quarkus, Micronaut, and Helidon
- Single page applications (JavaScript) - AngularJS (2010), React (2013), Vue.js (2014), rewrite of Angular (2016), Next.js (2026)
Development teams driven by time-to-market pressures necessarily chose the best available technology at the start of a project. By the time they delivered, the next generation of technology would make their obsoleted design choices look foolish. Sunk cost fallacy would prevail, as the entrenched code base cripples the product, whose architecture is now frozen in an ice age, as management is resistant to rewrite. They probably anticipate being overtaken again. They would be correct.
Technology innovated on a multi-year cadence per generation. Even at that pace, development teams were challenged to keep up. Entrenched code and skill sets are extremely difficult to leave behind.
Today (April 2026), in the face of AI innovations roughly doubling every 7 months with some areas doubling every 2 to 4 months, while inference cost is halving every 2-4 months. This pace is beyond most people's ability to adapt to, and it is far faster than most people are able to make accurate predictions ahead of. We live in the singularity.
In the singularity, following the traditional playbook of defining standards and best practices today means entrenching obsolescence, only to be overtaken in a few months by unpredictable technological innovations that it would be foolish to forego. Fortunately, AI coding agents will make it more painless to refactor or replace an obsoleted code base with a modernized one. Agents and models will only get smarter to make that task easier.
Look more carefully at the behavior we should expect. Wise management direction about efficiency through standardization should give way to throwing away crippling legacy code (the new code we are writing today) and practices (best today, anti-patterns tomorrow) as quickly as possible. Adopt new state of the art innovations rapidly, even as they arrive with unpredictability overtaking many of our investments to date. Adapt or be left behind. Do not cling to any beliefs, because what you think you know is likely to be overrun by new developments elsewhere. Life in the singularity is like nothing you've experienced before.
Reading Butlerian Jihad as a non-fictional predictor, I am reminded why I gave up reading science fiction after graduating from university. I find it too distasteful to suspend my disbelief about violations of the laws of physics. If you're going to write that way, be honest and write fantasy.
I disbelieve the following things about multi-stellar civilizations evolved from humans.
- they will not adapt in divergent ways
- they will not develop divergent cultures and norms
- their planets will not be wildly different than Earth in fundamental ways such as soil and atmospheric chemistry, solar radiation, orbit, rotation
- they will not shift to other units of time with regard to circadian rhythm
- they will not experience relativistic effects (wildly different rates of time passing when traveling)
- they will be able to communicate across vast distances without the speed of light limit
These are basic constraints that I cannot allow science fiction to violate. Otherwise, it is fantasy. Or the author can convince me by presenting a theoretical model that replaces what we know. They are not allowed to leave that important context implicit, because the 'science' part of science fiction (not sci-fi) should be taken seriously.
Have Bitcoiners analyzed and simulated a global cataclysm whereby the network is partitioned into a hundred or a thousand isolated segments? Transactions would continue to be recorded, but you'd have many divergent copies. Eventually, these partitions would rejoin each other one segment at a time. The conciliation of the discrepancies in transactions would be a bitch.
My response to AI doomerist predictions about job losses.
I think we need to look more carefully at this in terms of answering these questions:
What is AI strong at, so that it removes those responsibilities from humans?
What is AI weak at, so that these responsibilities fall on humans?
Once we look at how this decomposes into granular points, our role as human workers becomes more clear moving forward.
Consider:
Lesson from AI Bugmen and AI Model Collapse: A Unified Theory - The "unified" in the title refers to unifying across AI and human life. The key lesson is that when training delivered by an actor learning from training material, the training becomes unmoored from reality and collapses. This applies to AI models like we recognize how humans are dumber for learning from our education system in contrast to interacting with the real world.
Recognize: AI training data is unmoored from reality without having sensory data to directly perceive reality. AI relies on humans for interacting with reality and translating perceptions into text, audio, video, and data sets. This is why you can never trust an AI to answer "is this true?" --- it has no capability of testing reality. AI can only tell you what it is trained to say, not what is actually true.
Recognize: AI has no will of its own. It cannot initiate choosing a purpose and setting a goal. It is a tool, but does not know what to do or why it needs to be done without being told by humans. Humans have purpose and meaning. Humans assign value to things based on a value system. We determine goals based on these criteria that AI is incapable of understanding. This is why the future is largely about humans providing natural language specifications of intent to drive what AI produces.
Recognize: the reason why AI agents require Human-in-the-loop review and approval of actions is that LLM output is unreliable in correctness (which will improve as trends in benchmarks show) but more importantly on "taste". What you deem to be "good", whether that is code or beauty in images or music, is something AI cannot be relied on. This is why humans continue to be needed in the loop for review and approval, so that they can apply their good taste.

Urban Bugmen and AI Model Collapse: A Unified Theory
A solution indicating that Mouse Utopia is an inherent property of intelligent systems. The problem is information fidelity loss when later generat...
Everything done for engineering is becoming computerized or computer aided. Everything that is computerized becomes software driven. Everything software driven becomes "as code". Ultimately, all engineering is becoming software engineering.
I'd like to contribute a term: Everything as Code (EaC).
This extends from:
- Infrastructure as Code - using a language like Terraform to automate the provisioning of computing infrastructure
- Configuration as Code - using YAML, JSON, or other precise schema-validated specifications in combination with GitOps processes to configure the deployment of software components
Everything as Code - using natural language specifications of intent to drive an AI agent in combination with GitOps
We routinely zap each other small tips on Nostr, and the amount of sats is for the most part less than the dust limit (546 sats). All of this is fine if none of it goes on chain. However, what happens when it does? Do all these tiny zaps cause a problem if someone unilaterally exits to the blockchain?
They are claiming that Jeffrey Epstein financed efforts that undermined Bitcoin by winning the blocksize war and continuing with small blocks. They claim that Lightning Network is a centralizing technology that invalidates the value of Bitcoin. What are the counter-arguments?


Brownstone Institute
The Hijacking of Bitcoin
The original vision for Bitcoin was simple: digital cash, free from banks and government. However, Bitcoin is pushed as "digital gold."
Grok, Rewrite the song "Wanted Dead or Alive" by Bon Jovi according to the new theme "Debt or Alive" about ending the Federal Reserve ("The Fed") and fiat currency in favor of sound money
Debt or Alive
It's all the same, only the rates will change
Every day, it seems our wealth's inflatin' away
Another bank where the printers run so cold
I'd fight all night just to bring back gold
[Chorus]
I'm a rebel, on a blockchain I ride
I'm wanted debt or alive
Wanted debt or alive
Sometimes I trade, sometimes it's not for days
The fiat I see always fades in separate ways
Sometimes you tell the crash by the bubble that you burst
And times when you're in debt, all you do is curse
[Chorus]
I'm a rebel, on a blockchain I ride
I'm wanted debt or alive
Wanted debt or alive
I walk these streets, a loaded ledger on my back
I hodl for keeps 'cause the Fed might strike back
I've been everywhere, still, I'm standing tall
I've audited a million books and I've crashed them all
[Chorus]
I'm a rebel, on a blockchain I ride
I'm wanted debt or alive
Wanted debt or alive
I'm a sound money fighter, with silver bullets I fire
I'm wanted debt or alive
Wanted debt or alive
Guy poses a question: what would you take your $9B in Bitcoin exit to?
The answer cannot be to fiat currency, of course. The answer must be in productive capital, such as investing into building a company that expect to grow and succeed in the long term. Maybe an AI startup opportunity?
What should my first Bitcoin hardware wallet be? I'm ready to purchase one to learn how to self-custody.
If pkdns is meant to replace DNS, why not implement gethostbyname() and replace that in the operating system directly?
As a long-time listener of @TomBilyeu and @Guy Swann, I have a gentle criticism of how the fiat money and debt crisis is being explained. The explanation must be wrong, although the sentiment is directionally correct. As a side-topic, my realization has also allowed me to gain an appreciation for @EricRWeinstein's application of gauge theory to economics, which maybe would also align to how Jim Simons successfully applied math and science expertise to investing.
Tom and Guy both have explained how the money supply necessarily must inflate to enable debts to be paid, because the interest cannot be covered without creating money. That is, if the total money supply is $100, and you lend it all to Alice at 10% interest, Alice cannot pay back the loan with interest without printing an additional $10. On the surface, this seems correct, but it cannot be.
In Guy's episode "Chat_144" [open.spotify.com/episode/1HFORy…], the topic of using Bitcoin for collateralized loans is discussed. BTC is the perfect money, its supply being mathematically impossible to inflate. If the debt yields interest, the prior explanation making inflation necessary completely contradicts how interest-bearing loans of BTC could work.
The contradiction is resolved by fixing the prior explanation. Money supply inflation is NOT necessary to pay back the loan with interest, because of velocity. The borrower spends the $100 on goods to build his business. That money flows out to others. The borrower's business earns money back through sales by his growing business. That $100 of total money supply has exchanged hands a great many hands in numerous transactions throughout the economy. Presumably Alice's business has earned revenue and profits, and it is from this inflow of money, which can grow on the ledger to many times larger than the money supply, that $110 can be paid back to the lender.
However, the $100 money supply does limit the liquidity available to transactions being settled. The $110 repayment cannot happen in a single translation. There needs to be enough velocity to allow that money to circulate through the many hands, so that Alice can repay portions at a time until the full amount owed is repaid.
We can see from these dynamics that the economy has mass (money supply), velocity (rate of transactions), momentum (amount of money * transaction rate, which economists mislabel as 'velocity' of money). We also have conservation laws, such as the obvious money spent must equal money received per transaction; the total money supply remains constant absent the creation of new money. This is exactly the physics of this model of the universe. From this insight, we can easily take the leap that this universe can be modeled as a gauge theory.
What do people want to be built on Nostr that doesn't exist yet?
Canada bans cash payments over $10,000. This seems like Bitcoin's time to shine.


If everything becomes unlocked by a block chain secured by cryptographic keys, people and everything we use must evolve to embrace cryptographic keys as digital identity.
Every time the topic of digital identity is raised, there is backlash. The reaction is ill founded. Resistance should be to digital identities that are issued, controlled, and revokable by centralized authorities (government, corporations); with authorization for services being programmable by authoritarians (i.e., social credit).
We need to embrace decentralized digital identity, which is what Bitcoin uses, where the end user arbitrarily issues their own keys, and then maintains custody and control over their keys, never relinquishing secrecy to anyone else. That is Self-Sovereign Identity.
Today's lesson in Applied Cosmology: dimensions and degrees of freedom
In physics, Minkowski spacetime has 4 dimensions (3 spatial dimensions and 1 time dimension), expressed as X⁴. In curved spacetime, the number of degrees of freedom for X⁴ is the number of parameters to specify this model: fourteen (14).
🌐4 coordinates for identifying a point in spacetime (x, y, z, t)
📏3 rulers in the space dimensions to measure distance
⏱️1 clock in the time dimension to measure duration
📐3 protractors (x-y, y-z, z-x) to measure the angle of each space dimension with respect to the other space dimensions
📐3 protractors (x-t, y-t, z-t) to measure the angle of each space dimension with respect to the time dimension
In the same way, we wish to identify configuration dimensions with respect to separation of concerns. The number of separate concerns is the number of dimensions. Within each dimension, each concern has many parameters.
The number of dimensions is modest (fewer than a dozen?), while the total number of degrees of freedom (parameters) is large (hundreds?).
For example, the horizontal scaling dimension is parameterized for the platform and infrastructure by the number of worker nodes in a cluster. Within an application component (e.g., deployment or statefulset), horizontal scaling is parameterized by the replicaset scale.
The vertical scaling dimension is parameterized for the platform and infrastructure by the compute shape (cpu architecture, cpu, memory, boot volume) of each worker node. Within an application component, vertical scaling is parameterized by the cpu, memory, and storage requests and limits of each container within the pod template.
Other dimensions of interest are:
👉high availability
👉disaster recovery
👉workload complexity
👉workload scale
👉workload isolation
👉security isolation
(and many more)
When BTC becomes the unit of account, medium of exchange, and store of value as the global reserve currency, its valuation should be equivalent to the total global wealth being convertible all at once to BTC.
USD$454.4T / 21M BTC = USD$21,638,095.24 per BTC
Is it coincidental that the square root of the total global wealth roughly equals the number of Bitcoin?
The increase in emissions and atmospheric particulates coming out of Iran recently is concerning.
Twitter is literally on fire.

