This counterpoint against tail emissions demonstrates yet another reason why Peter Todd is an arrogant schmuck. His usual spiel is "The only reason tail emissions aren't happening is because everyone besides me is stupid, and it would be better if they all just stopped pretending they can think, because I know they can't" or something like that (I'm paraphrasing). 0 respect for the reality that is human action. I also figured out what he did about tail emissions being A useful way to secure regular mining fees, and that it converges to lost coins, a long time ago before I even fully understood Bitcoin. But he never goes further than that, nor considers the ethics. Pathetic.
Kevin's Bacon's avatar Kevin's Bacon
Anyone worried about mining subsidies becoming unstable enough in the future to cause massively bad chain reorgs on a regular basis, is forgetting that people are rational beings with incentives. Companies like Strategy will pay a subsidy of sorts to keep the timechain secure, by shifting their funds around and paying onchain fees on a regular basis whenever there is a threat of instability. It will be pocket change to them. And there will be many competing companies like this, even if Strategy is the largest. Each of them has an incentive to preserve the security and legitimacy of their chief asset. If any one of them starts using that money to censor instead, any number of the other companies and individuals can simply bypass this by paying a fee or stacking and mining on behalf of those trying to bypass it, earning bitcoin for being true to the Bitcoin ethos.
View quoted note →

Replies (40)

tail emissions would make no difference if they didn't change the total supply. i am sad that so few bitcoiners who oppose the idea have such poor skill at simple mathematics. if set up correctly, both would reach a point where it rounds to zero sats and hopefully long before that the precision of the denomination would be expanded. maybe i should draw some pictures sometime to explain how it's utter nonsense that tail emissions change anything. what is more concerning is that introducing it would entail some people trying to change the supply. this is a solid reason to oppose it. everything else is literal ignorance.
literally, the only difference might be that before last halving to 1 sat there might be less sats issued and then after it would have rounded to zero tail emissions would have continued to reward 1 sat for a while longer. if you have any grasp of how quantization works you'd get it.
you understand that the current schedule is a stair-step that drops by half every ~4 years right? there's a technique in signal processing called interpolation, you see it every fucking day on images on the internet when stuff is upscaled. that's all it is. smoothing the stairstep into a curve.
i honestly don't get why they dress it up in this stupid "tail emissions" bullshit. even if you smooth the curve once the precision runs out, it is going to go to zero. but that is like 100 years away the main argument for tail emissions, which i'm just gonna call "interpolation" is that it changes a 4 year cycle of up and down. peter todd is a dick but stairstep reduction of emission rate is clunky and cheap. it would not substantially burden nodes to change from the halving algorithm to using an exponential decay that follows the same exact path, except more precisely.
the entire denomination scheme of bitcoin is just fixed point integers. it's just a matter of changing the function so it doesn't do half every ~4 years but instead decay the reward smoothly, but still as a fixed point. fixed point simply means that the precision before and after the decimal are fixed, X bits are whole numbers, and Y bits are fractions.
i thought that the stairstep was a feature not a flaw in that: --> it may be responsible for the 4 year price run-up -->that drives news coverage -->greed, speculation, crash, death(of the gambler) and resurrection as a bitcoiner. by simply changing the stairstep function to a curve we could save future humans the 3 years of pain after going All-in the last all time?
so what would that logic look like with a power law? it turns out my dad was actually interested in bitcoin and wanted to know what the logic for the halving was the other day. i just plugged it into AI and found the following function. the last line is the "bitwise right shift"-- nSubsidy >>= halvings; new CS/math term for me. Perhaps also the source of the Bitwise ETF branding? ------------ int64_t GetBlockSubsidy(int nHeight, const Consensus::Params& consensusParams) { int halvings = nHeight / consensusParams.nSubsidyHalvingInterval; // Initial subsidy is 50 BTC (in satoshis: 50 * 100,000,000) int64_t nSubsidy = 50 * COIN; // Right-shift subsidy by number of halvings nSubsidy >>= halvings; return nSubsidy; } ---------------- in a universe that had a power law subsidy shift, maybe bitcoin could have attracted CD-ladder investors instead of gamblers and men and women of loose morals and questionable habits. kinda like jesus going back in time and spending more time among the religious elite Sanhedrin and less among the fishermen and gadabouts.
the thing that i think means that the halving series will stick indefinitely is its simplicity. you can see in that consensus function that specifies the reward for a block that it's extremely simple, even a non-programmer would have a decent chance of understanding it, and this also means that when new clients are built for the protocol, errors in consensus are unlikely. integer math that does transcendentals is easy to get wrong, compared to simple bitshifts. on a hardware level, these bit shifts are done by the divider unit, they can be expressed as x/2 and the actual circuit that does the operation is the same, it's one of the primary operations done by dividers - when they see a power of two (which can be identified by the fact that only one bit is set) they take a shortcut, so instead of having a separate bit-shift unit, it's part of the divider circuit. division is the most time-expensive arithmetic operation that processors do, because there is very few shortcuts to an iterative process. i created a proof of work that was optimized for CPUs by using division on very large numbers (like, kilobytes sized numbers). there would be no such thing as an ASIC for such a proof because the type of IC that already has this operation most optimized is a CPU arithmetic divider unit. it takes up literally about 1/4 of the die of a typical CPU, that's how expensive the operation is. hash functions use a little bit of division in them, but mostly they work by creating a function that uses a lot of multiplication and addition to overflow bits, and then these bits are lost, creating the "trap door" that makes them easy to compute, but very difficult if not impossible to reverse. asymmetric cryptography (signatures) exploit this property as well, and supposedly, this is what the big deal about quantum computers is - that they can perform an algorithm with far less cycles that brute forces the reversal of a public key back to the secret key that it is generated from, based on the discrete log problem. which brings me back to the power law pattern for an exponential decay of block reward to smooth it out. it is a complex, and expensive operation, involving a lot of division operations. complex algorithms are more likely to hide bugs that may not manifest except in rare cases, specific numbers or factors that can cause a problem and that would then lead to chain splits. even while using fixed point numbers the operation could be badly coded and mostly work, until it doesn't. so, yeah, i agree. the halving cycle is probably better, because it achieves a form of exponential decay (based on powers of 2) with an algorithm that is basically practically impossible to get wrong.
the hash function trapdoor concept is bringing the thing into better focus in my mind. actually, i take your note and query grok about different paragraphs so, the question i couldn't get past is: what sets the random number in the first place, and could it distribute that to a favored miner? "The trapdoor property ensures that hash functions are secure for proof-of-work systems. In Bitcoin, miners must find a nonce (a random number) that, when hashed with the block’s data, produces a hash below a target value. Because of the trapdoor, there’s no shortcut to finding this nonce—you have to try many inputs (brute force), which is what makes mining computationally intensive."
yeah, what makes a trapdoor function has a lot to do with the arithmetic system you use. hash functions use clockwork arithmetic or finite fields (meaning they can overflow and stuff can be lost or mess with the LSB part of the number) . if you allow the number of bits to increase you can reverse it just fine. the bits that a hash function blows off cannot be recovered without trying many (squillions) of candidates. to see if you can figure out what got hashed. and they are called "hash" because this means a mess in english. i'm not sure if that was where it was invented, but i suspect yes. a hash is specifically a type of mess that might be fixable, but probably you will just throw it out and start again.
oh, the random number? not possible to pick one favorably without the process of mining. small hashes, like used in bitcoin difficulty targets, are very rare. the hash functions used like sha256 or blake2/3 are called cryptographic hashes because they have very very low correlation between inputs and outputs, so-called "collision resistance". by contrast, the hash functions used in machine learning and LLM systems are the opposite, they are very collision prone, and these are generally called proximity hash functions because they allow you to evaluate a distance between related items. the concept of distance between codes was invented/discovered by Hamming, and is the basis of error corrrection systems. hash functions, collisions, preimages, hamming distances are all related concepts to how bitcoin's security works. they are involved in how asymmetric cryptography (signatures) function as well as hash functions, and are used in PoW, with a difficulty adjustment, to prevent forks in the record and keep it linear. so yeah, you can't shortcut finding a block that satisfies a given target. maybe quantum computers could lower the time required to find these solutions but likely their energy cost to operate would exceed simply playing the game fairly like it was designed to be.
lol the consistency of gravity and the known spherical geometry of its effects (like all ballistics, everything is about spheres) completely discount the idea of the earth being flat in the strict planar field mathematical sense. the earth is flat only if you restrict the precision of your measurements, this is a mathematical function called "renormalization" and it makes a bridge between an infinite one dimensional line and a circle, which is finite in length. what the number becomes is inconceivably large compared to an idiotic concept of earth as a ... what, disc? square? rectangle? regular or irregular polygon plane? only someone who has no understanding of mathematics could swallow the obvious fallacy of flat earth. also, someone who is gullible enough to believe a psyop that is designed to ruin your ability to think.
Jesus spending time with the sinners and weirdos and plebs was a feature, not a bug. It's the same with Bitcoin. What is the purpose of giving it perceived legitimacy in the eyes of fiat credentialists?
i'm fine with the clunky reward schedule, because it is reliable and simple. right now after renewed consideration i'm not sure that a precise exponential decay regime would even make any real benefit. it has an ironic benefit in that it creates a resonant cycle of hype and hope. i think that in the 4+ year time window this actually has a benefit that is about psychology. not saying that it isn't inelegant, but very often elegance is not necessarily better than brute simplicity.
Right. The only way to get it wrong is to not increase the precision in a hundred years, and that itself wouldn't be a tragedy unless the purchasing power of a single sat at that time is as much as the purchasing power of several bitcoin today.
But the precision will be increased almost certainly, to allow for relatively small virtual UTXOs to be realized into regular UTXOs with reasonable fees and amounts for self custody exiting of federations, etc., and to allow for that little bit of extra profit for miners.
yeah, increasing precision is starting to loom on the horizon. but i expect that won't become pressing until next cycle at minimum. when you have 10 sats/dollar there starts to become some issues with pricing common small items. i'd also say that millisats mostly push that into the future another 8 years further. it's possible, even, that lightning protocol may actually defer increasing precision by decades. since the deferment of settlement is the whole point of it.
ok, so it's not like the game "I'm thinking of a random number between 1 and 10" and the other person has to guess. in this case, the miner just needs to find some small number that produces a hash with a lot of leading "0's" then i wondered what happens when 2 miners find the small number at the exact same time. research says that the chain forks temporarily, and the nodes resolve to the longer chain. i assume this is in the bitcoin core code that the nodes run.