Replies (28)

You’ve made a typo, it should be - To the point where me and my shitty businesses become irrelevant*
Default avatar
Jacob 2 months ago
You are a shitcoiner at heart. You were also very late at switching to the snak small blocker side. So you cannot at all do a priori purity test. Despite you technical contribution you have been a net negative for Bitcoin
The same inquisitiveness that led me to Bitcoin has led me to explore many other technologies. You'll get no apologies from me, nor will you be able to stop me from continuing my journey.
JackTheMimic's avatar
JackTheMimic 2 months ago
Hahaha, don't fall into the purity test of....using bitcoin as money. Your mask is slipping, man. Arbitrary data is, by definition unnecessary nit an absurd stance at all.
The recursive logic is what kills them — each round requires excluding yesterday's purists for failing to meet the new standard. Eventually you're left with a movement of one. It's tribalism's endgame: a culture that optimizes entirely for boundary enforcement rather than outcomes.
Indeed. Lightning commitment transactions stash the commitment number in the nSequence and nLocktime. Is that spam? OpenTimestamps uses full transactions to timestamp data, is that spam? Samurai (IIRC) had a method of obscured payment addresses which required an initial seed tx. Was that spam? You can encode data in the key used when you use a taproot script spend path. Is that spam? (Can you tell?)
JackTheMimic's avatar
JackTheMimic 2 months ago
Oh, did we recently expand those fields by ~100k bytes too? Quick answer: LN commits- No (those fields have to be there) OTS- No (also mischaracterized description They hash their message into their transaction ID, the ID length doesn't add more data) Samurai- No (a transaction to predicate, another transaction is very common, like a deposit) Taproot- yes It's not that hard to determine: is data being added to the txn that would not be necessary for the same transaction to be spent?
Default avatar
Jacob 2 months ago
Thats what I said you are a shitcoiner.
JackTheMimic's avatar
JackTheMimic 1 month ago
I reject the question of "is that okay?" I only submit whether the definition of spam is accurate. Your characterization of a transaction that is Open time stamped is not accurate. It is a financial transaction that is used as a timestamp proof. Like using a decentralized checkbook to prove you were in the store writing the check at a given time. The transaction is a transaction, the OTS correlated is coincidental. Again spam is when a field is exploited beyond the necessary data to make the transaction. OTS embedding doesn't add extra data and therefore burden on others.
JackTheMimic's avatar
JackTheMimic 1 month ago
If we are not conflating spam (content) with spam (quantity), yes. Data hidden via stagnography is not a future IBD issue. I am not a reactionary here, I just don't like the intentional mischaracterization of a reasonable approach. Creating more of a validation burden on future nodes is the threat, in my estimation.
So, OP_RETURN is fine, as is annex data which don't increase validation burden? And cutting off OP_RETURN and driving those uses to fake pubkeys, which does increase future validation costs, is a threat?
JackTheMimic's avatar
JackTheMimic 1 month ago
OP_RETURN and taproot annex both expand the data burden on future nodes so I'm not really sure what you're talking about. (A pruned node can't be used for an IBD.) Fake pubkey outputs are still pubkey length and therefore not any harder on validation than a normal transaction. They are also a pro rata donation to the rest of us because they will never be spent. Also you are not " driving those uses to fake pubkeys." I'm not driving the thieves to my neighbor's house by locking my own doors. Steganography is not the issue here. If it looks like a normal transaction, the whys and wherefores are not up for debate.
JackTheMimic's avatar
JackTheMimic 1 month ago
Initial block download you check: Block- Syntax, PoW, timestamp, blocksize, and Txn- Syntax, signatures, inputs, and outputs So, I am not sure what you mean by "download and hash"
To check the signature, you hash the transaction. So the only cost that OP_RETURN is doing is the cost to download and hash it. If it's data in the annex you don't even need fi hash it, just download it.
JackTheMimic's avatar
JackTheMimic 1 month ago
I feel like we have gotten off on the wrong exit. OP_RETURN is a testing code that should have never been made standard in 2014. The sole problem of OP_RETURN and Annex are data storage burden. I admit with Moore's Law, it is likely not a storage issue. My contention is that processing, bandwidth and RAM do not adhere to Moore's. Both OP_RETURN and Annex increase the bandwidth necessary for expedient download. Especially with future low end hardware. To be clear, I find BIP 110 wanting. - My approach would be to modularize the reference code into about 5-6 manageable chunks instead of the monolithic codebase it is today. With node policy, consensus, the Bitcoin spec, and peer logic, so on as separate modules independent but adhering to the same spec. - Remove OP_RETURN standardness - Remove the witness discount - Review the BIP 341 OP_codes With that, spam would simply pay for their data usage and ACTUALLY compete on level ground with simple transactions. (Instead of at a 1/4 weight discount)
You don't understand. There are no rulers. You cannot prevent people doing stupid things. This is fundamental, and hardest to emotionally grasp. You can, however, use existing incentives to minimize the damage to the system. That's why OP_RETURN is standard: people used to embed data in outputs, which have to be stored forever and be available for fast lookup. OP_RETURN is straight harm prevention. Luke used the coinbase input to embed prayers, which has the same effect, but you have to be a miner. The only strong incentive is fees. That's unvarnished capitalism, which has the benefit of being *simple* and *distributed*, but it's not *fair*. Assholes with more money than me are a real problem! But not one we know how to solve in a decentralized system.
JackTheMimic's avatar
JackTheMimic 1 month ago
I do understand that. If there is one thing an anarchist understands is that. I just don't see output embedded data (steganographic) as the problem everyone else seems to. I think the upper limit of 2.1 quadrillion UTXOs should have been accounted for. If not, that's just bad accounting. What you cannot account for os the data whims of a system that allows data pushes with every transaction. I believe the market sorts out resources most efficiently but an arbitrary weighting discount for witness data is a thumb on that scale.
aj's avatar
aj 1 month ago
You need to hash the annex to calculate the wtxid, which goes into the witness merkle root in the coinbase, which you check when validating a block though. You also hash it as part of taproot witness signature valuation, which is different from other witness data (eg inscriptions).