i could also point to nyquist sampling theory, the same thing applies. if you are not sampling the data at double or more than the periodicity of a signal, you lose information, or worse, you have multiple, contradictory informations that you can't distil to a single measurement. also, it's long been my opinion that quantum theory in general is an attempt to find that elusive atomicity of the phenomenon of physical reality. concurrency is much more important for understanding the *process* by which a system moves from one state to another. you have to understand the process in order to control it, otherwise you are just throwing dice and hoping you don't get snake eyes. UTXOs are the *reason* for running bitcoin, because of their scarcity. but bitcoin itself is a series of blocks. if you read up on how a bitcoin node interacts with all of the data that comes from peers and how it works, the block is the centre of the system, the block height represents the scalar of the state and the block hash (really, the hash of the header with the merkle root of the transactions) is the digest of the system state at that block height. i didn't also mention the fact that the history of the chain is not immediately fixed when a new block is minted. it can happen that concurrently two miners emit a valid solution at the same time, so there is also a rule that the smallest hash is preferred to build the next block on. but until that next block is found, the head of the chain is only hypothetically the best block, and it is only once 6 blocks have stacked up on top of a block that the probability of it being forked off the main chain goes so low you can say it's now immutable.

Replies (1)

Agreed, longest chain of work. I understand orphaned blocks, just trying to speak in simple terms here. Does bitcoin disprove Feynman path integral, as only one state is actually committed to?