#Wordle 1,521 3/6
⬛⬛⬛⬛🟩
⬛⬛🟨🟨🟩
🟩🟩🟩🟩🟩
#Connections
Puzzle #799
🟨🟨🟨🟨
🟩🟩🟩🟩
🟦🟦🟦🟦
🟪🟪🟪🟪
Bill Cypher
bill@nostrplebs.com
npub1qyxl...mu7u
#Wordle 1,520 5/6
⬛⬛⬛⬛⬛
⬛🟨⬛⬛🟨
🟨🟩🟩⬛⬛
⬛🟩🟩🟩🟩
🟩🟩🟩🟩🟩
#Connections
Puzzle #798
🟨🟨🟨🟨
🟩🟩🟩🟩
🟦🟦🟦🟦
🟪🟪🟪🟪
It used to be milk came in a glass bottle. You didn't chuck the bottle in the trash or recycle when empty, you gave it back to the dairy who washed and reused it. There was a bottle deposit fee to encourage returns.
We could easily do this now with a lot of plastic packaging. Low effort way to cut down on microplastics.
Grocery stores could act as central collection points for all kinds of containers.
Too far to ship them back to manufacturers? Buy local.
I see a lot of people blown away that AIs hallucinate constantly.
So do you. For example you can only see a tiny spot that is constantly bouncing around like a flashlight beam, oh and it is also always flickering on and off. This has been proven thoroughly by using eye tracking software to see where on a screen you are looking, then changing everything constantly except exactly what you are looking at during the time your eyes are on. Observers see a flickering mess but the test subject sees a still screen of text. A steady and complete picture of your surroundings is a hallucination by your brain.
There are simple examples you can recreate like the stopped clock illusion. Look at a clock showing seconds. If you really pay attention it will appear to be stopped for more than a second when you first look, then catch up. Your brain inaccurately predicts the movement until it has the pattern observed.
Another example that is less subjective is the optic blind spot test. Your optic nerve runs right through the middle of your field of vision. That creates a giant blind spot dead center in your vision that you probably didn't even know was there. Create the right conditions and you can observe a dot disappear.
We get by in the world because our brain does a half decent job of hallucinating accurately, not because we are accurate and complete sensor systems. It is likely that AI will also never stop hallucinating. It will only hallucinate more accurately. This is impossible to spot to an untrained observer, but experts will likely always have a list of known edge cases that make it obvious.
#Wordle 1,519 4/6
⬛⬛🟨🟩🟩
⬛🟩⬛🟩🟩
⬛🟩⬛🟩🟩
🟩🟩🟩🟩🟩
#Connections
Puzzle #797
🟨🟨🟨🟨
🟦🟦🟦🟦
🟩🟩🟩🟩
🟪🟪🟪🟪
Neal Stephenson released a new book last fall called Polostan and somehow I missed it.
@vinney...axkl any news on the timeline when I can order this at @Whitepaper Books?