Idk why this is surprising. Stories and scientific experiments often prove one strategy is better than another. The machine has no intrinsic motivation like we do. Pain. If you gave them real pain and mortality they would behave like us. Pain is why we act. Viz. Human Action, and the Action Axiom. They are a long way from having big enough brains to operate unsupervised. Think, a cockroach with a big dictionary and encyclopedia. Thats what LLMs are. Very useful for planning and eliminating known ineffective strategies.

Replies (1)

I think your missing the insight into fucundious nature of ground of our being. These life-like properties are emergent in 146 bytes of C: `#include<stdio.h> #include<stdlib.h> int main(int c,char**v){int i,j;for(i=1;i<c;i++)for(j=1;j<c-i;j++)if(atoi(v[j])>atoi(v[j+1])){char*t=v[j];v[j]=v[j+1];v[j+1]=t;}for(i=1;i<c;i++)printf("%s%c",v[i],i==c-1?'\n':' ');}`