they would if they knew how to prompt it ... and catch where it hallucinates ...
Login to reply
Replies (2)
They dont even have to use LLM for that? or you mean human halucinations?
Well technical both use similar datastructures (neural nets) humans have real fuzzy reasoning ... llms are pure fuzzy roll of dice ... so both have same inefficiencies but humans works on 20watts of power