@JonhernandezIA
📁 Yann LeCun explains that LLMs work well when problems are symbolic, like math, code or chess, where searching through known sequences is enough. But the real world does not work that way. Physical action, planning and understanding what is possible require continuous intuition, which LLMs lack. Manipulating symbols is not the same as understanding reality.