@JonhernandezIA
📁 Yann LeCun says language models do extract meaning, but only at a superficial level. Unlike humans, their intelligence is not grounded in physical reality or common sense. They answer many questions well, but break down when faced with new situations because they do not truly understand the world they describe.