Do stochastic parrots understand what they recite?

posted in: reading | 0
There had never been a time like this before in the history of Artificial Intelligence (AI) where AI was this democratic and celebrated. With the advent of deep learning, big data, and high-performance computing, things that seemed impossible just a decade ago can now be achieved with a few lines of code. The area where the advancement leap has been the greatest is in Natural Language Processing (NLP). This surge in NLP began with the coming of Transformers proposed in the paper “Attention is all you need” by Vaswani and others. This article discusses some of the strengths and limitations of transformer-based language models.

The innovation and excitement LLMs have brought into the AI world are exceptional, beyond doubt. I do not intend to undermine their contribution, but we NLP researchers need to be careful not to miss the forest for the trees. Language is much more than mere data.



Article by Bender on the Octopus test discussion : Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data
Ryan Watkins