There had never been a time like this before in the history of Artificial Intelligence (AI) where AI was this democratic and celebrated. With the advent of deep learning, big data, and high-performance computing, things that seemed impossible just a decade ago can now be achieved with a few lines of code. The area where the advancement leap has been the greatest is in Natural Language Processing (NLP). This surge in NLP began with the coming of Transformers proposed in the paper “Attention is all you need” by Vaswani and others. This article discusses some of the strengths and limitations of transformer-based language models.
The innovation and excitement LLMs have brought into the AI world are exceptional, beyond doubt. I do not intend to undermine their contribution, but we NLP researchers need to be careful not to miss the forest for the trees. Language is much more than mere data.
Article by Bender on the Octopus test discussion : Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data
Latest posts by Ryan Watkins (see all)
- Experimental Evidence That Conversational Artificial Intelligence Can Steer Consumer Behavior Without Detection - September 28, 2024
- Don’t be Fooled: The Misinformation Effect of Explanations in Human-AI Collaboration - September 20, 2024
- Implementing New Technology in Educational Systems - September 19, 2024