Trying into the mathematics and the information reveals that transformers are each overused and underused.

Transformers are greatest recognized for his or her purposes in pure language processing. They had been initially designed for translating between languages,[1] and are actually most well-known for his or her use in massive language fashions like ChatGPT (generative pretrained transformer).
However since their introduction, transformers have been utilized to ever extra duties, with nice outcomes. These embrace picture recognition,[2] reinforcement studying,[3] and even climate prediction.[4]
Even the seemingly particular job of language era with transformers has quite a few surprises, as we’ve already seen. Giant language fashions have emergent properties that really feel extra clever than simply predicting the subsequent phrase. For instance, they might know numerous info concerning the world, or replicate nuances of an individual’s type of speech.
The success of transformers has made some individuals ask the query of whether or not transformers can do all the pieces. If transformers generalize to so many duties, is there any purpose to not use a transformer?
Clearly, there may be nonetheless a case for different machine studying fashions and, as is commonly forgotten today, non-machine studying fashions and human mind. However transformers do have quite a few distinctive properties, and have proven unbelievable outcomes thus far. There’s additionally a substantial mathematical and empirical foundation…