That is the number of parameters the Natural Language Processing(NLP) AI model WuDao 2.0 developed by the Beijing Academy of AI. It is even higher than Google’s Switch Transformer, which has 1.6 trillion parameters, which it unveiled in January. More parameters equate to a more sophisticated machine learning model.
The Snippets Journal
By Shreesha
