That is the number of parameters the Natural Language Processing(NLP) AI model WuDao 2.0 developed by the Beijing Academy of AI. It is even higher than Google’s Switch Transformer, which has 1.6 trillion parameters, which it unveiled in January. More parameters equate to a more sophisticated machine learning model.
Shreesha writes about Business, Finance and Tech for The Snippets Journal. He is also the Founder and Head of Content Development.Articles: 192