At Goole I/O 2018, Google launched TPU3.0, an AI chip with a computing performance of up to 100 PFlops. The original intention of the design is to apply it to Google Cloud, so this is also the first Cloud TPU launched by Google. Following Google, Amazon also released Inferentia, a machine learning chip that powers AWS products, in late 2018. In addition, Microsoft and Facebook are also stepping up the recruitment of cloud AI chip-related talents in 2018.
It is not difficult to see that the progress of the Internet giants in the United States in the cloud AI chip is different. Google continues to lead, followed b2b data by Amazon, and Microsoft and Facebook are desperately catching up. 03 BAT's independent research and development continues to break through The Chinese internet giants have performed very similarly to their American counterparts. Baidu, which is also a search engine, is the first to develop cloud AI chips. In July 2018, Baidu released China's first cloud-based full-featured AI chip "Kunlun".
One of the most important missions of this chip is to form a synergy with Baidu's AI brain and drive the explosive growth of Baidu's brain computing power. This is also similar to Google's TPU3.0. Later, at the Yunqi Conference in 2019, Alibaba released its first cloud AI chip "Hanguang 800", which was claimed to be the most powerful AI chip in the world at that time. What's more special is that this cloud AI chip is more directional and is mainly used in cloud vision processing scenarios. It is a professional AI chip that is deeply customized for cloud needs.