Personal Brand Presence | 6 / 10 |
Authoritativeness | 8 / 10 |
Expertise | 9 / 10 |
Influence | 6 / 10 |
Overall Rating | 7 / 10 |
Entrepreneur, computer scientist, investor, and writer Kai-Fu Lee is from Taiwan. At the moment, Beijing, China, is his home base. For his doctoral dissertation at Carnegie Mellon University, Lee created a continuous voice recognition system that is independent of the speaker. Subsequently, he had senior positions at Apple, SGI, Microsoft, and Google. In 2000, when he was promoted to corporate vice president of interactive services at Microsoft, he signed a one-year non-compete agreement. This agreement became the subject of a legal dispute in 2005 between Google and Microsoft, his previous employer.
He works in the Chinese internet industry. From 1998 to 2000, he was the founding director of Microsoft Research Asia. From July 2005 until September 4, 2009, he was president of Google China. He started the venture capital company Sinovation Ventures after leaving his position. In addition to writing “10 Letters to Chinese College Students,” he founded the website WǒxuéwĎng (Chinese: 我学网; lit. “I-Learn Web”) with the goal of assisting young Chinese people in their academic and professional endeavors. He is a Chinese microblogger with over 50 million followers on Sina Weibo, in particular.
Lee established 01.AI at the end of March with the goal of creating a major language model in-house for the Chinese market. He will be competing with other well-known Chinese tech titans, such as Wang Xiaochuan, the creator of Sogou, who has been amassing talent and venture funding to build China’s OpenAI-like companies rather quickly.
The tremendous advancement in the field of generative AI is reflected in the expansion of AI. The firm has delivered its first model, the open source Yi-34B, seven months after it was founded. Lee explained that the choice to launch an open LLM as their first product was an attempt to “give back” to the community. “We’ve provided a compelling alternative,” he continued, for those who have believed LLaMA is a “godsend” for them.
As of this writing, the top pre-trained LLM model according to a ranking by Hugging Face is Yi-34B, a multilingual (English and Chinese) base model trained with 34 billion parameters and much smaller than other open models like Falcon-180B and Meta LlaMa2-70B.
Cryptocurrency, like any other currency, is a financial instrument based on the fundamental economic principles of supply ...
Know MoreWith the current fast-growing crypto market, the significance of reliable and secure wallet solutions cannot be emphasized ...
Know More