Contextual AI Partners with Google Cloud to Deliver Generative AI for Enterprises
In Brief
Contextual AI has partnered with Google Cloud to scale and train its Large Language Models (LLMs) for the enterprise.
Using Google Cloud’s infrastructure, Contextual Language Models (CLMs) will produce responses trained on the data and institutional knowledge of enterprises.
Contextual AI will use Google Cloud’s GPU VMs to build and train its LLMs.
Contextual AI, a company building large language models (LLMs) for enterprises, today announced a partnership with Google Cloud. The company announced its selection of Google Cloud as the preferred cloud provider. This choice encompasses business expansion, operational needs, scaling, and the training of its LLMs.
Under this partnership, Contextual AI will take advantage of Google Cloud’s GPU Virtual Machines (VMs) for constructing and training its models. The cloud provider offers A3 VMs and A2 VMs powered by the NVIDIA H100 and A100 Tensor Core GPUs, respectively.
Launched out of stealth following a $20 million seed raise in June, the company also plans to leverage Google Cloud’s specialized AI accelerators, the Tensor Processor Units (TPUs), to build its next generation of LLMs.
“Building a large language model to solve some of the most challenging enterprise use cases requires advanced performance and global infrastructure,” said Douwe Kiela, chief executive officer at Contextual AI. “As an AI-first company, Google has unparalleled experience operating AI-optimized infrastructure at high performance and at global scale which they are able to pass along to us as a Cloud customer.”
The company announced its intention to construct contextual language models (CLMs) on the Google Cloud platform. These models will be customized to produce responses aligned with the distinct data and institutional knowledge of each enterprise.
Contextual AI claims that this approach not only bolsters the precision and effectiveness of AI-powered interactions but will also empower users to trace answers back to their source documents.
For instance, customer service representatives can now employ Contextual AI’s CLMs to deliver pinpoint responses to user inquiries. They will draw solely from authorized data sources such as the user’s account history, company regulations, and prior tickets concerning analogous questions.
Likewise, financial advisors will gain the capability to automate reporting procedures, furnishing personalized recommendations based on a client’s portfolio and history. The company said that this will encompass proprietary market insights and other confidential data assets.
The Race to Deliver Generative AI for Enterprises
As AI companies race to develop generative AI to help organizations streamline business processes, cloud providers are also competing to provide infrastructure for these companies to build and train their models on.
Just last week, IBM disclosed a partnership with Microsoft aimed at accelerating the deployment of generative AI solutions to their shared enterprise clientele. In June, Oracle, renowned for its cloud applications and platform, joined forces with enterprise AI platform Cohere to offer worldwide organizations access to generative AI services.
Recognizing growing interest from organizations in utilizing generative AI for business purposes, Amazon Web Services (AWS) also unveiled its plans to launch the AWS Generative AI Innovation Center. This center is designed to assist customers in constructing and launching generative AI services.
As AI innovation converges with cloud capabilities, these initiatives symbolize a leap forward in enterprise AI. Not only do they demonstrate the potential of AI-driven solutions, but they also pave the way towards enhanced business efficiencies.
Disclaimer
In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.
About The Author
Cindy is a journalist at Metaverse Post, covering topics related to web3, NFT, metaverse and AI, with a focus on interviews with Web3 industry players. She has spoken to over 30 C-level execs and counting, bringing their valuable insights to readers. Originally from Singapore, Cindy is now based in Tbilisi, Georgia. She holds a Bachelor's degree in Communications & Media Studies from the University of South Australia and has a decade of experience in journalism and writing. Get in touch with her via cindy@mpost.io with press pitches, announcements and interview opportunities.
More articlesCindy is a journalist at Metaverse Post, covering topics related to web3, NFT, metaverse and AI, with a focus on interviews with Web3 industry players. She has spoken to over 30 C-level execs and counting, bringing their valuable insights to readers. Originally from Singapore, Cindy is now based in Tbilisi, Georgia. She holds a Bachelor's degree in Communications & Media Studies from the University of South Australia and has a decade of experience in journalism and writing. Get in touch with her via cindy@mpost.io with press pitches, announcements and interview opportunities.