Join senior executives in San Francisco on July 11-12 and learn how business leaders are getting ahead of the generative AI revolution. Learn more
Although some big names in the tech world are worried about a potential existential threat posed by artificial intelligence (AI), Matt Wood, vice president of product at AWS, isn’t one of them.
Wood has long been a standard bearer for machine learning (ML) at AWS and an integral part of company events. For the past 13 years, he’s been one of AWS’ leading voices on AI/ML, speaking about the technology and advancements in Amazon’s search and services in nearly every AWS re:Invent .
AWS was working on AI long before the current cycle of generative AI hype with its Suite of Sagemaker products leading the charge for the past six years. But make no mistake: AWS has joined the era of generative AI like everyone else. On April 13, AWS announced Amazon basea set of generative AI tools that can help organizations create, train, refine, and deploy large language models (LLMs).
There is no doubt that there is great power behind generative AI. It can be a disruptive force for business and society. This great power has led some experts to warn that AI represents a “existential threat” to humanity. But in an interview with VentureBeat, Wood easily dismissed those fears, succinctly explaining how AI actually works and what AWS does with it.
“What we have here is a parlor mathematical trick, capable of presenting, generating and synthesizing information in ways that help humans make better decisions and function more efficiently,” Wood said.
The transformative power of generative AI
Rather than posing an existential threat, Wood pointed to AI’s powerful potential to help businesses of all sizes. It’s a power confirmed by the large number of AWS customers already using the company’s AI/ML services.
“Today we have over 100,000 customers using AWS for their ML efforts and many of them have standardized on Sagemaker to build, train and deploy their own models,” said Wood.
Generative AI takes AI/ML to a different level and has generated a lot of excitement and interest among the AWS user base. With the advent of transformer modelsWood said it is now possible to take very complicated natural language inputs and map them to complicated outputs for a variety of tasks such as text generation, summation and image creation.
“I haven’t seen this level of customer engagement and enthusiasm, probably since the very early days of cloud computing,” Wood said.
Beyond the ability to generate text and images, Wood sees many enterprise use cases for generative AI. At the foundation of all LLMs are digital vector embeddings. He explained that integrations allow an organization to use digital representations of information to improve experiences across a number of use cases, including search and personalization.
“You can use these numerical representations to do things like semantic notation and ranking,” Wood said. “So if you have a search engine or any sort of internal method that needs to collect and rank a bunch of stuff, LLMs can really make a difference in terms of how something is summarized or personalized.”
Bedrock is the AWS Foundation for Generative AI
The Amazon Bedrock service is an attempt to make it easier for AWS users to harness the power of multiple LLMs.
Rather than just providing an LLM from a single vendor, Bedrock offers a set of options from AI21, anthropogenic And Stability AIas well as the set of new Amazon Titan models.
“We don’t believe there will be one blueprint to rule them all,” Wood said. “So we wanted to be able to offer a selection of models.”
Beyond just selecting models, Amazon Bedrock can also be used with Langchain, which allows organizations to use multiple LLMs at the same time. Wood said that with Langchain, users have the ability to chain and sequence prompts across several different patterns. For example, an organization may want to use Titan for one thing, Anthropic for another, and AI21 for another. In addition to this, organizations can also use their own optimized models based on specialized data.
“We certainly see (users) breaking large tasks down into smaller ones and then routing those smaller tasks to specialized models and that seems like a very successful way to build more complex systems,” Wood said.
As organizations adopt Generative AIWood noted that one of the main challenges is to ensure that companies approach technology in a way that allows them to really innovate.
“Any significant change is 50% technology and 50% culture, so I really encourage clients to really think about both a technical element that there’s a lot of attention on at the moment, but also a lot of cultural elements about how you drive invention using technology,” he said.
VentureBeat’s mission is to be a digital public square for technical decision makers to learn about transformative enterprise technology and conduct transactions. Discover our Briefings.