- Amazon Bedrock and Amazon Titan models, offers ease, reliability, scalability, security, and customization to build generative AI applications using Foundational Models, with features build for enterprise
- AWS also announces the general availability of Inf2 and Trn1n instances powered on AWS Inferentia2 and AWS Trainium making generative AI cost-efficient
- Amazon CodeWhisperer now free for every developer, democratizing access to generative AI services
Amazon Web Services, Inc. (AWS), an Amazon.com, Inc. company (NASDAQ: AMZN), today announced four innovations across its machine learning portfolio to make to make generative AI more accessible to customers. As an AI pioneer, AWS has helped more than 100,000 customers of all sizes and across industries to innovate using ML and AI with industry leading capabilities. Its new solutions will drive the next wave of innovation by making generative AI easy, practical, and cost-effective for customers.
These announcements include:
- Amazon Bedrock: Easily build generative AI applications – Amazon Bedrock is a new service for building and scaling generative AI applications, which are applications that can generate text, images, audio, and synthetic data in response to prompts. Amazon Bedrock gives customers easy access to foundation models (FMs)—those ultra-large ML models that generative AI relies on and exclusive access to the Titan family of foundation models developed by AWS. Amazon Bedrock opens up an array of foundation models from leading providers, so AWS customers have flexibility and choice to use the best models for their specific needs.
- General availability of Amazon EC2 Inf2 instances powered by AWS Inferentia2 chips: Lowering the cost and energy consumption to run generative AI workloads – Ultra-large ML models require massive compute to run them. AWS Inferentia chips offer the most energy efficiency and the lowest cost for running demanding generative AI inference workloads (like running models and responding to queries in production) at scale on AWS.
- New Trn1n instances, powered by AWS Trainium chips: Custom silicon to train models faster -Generative AI models need to be trained so they offer the right answer, image, insight, or other focus the model is tackling. New Trn1n instances (the server resource where the compute happens, and in this case, runs on AWS’s custom Trainium chips) offer massive networking capability, which is key for training these models quickly and in a cost-efficient manner.
- Free access to Amazon CodeWhisperer for individual developers: Real-time coding assistance. Imagine being a software developer with an AI-powered coding companion, making your coding faster and easier. Amazon CodeWhisperer does just that. It uses generative AI under the hood to provide code suggestions in real
- time, based on a user’s comments and their prior code. Individual developers can access Amazon CodeWhisperer for free, without any usage limits. Amazon CodeWhisperer now supports 10 more programming languages, and participants who used CodeWhisperer completed tasks 57% faster than those that did not use CodeWhisperer and completed tasks successfully 27% more frequently than those that did not use CodeWhisperer.
Swami Sivasubramanian, VP of Databases, Analytics, and Machine Learning at AWS said: “AWS cuts through the noise to make generative AI easy, practical, and cost-effective for customers by providing solutions at every layer of the ML stack, including infrastructure, ML tools, and purpose-built AI services. Bedrock is the easiest way for customers to build and scale generative AI applications using foundation models and Amazon CodeWhisperer is the best AI coding companion for coding securely and using AI responsibly.”
AWS is on a mission is to make it possible for developers of all skill levels and for organizations of all sizes to innovate using generative AI. Through its innovative solutions, AWS is providing access to powerful tools and technologies that enable developers to create intelligent applications without requiring extensive expertise in machine learning. This democratization of AI will empower organizations to leverage the benefits of generative AI for a wide range of use cases, unlock insights and power new possibilities.