Deals Counterparts

AWS partners with Cerebras Systems for AI inference solution deployment on Amazon Bedrock in AWS data centers

Partnership Data Center announced Mar 13, 2026
deployment
Stage

AWS will deploy Cerebras Systems' AI inference solution on Amazon Bedrock under a multi-year, $750 million agreement. This significantly enhances AWS's specialized AI compute offerings for enterprise clients,.

Get daily data center deal alerts — free, no spam.

Register free to access full counterpart details, deal analysis, and timeline.

Register free →

Deal Analysis

Amazon Web Services will deploy Cerebras Systems' AI inference solution within its AWS data centers under a multi-year, $750 million agreement announced on 2026-03-13. This partnership directly enhances AWS's specialized AI compute offerings, providing advanced capabilities for enterprise clients utilizing Amazon Bedrock. The $750 million commitment over multiple years indicates a substantial investment by AWS into dedicated AI hardware infrastructure. This move allows AWS to integrate Cerebras Systems' specialized hardware, designed for accelerating AI workloads, directly into its cloud services. The commercial logic centers on AWS's need to meet escalating enterprise demand for high-performance AI inference, leveraging Cerebras Systems' focused technology. Cerebras Systems, a technology company specializing in computer systems for AI work and hardware solutions to accelerate AI workloads, gains a significant deployment channel through Amazon Bedrock. This agreement provides Cerebras Systems with a direct pathway to AWS's extensive enterprise client base, validating its focused hardware development strategy. Amazon, while actively investing in renewable energy to power its operations through direct acquisitions and power purchase agreements, is simultaneously expanding its portfolio with solar-plus-storage solutions to support its growing infrastructure footprint. This broader operational strategy, including its expansion into renewable energy, underpins the energy-intensive demands of deploying advanced AI inference solutions within AWS data centers. The deal positions Amazon to offer more competitive, specialized AI compute, while Cerebras Systems secures a substantial, multi-year revenue stream.
  • AWS commits $750 million over a multi-year period for Cerebras Systems' AI inference solution.
  • Deployment will occur within AWS data centers, integrated with Amazon Bedrock.
  • Cerebras Systems specializes in hardware solutions designed to accelerate AI workloads.
  • Amazon is actively expanding its renewable energy portfolio with solar-plus-storage to power operations.
  • The partnership was announced on 2026-03-13.

Source Intelligence

KEY DETAILS

The solution will combine AWS Trainium-powered servers with Cerebras' wafer-scale CS-3 systems and Elastic Fabric Adapter (EFA) networking. AWS will also offer open-source LLMs and Amazon Nova, its own foundation models, using Cerebras hardware “later this year.” The combined Trainium/CS-3 solution will enable “inference disaggregation,” splitting AI inference into a compute intensive prompt processing (‘prefill’) stage and a memory bandwidth-intensive output generation (‘decode’) stage. Trainium chips will be optimized for prefill, while Cerebras CS-3 hardware will be optimized for decode.

Deal Size
The financial terms of the agreement have not been disclosed.
Location
set to be deployed on Amazon Bedrock in AWS data centers.
Financials
The financial terms of the agreement have not been disclosed.
Announcement
March 13, 2026 By Charlotte Trueman
PARTIES MENTIONED IN SOURCE
A
Amazon Web Services (AWS) partner

"Amazon Web Services (AWS) has partnered with Cerebras Systems"

C
Cerebras Systems partner

"Amazon Web Services (AWS) has partnered with Cerebras Systems"

medium quality Enriched Mar 13, 2026

Timeline

Announced
Mar 13, 2026
Signed
Closed

Get the full picture — timeline, source intelligence, and counterpart analysis.

Register free →
Track Data Center deals