Reduce energy consumption of your machine learning workloads by up to 90% with AWS purpose-built accelerators
JUNE 20, 2023
For reference, GPT-3, an earlier generation LLM has 175 billion parameters and requires months of non-stop training on a cluster of thousands of accelerated processors. The Carbontracker study estimates that training GPT-3 from scratch may emit up to 85 metric tons of CO2 equivalent, using clusters of specialized hardware accelerators.
Let's personalize your content