Sakuu Blog
AI, Data Centers, and Interesting Implications for Kavian
AI continues to capture public attention and investment dollars, and its rapid evolution is producing considerable impact on business in remarkable ways — some of which raise interesting implications for Kavian platform use.
Sakuu is a pioneer in dry-electrode production technology that transforms the battery manufacturing process for reduced CAPEX, localized supply chains, lower carbon footprints, and speedier innovation. Our Kavian platform eliminates toxic solvents, ovens, and solvent recovery systems by using dry materials to print electrodes at speed. The technology works with all major chemistries (NCM, LFP, NCA, graphite, silicon-graphite, solid-state materials, etc.), enables new chemistries such as aluminum-ion and sodium-ion, and is applicable to everything from producing better EV batteries to safer grid storage solutions.
As it turns out, Kavian may also play a key role in powering AI.
The Data Center Demand Crisis
AI’s energy use already represents as much as 20 percent of global data-center power demand. By 2035, Deloitte estimates “that power demand from AI data centers in the United States could grow more than thirtyfold, reaching 123 gigawatts, up from 4 gigawatts in 2024.” Over the summer this year, Meta, Microsoft, Amazon, and Alphabet reported year-to-date capital expenditures that cumulatively total well over $100 billion —including “gargantuan investments in physical infrastructure, namely data centers.”
Here in Silicon Valley, the city of Santa Clara is investing “in expanded electrical infrastructure that will largely serve data centers.” Currently home to 55 data centers, there are 3 more already in the pipeline. The nearby City of San Jose, where Sakuu is headquartered, recently struck a deal with utility giant PG&E for major grid improvements, with Mayor Matt Mahan noting the economic imperative driving the effort: “the demand for data centers is significant because cutting-edge, low-latency computing capacity could help draw R&D labs that want to be as close to those data centers as possible. He added that they could help the city maintain its competitive edge in the global artificial intelligence race.”
At issue is the difference between traditional data centers built for enterprise storage and processing, and the enormous additional computational power demanded by AI workloads and models processing trillions of parameters. AI data centers require thousands of high-performance GPUs or TPUs to perform those computations, and these specialized processors consume a lot more power than traditional CPUs. That power consumption can strain the grid. Google, for example, has already “signed agreements with two U.S. electric utilities to reduce its AI data center power consumption during times of surging demand on the grid…as energy-intensive AI use outpaces power supplies.”
The very nature of AI workloads causes spikes in energy demand, as noted in IEEE Spectrum, “When you have all of those GPU clusters, and they’re all linked together in the same workload [e.g., model training tasks], they’ll turn on and turn off at the same time.” That leads to massive energy spikes and distortions, and can make it difficult to load balance an electrical grid to prevent outages or infrastructure damage. It also increases the total amount of costly power supply to be installed unless those power peaks can be shaved off. Dedicated backup battery energy storage systems (BESS) for data center facilities can help bridge power outages for some time, but batteries can’t consistently provide sufficiently high power for the peaks typical with AI-based computation. A better way to mitigate those spikes, limit overall investment, and safeguard the grid is by installing banks of supercapacitors along with all those GPUs in AI data centers.
Supercapacitors
Supercapacitors are similar to batteries, but they firstly store energy electrostatically rather than chemically. They can charge and discharge quickly, providing high power for brief amounts of time, and tend to have long lifespans. Their high charge-discharge efficiency makes them a reliable, cost-effective, and low-maintenance solution for smoothing short-term power demand spikes from AI workloads in data centers without the need to draw extra power from the grid.
So naturally, some of our customers have started exploring Kavian’s benefits for lithium-ion supercapacitor electrode manufacturing to meet the expected rise in demand. We’ve started testing dry process performance and are expecting even higher cost reductions using Kavian for this purpose compared to use for battery electrodes. Additionally, supercapacitors require a manufacturing step of inserting Lithium — which is another process step that Kavian already supports with its ability to print Lithium-metal layers.
The end goal is to make the grid more stable and enable AI data centers to continue fueling innovation as efficiently and as sustainably as possible. An enormous number of supercapacitors need to be produced to meet that challenge.
Kavian already has proven its revolutionary printing capabilities for manufacturing electrodes of various materials efficiently in the appropriate range of specifications. But the platform can also help power AI evolution by supercharging supercapacitor manufacturing with the speed and cost-effectiveness of print production. — Arwed Niestroj