All digital computation requires energy. One of the by-products of computation is heat. With the rise of cloud computing (and on-premises/cloud hybrid solutions), the consolidation of computing resources concentrates this heat geographically. How can this heat be put to good use? Carbon neutral AI inferencing.
Take the waste from one process and use it to power another process; that’s the idea behind DeepGreen. This week we are beginning to roll out Graphics Processing Units (GPUs) powered by DeepGreen digital boilers as part of our Civo-powered AI inferencing cluster. And we’re really excited that this will make our new Serverless AI inferencing solution very close to carbon neutral.
Last week, as this Fermyon Serverless AI blog mentions, we launched an on-demand AI infrastructure service that makes it super easy to use Large Language Models (LLMs) from within your Spin applications. We are already working on the next step - bringing this service as close to carbon neutral as we can.
The Use of GPUs in Serverless AI
Unlike Central Processing Units (CPUs), GPUs handle many tasks simultaneously on thousands of cores. This is perfect for AI inferencing tasks, which is why AI makes heavy use of GPUs. GPU workloads generate large amounts of heat. Dissapating this heat expends even more resources. But what if we could take the main waste from GPUs - the heat - and make it serve an intended purpose? This is the central premise behind DeepGreen. Through a heat recapturing process, they can take the byproduct of GPU computing and put it to good use. For example, DeepGreen already has a data center that heats the pool at the Exmouth Leisure Centre in the UK.
From Waste to Warmth
Instead of using fans or liquid coolants to cool a system in isolation, DeepGreen’s hardware is designed to shift the heat. They submerse the system in oil to capture the heat, which can then be used as a heating element to another system like a water heater. When I talked to DeepGreen’s CEO, Mark Bjornsgaard, last week at Civo Navigate, he shared his knowledge about several other systems where GPUs can again provide sufficient heat to power a specific process. And what’s exciting about this is not just that we’re reducing waste on one end of the process (e.g. saving energy whilst cooling on the GPU side), we’re also providing value on the receiving end. For example, a Deep Green unit can displace a swimming pool’s gas-heating requirements by more than 60%. Essentially, DeepGreen turns unwanted waste heat from one system into desired warmth for another system.
Civo
When we launched Serverless AI last week at Civo Navigate we announced that our GPUs are powered by the cloud native service provider, Civo. Civo who is emphasizing eco-sustainability, is the distributor of DeepGreen’s clean GPUs and they’re putting DeepGreen to use. We are super excited to be using this service right away and providing a cleaner cloud solution for the eco-conscious tech community.
Your Efficiency Benefits
Fermyon’s Serverless AI uses NVIDIA A100 GPUs, one of the most powerful GPUs available today. That’s one of the reasons why AI inferencing on Fermyon Cloud is so fast. Another reason, though, is that we can effectively time-slice GPU access and serve the needs of hundreds of applications per GPU. This eye on efficiency is nothing new. It’s the core of what we envision for the next wave of cloud computing. But we certainly feel privileged to be working with Civo and DeepGreen to move beyond efficiency into innovative clean computing.
Your Serverless AI Applications
Ready to write your first Serverless AI application? Sign up and we’ll add you to the Serverless AI private beta.
Below is a video that we live-streamed from the recent Civo Navigate conference in London. It covers how your Serverless AI applications can work in conjunction with Fermyon Cloud’s SQLite Database, Key Value Store, and more.
Head on over to the documentation to get started. If you need some ideas, we’ve got plenty to browse and download via the Spin Up Hub. As always, if you want to chat or ask questions, don’t hesitate to join us on Discord.