ai.PULSE: My First Cloud Vendor Event Focused Entirely On AI

ai.PULSE: My First Cloud Vendor Event Focused Entirely On AI


AI is a leading theme among cloud vendors globally and European vendors are following suit. In a cloudy Paris, on November 7th the cloud vendor Scaleway hosted its third ai.PULSE event, with about 1,000 attendees and 40 press delegates and analysts. The parterre of guests was remarkable, from CEOs of global companies such as Michael Dell (founder of DELL Technologies) to government representatives such as Clara Chappaz (French secretary of state for AI and digital affaires), to founders of startups like Charles Kantos (CEO and founder of AI startup “H”). The event revolved around cloud and AI and specifically sustainability in AI and Cloud deployments and digital and cloud sovereignty.

Fundamentally a cloud provider, Scaleway is massively focusing on AI. The vendor has significantly boosted its AI computing power by making over 5,000 NVIDIA H100 GPUs available to its clients. This is to support the “GPU Cluster On Demand” service, allowing customers to reserve computing clusters of varying sizes for flexible durations. Additionally, Scaleway has partnered with stealth startup “H” to provide a large training cluster of NVIDIA H100 GPUs. This pivot to AI is driving discussion in related areas such as AI and cloud sustainability, as well as digital and cloud sovereignty.

Cloud And AI Sustainability

Due to increasing numbers of GPUs available in data centers belonging to cloud providers and their associated power consumption, cloud and AI vendors have begun to consider environmental sustainability as a pivotal issue both in cloud and AI deployments. If the adoption of AI continues at scale, there will be no power plant that will be able to sustain AI workloads’ energy needs. In fact, over time the increase in AI utilization will more than offset the gains in power consumption efficiency that are being made for every new GPU model released. Cloud decision-makers should keep an eye on these trends:

  • Colocation of data centers and power plants. Given the increasing power consumption of data centers, it could be necessary to build power plants to meet cloud vendors’ demand exclusively. This will be a viable solution to avoid overprovisioning of (diesel powered) backup generators to grant continuity in case of power grids overload or failure. This will also require colocation of power plants and data centers to minimize energy dispersion.
  • Increase of nuclear power usage. Cloud vendors are realizing that they need nuclear energy to meet their power demand. While solar and wind power aligns better with the cloud vendors’ sustainability goals, nuclear is going to be the main energy source in countries like France where this kind of source is available.
  • GPU Mutualization. Solicited by my question on GPUs’ utilization and efficient usage, Scaleway’s CEO Damien Lukas illustrated an interesting concept: GPU mutualization. With GPU mutualization cloud vendors are becoming GPUs brokers and provide GPUs on demand to end-user organizations, enabling 100% utilization of the processing units in their data centers, with related power efficiency gains.
  • Improvement of efficiency of cloud infrastructures, AI, and GenAI workloads. A single request to an AI chatbot consumes 2.9 watt hours. This is ten times more power consumption than a regular Google search query. This has an obvious impact in terms of power consumption and related emissions from AI workloads hosted in public cloud environments. According to Renee James – founder of Ampere Computing – interviewed for one of the keynotes,  AI has to be affordable and environmentally efficient. Given the current increasing power consumption trend, it must become more efficient rather than more affordable to be sustainable from a cloud vendor’s emissions perspective.

Digital And Cloud Sovereignty

Digital and cloud sovereignty do play a major role in France – the country of the SecNum Cloud regulation. This has consequences on AI as well in terms of data residency and workloads’ location. Presenters and keynote speakers made a few interesting points:

  • Bring AI to the data, not data to AI for inference. One sovereignty-related issue with AI is that some organizations bring data to where the AI engine is running. Moving forward, AI will be more and more at the edge, be it on mobile phones, laptops or any other edge device. For example, a lot of inference can be done on more efficient CPUs with special purpose transistors. This will make it possible to keep the data used for inference local, overcoming some of the sovereignty-related concerns.
  • Select data for model training. Organizations do not — and should not — need to feed models with all the available data. Selecting the data to provide for model training in the cloud is one way to stay in control and avoid leveraging data in the public cloud when it need not leave the premises.
  • Foster a startup culture in Europe. Europe is not short of ideas and capital for startups in the cloud and AI space. However, some start-ups leave the continent seeking higher funding, less regulation, and better ecosystems in places such as the Silicon Valley. If Europe wants to play a leading role in the cloud and AI space, regulators and operators need to contribute to a better culture that will keep talent and top ideas in Europe. This will help tackle sovereignty constraints as more technology options are going to be available locally.

European organizations should be aware that they can leverage AI opportunities within the continent’s boundaries to abide by digital sovereignty requirements and shrink the number of cloud vendors whose CO2 emissions they have to keep under control.

Set up an inquiry or guidance session with me to learn more.

 

 



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.