Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Data centers are at the heart of whatever we do online. There are an estimated 8 million data centers globally processing our online activities. Data centers (DCs) provide the computing and storage of incoming as well as outgoing data from IT network traffic and applications. The increase in the use of mobile and related technologies such as Cloud computing has accelerated the growth of DCs to support online activities from internet browsing to various autonomous operations and more.  Also read:  The rise of cloud communications

There are broadly four categories of DCs:

  1. Colocation Data Centers are facilities that are rented with the servers managed by clients. The facility owner is responsible for power, cooling, resiliency, security and other environmental support.
  2. Managed Service Data Centers are similar to cloud providers but also give users access at the physical server level.
  3. Enterprise Data Centers are built and maintained by a single company using its servers. These range from smaller on-premise facilities to multi-level hyperscale centers.
  4. Cloud Data Centers are operated by hyperscalers such as Amazon, Microsoft and Google. Clients use them for storage and computing but do not have access to the physical servers. We will mostly focus on the last category in this article.  Also read:  Data centre revenues to hit $948bn by 2030: report

There is no fixed shape or size of DCs – they may be a tiny single room serving a small organization or massive warehouses processing the data for internet giants such as AWS, Google and Facebook. As data processing machines, they consume huge amounts of energy and are responsible for immense volumes of heat generated from both the running and cooling of the involved equipment. For perspective, DCs consume 10 to 50 times the energy per floor space of a typical commercial office building.

And the processing (and subsequent energy usage) is growing. The global data center market was valued at $187.35 billion in 2020 and is projected to reach $517.17 billion by 2030, at CAGR of 10.5% from 2021 to 2030, according to market research. Combined, global data centers process and manage roughly 2.5 quintillion bytes of data created each day by individuals and businesses. The DC operations are continually on the run, and they must function as efficiently as possible, ideally with minimal or no outage.

The energy consumption for a typical data center can be divided into around 50% being used by IT equipment, 35% on cooling and HVAC, 10% on electrical infrastructure and support and 5% on lighting. The electrical demand for data centers varies from just a few kilowatts up to many megawatts, depending on the size and location. Data centers contribute around 0.3% to overall carbon emissions, while the entire ICT technology ecosystem accounts for more than 2%. With the global push to achieve net-zero carbon emissions by 2050, data centers are finding ways to decarbonize their operations.

Hence, DCs are looking at a cooling-cost reduction from natural means such as location selection. This is fast joining other important determining factors including security, network proximity, tax incentives, access to renewable energy, etc. Hyperscaler data centers have been located in Arctic regions for effective natural cooling. Moreover, countries with additional renewable electricity, such as Canada, Finland, Sweden, etc., are also being seen as suitable locations for building data centers.

Why Go Underwater?

A start-up cloud provider, Subsea Cloud, is proposing to place servers 3,000m deep into the ocean “to make physical security breaches extremely difficult.” As CEO Maxie Reynolds, a marine engineer cum computer scientist turned ethical hacker explained in an industry podcast, "You can’t do it with divers. You’re going to need some very disruptive equipment. You can’t do it with a submarine, they don’t go deep enough. So you’re going to need a remote operated vehicle (ROV) and those are very trackable. It takes care of a lot of the physical side of security, and what I’m finding is that a lot of military industries want to use these for their physical security."  Subsea Cloud has stated that its data centers will support healthcare, finance and the military, and will not use water or electrical cooling.

Other underwater data center projects include China's Highlander and Microsoft's Natick, which use gas-filled vessels in shallow coastal waters at depths of around 120m (400ft).

The concept of an underwater data center began as a way to provide quick cloud services to coastal communities and save overall energy use. Microsoft’s Natick project, for example, was tested in the Northern Isles in the Pacific Ocean off the coast of Scotland’s Orkney Island. The location’s relatively cool waters were perfect for the experiment. The power grids supplying the electricity for these centers can be sourced from solar and wind power, thus aligning with energy-saving objectives. The waters can consistently cool all the immersed equipment without the risk of overheating. Moreover, as has been observed at “lights-out” or unmanned data centers (fully automated facilities that can operate in the dark without onsite staff), the absence of staff can boost reliability. The logic is in making these centers free from human interference and thus eliminating human error from data center maintenance and operations.

Successful underwater data center experiments could pave the way for achieving carbon-negative data center operations in the future. With their potential to be cooled by ocean waters and thus perform with less energy, combined with the use of renewable energy sources for power supplies, underwater data centers could become the ultimate display of faster data processing speeds and heightened security, all at optimal sustainability.

Also read: Data centers are mushrooming, but what’s keeping them cool?

Pin It