A team of Microsoft employees has been testing underwater datacenter in the Pacific Ocean in 2015.
The datacenter, which consumed computing power equivalent to 300 desktop PCs, was deployed inside a container.
The experiment, named Project Natick, could use renewable energy sorces to get the power needed for storing, processing and/or distributing massive amounts of information.
Building of a vessel that housed the experimental datacenter took 90 days.
Once the vessel was submerged last August, the researchers monitored the container from their offices on Microsoft’s Redmond campus.
Using cameras and other sensors, they recorded data like temperature, humidity, the amount of power being used for the system, even the speed of the current.
“Going under water could solve several problems by introducing a new power source, greatly reducing cooling costs, closing the distance to connected populations and making it easier and faster to set up datacenters,” the company said.
Ben Cutler, Project Manager, said: “We take a big whack at big problems, on a short-term basis. We take a look at something from a new angle, a different perspective, with a willingness to challenge conventional wisdom.”
The team is currently planning the project’s next phase, which could include a vessel four times the size of the current container with as much as 20 times the compute power. The team is also evaluating test sites for the vessel, which could be in the water for at least a year, deployed with a renewable ocean energy source.
Meanwhile, the initial vessel is now back on land, sitting in the lot of one of Microsoft’s buildings.
“We’re learning how to reconfigure firmware and drivers for disk drives, to get longer life out of them. We’re managing power, learning more about using less. These lessons will translate to better ways to operate our datacenters. Even if we never do this on a bigger scale, we’re learning so many lessons,” said Peter Lee, corporate vice president of Microsoft Research NExT.