homehome Home chatchat Notifications


Most powerful supercomputer dedicated to geosciences is now live

While climate change may be a subject of intense debate with equally enthusiastic supports on both sides of the fence, one thing anyone, no matter their side, shouldn’t argue is allocating resources for its study. Just recently, one of the most powerful tools for studying the planet’s climate in great detail has been powered up […]

Tibi Puiu
October 17, 2012 @ 12:32 pm

share Share

Yellowstone ncar supercomputer

Some of the Yellowstone supercomputer’s racks. A mosaic of the Yellowstone National Park was put in place as a tribute. (c) CARLYE CALVIN / NCAR

While climate change may be a subject of intense debate with equally enthusiastic supports on both sides of the fence, one thing anyone, no matter their side, shouldn’t argue is allocating resources for its study. Just recently, one of the most powerful tools for studying the planet’s climate in great detail has been powered up – the “Yellowstone” 1.5 petaflops supercomputer, which has already been listed under the top 20 supercomputers in the world list.

The system went live at the NCAR-Wyoming Supercomputing Center in Cheyenne, Wyoming where it was met with bold enthusiasm by meteorologists and geoscientists stationed there, and from the rest of the world for that matter. Yellowstone promises to aid scientists in performing complex climate models which should allow for studying anything from hurricanes and tornadoes to geomagnetic storms, tsunamis, wildfires, as well as locating resources such as oil miles beneath the Earth’s crust.

People “want to know what [climate change] is going to do to precipitation in Spain or in Kansas,” said Rich Loft, the director of technology development at the center.

The supercomputer can perform computations at 1.5 petaflops, which translates in a staggering 1,500 teraflops or 1.5 quadrillion calculations per second. Just so you can get an idea of both the kind of upgrade Yellowstone offers and the degree of technological advancements witnessed in the past few years, consider that NCAR’s previous supercomputer, Bluefire, which was commissioned in 2008,  peaked at 76 teraflops, yet still it was one of the most powerful supercomputers of its day.

The $70 million data center is comprised of 100 racks with 72,288 compute cores from Intel Sandy Bridge processors, a massive 144.6 terabyte storage farm and a system for visualizing all of its data.

A powerful tool for predicting our planet’s climate

All these numbers might not mean very much to you, but if you put its tasks into context, you suddenly become impressed. For instance, a short-term weather forecast which would typically require a few hours to complete for Bluefire, can be rendered in mere minutes by Yellowstone. But it’s not in speed where Yellowstone shines, but in the complex tasks it can undertake. Scientists typically create climate change models of a particular region by arranging it in 100 km wide grids, yet Yellowstone is capable of refining the resolution to as much as 10 km. This significant improvement allows for a more detailed and accurate assessment of climate change closer to reality.

“Scientists will be able to simulate these small but dangerous systems in remarkable detail, zooming in on the movement of winds, raindrops, and other features at different points and times within an individual storm. By learning more about the structure and evolution of severe weather, researchers will be able to help forecasters deliver more accurate and specific predictions, such as which locations within a county are most likely to experience a tornado within the next hour,” according to a NCAR statement.

Currently, 11 research projects have already been planned to make use of Yellowstone “to try to do some breakthrough science straight away and try to shake the machine,” according to NCAR officials.

 

share Share

AI thought X-rays are connected to eating refried beans or drinking beer

Instead of finding true medical insights, these algorithms sometimes rely on irrelevant factors — leading to misleading results.

AI is scheming to stay online — and then lying to humans

An alarming third party report almost looks like a prequel to Terminator.

The David Mayer case: ChatGPT refuses to say some names. We have an idea why

Who are David Mayer and Brian Hood?

How CCTV Cameras and AI Can Prevent Floods in Cities

Researchers have developed an AI system using CCTV cameras to monitor culverts, potentially reducing urban flooding by detecting blockages in real-time.

Elon Musk’s social media posts have had a ‘sudden boost’ since July, new research reveals

Is the former Twitter platform now just used as a megaphone?

The world's first wooden satellite was launched into space

The satellite is made from magnolia wood, which was historically used for samurai sheaths.

Fast fashion company replaces models with AI and brags about it

The clothes they are "wearing" are real. But everything else is very, very fake.

AI could diagnose heart disease in dogs before it's too late

Heart murmurs often go undiagnosed in dogs. This new tool could help.

Researchers encode data in DNA hundreds of times faster than before — with panda pics

Two images were stored in and retrieved from DNA sequences faster than ever before. This could be a game-changer for our data storage.

The unlikely story of how a pastry AI came to be used to detect cancer

The journey of this particular AI was as unexpected as it gets.