homehome Home chatchat Notifications


OpenAI will use Reddit and a new supercomputer to teach artificial intelligence how to speak

There's no "forum" in "AI" but Musk thinks there should be.

Alexandru Micu
August 19, 2016 @ 4:31 pm

share Share

OpenAI, Elon Musk’s artificial intelligence research company, just became the proud owner of the first ever DGX-1 supercomputer. Made by NVIDIA, the rig boasts a whopping 170 teraflops of computing power, equivalent to 250 usual servers — and OpenAI is gonna use it all to read Reddit comments.

OpenAI’s researchers gather around the first AI supercomputer in a box, NVIDIA DGX-1.
Image credits NVIDIA.

OpenAI is a non-profit AI research company whose purpose is to “advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return.” And now, NVIDIA CEO CEO Jen-Hsun Huang just delivered the most powerful tool the company has ever had at its disposal, a US$ 2 billion supercomputer.

This “AI supercomputer in a box” isn’t much bigger than a large-ish desktop PC, but it packs a huge punch. It’s 170 teraflops of computing power makes is roughly equivalent to 250 conventional servers working together, and all that oopmh is being put to good use for a worthy cause.

“The world’s leading non-profit artificial intelligence research team needs the world’s fastest AI system,” NVIDIA said in a statement.

“I thought it was incredibly appropriate that the world’s first supercomputer dedicated to artificial intelligence would go to the laboratory that was dedicated to open artificial intelligence,” Huang added.

But an OpenAI needs to do more than just process things very fast. It needs to learn and Musk, unlike my parents, believes that the best place to do so is on the Internet — specifically, on Reddit. The forum’s huge size makes for an ideal training ground for DGX-1, which will be spending the next few months processing nearly two billion comments to learn how to better chat with human beings.

“Deep learning is a very special class of models because as you scale up, they always work better,” says OpenAI researcher Andrej Karpathy.

The $129,000 super-computer relies on eight NVIDIA Tesla P100 GPUs (graphic processing units), 7 terabytes of SSD storage, and two Xeon processors (apart from the aforementioned 170 teraflops of performance) to go through this data and make sense of it all.

“You can take a large amount of data that would help people talk to each other on the internet, and you can train, basically, a chatbot, but you can do it in a way that the computer learns how language works and how people interact,” Karpathy added.

Even better, the new supercomputer is designed to function with OpenAI’s existing software. All that’s needed is to scale it up.

“We won’t need to write any new code, we’ll take our existing code and we’ll just increase the size of the model,” says OpenAI scientist Ilya Sutskever. “And we’ll get much better results than we have right now.”

NVIDIA CEO Jen-Hsun Huang slides open the DGX-1’s GPU tray at OpenAI’s headquarters in San Francisco.
Image credits NVIDIA

share Share

Researchers Turn 'Moon Dust' Into Solar Panels That Could Power Future Space Cities

"Moonglass" could one day keep the lights on.

Ford Pinto used to be the classic example of a dangerous car. The Cybertruck is worse

Is the Cybertruck bound to be worse than the infamous Pinto?

Archaeologists Find Neanderthal Stone Tool Technology in China

A surprising cache of stone tools unearthed in China closely resembles Neanderthal tech from Ice Age Europe.

A Software Engineer Created a PDF Bigger Than the Universe and Yes It's Real

Forget country-sized PDFs — someone just made one bigger than the universe.

The World's Tiniest Pacemaker is Smaller Than a Grain of Rice. It's Injected with a Syringe and Works using Light

This new pacemaker is so small doctors could inject it directly into your heart.

Scientists Just Made Cement 17x Tougher — By Looking at Seashells

Cement is a carbon monster — but scientists are taking a cue from seashells to make it tougher, safer, and greener.

Three Secret Russian Satellites Moved Strangely in Orbit and Then Dropped an Unidentified Object

We may be witnessing a glimpse into space warfare.

Researchers Say They’ve Solved One of the Most Annoying Flaws in AI Art

A new method that could finally fix the bizarre distortions in AI-generated images when they're anything but square.

The small town in Germany where both the car and the bicycle were invented

In the quiet German town of Mannheim, two radical inventions—the bicycle and the automobile—took their first wobbly rides and forever changed how the world moves.

Scientists Created a Chymeric Mouse Using Billion-Year-Old Genes That Predate Animals

A mouse was born using prehistoric genes and the results could transform regenerative medicine.