homehome Home chatchat Notifications


OpenAI will use Reddit and a new supercomputer to teach artificial intelligence how to speak

There's no "forum" in "AI" but Musk thinks there should be.

Alexandru Micu
August 19, 2016 @ 4:31 pm

share Share

OpenAI, Elon Musk’s artificial intelligence research company, just became the proud owner of the first ever DGX-1 supercomputer. Made by NVIDIA, the rig boasts a whopping 170 teraflops of computing power, equivalent to 250 usual servers — and OpenAI is gonna use it all to read Reddit comments.

OpenAI’s researchers gather around the first AI supercomputer in a box, NVIDIA DGX-1.
Image credits NVIDIA.

OpenAI is a non-profit AI research company whose purpose is to “advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return.” And now, NVIDIA CEO CEO Jen-Hsun Huang just delivered the most powerful tool the company has ever had at its disposal, a US$ 2 billion supercomputer.

This “AI supercomputer in a box” isn’t much bigger than a large-ish desktop PC, but it packs a huge punch. It’s 170 teraflops of computing power makes is roughly equivalent to 250 conventional servers working together, and all that oopmh is being put to good use for a worthy cause.

“The world’s leading non-profit artificial intelligence research team needs the world’s fastest AI system,” NVIDIA said in a statement.

“I thought it was incredibly appropriate that the world’s first supercomputer dedicated to artificial intelligence would go to the laboratory that was dedicated to open artificial intelligence,” Huang added.

But an OpenAI needs to do more than just process things very fast. It needs to learn and Musk, unlike my parents, believes that the best place to do so is on the Internet — specifically, on Reddit. The forum’s huge size makes for an ideal training ground for DGX-1, which will be spending the next few months processing nearly two billion comments to learn how to better chat with human beings.

“Deep learning is a very special class of models because as you scale up, they always work better,” says OpenAI researcher Andrej Karpathy.

The $129,000 super-computer relies on eight NVIDIA Tesla P100 GPUs (graphic processing units), 7 terabytes of SSD storage, and two Xeon processors (apart from the aforementioned 170 teraflops of performance) to go through this data and make sense of it all.

“You can take a large amount of data that would help people talk to each other on the internet, and you can train, basically, a chatbot, but you can do it in a way that the computer learns how language works and how people interact,” Karpathy added.

Even better, the new supercomputer is designed to function with OpenAI’s existing software. All that’s needed is to scale it up.

“We won’t need to write any new code, we’ll take our existing code and we’ll just increase the size of the model,” says OpenAI scientist Ilya Sutskever. “And we’ll get much better results than we have right now.”

NVIDIA CEO Jen-Hsun Huang slides open the DGX-1’s GPU tray at OpenAI’s headquarters in San Francisco.
Image credits NVIDIA

share Share

How Hot is the Moon? A New NASA Mission is About to Find Out

Understanding how heat moves through the lunar regolith can help scientists understand how the Moon's interior formed.

This 5,500-year-old Kish tablet is the oldest written document

Beer, goats, and grains: here's what the oldest document reveals.

A Huge, Lazy Black Hole Is Redefining the Early Universe

Astronomers using the James Webb Space Telescope have discovered a massive, dormant black hole from just 800 million years after the Big Bang.

Did Columbus Bring Syphilis to Europe? Ancient DNA Suggests So

A new study pinpoints the origin of the STD to South America.

The Magnetic North Pole Has Shifted Again. Here’s Why It Matters

The magnetic North pole is now closer to Siberia than it is to Canada, and scientists aren't sure why.

For better or worse, machine learning is shaping biology research

Machine learning tools can increase the pace of biology research and open the door to new research questions, but the benefits don’t come without risks.

This Babylonian Student's 4,000-Year-Old Math Blunder Is Still Relatable Today

More than memorializing a math mistake, stone tablets show just how advanced the Babylonians were in their time.

Sixty Years Ago, We Nearly Wiped Out Bed Bugs. Then, They Started Changing

Driven to the brink of extinction, bed bugs adapted—and now pesticides are almost useless against them.

LG’s $60,000 Transparent TV Is So Luxe It’s Practically Invisible

This TV screen vanishes at the push of a button.

Couple Finds Giant Teeth in Backyard Belonging to 13,000-year-old Mastodon

A New York couple stumble upon an ancient mastodon fossil beneath their lawn.