Skip to main content

Facebook’s open-sourcing of AI hardware is the start of the deep-learning revolution

posted onDecember 16, 2015
by l33tdawg

A few days ago, Facebook open-sourced its artificial intelligence (AI) hardware computing design. Most people don’t know that large companies such as Facebook, Google, and Amazon don’t buy hardware from the usual large computer suppliers like Dell, HP, and IBM but instead design their own hardware based on commodity components. The Facebook website and all its myriad apps and subsystems persist on a cloud infrastructure constructed from tens of thousands of computers designed from scratch by Facebook’s own hardware engineers.

Open-sourcing Facebook’s AI hardware means that deep learning has graduated from the Facebook Artificial Intelligence Research (FAIR) lab into Facebook’s mainstream production systems intended to run apps created by its product development teams. If Facebook software developers are to build deep-learning systems for users, a standard hardware module optimised for fast deep learning execution that fits into and scales with Facebook’s data centres needs to be designed, competitively procured, and deployed. The module, called Big Sur, looks like any rack mounted commodity hardware unit found in any large cloud data centre.

Source

Tags

Facebook Hardware Technology

You May Also Like

Recent News

Friday, November 8th

Friday, November 1st

Tuesday, July 9th

Wednesday, July 3rd

Friday, June 28th

Thursday, June 27th

Thursday, June 13th

Wednesday, June 12th

Tuesday, June 11th

Friday, June 7th