LATEST NEWS

Microsoft Outlines Hardware Architecture for Deep Learning on Intel FPGAs

Taking place this week, Microsoft Azure CTO Mark Russinovich disclosed major advances in Microsoft’s hyperscale deployment of Intel® field programmable gate arrays (FPGAs). These advances have resulted in the industry’s fastest public cloud network, and new technology for acceleration of Deep Neural Networks (DNNs) that replicate “thinking” in a manner that’s conceptually similar to that of the human brain.

The advances offer performance, flexibility and scale, using super low latency networking to leverage the world’s largest cloud investment in FPGAs. The increases in networking speed achieved by this low latency networking will help business, government, healthcare, and universities better process Big Data workloads. Azure’s FPGA-based Accelerated Networking reduces inter-virtual machine latency by up to 10x while freeing the Intel® Xeon® processors for other tasks.

Russinovich  also outlined a new cloud acceleration framework that Microsoft calls Hardware Microservices.  The infrastructure used to deliver this acceleration is built on Intel® FPGAs.  This new technology will enable accelerated computing services, such as Deep Neural Networks, to run in the cloud without any software required, resulting in large advances in speed and efficiency.

“From our early work accelerating Bing search using FPGAs added to the Intel Xeon processor-based servers, to this new Hardware Microservices model that underlies the Deep Neural Networks (DNNs) infrastructure that Mark discussed yesterday afternoon, Microsoft is continuing to invest in novel hardware acceleration infrastructure using Intel® FPGAs,” said Doug Burger, one of Microsoft’s Distinguished Engineers.

“Application and server acceleration requires more processing power today to handle large and diverse workloads, as well as a careful blending of low power and high performance—or performance per Watt, which FPGAs are known for,” said Dan McNamara, corporate vice president and general manager, Programmable Solutions Group, Intel. “Whether used to solve an important business problem, or decode a genomics sequence to help cure a disease, this kind of computing in the cloud, enabled by Microsoft with help from Intel FPGAs, provides a large benefit.”

Liat

Recent Posts

eInfochips and NXP Collaborate to Enable Battery Energy Storage Customers

 eInfochips, an Arrow Electronics company, today announced its expanded collaboration with NXP® Semiconductors to help…

6 hours ago

DigiKey Adds More Than 611,000 Products and 139 New Suppliers in Q3 2024

 DigiKey, a leading global commerce distributor offering the largest selection of technical components and automation…

9 hours ago

Infineon launches new generation of GaN power discretes with superior efficiency and power density

Infineon Technologies AG (FSE: IFX / OTCQX: IFNNY) today announced the launch of a new…

2 days ago

Power Integrations Launches 1700 V GaN Switcher IC, Setting New Benchmark for Gallium Nitride Technology

1700 V GaN InnoMux-2 IC delivers efficiency of better than 90 percent from a 1000…

2 days ago

NVIDIA Ethernet Networking Accelerates World’s Largest AI Supercomputer, Built by xAI

NVIDIA today announced that xAI’s Colossus supercomputer cluster comprising 100,000 NVIDIA Hopper Tensor Core GPUs…

1 week ago

Siemens strengthens leadership in industrial software and AI with acquisition of Altair Engineering

Acquisition of Altair Engineering Inc., a global leader in computational science and artificial intelligence software,…

1 week ago