Intel Accelerates Development of Artificial Intelligence Solutions with Open Neural Network Exchange Support

 weSRCH's Best of the Internet Award
  11th-Oct-2017
 138

Today, Intel announced that it joined the Open Neural Network Exchange (ONNX) to enable enhanced framework interoperability for developers that boosts efficiency and speeds creation of artificial intelligence (AI) and deep learning models. AI and deep learning are transforming how people engage with the world and how businesses make smarter decisions.

Press Kit: Artificial Intelligence

The ONNX format was first announced last month by Microsoft* and Facebook* to give users more choice within AI frameworks, as every modeling project has its own special set of requirements that often require different tools for different stages. Intel, along with others, is participating in the project to provide greater flexibility to the developer community by giving access to the most suitable tools for each unique AI project and the ability to easily switch between frameworks and tools.

Intel's addition to the open ecosystem for AI will broaden the toolset available to developers through neon and the Intel® Nervana™ Graph as well as deployment through the Intel® Deep Learning Deployment Toolkit. neon will be compatible with other deep learning frameworks through the Intel Nervana Graph and ONNX, providing customers with more choices for frameworks and compatibility with the right hardware platform to fit their needs.

Currently, the ONNX format is supported by Microsoft Cognitive Toolkit*, Caffe2* and PyTorch*, with capabilities expanding over time. Through the increased interoperability and vast hardware and software ecosystem fostered by ONNX and Intel, developers can construct and train models at an accelerated pace to deliver new AI solutions.

Project Brainwave, Microsoft's FPGA-based deep learning platform for accelerating real-time AI, will also support ONNX in order to help customers accelerate models from a variety of frameworks. Project Brainwave leverages Intel® Stratix® 10 FPGAs to enable the acceleration of deep neural networks (DNNs) that replicate "thinking" in a manner that is conceptually similar to that of the human brain. Microsoft was the first major cloud service provider to deploy FPGAs in its public cloud infrastructure and the technology advancements it is demonstrating today with Intel Stratix 10 FPGAs.

To learn more about how Intel and ONNX are making AI more accessible across industries, visit this Intel Nervana blog post.

Facebook LinkedIn Twitter YouTube Instagram AddThis

Domain: Electronics
Category: Semiconductors
Posted By: weSRCH's Best of the Internet Award and Contact weSRCH's Best of the Internet Award
Maxims of Tech: Rules of Engagement for a Fast Changing Environment

Recent Press Releases

Online Premium Testosterone Replacem...

MarketResearchNest.com adds “Global Testosterone Replacement Therapy Market Research Report 2017” new report to its research database. The report spread across 111 pag

11 December, 2017

Aramid Fiber Market Sales, Price, Re...

MarketResearchNest.com adds “China Aramid Fiber Market by Manufacturers, Regions (Province), Type and Application, Forecast To 2022” new report to its research database

11 December, 2017

Dermatologicals in Saudi Arabia - Gl...

Hot temperatures produce the perfect atmosphere for microorganism and viruses to evolve, notably since ladies and men wear several layers; ladies area unit obligated to wear abayas

11 December, 2017