MIT’s new chip could bring neural nets to battery-powered gadgets

Mobile



MIT researchers have developed a chip designed to speed up the hard work of running neural networks, while also reducing the power consumed when doing so dramatically – by up to 95 percent, in fact. The basic concept involves simplifying the chip design so that shuttling of data between different processors on the same chip is taken out of the equation.

The big advantage of this new method, developed by a team led by MIT graduate student Avishek Biswas, is that it could potentially be used to run neural networks on smartphones, household devices and other portable gadgets, rather than requiring servers drawing constant power from the grid.

Why is that important? Because it means that phones of the future using this chip could do things like advanced speech and face recognition using neural nets and deep learning locally, rather than requiring more crude, rule-based algorithms, or routing information to the cloud and back to interpret results.

Computing ‘at the edge,’ as its called, or at the site of sensors actually gathering the data, is increasingly something companies are pursuing and implementing, so this new chip design method could have a big impact on that growing opportunity should it become commercialized.

Featured Image: Zapp2Photo/Getty Images





Source link

Products You May Like

Articles You May Like

Amazon says it will add 1,000 more employees in the UK, bringing the total to 28,500, bucking the Brexit chill – TechCrunch
Spotify’s Premium app gets a big makeover – TechCrunch
The 7 basic features that will hopefully return to the MacBook Pro – TechCrunch
Daivergent connects people on the autism spectrum with jobs in data management – TechCrunch
Buggy software in popular connected storage drives can let hackers read private data – TechCrunch

Leave a Reply

Your email address will not be published. Required fields are marked *