devices – AI News https://news.deepgeniusai.com Artificial Intelligence News Tue, 22 Dec 2020 16:10:06 +0000 en-GB hourly 1 https://deepgeniusai.com/news.deepgeniusai.com/wp-content/uploads/sites/9/2020/09/ai-icon-60x60.png devices – AI News https://news.deepgeniusai.com 32 32 Chinese AI chipmaker Horizon endeavours to raise $700M to rival NVIDIA https://news.deepgeniusai.com/2020/12/22/chinese-ai-chipmaker-horizon-raise-700m-rival-nvidia/ https://news.deepgeniusai.com/2020/12/22/chinese-ai-chipmaker-horizon-raise-700m-rival-nvidia/#comments Tue, 22 Dec 2020 16:10:04 +0000 https://news.deepgeniusai.com/?p=10133 AI chipmaker Horizon Robotics is seeking to raise $700 million in a new funding round. Horizon is often seen as potentially becoming China’s equivalent of NVIDIA. The company is founded by Dr Kai Yu, a prominent industry figure with quite the credentials. Yu led Baidu’s AI Research lab for three years, founded the Baidu Institute... Read more »

The post Chinese AI chipmaker Horizon endeavours to raise $700M to rival NVIDIA appeared first on AI News.

]]>
AI chipmaker Horizon Robotics is seeking to raise $700 million in a new funding round.

Horizon is often seen as potentially becoming China’s equivalent of NVIDIA. The company is founded by Dr Kai Yu, a prominent industry figure with quite the credentials.

Yu led Baidu’s AI Research lab for three years, founded the Baidu Institute of Deep Learning, and launched the company’s autonomous driving business unit.

Furthermore, Yu has taught at Stanford University, published over 60 papers, and even won first place in the ImageNet challenge which evaluates algorithms for object detection and image classification.

China is yet to produce a chipset firm which can match the capabilities of Western equivalents.

With increasing US sanctions making it more difficult for Chinese firms to access American semiconductors, a number of homegrown companies are emerging and gaining attention from investors.

Horizon is just five-years-old and specialises in making AI chips for robots and autonomous vehicles. The company has already attracted significant funding.

Around two years ago, Horizon completed a $600 million funding round with a $3 billion valuation. The company has secured $150 million so far as part of this latest round.

While it’s likely the incoming Biden administration in the US will take a less strict approach to trade with China, it seems Beijing wants to build more homegrown alternatives which can match or surpass Western counterparts.

Chinese tech giants like Huawei are investing significant resources in their chip manufacturing capabilities to ensure the country has the tech it needs to power groundbreaking advancements like self-driving cars.

The post Chinese AI chipmaker Horizon endeavours to raise $700M to rival NVIDIA appeared first on AI News.

]]>
https://news.deepgeniusai.com/2020/12/22/chinese-ai-chipmaker-horizon-raise-700m-rival-nvidia/feed/ 1
Researchers achieve 94% power reduction for on-device AI tasks https://news.deepgeniusai.com/2020/09/17/researchers-achieve-power-reduction-on-device-ai-tasks/ https://news.deepgeniusai.com/2020/09/17/researchers-achieve-power-reduction-on-device-ai-tasks/#respond Thu, 17 Sep 2020 15:47:52 +0000 https://news.deepgeniusai.com/?p=9859 Researchers from Applied Brain Research (ABR) have achieved significantly reduced power consumption for a range of AI-powered devices. ABR designed a new neural network called the Legendre Memory Unit (LMU). With LMU, on-device AI tasks – such as those on speech-enabled devices like wearables, smartphones, and smart speakers – can take up to 94 percent... Read more »

The post Researchers achieve 94% power reduction for on-device AI tasks appeared first on AI News.

]]>
Researchers from Applied Brain Research (ABR) have achieved significantly reduced power consumption for a range of AI-powered devices.

ABR designed a new neural network called the Legendre Memory Unit (LMU). With LMU, on-device AI tasks – such as those on speech-enabled devices like wearables, smartphones, and smart speakers – can take up to 94 percent less power.

The reduction in power consumption achieved through LMU will be particularly beneficial to smaller form-factor devices such as smartwatches; which struggle with small batteries. IoT devices which carry out AI tasks – but may have to last months, if not years, before they’re replaced – should also benefit.

LMU is described as a Recurrent Neural Network (RNN) which enables lower power and more accurate processing of time-varying signals.

ABR says the LMU can be used to build AI networks for all time-varying tasks—such as speech processing, video analysis, sensor monitoring, and control systems.

The AI industry’s current go-to model is the Long-Short-Term-Memory (LSTM) network. LSTM was first proposed back in 1995 and is used for most popular speech recognition and translation services today like those from Google, Amazon, Facebook, and Microsoft.

Last year, researchers from the University of Waterloo debuted LMU as an alternative RNN to LSTM. Those researchers went on to form ABR, which now consists of 20 employees.

Peter Suma, co-CEO of Applied Brain Research, said in an email:

“We are a University of Waterloo spinout from the Theoretical Neuroscience Lab at UW. We looked at how the brain processes signals in time and created an algorithm based on how “time-cells” in your brain work.

We called the new AI, a Legendre-Memory-Unit (LMU) after a mathematical tool we used to model the time cells. The LMU is mathematically proven to be optimal at processing signals. You cannot do any better. Over the coming years, this will make all forms of temporal AI better.”

ABR debuted a paper in late-2019 during the NeurIPS conference which demonstrated that LMU is 1,000,000x more accurate than the LSTM while encoding 100x more time-steps.

In terms of size, the LMU model is also smaller. LMU uses 500 parameters versus the LSTM’s 41,000 (a 98 percent reduction in network size.)

“We implemented our speech recognition with the LMU and it lowered the power used for command word processing to ~8 millionths of a watt, which is 94 percent less power than the best on the market today,” says Suma. “For full speech, we got the power down to 4 milli-watts, which is about 70 percent smaller than the best out there.”

Suma says the next step for ABR is to work on video, sensor and drone control AI processing—to also make them smaller and better.

A full whitepaper detailing LMU and its benefits can be found on preprint repository arXiv here.

The post Researchers achieve 94% power reduction for on-device AI tasks appeared first on AI News.

]]>
https://news.deepgeniusai.com/2020/09/17/researchers-achieve-power-reduction-on-device-ai-tasks/feed/ 0