ai chip – AI News https://news.deepgeniusai.com Artificial Intelligence News Wed, 25 Mar 2020 05:41:08 +0000 en-GB hourly 1 https://deepgeniusai.com/news.deepgeniusai.com/wp-content/uploads/sites/9/2020/09/ai-icon-60x60.png ai chip – AI News https://news.deepgeniusai.com 32 32 Huawei unveils high-end AI chip for servers alongside MindSpore framework https://news.deepgeniusai.com/2019/08/23/huawei-ai-chip-mindspore-framework/ https://news.deepgeniusai.com/2019/08/23/huawei-ai-chip-mindspore-framework/#respond Fri, 23 Aug 2019 14:24:32 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=5963 Huawei has unveiled a high-end artificial intelligence chip for servers along with an AI computing framework called MindSpore. The Huawei Ascend 910 is the “world’s most powerful AI processor,” according to a press release on Friday. The chip’s specs were first announced during last year’s Huawei Connect event in Shanghai. Eric Xu, Rotating Chairman of... Read more »

The post Huawei unveils high-end AI chip for servers alongside MindSpore framework appeared first on AI News.

]]>
Huawei has unveiled a high-end artificial intelligence chip for servers along with an AI computing framework called MindSpore.

The Huawei Ascend 910 is the “world’s most powerful AI processor,” according to a press release on Friday. The chip’s specs were first announced during last year’s Huawei Connect event in Shanghai.

Eric Xu, Rotating Chairman of Huawei, said:

“We have been making steady progress since we announced our AI strategy in October last year. Everything is moving forward according to plan, from R&D to product launch.

We promised a full-stack, all-scenario AI portfolio and today we delivered, with the release of Ascend 910 and MindSpore. This also marks a new stage in Huawei’s AI strategy.”

Huawei claims the final version of the Ascend 910 not only performs as promised, but it does so with much lower power consumption.

For half-precision floating point (FP16) operations, Ascend 910 delivers 256 TeraFLOPS performance. For integer precision calculations (INT8), it delivers 512 TeraOPS.

Huawei initially expected the Ascend 910’s max power consumption to be 350W but the company has managed to deliver the promised performance with a max consumption of just 310W.

“Ascend 910 performs much better than we expected,” said Xu. “Without a doubt, it has more computing power than any other AI processor in the world.”

Alongside the Ascend 910, Huawei has launched an AI computing framework called MindSpore.

Last year, Huawei announced three goals for MindSpore:

  • Easy development: Reduce training time and costs.
  • Efficient execution: Use the least amount of resources with the highest possible OPS/W.
  • Adaptable to all scenarios: Including device, edge, and cloud applications.

Huawei claims that MindSpore requires 20 percent fewer lines of code than other leading frameworks when used for a typical neural network for natural language processing.

“MindSpore will go open source in the first quarter of 2020,” said Xu. “We want to drive broader AI adoption and help developers do what they do best.”

The Chinese tech behemoth continues to expand its presence despite battling a US trade ban. The US has been pressuring its allies to ban Huawei over concerns it poses a national security threat.

While security must always be prioritised, few can dispute the innovation which Huawei brings across its business. Today’s announcements show the kind of innovations which US companies may miss out on if a deal cannot be reached, putting them at a disadvantage to Chinese rivals.

Interested in hearing industry leaders discuss subjects like this? , , , AI &

The post Huawei unveils high-end AI chip for servers alongside MindSpore framework appeared first on AI News.

]]>
https://news.deepgeniusai.com/2019/08/23/huawei-ai-chip-mindspore-framework/feed/ 0
Intel unwraps its first chip for AI and calls it Spring Hill https://news.deepgeniusai.com/2019/08/21/intel-ai-powered-chip-spring-hill/ https://news.deepgeniusai.com/2019/08/21/intel-ai-powered-chip-spring-hill/#respond Wed, 21 Aug 2019 10:17:07 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=5956 Intel has unwrapped its first processor that is designed for artificial intelligence and is planned for use in data centres. The new Nervana Neural Network Processor for Inference (NNP-I) processor has a more approachable codename of Spring Hill. Spring Hill is a modified 10nm Ice Lake processor which sits on a PCB and slots into... Read more »

The post Intel unwraps its first chip for AI and calls it Spring Hill appeared first on AI News.

]]>
Intel has unwrapped its first processor that is designed for artificial intelligence and is planned for use in data centres.

The new Nervana Neural Network Processor for Inference (NNP-I) processor has a more approachable codename of Spring Hill.

Spring Hill is a modified 10nm Ice Lake processor which sits on a PCB and slots into an M.2 port typically used for storage.

According to Intel, the use of a modified Ice Lake processor allows Spring Hill to handle large workloads and consume minimal power. Two compute cores and the graphics engine have been removed from the standard Ice Lake design to accommodate 12 Inference Compute Engines (ICE).

In a summary, Intel detailed six main benefits it expects from Spring Hill:

  1. Best in class perf/power efficiency for major data inference workloads.
  2. Scalable performance at wide power range.
  3. High degree of programmability w/o compromising perf/power efficiency.
  4. Data centre at scale.
  5. Spring Hill solution – Silicon and SW stack – sampling with definitional partners/customers on multiple real-life topologies.
  6. Next two generations in planning/design.

Intel’s first chip for AI comes after the company invested in several Isreali artificial intelligence startups including Habana Labs and NeuroBlade. The investments formed part of Intel’s strategy called ‘AI Everywhere’ which aims to increase the firm’s presence in the market.

Naveen Rao, Intel vice president and general manager, Artificial Intelligence Products Group, said:

“To get to a future state of ‘AI everywhere,’ we’ll need to address the crush of data being generated and ensure enterprises are empowered to make efficient use of their data, processing it where it’s collected when it makes sense and making smarter use of their upstream resources.

Data centers and the cloud need to have access to performant and scalable general purpose computing and specialized acceleration for complex AI applications. In this future vision of AI everywhere, a holistic approach is needed—from hardware to software to applications.”

Facebook has said it will be using Intel’s new Spring Hill processor. Intel already has two more generations of the NNP-I in development.

Interested in hearing industry leaders discuss subjects like this? , , , AI &

The post Intel unwraps its first chip for AI and calls it Spring Hill appeared first on AI News.

]]>
https://news.deepgeniusai.com/2019/08/21/intel-ai-powered-chip-spring-hill/feed/ 0
Nvidia CEO is ‘happy to help’ if Tesla’s AI chip ambitions fail https://news.deepgeniusai.com/2018/08/17/nvidia-ceo-help-tesla-ai-chip/ https://news.deepgeniusai.com/2018/08/17/nvidia-ceo-help-tesla-ai-chip/#respond Fri, 17 Aug 2018 14:46:20 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=3650 Nvidia CEO Jensen Huang has teased his company is ‘happy to help’ if Tesla fails its goal to launch a competitor AI chip. Tesla currently uses Nvidia’s silicon for its vehicles. The company’s CEO, Elon Musk, said earlier this month that he’s a “big fan” of Nvidia but that an in-house AI chip would be... Read more »

The post Nvidia CEO is ‘happy to help’ if Tesla’s AI chip ambitions fail appeared first on AI News.

]]>
Nvidia CEO Jensen Huang has teased his company is ‘happy to help’ if Tesla fails its goal to launch a competitor AI chip.

Tesla currently uses Nvidia’s silicon for its vehicles. The company’s CEO, Elon Musk, said earlier this month that he’s a “big fan” of Nvidia but that an in-house AI chip would be able to outperform those of the leading processor manufacturer.

During a conference call on Thursday, Huang said its customers are “super excited” about  Nvidia’s Xavier technology for autonomous machines. He also notes that it’s currently in production, whereas Tesla’s rival is yet-to-be-seen.

Here’s what Huang had to say during the call:

“With respect to the next generation, it is the case that when we first started working on autonomous vehicles, they needed our help. We used the 3-year-old Pascal GPU for the current generation of Autopilot computers.

It’s very clear now that in order to have a safe Autopilot system, we need a lot more computing horsepower. In order to have safe computing, in order to have safe driving, the algorithms have to be rich. It has to be able to handle corner conditions in a lot of diverse situations.

Every time there are more and more corner conditions or more subtle things that you have to do, or you have to drive more smoothly or be able to take turns more quickly, all of those requirements require greater computing capability. And that’s exactly the reason why we built Xavier. Xavier is in production now. We’re seeing great success and customers are super excited about Xavier.

That’s exactly the reason why we built it. It’s super hard to build Xavier and all the software stack on top of it. If it doesn’t turn out for whatever reason for them [Tesla] you can give me a call and I’d be more than happy to help.”

The conference call was carried out following the release of Nvidia’s fiscal earnings report where the company reported better-than-expected earnings.

“Growth across every platform – AI, Gaming, Professional Visualization, self-driving cars – drove another great quarter,” said Huang. “Fueling our growth is the widening gap between demand for computing across every industry and the limits reached by traditional computing.”

However, due to lower-than-expected revenue guidance, Nvidia stock fell by six percent on Thursday following the earnings report.

What are your thoughts on Huang’s comments? Let us know below.

 

The post Nvidia CEO is ‘happy to help’ if Tesla’s AI chip ambitions fail appeared first on AI News.

]]>
https://news.deepgeniusai.com/2018/08/17/nvidia-ceo-help-tesla-ai-chip/feed/ 0
Intel’s AI chip business is now worth $1bn per year, $10bn by 2022 https://news.deepgeniusai.com/2018/08/09/intel-ai-business-worth/ https://news.deepgeniusai.com/2018/08/09/intel-ai-business-worth/#respond Thu, 09 Aug 2018 16:00:38 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=3615 The size of Intel’s AI chip business today is huge, but it’s nothing compared to where it expects to be in just four years’ time. Speaking during the company’s Innovation Summit in Santa Clara, Intel Executive VP Navin Shenoy revealed a new focus on AI development. The company’s AI-focused Xeon processors generated $1 billion in... Read more »

The post Intel’s AI chip business is now worth $1bn per year, $10bn by 2022 appeared first on AI News.

]]>
The size of Intel’s AI chip business today is huge, but it’s nothing compared to where it expects to be in just four years’ time.

Speaking during the company’s Innovation Summit in Santa Clara, Intel Executive VP Navin Shenoy revealed a new focus on AI development.

The company’s AI-focused Xeon processors generated $1 billion in revenues during 2017. By 2022, it expects to be generating around $10 billion per year.

AI is set to be implemented in many areas of our lives in the coming years, across a variety of devices.

Shenoy claims recent breakthroughs have increased the company’s AI performance by 200x since 2014. He teases further improvements are on their way in upcoming releases.

The company will be launching its ‘Cascade Lake’ Xeon processor later this year with 11 times better performance for AI image recognition.

Arriving in 2019 will be ‘Cooper Lake’ which uses 14-nanometer manufacturing and will feature even better performance. In 2020, however, the company is targeting ‘Ice Lake’ with 10-nanometer manufacturing technology.

“After 50 years, this is the biggest opportunity for the company,” says Shenoy. “We have 20 percent of this market today.”

The admission it currently has a small share of the market today is bold and shows the company is confident about significantly upping that percentage in the coming years. It faces significant competition from Nvidia in particular.

Intel’s revenues were around a third data-centric five years ago. Now, it’s around half of Intel’s business.

Shenoy’s comments today show how seriously Intel is taking its AI business and the firm’s confidence it will be a major player.

What are your thoughts on Intel’s AI business?

 

The post Intel’s AI chip business is now worth $1bn per year, $10bn by 2022 appeared first on AI News.

]]>
https://news.deepgeniusai.com/2018/08/09/intel-ai-business-worth/feed/ 0
Facebook is helping Intel with AI for the first Neural Network Processor https://news.deepgeniusai.com/2017/10/18/facebook-helping-intel-ai-first-neural-network-processor/ https://news.deepgeniusai.com/2017/10/18/facebook-helping-intel-ai-first-neural-network-processor/#respond Wed, 18 Oct 2017 11:48:41 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=2595 The CEO of Intel has revealed Facebook is providing its AI knowledge ahead of the launch of the world’s first Neural Network Processor. Brian Krzanich made the comment during an on-stage interview at the WSJD Live conference in Laguna Beach, California. The news Intel is working on its own AI chips is no surprise, but... Read more »

The post Facebook is helping Intel with AI for the first Neural Network Processor appeared first on AI News.

]]>
The CEO of Intel has revealed Facebook is providing its AI knowledge ahead of the launch of the world’s first Neural Network Processor.

Brian Krzanich made the comment during an on-stage interview at the WSJD Live conference in Laguna Beach, California. The news Intel is working on its own AI chips is no surprise, but the choice of partner may be.

“This is the first piece of silicon,” Krzanich said. “We have a whole family planned for this, (Facebook) is helping us, along with others, as to where this is going.”

Many consumers are wary of Facebook because, like Google, the company relies on collecting vast amounts of data about users. While it’s unlikely Intel would allow Facebook to perform any data collection of its users; there will doubtless be some concerns.

Facebook was the only named company but Intel is also collaborating with others for its AI chips. The extent of the partnerships, or what benefits the partners receive for providing their resources, is currently unknown. We’ve reached out to Facebook and Intel for clarification.

Intel is aiming to build the first Neural Network Processor (NNP) before the end of this year. The company is calling this ambition Nervana, following the company of the same name Intel acquired in August last year, and it promises to “revolutionise AI computing” across a myriad of industries.

In a blog post, Krzanich provided the following examples:

  • Healthcare: AI will allow for earlier diagnosis and greater accuracy, helping make the impossible possible by advancing research on cancer, Parkinson’s disease, and other brain disorders.
  • Social media: Providers will be able to deliver a more personalized experience to their customers and offer more targeted reach to their advertisers.
  • Automotive: The accelerated learning delivered in this new platform brings us another step closer to putting autonomous vehicles on the road.
  • Weather: Consider the immense data required to understand the movement, wind speeds, water temperatures and other factors that decide a hurricane’s path. Having a processor that takes better advantage of data inputs could improve predictions on how subtle climate shifts may increase hurricanes in different geographies.

Krzanich says multiple generations of Nervana products are in the pipeline. Last year, the company set the goal of achieving 100 times greater AI performance by 2020. Intel believes these NNPs will help them achieve this lofty goal.

Nervana, even prior to its acquisition by Intel, has been working on neuromorphic chips for years and even developed its own called ‘Lake Crest’ as it found traditional GPUs to be unsuitable for neural networking. These chips are designed to mimic the human brain to make decisions based on patterns and associations. Intel announced its own ‘Loihi’ chip self-learning neuromorphic chip back in September.

According to Naveen Rao, co-founder of Nervana, the first member of the NNP family will begin shipping “soon”. We’ll keep you informed of all developments.

What are your thoughts on the NNPs being developed by Intel and partners?

 

The post Facebook is helping Intel with AI for the first Neural Network Processor appeared first on AI News.

]]>
https://news.deepgeniusai.com/2017/10/18/facebook-helping-intel-ai-first-neural-network-processor/feed/ 0