kubernetes – AI News https://news.deepgeniusai.com Artificial Intelligence News Wed, 25 Mar 2020 05:09:11 +0000 en-GB hourly 1 https://deepgeniusai.com/news.deepgeniusai.com/wp-content/uploads/sites/9/2020/09/ai-icon-60x60.png kubernetes – AI News https://news.deepgeniusai.com 32 32 Amazon makes three major AI announcements during re:Invent 2019 https://news.deepgeniusai.com/2019/12/03/amazon-ai-announcements-reinvent-2019/ https://news.deepgeniusai.com/2019/12/03/amazon-ai-announcements-reinvent-2019/#respond Tue, 03 Dec 2019 15:45:54 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=6270 Amazon has kicked off its annual re:Invent conference in Las Vegas and made three major AI announcements. During a midnight keynote, Amazon unveiled Transcribe Medical, SageMaker Operators for Kubernetes, and DeepComposer. Transcribe Medical The first announcement we’ll be talking about is likely to have the biggest impact on people’s lives soonest. Transcribe Medical is designed... Read more »

The post Amazon makes three major AI announcements during re:Invent 2019 appeared first on AI News.

]]>
Amazon has kicked off its annual re:Invent conference in Las Vegas and made three major AI announcements.

During a midnight keynote, Amazon unveiled Transcribe Medical, SageMaker Operators for Kubernetes, and DeepComposer.

Transcribe Medical

The first announcement we’ll be talking about is likely to have the biggest impact on people’s lives soonest.

Transcribe Medical is designed to transcribe medical speech for primary care. The feature is aware of medical speech in addition to standard conversational diction.

Amazon says Transcribe Medical can be deployed across “thousands” of healthcare facilities to provide clinicians with secure note-taking abilities.

Transcribe Medical offers an API and can work with most microphone-equipped smart devices. The service is fully managed and sends back a stream of text in real-time.

Furthermore, and most importantly, Transcribe Medical is covered under AWS’ HIPAA eligibility and business associate addendum (BAA). This means that any customer that enters into a BAA with AWS can use Transcribe Medical to process and store personal health information legally.

SoundLines and Amgen are two partners which Amazon says are already using Transcribe Medical.

Vadim Khazan, president of technology at SoundLines, said in a statement:

“For the 3,500 health care partners relying on our care team optimisation strategies for the past 15 years, we’ve significantly decreased the time and effort required to get to insightful data.”

SageMaker Operators for Kubernetes

The next announcement is Amazon SageMaker Operators for Kubernetes.

Amazon’s SageMaker is a machine learning development platform and this new feature lets data scientists using Kubernetes train, tune, and deploy AI models.

SageMaker Operators can be installed on Kubernetes clusters and jobs can be created using Amazon’s machine learning platform through the Kubernetes API and command line tools.

In a blog post, AWS deep learning senior product manager Aditya Bindal wrote:

“Customers are now spared all the heavy lifting of integrating their Amazon SageMaker and Kubernetes workflows. Starting today, customers using Kubernetes can make a simple call to Amazon SageMaker, a modular and fully-managed service that makes it easier to build, train, and deploy machine learning (ML) models at scale.”

Amazon says that compute resources are pre-configured and optimised, only provisioned when requested, scaled as needed, and shut down automatically when jobs complete.

SageMaker Operators for Kubernetes is generally available in AWS server regions including US East (Ohio), US East (N. Virginia), US West (Oregon), and EU (Ireland).

DeepComposer

Finally, we have DeepComposer. This one is a bit more fun for those who enjoy playing with hardware toys.

Amazon calls DeepComposer the “world’s first” machine learning-enabled musical keyboard. The keyboard features 32-keys and two octaves, and is designed for developers to experiment with pretrained or custom AI models.

In a blog post, AWS AI and machine learning evangelist Julien Simon explains how DeepComposer taps a Generative Adversarial Network (GAN) to fill in gaps in songs.

After recording a short tune, a model for the composer’s favourite genre is selected in addition to setting the model’s parameters. Hyperparameters are then set along with a validation sample.

Once this process is complete, DeepComposer then generates a composition which can be played in the AWS console or even shared to SoundCloud (then it’s really just a waiting game for a call from Jay-Z).

Developers itching to get started with DeepComposer can apply for a physical keyboard for when they become available, or get started now with a virtual keyboard in the AWS console.

The post Amazon makes three major AI announcements during re:Invent 2019 appeared first on AI News.

]]>
https://news.deepgeniusai.com/2019/12/03/amazon-ai-announcements-reinvent-2019/feed/ 0
Google Cloud AI Platform updates make it ‘faster and more flexible’ https://news.deepgeniusai.com/2019/10/29/google-cloud-ai-platform-updates-faster-flexible/ https://news.deepgeniusai.com/2019/10/29/google-cloud-ai-platform-updates-faster-flexible/#respond Tue, 29 Oct 2019 15:23:34 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=6144 Google has issued several updates for its Cloud AI Platform which aims to make it ‘faster and more flexible’ for running machine learning workloads. Cloud AI Platform is Google’s machine learning platform-as-a-service (ML PaaS) designed for AI developers, engineers, and data scientists. The platform is end-to-end and supports the full development cycle from preparing data,... Read more »

The post Google Cloud AI Platform updates make it ‘faster and more flexible’ appeared first on AI News.

]]>
Google has issued several updates for its Cloud AI Platform which aims to make it ‘faster and more flexible’ for running machine learning workloads.

Cloud AI Platform is Google’s machine learning platform-as-a-service (ML PaaS) designed for AI developers, engineers, and data scientists. The platform is end-to-end and supports the full development cycle from preparing data, to training, all the way to building and deploying machine learning models.

Among the most noteworthy additions to the platform is support for Nvidia GPUs. As Google explains, “ML models are so complex that they only run with acceptable latency on machines with many CPUs, or with accelerators like NVIDIA GPUs. This is especially true of models processing unstructured data like images, video, or text.”

Previously, Cloud AI Platform only supported one vCPU and 2GB of RAM. You can now add GPUs, like the inference-optimised, low latency NVIDIA T4, for AI Platform Prediction. The basic tier adds support for up to four vCPUs.

AI Platform Prediction is being used by Conservation International, a Washington.-based organisation with the mission “to responsibly and sustainably care for nature, our global biodiversity, for the wellbeing of humanity,” for a collaborative project called Wildlife Insights.

“Wildlife Insights will turn millions of wildlife images into critical data points that help us better understand, protect and save wildlife populations around the world,” explains Eric H. Fegraus, Senior Director, Conservation Technology.

“Google Cloud’s AI Platform helps us reliably serve machine learning models and easily integrate their predictions with our application. Fast predictions, in a responsive and scalable GPU hardware environment, are critical for our user experience.”

Support for running custom containers in which to train models has also become generally available. Users can supply their own Docker images with an ML framework preinstalled to run on AI Platform. Developers can test container images locally before they’re deployed to the cloud.

Customers aiming to use the platform for inference – hosting a trained model that responds with predictions – can now do so. Machine learning models can be hosted using the Google Cloud AI Platform and AI Platform Prediction can be used to infer target values for obtaining new data.

Oh, and AI Platform Prediction is now built on Kubernetes which enabled Google to “build a reliable and fast serving system with all the flexibility that machine learning demands.”

Interested in hearing industry leaders discuss subjects like this? , , , AI &

The post Google Cloud AI Platform updates make it ‘faster and more flexible’ appeared first on AI News.

]]>
https://news.deepgeniusai.com/2019/10/29/google-cloud-ai-platform-updates-faster-flexible/feed/ 0