Back to top

The indispensable source for professionals who create, implement and service technology solutions for entrepreneurs to enterprise.

In the Zone

AI requires GPUs? Not with today’s CPUs

Peter Krass's picture

by Peter Krass on 07/31/2018
Blog Category: advanced-technologies

Getting your customers started with artificial intelligence requires powerful GPUs, right?

Not anymore. Today’s advanced CPUs are increasingly up to the job. They can power many AI workloads all by themselves.

This means your customers can now get started with AI using the same server architectures they already know. That includes the Intel Xeon Scalable processors.

When your customers run AI applications on CPU-based servers, they can get more from their existing on-premises infrastructure. They can also take advantage of low-cost public clouds. And they can run AI training during off-hours.

AI on CPU-based servers is quickly becoming a mainstream practice. At Facebook, for example, the top 3 services now use Intel Xeon processors for 100% of their inference workloads.

Big benefits

Running AI workloads on a CPU-based system can deliver other benefits to your customers, as well. These include:

>  Utilization to the max: If a GPU isn’t running its intended workloads, it sits idle. By contrast, a general-purpose CPU can be used for other workloads, even rented as an IaaS server. A CPU also offers more flexibility, as it already runs other workloads.

> Scalability: Both machine learning and deep learning require highly scalable architectures. The Intel Xeon Scalable processors support very large models and memory-intense models.

> More memory: While today’s GPUs typically max out at 32GB, CPU-based servers can make use of hundreds of gigs across an entire platform. For applications that involve large images and unstructured data — such as voice and text — this larger capacity can make a big difference.

For example, pharmaceuticals company Novartis now runs deep neural networking for early drug discovery, a process that involves lots of highly detailed images, on 8 CPU-based servers.

High performance

Intel Xeon Scalable processors have been specifically designed to run high-performance AI workloads alongside the data-center and cloud workloads they already run. For deep learning, performance has improved by more than 240x for training, and nearly 280x for inference, according to Intel.

Intel Xeon Scalable processor family

Intel Xeon Scalable CPUs: ready for AI

Intel is also working on new, optimized instruction sets, as well as additional in-generation framework optimizations. The company has been optimizing its software frameworks and libraries, a prerequisite to getting started with AI. Most of this optimization is being done with open-source code — meaning it’s free and available to everyone.

AI in a box

To give your customers another path to CPU-driven AI, Intel has partnered with C3 IoT, an innovative software platform provider.

The result of this partnership is a hardware-software solution called the C3 IoT AI Appliance. It’s designed for organizations seeking to deploy AI applications while meeting stringent data governance, compliance and security requirements.

C3 IoT AI Appliance

The C3 IoT AI Appliance: CPU-driven

The C3 IoT AI Appliance is powered by an Intel Xeon Platinum 8160 processor with 24 cores, 48 threads and a maximum Turbo frequency of 3.7 GHz.

The appliance’s software, the C3 IoT Platform, is a platform-as-a-service offering for rapidly developing, operating and scaling AI and IoT. It comes pre-installed on the C3 IoT AI Appliance, along with Microsoft’s Azure Stack.

If you haven’t heard of C3 IoT, you’ve probably heard of its CEO: Tom Siebel. He’s the former CEO of Siebel Systems, which was acquired by Oracle in 2006.

This past May, when the C3 IoT AI Appliance was introduced, Siebel said, “C3 IoT and Intel are enabling our joint customers to attain unprecedented levels of operational efficiency, productivity and competitive advantage.”

Get intelligent

So if you have customers who are looking to get started with AI, but think they need GPUs, tell them otherwise. For the right AI workloads — especially training and inference — a CPU-based system can meet their needs.

And it can do so with an architecture your customers already know, and may already own.

Saving money and time for AI? There's nothing artificial about that.

Take smart action:

> Watch our social-chat replay: Driving the AI revolution with Intel

> Get training: Invest, grow & expand your business with AI

> Explore a website: What will you build with AI on Intel?

> Get builders’ resources: Intel AI Solutions Library


Back to top