Learn how to use the Intel Math Kernel Library (MKL) with the open source TensorFlow* framework. Optimize your models with Intel MKL libraries to achieve the best performance on Intel processors.
Click to see video:
Intel MKL is a library of optimized math routines for science, engineering, and financial applications. TensorFlow is an open-source machine learning platform. Together, they can provide significant performance gains over using TensorFlow with the default math library.
This tutorial will show you how to install and use Intel MKL with TensorFlow on Windows*, Linux*, or macOS*.
What is Intel MKL?
Intel MKL is a math library that is optimized for use with Intel processors. It can be used with a variety of programming languages, including Python. MKL provides vectorization, threading, and rebuilds functionality that can improve the performance of scientific and numerical code.
What is TensorFlow?
TensorFlow is an open source platform for machine learning. It was developed by the Google Brain team and released under the Apache 2.0 open source license in 2015. TensorFlow is used by major companies all over the world, including Airbnb, Ebay, Dropbox, Snapchat, Twitter, and more.
How to Use Intel MKL with TensorFlow
Intel MKL is a library for mathematical and statistical functions. It can be used with TensorFlow to accelerate computations.
To use Intel MKL with TensorFlow, you need to install the library and then set the environment variable “TF_MKL_ROOT” to the installation directory.
Once Intel MKL is installed, you can test your installation by running the following command:
python -c “import tensorflow as tf; tf.test.is_built_with_mkl()”
If Intel MKL is correctly installed, this command should return “True”.
Setting up the Environment
Many users find it helpful to use Intel MKL with TensorFlow in order to get better performance on their CPU-based systems. This guide will show you how to set up your environment so that you can use Intel MKL with TensorFlow.
First, you will need to install the following packages:
-GPU support for TensorFlow (optional)
Next, you will need to configure your environment so that TensorFlow can use Intel MKL. You can do this by setting the following environment variables:
-INTEL_MKL_DIR: the path to the Intel MKL installation directory
-LD_LIBRARY_PATH: the path to the libmklml.so file (required for Linux users)
Building TensorFlow with Intel MKL
Building TensorFlow with Intel MKL gives you the option to use Intel MKL as your math backend instead of the default Eigen math backend. This can provide significant performance gains on some models and architectures. In this article, we’ll show you how to build TensorFlow with Intel MKL support.
Before you begin, you’ll need to install the following dependencies:
– Bazel 0.4.5 or newer
– Intel MKL 2017 Update 2 or newer
– A supported compiler (see below)
Once you have the dependencies installed, you can clone the TensorFlow repository and checkout the r1.3 branch (or any other branch that supports MKL):
git clone https://github.com/tensorflow/tensorflow.git
git checkout r1.3
Next, configure your build with bazel:
./configure – mkl=yes – mkl-lib=/opt/intel/mkl/lib/intel64_lin – copt=-march=native – copt=-DEIGEN_USE_VML – cxxopt=-D_GLIBCXX_USE_CXX11_ABI=0 # Replace /opt/intel/mkl with your MKL installation path
Running TensorFlow with Intel MKL
This article will show you how to install and use the Intel Math Kernel Library (MKL) with TensorFlow on an Intel-based system.
The Intel MKL is a library of optimized math routines for scientific and engineering applications. The library provides linear algebra, Fast Fourier Transforms (FFTs), and vectorized math functions. TensorFlow is a machine learning platform that includes many different components, including a math library.
Installing the Intel MKL is simple. The most recent version can be downloaded from the official website. Once you have downloaded the installer, run it and follow the prompts.
You will need to specify the installation directory for the MKL. By default, it will install to “/opt/intel/mkl”. If you want to install it to a different location, you can specify that here.
Once the installation is complete, you will need to set the “MKL_HOME” environment variable to point to the installation directory. You can do this by running the following command:
With the Intel MKL installed and the environment variable set, you can now run TensorFlow with MKL support. To do this, you will need to compile TensorFlow from source with the “–mkl” flag set.
First, clone the TensorFlow repository from GitHub:
git clone https://github.com/tensorflow/tensorflow.git
cd tensorflow // Change directory into tensorflow repository just cloned // Configure the build with MKL support ./configure – mkl // Build TensorFlow bazel build -c opt – config=mkl // Run your favorite TensorFlow program
TensorFlow is a popular open-source platform for machine learning. Recently, the Intel Math Kernel Library (Intel MKL) has added support for TensorFlow, which can result in significant performance gains. In this article, we’ll benchmark the performance of TensorFlow with and without Intel MKL support.
In this article, we have shown you how to use Intel MKL with TensorFlow. We have also shown you how to change the default MKL settings to improve performance. Finally, we have provided a brief performance comparison between TensorFlow with and without MKL.
If you want to learn more about using Intel MKL with TensorFlow, check out the following resources:
-The official TensorFlow documentation on using Intel MKL: https://www.tensorflow.org/performance/intel_mkl
-A tutorial from Intel on using MKL with TensorFlow: https://software.intel.com/en-us/articles/using-intel-mkl-with-tensorflow
-‘dnn’, ‘cpu’, ‘mkl’ CPU performance guide from TensorFlow: https://www.tensorflow.org/performance/cpu
Keyword: How to Use Intel MKL with TensorFlow