Click here to Skip to main content
13,863,412 members
Click here to Skip to main content
Add your own
alternative version

Stats

3K views
1 bookmarked
Posted 21 Jan 2019
Licenced CPOL

Run an Inference Engine API in Python on IEI Tank AIoT Developer Kit

, 21 Jan 2019
This paper provides introductory information, links and resources for operating an IEI Tank with the Intel® Distribution of OpenVINO™ toolkit for Linux with FPGA support.

Editorial Note

This article is in the Product Showcase section for our sponsors at CodeProject. These articles are intended to provide you with information on products and services that we consider useful and of value to developers.

Introduction

The IEI Tank* AIoT Developer Kit is a ruggedized embedded computer system for performing deep learning inference at the edge. This computer platform supports multiple devices for heterogeneous workflows, including CPU, GPU and FPGA.

This paper provides introductory information, links and resources for operating an IEI Tank with the Intel® Distribution of OpenVINO™ toolkit for Linux* with FPGA support. The steps for running an inference engine API sample in Python* targeting the FPGA are also described below.

Hardware and Software Components

The IEI New Product Launch video provides an overview of the Mustang-F100-A10 acceleration card, along with information on how the card is installed in an IEI Tank. Figure 1 shows the IEI Tank AIoT Developer Kit with a Mustang-F100-A10 acceleration card installed and operational.

Figure 1. Installed Mustang-F100-A10 acceleration card.

The hardware and software components for the system show in Figure 1 are listed below.

Computer

Software

Note: The IEI TANK AIoT Developer Kit specified comes with pre-installed Ubuntu 16.04 LTS, pre-installed Intel® Distribution of OpenVINO™ toolkit, Intel® Media SDK, Intel® System Studio and Arduino Create*.

FPGA

Software Installation

The online document "Install the Intel® Distribution of OpenVINO™ toolkit for Linux with FPGA Support" provides detailed steps for installing and configuring the required software components. This comprehensive procedure demonstrates how to:

  • Install the core components and external software dependencies
  • Configure the Model Optimizer
  • Set up the Intel® Arria® 10 FPGA
  • Program a FPGA bitstream
  • Verify the installation by running a C++ classification sample using the -d parameter option to target the FPGA.

Note: All of these steps must be completed before attempting to run the Python classification sample presented in the next section.

Run a Python* Classification Sample Targeting FPGA

The steps required to run the Python classification sample included in the Intel® Distribution of OpenVINO™ toolkit are shown below. This workflow is similar to the steps shown in the installation guide under the section titled "Run a Sample Application," only here we are running a classification sample created with the Python programming language.

  1. Open a terminal and enter the following command to go to the Python samples directory:
    cd /opt/intel/computer_vision_sdk/deployment_tools/inference_engine/samples/python_samples
  2. Run a Python classification sample application targeting only the CPU, and use the -ni parameter to increase the number of iterations to 100:
    python3 classification_sample.py -m ~/openvino_models/ir/squeezenet1.1/FP32/squeezenet1.1.xml -i /opt/intel/computer_vision_sdk/deployment_tools/demo/car.png -ni 100

    The output of this program, with the -ni 100 parameter included, is shown in Figure 2.

    Figure 2. Classification results using CPU only.
  3. Next, run the command again using the -d option to target the FPGA:
    python3 classification_sample.py -m ~/openvino_models/ir/squeezenet1.1/FP32/squeezenet1.1.xml -i /opt/intel/computer_vision_sdk/deployment_tools/demo/car.png -d HETERO:FPGA,CPU -ni 100

    The output of this program, with the -d HETERO:FPGA,CPU -ni 100 parameters included, is shown in Figure 3.

    Figure 3. Classification results using FPGA.

Summary

For more information about the IEI Tank* AIoT Developer Kit and the Intel® Distribution of OpenVINO™ toolkit for Linux with FPGA support, take a look at the resources provided below.

Product information

Articles and tutorials

Videos

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Share

About the Author

Intel Corporation
United States United States
You may know us for our processors. But we do so much more. Intel invents at the boundaries of technology to make amazing experiences possible for business and society, and for every person on Earth.

Harnessing the capability of the cloud, the ubiquity of the Internet of Things, the latest advances in memory and programmable solutions, and the promise of always-on 5G connectivity, Intel is disrupting industries and solving global challenges. Leading on policy, diversity, inclusion, education and sustainability, we create value for our stockholders, customers and society.
Group type: Organisation

43 members


You may also be interested in...

Pro
Pro

Comments and Discussions

 
-- There are no messages in this forum --
Permalink | Advertise | Privacy | Cookies | Terms of Use | Mobile
Web04 | 2.8.190214.1 | Last Updated 21 Jan 2019
Article Copyright 2019 by Intel Corporation
Everything else Copyright © CodeProject, 1999-2019
Layout: fixed | fluid