Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
An Embedded Inference Framework for Convolutional Neural Network Applications
oleh: Sheng Bi, Yingjie Zhang, Min Dong, Huaqing Min
| Format: | Article |
|---|---|
| Diterbitkan: | IEEE 2019-01-01 |
Deskripsi
With the rapid development of deep convolutional neural networks, more and more computer vision tasks have been well resolved. These convolutional neural network solutions rely heavily on the performance of the hardware. However, due to privacy issues or the network instability, we need to run convolutional neural networks on embedded platforms. Critical challenges will be raised by limited hardware resources on the embedded platform. In this paper, we design and implement an embedded inference framework to accelerate the inference of the convolutional neural network on the embedded platform. For this, we first analyzed the time-consuming layers in the inference process of the network, and then we design optimization methods for these layers. Also, we design a memory pool specifically for neural networks. Our experimental results show that our embedded inference framework can run a classification model MobileNet in 80ms and a detection model MobileNet-SSD in 155ms on Firefly-RK3399 development board.