'DLB (Deep Learning Blocks)' as a part of DPU (Deep Learning Processing
Unit) is a collection of synthesizable Verilog modules for deep learning
inference network.
All contents are provided as it is WITHOUT ANY WARRANTY and NO TECHNICAL SUPPORT will be provided for problems that might arise.
Click to expand table of contents
- Overview
1.1 Prerequisites
1.2 Prepareing environment - Verification
- Deep Learning Blocks
3.1 Concatenation
3.2 Convolution
3.3 Linar
3.4 Pooling
3.5 Residual - Deep Learning Processing Unit
- Projects
5.1 Project: LeNet-5
5.2 Project: Tiny Yolo V2
5.3 Project: U-Net - Other things
6.1 Acknowledgment
6.2 Authors and contributors
6.3 License
6.4 Revision history
'DLB (Deep Learning Blocks)' as a part of DPU (Deep Learning Processing Unit) is a collection of synthesizable Verilog modules for deep learning inference network.
DLB blocks contain followings (some are under development and more will be added):
- Convolution layer (straightforward 2D convolution)
- Pooling layer (2D max pooling)
- Fully connected layer (1D linear vector-to-matric multiplication)
- Concat layer
- Residual layer (point-to-point adder)
- move, Fill, and so on.
The blocks have following highlights:
- Fully synthesizable Verilog code
- Parameterized to be adopted wide range of usages
- Test-benches for Verilog, C/C++, and PyTorch
- FPGA verified using Future Design Systems’ CON-FMC
This program requires followings.
- GNU GCC: C++ compiler
- PyTorch / Conda / Python3
- HDL simulator: Xilinx Xsim (Xilinx Vivado 2019.1 or later)
- Xilinx Vivado 2017 or earlier version does not support SystemVerilog features that are used in this project.
- Xilinx Vivado Webpack will be fine, which does not require license.
- FPGA implementation: Xilinx Vivado
- HW-SW Co-Simulation Library for AMBA AXI BFM using DPI/VPI
- DLR: Deep Learning Routines (v1.4 for thie project)
Refer to "How to Prepare DPU Development Environment".
The picture below shows how to verify DLB blocks.
| | | | | | |:---:| |:---:| |:---:| | DPU on board | | DPU on FPGA | | DPU HW/SW co-simulation |
As shown in the picture below, DLB blocks are verified along with HDL (Hardware Description Language), C/C++, and Python.
| || || | |:---:| |:---:| |:---:| | DLB with HDL | | DLB with C | | DLR with Python|
It should be noted that the blocks are not optimized to get a higher performance since the blocks are for hardware implementation not for computation. In addition to this the blocks are only for inference not for training. For optimized version, contact at Future Design Systems.
To be added.
To be added.
To be added.
To be added.
To be added.
To be added.
More details will be added.
Deep-learning processing unit |
LeNet-5 is a popular convolutional neural network architecture for handwritten and machine-printed character recognition.
Following picture summarizes how to use DPU for LeNet-5 inference network, in which design flow consists of three major phases.
- Training phase: Training on PyTorch
- It uses PyTorch framework for modeling and training the LeNet-5 to get trained weights.
- Verification phase: HW/SW co-simulation
- It uses cosim framework for verifying functionality after preparing DPU RTL and SW.
- Inferencing phase: Running on FPGA
- It uses CON-FMC frmework to run the model on FPGA along with SW through USB.
This project uses '32-bit floating point' data type and there is no need to manipulate trainined weights. This project does not have 'network compiler' that could generate C/C++/Python SW code to control DPU hardware.
More details will be added.
More details will be added.
The author(s) thanks to all who contributed on this package.
- [Ando Ki] - Initial work - Future Design Systems
DLR (Deep Learning Routines) and its associated materials are licensed with the 2-clause BSD license to make the program and library useful in open and closed source projects independent of their licensing scheme. Each contributor holds copyright over their respective contribution.
The 2-Clause BSD License
Copyright 2020-2021 Future Design Systems (http:://www.future-ds.com)Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
-
Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
-
Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
- 2021.08.10: Started by Ando Ki (adki(at)future-ds.com)