基本信息来源于合作网站,原文需代理用户跳转至来源网站获取       
摘要:
This paper describes our implementation of several neural networks built on a field programmable gate array (FPGA) and used to recognize a handwritten digit dataset—the Modified National Institute of Standards and Technology (MNIST) database. We also propose a novel hardware-friendly activation function called the dynamic Rectifid Linear Unit (ReLU)—D-ReLU function that achieves higher performance than traditional activation functions at no cost to accuracy. We built a 2-layer online training multilayer perceptron (MLP) neural network on an FPGA with varying data width. Reducing the data width from 8 to 4 bits only reduces prediction accuracy by 11%, but the FPGA area decreases by 41%. Compared to networks that use the sigmoid functions, our proposed D-ReLU function uses 24% - 41% less area with no loss to prediction accuracy. Further reducing the data width of the 3-layer networks from 8 to 4 bits, the prediction accuracies only decrease by 3% - 5%, with area being reduced by 9% - 28%. Moreover, FPGA solutions have 29 times faster execution time, even despite running at a 60× lower clock rate. Thus, FPGA implementations of neural networks offer a high-performance, low power alternative to traditional software methods, and our novel D-ReLU activation function offers additional improvements to performance and power saving.
推荐文章
基于recurrent neural networks的网约车供需预测方法
长短时记忆循环神经网络
网约车数据
交通优化调度
TensorFlow
深度学习
Role of hydro-geochemical functions on karst critical zone hydrology for sustainability of water res
Hydro-geochemical analysis
Karst critical zone
Water resources
Vegetation Southwest China
基于Overlay Networks的区分服务模型
重叠网
服务质量
区分服务
接纳控制
内容分析
关键词云
关键词热度
相关文献总数  
(/次)
(/年)
文献信息
篇名 Neural Networks on an FPGA and Hardware-Friendly Activation Functions
来源期刊 电脑和通信(英文) 学科 工学
关键词 Deep Learning D-ReLU Dynamic ReLU FPGA Hardware Acceleration Activation Function
年,卷(期) 2020,(12) 所属期刊栏目
研究方向 页码范围 251-277
页数 27页 分类号 TN9
字数 语种
DOI
五维指标
传播情况
(/次)
(/年)
引文网络
引文网络
二级参考文献  (0)
共引文献  (0)
参考文献  (0)
节点文献
引证文献  (0)
同被引文献  (0)
二级引证文献  (0)
2020(0)
  • 参考文献(0)
  • 二级参考文献(0)
  • 引证文献(0)
  • 二级引证文献(0)
研究主题发展历程
节点文献
Deep
Learning
D-ReLU
Dynamic
ReLU
FPGA
Hardware
Acceleration
Activation
Function
研究起点
研究来源
研究分支
研究去脉
引文网络交叉学科
相关学者/机构
期刊影响力
电脑和通信(英文)
月刊
2327-5219
武汉市江夏区汤逊湖北路38号光谷总部空间
出版文献量(篇)
783
总下载数(次)
0
总被引数(次)
0
论文1v1指导