An Energy-Efficient Stochastic Computational Deep Belief Network
Title | An Energy-Efficient Stochastic Computational Deep Belief Network |
Publication Type | Conference Paper |
Year of Publication | 2018 |
Authors | Liu, Y., Wang, Y., Lombardi, F., Han, J. |
Conference Name | 2018 Design, Automation Test in Europe Conference Exhibition (DATE) |
Keywords | approximate SC activation unit, belief networks, Biological neural networks, Cognitive Computing, Collaboration, composability, computation speed, Correlation, deep belief network, deep neural networks, DNNs, effective machine learning models, Electronic mail, energy consumption, energy-efficient deep belief network, energy-efficient stochastic computational deep belief network, fixed point arithmetic, fixed-point implementation, floating point arithmetic, floating-point design, Hardware, high energy consumption, Human Behavior, learning (artificial intelligence), Metrics, neural nets, Neurons, nonlinearly separable patterns, pattern classification, policy-based governance, pubcrawl, random number generation, random number generators, rectifier linear unit, resilience, Resiliency, RNGs, SC-DBN design, Scalability, Stochastic computing, Stochastic processes |
Abstract | Deep neural networks (DNNs) are effective machine learning models to solve a large class of recognition problems, including the classification of nonlinearly separable patterns. The applications of DNNs are, however, limited by the large size and high energy consumption of the networks. Recently, stochastic computation (SC) has been considered to implement DNNs to reduce the hardware cost. However, it requires a large number of random number generators (RNGs) that lower the energy efficiency of the network. To overcome these limitations, we propose the design of an energy-efficient deep belief network (DBN) based on stochastic computation. An approximate SC activation unit (A-SCAU) is designed to implement different types of activation functions in the neurons. The A-SCAU is immune to signal correlations, so the RNGs can be shared among all neurons in the same layer with no accuracy loss. The area and energy of the proposed design are 5.27% and 3.31% (or 26.55% and 29.89%) of a 32-bit floating-point (or an 8-bit fixed-point) implementation. It is shown that the proposed SC-DBN design achieves a higher classification accuracy compared to the fixed-point implementation. The accuracy is only lower by 0.12% than the floating-point design at a similar computation speed, but with a significantly lower energy consumption. |
URL | https://ieeexplore.ieee.org/document/8342191 |
DOI | 10.23919/DATE.2018.8342191 |
Citation Key | liu_energy-efficient_2018 |
- random number generation
- high energy consumption
- Human behavior
- learning (artificial intelligence)
- Metrics
- neural nets
- Neurons
- nonlinearly separable patterns
- pattern classification
- policy-based governance
- pubcrawl
- Hardware
- random number generators
- rectifier linear unit
- resilience
- Resiliency
- RNGs
- SC-DBN design
- Scalability
- Stochastic computing
- Stochastic processes
- DNNs
- belief networks
- Biological neural networks
- Cognitive Computing
- collaboration
- composability
- computation speed
- Correlation
- Deep Belief Network
- deep neural networks
- approximate SC activation unit
- effective machine learning models
- Electronic mail
- energy consumption
- energy-efficient deep belief network
- energy-efficient stochastic computational deep belief network
- fixed point arithmetic
- fixed-point implementation
- floating point arithmetic
- floating-point design