NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:4567
Title:The Synthesis of XNOR Recurrent Neural Networks with Stochastic Logic


		
This paper introduces interesting stochastic finite state machine based methods to approximate nonlinear activation functions including hyperbolic tangent and sigmoid functions. A fully binary model of LSTM (both weights and hidden states are binary) is constructed in which XNOR operations are used to perform all the multiplications in the gate and state computations. Empirical results show that the proposed binary LSTM model can dramatically reduce the computational lost while without sacrificing latency or accuracy comparing with existing methods. In the rebuttal, concerns from the reviewers are carefully addressed, e.g., adding an FPGA based implementation. However, some of them are still lack of sufficient details and discussions, in particular, the cost of stochastic computing, and the memory movement cost. In addition, this paper only considered 1-layer LSTM. It is unclear how the proposed method can be generalized to other architectures. We will be happy to see these concerns are addressed in the revised version of this paper.