Author Search Result

[Author] Yihan ZHU(1hit)

1-1hit
  • Area-Efficient Binarized Neural Network Inference Accelerator Based on Time-Multiplexed XNOR Multiplier Using Loadless 4T SRAM Open Access

    Yihan ZHU  Takashi OHSAWA  

     
    PAPER-Integrated Electronics

      Pubricized:
    2024/05/08
      Vol:
    E107-C No:12
      Page(s):
    545-556

    A binarized neural network (BNN) inference accelerator is designed in which weights are stores in loadless four-transistor static random access memory (4T SRAM) cells. A time-multiplexed exclusive NOR (XNOR) multiplier with switched capacitors is proposed which prevents the loadless 4T SRAM cell from being destroyed in the operation. An accumulator with current sensing scheme is also proposed to make the multiply-accumulate operation (MAC) completely linear and read-disturb free. The BNN inference accelerator is applied to the MNIST dataset recognition problem with accuracy of 96.2% for 500 data and the throughput, the energy efficiency and the area efficiency are confirmed to be 15.50 TOPS, 72.17 TOPS/W and 50.13 TOPS/mm2, respectively, by HSPICE simulation in 32 nm technology. Compared with the conventional SRAM cell based BNN inference accelerators which are scaled to 32 nm technology, the synapse cell size is reduced to less than 16% (0.235 μm2) and the cell efficiency (synapse array area/synapse array plus peripheral circuits) is 73.27% which is equivalent to the state-of-the-art of the SRAM cell based BNN accelerators.

FlyerIEICE has prepared a flyer regarding multilingual services. Please use the one in your native language.