An 2.31uJ/Inference Ultra-Low Power Always-on Event-Driven AI-IoT SoC With Switchable nvSRAM Compute-in-Memory Macro

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 2
  • Download : 0
Internet-of-Things (IoT) drives the demand for artificial intelligence (AI) system-on-chips (SoCs) for vast always-on ultra-low power applications such as human action recognition (HAR) for surveillance systems, face detection (FD) and recognition (FR) for home security, etc. Previous AI-IoT SoCs still face limited system efficiency caused by the high leaky power of SRAMs, huge external memory access (EMA), and frequent on-chip data transfer. The proposed ultra-low power RISC-V embedded AI-IoT SoC is composed of 1) a novel bit-line (BL) segmented coupled nvSRAM macro with switchable working modes: SRAM, non-volatile memory (NVM), NVM computing in memory (CIM), performing pre-charge reusing, power gating and local data swapping; 2) a hot-silent encoded (HSE) uDMA cluster with 1MB multi-bank eMRAM to reduce the on-chip transmission power and eliminate the EMA power; 3) and an event-driven wake-up unit (EDWU) for skipping unnecessary inference; 4) a RISC-V core with dedicated ISA extension for switchable working modes. The proposed SoC achieves an energy efficiency of 20.3-35.5 TOPS/W @ResNet-20 (fix-point-8, FXP8) inferencing, which shows a 2.82x - 3.69x efficiency improvement compared to the previous state-of-the-art (SOTA) AI-IoT SoCs.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2024-05
Language
English
Article Type
Article
Citation

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, v.71, no.5, pp.2534 - 2538

ISSN
1549-7747
DOI
10.1109/TCSII.2024.3374885
URI
http://hdl.handle.net/10203/322723
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0