Deep learning processors for on-device intelligence

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 41
  • Download : 0
Recently, deep learning is influencing not only the technology itself but also our everyday lives. Formerly, most AI functionalities and applications were centralized on datacenters. However, the primary platform for AI has recently shifted to on-devices. With the increasing demand on edge, mobile and IoT AI, conventional hardware solutions face their ordeal because of their low energy efficiency on such power hungry applications. For the past few years, dedicated DNN inference accelerators have been under the spotlight. However, with the rising emphasis on privacy, personalization and local optimization, ability to learn is becoming the second hurdle for “on-device AI.” In addition, with the recent developments in hardware research, faster DNN processing speed with low power consumption is achieved, enabling numerous applications on edge and mobile devices, which were formerly not applicable to edge and mobile devices. Applications with humanistic intelligence, which can take users' emotion into account, have been demonstrated, along with GAN and DRL as well as AI models using 3-dimensional data processing for higher accuracy.
Publisher
Association for Computing Machinery
Issue Date
2020-09
Language
English
Citation

30th Great Lakes Symposium on VLSI, GLSVLSI 2020, pp.1 - 8

DOI
10.1145/3386263.3409103
URI
http://hdl.handle.net/10203/307281
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0