AI-to-Human Actuation: Boosting Unmodified AI's Robustness by Proactively Inducing Favorable Human Sensing Conditions

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 87
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorCho, Sungjaeko
dc.contributor.authorKim, Yoonsuko
dc.contributor.authorJang, Jaewoongko
dc.contributor.authorHwang, Inseokko
dc.date.accessioned2023-05-03T01:01:20Z-
dc.date.available2023-05-03T01:01:20Z-
dc.date.created2023-05-02-
dc.date.issued2023-03-
dc.identifier.citationPROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, v.7, no.1-
dc.identifier.issn2474-9567-
dc.identifier.urihttp://hdl.handle.net/10203/306446-
dc.description.abstractImagine a near-future smart home. Home-embedded visual AI sensors continuously monitor the resident, inferring her activities and internal states that enable higher-level services. Here, as home-embedded sensors passively monitor a free person, good inferences happen randomly. The inferences' confidence highly depends on how congruent her momentary conditions are to the conditions favored by the AI models, e.g., front-facing or unobstructed. We envision new strategies of AI-to-Human Actuation (AHA) that empower the sensory AIs with proactive actuation so that they induce the person's conditions to be more favorable to the AIs. In this light, we explore the initial feasibility and efficacy of AHA in the context of home-embedded visual AIs. We build a taxonomy of actuations that could be issued to home residents to benefit visual AIs. We deploy AHA in an actual home rich in sensors and interactive devices. With 20 participants, we comprehensively study their experiences with proactive actuation blended with their usual home routines. We also demonstrate the substantially improved inferences of the actuation-empowered AIs over the passive sensing baseline. This paper sets forth an initial step towards interweaving human-targeted AIs and proactive actuation to yield more chances for high-confidence inferences without sophisticating the model, in order to improve robustness against unfavorable conditions.-
dc.languageEnglish-
dc.publisherASSOC COMPUTING MACHINERY-
dc.titleAI-to-Human Actuation: Boosting Unmodified AI's Robustness by Proactively Inducing Favorable Human Sensing Conditions-
dc.typeArticle-
dc.identifier.scopusid2-s2.0-85152484478-
dc.type.rimsART-
dc.citation.volume7-
dc.citation.issue1-
dc.citation.publicationnamePROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT-
dc.identifier.doi10.1145/3580812-
dc.contributor.localauthorKim, Yoonsu-
dc.contributor.nonIdAuthorCho, Sungjae-
dc.contributor.nonIdAuthorJang, Jaewoong-
dc.contributor.nonIdAuthorHwang, Inseok-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorHuman-AI Interaction-
dc.subject.keywordAuthorIoT-
dc.subject.keywordAuthorActuation-
dc.subject.keywordPlusCHANGE DEAFNESS-
dc.subject.keywordPlusDETECT CHANGES-
Appears in Collection
RIMS Journal Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0