AI-to-Human Actuation: Boosting Unmodified AI's Robustness by Proactively Inducing Favorable Human Sensing Conditions

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 86
  • Download : 0
Imagine a near-future smart home. Home-embedded visual AI sensors continuously monitor the resident, inferring her activities and internal states that enable higher-level services. Here, as home-embedded sensors passively monitor a free person, good inferences happen randomly. The inferences' confidence highly depends on how congruent her momentary conditions are to the conditions favored by the AI models, e.g., front-facing or unobstructed. We envision new strategies of AI-to-Human Actuation (AHA) that empower the sensory AIs with proactive actuation so that they induce the person's conditions to be more favorable to the AIs. In this light, we explore the initial feasibility and efficacy of AHA in the context of home-embedded visual AIs. We build a taxonomy of actuations that could be issued to home residents to benefit visual AIs. We deploy AHA in an actual home rich in sensors and interactive devices. With 20 participants, we comprehensively study their experiences with proactive actuation blended with their usual home routines. We also demonstrate the substantially improved inferences of the actuation-empowered AIs over the passive sensing baseline. This paper sets forth an initial step towards interweaving human-targeted AIs and proactive actuation to yield more chances for high-confidence inferences without sophisticating the model, in order to improve robustness against unfavorable conditions.
Publisher
ASSOC COMPUTING MACHINERY
Issue Date
2023-03
Language
English
Article Type
Article
Citation

PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, v.7, no.1

ISSN
2474-9567
DOI
10.1145/3580812
URI
http://hdl.handle.net/10203/306446
Appears in Collection
RIMS Journal Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0