Semi-Supervised Gait Generation With Two Microfluidic Soft Sensors

Cited 21 time in webofscience Cited 14 time in scopus
  • Hit : 515
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Dooyoungko
dc.contributor.authorKim, Minko
dc.contributor.authorKwon, Junghanko
dc.contributor.authorPark, Yong-Laeko
dc.contributor.authorJo, Sunghoko
dc.date.accessioned2019-05-10T02:10:02Z-
dc.date.available2019-05-10T02:10:02Z-
dc.date.created2019-05-10-
dc.date.created2019-05-10-
dc.date.issued2019-07-
dc.identifier.citationIEEE Robotics and Automation Letters, v.4, no.3, pp.2501 - 2507-
dc.identifier.issn2377-3766-
dc.identifier.urihttp://hdl.handle.net/10203/261819-
dc.description.abstractNowadays, the use of deep learning for the calibration of soft wearable sensors has addressed the typical drawbacks of the microfluidic soft sensors, such as hysteresis and nonlinearity. However, previous studies have not yet resolved some of the design constraints such as the sensors are needed to he attached to the joints and many sensors are needed to track the human motion. Moreover, the previous methods also demand an excessive amount of data for sensor calibration which make the system impractical. In this letter, we present a gait motion generating method using only two microfluidic sensors. We select appropriate sensor positions with consideration of the deformation patterns of the lower-limb skins and mutual interference with soft actuators. Moreover, a semi-supervised deep learning model is proposed to reduce the size of calibration data. We evaluated the performance of the proposed model with various walking speeds. From the experiment, the proposed method showed a higher performance with smaller calibration dataset comparing to the other methods that are based on the supervised deep learning.-
dc.languageEnglish-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.titleSemi-Supervised Gait Generation With Two Microfluidic Soft Sensors-
dc.typeArticle-
dc.identifier.wosid000464913800002-
dc.identifier.scopusid2-s2.0-85064550853-
dc.type.rimsART-
dc.citation.volume4-
dc.citation.issue3-
dc.citation.beginningpage2501-
dc.citation.endingpage2507-
dc.citation.publicationnameIEEE Robotics and Automation Letters-
dc.identifier.doi10.1109/LRA.2019.2907431-
dc.contributor.localauthorJo, Sungho-
dc.contributor.nonIdAuthorKim, Min-
dc.contributor.nonIdAuthorKwon, Junghan-
dc.contributor.nonIdAuthorPark, Yong-Lae-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorWearable robots-
dc.subject.keywordAuthorsoft robot applications-
dc.subject.keywordAuthordeep learning in robotics and automation-
dc.subject.keywordPlusDEFORMATION-
dc.subject.keywordPlusBODY-
Appears in Collection
CS-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 21 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0