DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chang-Hyun Kim | ko |
dc.contributor.author | Joon-Yong Lee | ko |
dc.contributor.author | Lee, Ju-Jang | ko |
dc.date.accessioned | 2009-01-30T02:55:39Z | - |
dc.date.available | 2009-01-30T02:55:39Z | - |
dc.date.created | 2012-02-06 | - |
dc.date.created | 2012-02-06 | - |
dc.date.issued | 2003 | - |
dc.identifier.citation | ARTIFICIAL LIFE AND ROBOTICS, v.7, no.3, pp.86 - 90 | - |
dc.identifier.issn | 1433-5298 | - |
dc.identifier.uri | http://hdl.handle.net/10203/8364 | - |
dc.description | This work was presented in part at the 7th International Symposium on Artificial Life and Robotics, Oita, Japan, January 16–18, 2002 | en |
dc.description.abstract | Many map-building algorithms using ultrasonic sensors have been developed for mobile robot applications. In indoor environments, the ultrasonic sensor system gives some uncertain data. To compensate for this effect, a new feature extraction method using neural networks is proposed. A new, effective representation of the target is defined, and the reflection wave data patterns are learnt using neural networks. As a consequence, the targets are classified as planes, corners, or edges, which all frequently occur in indoor environments. We constructed our own robot system for the experiments which were carried out to show the performance. | - |
dc.language | English | - |
dc.language.iso | en_US | en |
dc.publisher | Springer Verlag | - |
dc.title | Feature Extraction Method for robot map using neural networks | - |
dc.type | Article | - |
dc.type.rims | ART | - |
dc.citation.volume | 7 | - |
dc.citation.issue | 3 | - |
dc.citation.beginningpage | 86 | - |
dc.citation.endingpage | 90 | - |
dc.citation.publicationname | ARTIFICIAL LIFE AND ROBOTICS | - |
dc.embargo.liftdate | 9999-12-31 | - |
dc.embargo.terms | 9999-12-31 | - |
dc.contributor.localauthor | Lee, Ju-Jang | - |
dc.contributor.nonIdAuthor | Chang-Hyun Kim | - |
dc.contributor.nonIdAuthor | Joon-Yong Lee | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.