An important and challenging research issue associated with mobile robot navigation is simultaneous localization and map building (SLAM), which means that the mobile robot could estimate its poses in the environment without external information, and simultaneously build the map of the environment. Various kinds of methods such as odometry measurement and landmark matching, laser range image matching and scale invariant feature (SIFT) based algorithm have already been proposed to solve this kind of research problems, but they suffer from inevitable drawbacks. For example, range image matching may suffer from local minimum problem and thus can not get a good location estimation sometimes, and SIFT algorithm can not work if few SIFT intensity features exist in the environment. Furthermore, the map built by laser range image matching is not good for localization when the robot returns to the map built by itself, and the map built by SIFT could not be used for obstacle avoidance and next view generation. To solve these problems, in this paper, we first proposed a method of using Laser Structured Light Sensor for mobile robot SLAM based on the improved trimmed iterative closest point algorithm. To this end, we propose a sensor fusion method, which makes use of both the laser range information and SIFT features information for SLAM. Through a series of experiments, the proposed method is tested and evaluated.