We have developed a 3D vision-based semiautonomous assistive robot arm control method, called AROMA-V, to provide intelligent robotic manipulation assistance to individuals with impaired motor control. A working prototype AROMA-V was built on a JACO robotic manipulator combined with a low-cost short-range 3D depthsensing camera. In performing actual robotic manipulation tasks with the AROMA-V, a user starts operating the robot arm using an available manual control method (e.g., joystick, touch pad, or voice recognition). During the operation, when detecting objects within a set range, AROMA-V automatically stops the robotic arm, and provides the user with possible manipulation options through audible text output, based on the identified object characteristics. Then it waits until the user selects one by saying a voice command. Once the user feedback is provided, the AROMA-V drives the robotic arm autonomously until the given command is completed. In the lab trials conducted with five able-bodied subjects, the AROROMA-V demonstrated that it has the potential to enable users who have difficulty in using a conventional control interface. For the relatively simple tasks (e.g., manipulating a door handle, operating a light switch, and pushing an elevator switch) that do not require switching between different command mode, the AROMA-V was slower than the manual control. But, for the relatively complex tasks (e.g., knob-turning, ball-picking, and bottlegasping) which require fine motion control, the AROMA-V showed significantly faster performance than the manual control.