In "Itchy Nose" we proposed a sensing technique for detecting finger movements on the nose for supporting subtle and discreet interaction. It uses the electrooculography sensors embedded in the frame of a pair of eyeglasses for data gathering and uses machine-learning technique to classify different gestures. Here we further propose an automated training and visualization tool for its classifier. This tool guides the user to make the gesture in proper timing and records the sensor data. It automatically picks the ground truth and trains a machine-learning classifier with it. With this tool, we can quickly create trained classifier that is personalized for the user and test various gestures.