We address an event reconstruction method for the muon-to-electron conversion in the COMET Phase-I experiment and estimate the single event sensitivity. The signal event of the experiment is an 104.97 MeV electron from the muon-to-electron conversion which proves the charged lepton flavor violation. The background one is an electron from the decay-in-orbit (DIO) where two additional electrons are emitted and take kinetic energy away. To reject the background ones that have energy close to the signal electron energy, the momentum of electrons should be fitted precisely by reconstructing their tracks. While propagating through the magnetic field, signal and background electrons can make multiple helix turns where we should classify each hit for each helix turn segment. For the helix turn classification, (1) we prepare the set of seeds by investigating the possible ranges of the initial position and momentum for the specific helix turn segment and find the optimal one using a minimization model and, (2) classify hits for the segment if their distance-of-closest-approach (DCA) to the extrapolated track is less than the cutoff value. This method generally works on the 90–104.97 MeV electrons with 200 keV of energy resolution, while show about 60% tracking efficiency for the signal ones. Since the huge amount of computation was required, we parallelized the algorithms with the GPU where we could accelerate the event reconstruction by a factor of more than 100 compared to the single CPU core. The single event sensitivity of the muon-to-electron conversion during the 150 days DAQ was estimated to be 4.4×E−15.