Nowadays, Automated Machine Learning (AutoML)has gradually become an inevitable trend providing automaticand suitable solutions to address AI tasks without needingmore efforts from experts. Neural Architecture Search (NAS), asubfield of AutoML, has generated automated models solving fun-damental problems in computer vision such as image recognition,objects detection. NAS with differentiable search strategies hasreduced significantly the GPU time that occupancy on calculation.In this paper, we present an effective algorithm that allowsexpanding search spaces by selecting operation candidates fromthe initial set with different ways in concurrent execution. Theextended search space makes NAS having more opportunities tofind good architectures simultaneously by running the group ofsearch spaces in overlapping time periods instead of sequentially.Our approach, is called Accelerated NAS, shortens 1.8× search-ing time when comparing to previous works. In addition, theAccelerated NAS generates potential neural architectures havingcomparable performances with the low inference time.