In practical classification tasks, mislabeled instances pose significant challenges, often deteriorating the performance of conventional machine learning algorithms such as AdaBoost. Existing methods either discard potentially clean samples through data filtering or ignore the heterogeneity of noise across different instances, limiting their effectiveness. To address this issue, we propose a novel boosting algorithm, named Noise-Aware AdaBoost (NA2daBoost), which integrates an instance-wise noise level into the weight updating mechanism of the AdaBoost framework. Specifically, our algorithm assigns and updates instance weights based on their likelihood of being mislabeled: instances identified as more likely to be noisy experience relatively smaller weight increments upon misclassification and greater weight decrements upon correct classification. Conversely, instances identified as having low noise levels experience larger weight increments when misclassified and smaller decrements when correctly classified. We theoretically define and analyze the error upper bound of the resulting strong classifier, demonstrating that our noise-aware approach iteratively reduces this bound. Experimental results on UCI benchmark datasets show that the proposed algorithm reduces error rate degradation from 15 to 25% to 10-15% across 10-40% noise levels and achieves superior classification performance compared to conventional AdaBoost and existing noise-robust AdaBoost approaches. Our weight adjustment strategy achieves an optimal trade-off between noise suppression and clean sample retention, demonstrating stable and robust performance across all noise levels.