Recently, many classification methods are widely used on real life data. K-nearest neighbor (KNN) is one of the popular classification methods. Although KNN is a simple and popular classifier, it still has two problems: including the classification accuracy is often worse than nonlinear classifiers such as support vector machine (SVM); the size of parameter k for KNN. To enhance the classification accuracy and to avoid the sensitivity influence of parameter k, we propose a novel modified KNN method, the distance-based k-nearest neighbor voting classifier (DBKNNV). In our study, the classification accuracy and the sensitivity of parameter k of DBKNNV are compared with KNN and two modified KNN methods. The experiment shows that DBKNNV often achieves higher and more stable classification accuracy. Moreover, the influence with the size of the parameter k of DBKNNV is not sensitivity. That means the classification accuracy of KNN and two modified KNN methods are affected with the different parameter k setting. In contrast, the classification accuracy of DBKNNV is more stable with different parameter k setting. Furthermore, the experiment also shows the classification accuracies of DBKNNV and SVM are similar to each other.