Feature selection can effectively eliminate irrelevant or redundant features without changing features semantics, so as to improve the performance of learning and reduce the training time. In most of the existing feature selection methods based on rough sets, eliminating the redundant features between features and decisions, and deleting the redundant features between features are performed separately. This will greatly increase the search time of feature subset. To quickly remove redundant features, we define a series of feature evaluation functions that consider both the consistency between features and decisions, and redundancy between features, then propose a novel feature selection method based on min-redundancy and max-consistency. Firstly, we define the consistency of features with respect to decisions and the redundancy between features from neighborhood information granules. Then we propose a combined criterion to measure the importance of features and design a feature selection algorithm based on minimal-redundancy-maximal-consistency (mRMC). Finally, on UCI data sets, mRMC is compared with three other popular feature selection algorithms based on neighborhood idea, from classification accuracy, the number of selected features and running time. The experimental comparison shows that mRMC can quickly delete redundant features and select useful features while ensuring classification accuracy.