Feature selection is a term commonly used in data mining to describe the tools and techniques available for reducing inputs to a manageable size for processing and analysis. Feature selection implies not only cardinality reduction, which means imposing an arbitrary or predefined cutoff on the number of attributes that can be considered when building a model, but also the choice of attributes, meaning that either the analyst or the modeling tool actively selects or discards attributes based on their usefulness for analysis. “Feature selection methods best Practices” is the mast reference that practitioners and researchers have long been seeking. It is also the obvious choice for academic and research scholars. Это и многое другое вы найдете в книге Feature Selection Methods Best Practices (Subramanian Appavu alias Balamurugan)