Feature Selection ToolboxFST3 Library / Documentation

demo61.cpp File Reference

Example 61: Feature selection that respects pre-specified feature weights. More...

#include <boost/smart_ptr.hpp>
#include <exception>
#include <iostream>
#include <cstdlib>
#include <string>
#include <vector>
#include "error.hpp"
#include "global.hpp"
#include "subset.hpp"
#include "data_intervaller.hpp"
#include "data_splitter.hpp"
#include "data_splitter_5050.hpp"
#include "data_splitter_cv.hpp"
#include "data_scaler.hpp"
#include "data_scaler_void.hpp"
#include "data_accessor_splitting_memTRN.hpp"
#include "data_accessor_splitting_memARFF.hpp"
#include "criterion_wrapper.hpp"
#include "criterion_sumofweights.hpp"
#include "criterion_negative.hpp"
#include "distance_euclid.hpp"
#include "classifier_knn.hpp"
#include "seq_step_straight.hpp"
#include "search_seq_sffs.hpp"
#include "result_tracker_regularizer.hpp"
Include dependency graph for demo61.cpp:

Functions

int main ()

Detailed Description

Example 61: Feature selection that respects pre-specified feature weights.


Function Documentation

int main (  ) 

Example 61: Feature selection that respects pre-specified feature weights.

In many applications it is desirable to optimize feature subsets not only with respect to the primary objective (e.g., decision rule accuracy), but also with respect to additional factors like known feature acquisition cost. In many cases there might be only negligible difference in discriminatory ability among several features, while the cost of measuring their value may differ a lot. In such a case it is certainly better to select the cheaper feature. In other scenarios it might be even advantageous to trade a minor degradation of classifcation accuracy for substantial saving in measurement acquisition cost. For such cases FST3 implements a mechanism that allows to control the feature accuracy vs. feature cost trade-off. It is made possible through result tracking and subsequent selection of alternative solution so as to minimize the sum of pre-specified feature weights. The lower-weight solution is selected from the pool of all known solutions that differ from the best one by less than a user-specifed margin (permitted primary criterion value difference from the known maximum value). In this example we illustrate how to add the respective mechanism to standard wrapper based feature selection. Here we select features so as to maximize 3-Nearest Neighbor accuracy; then several lower-weight solutions are identified and validated, for various margin values.

References FST::Search_SFFS< RETURNTYPE, DIMTYPE, SUBSET, CRITERION, EVALUATOR >::search(), and FST::Search_SFFS< RETURNTYPE, DIMTYPE, SUBSET, CRITERION, EVALUATOR >::set_search_direction().


Generated on Thu Mar 31 11:36:52 2011 for FST3Library by  doxygen 1.6.1