Uses of Class
weka.classifiers.SingleClassifierEnhancer

Packages that use SingleClassifierEnhancer
weka.classifiers   
weka.classifiers.lazy   
weka.classifiers.meta   
weka.classifiers.meta.nestedDichotomies   
weka.classifiers.mi   
 

Uses of SingleClassifierEnhancer in weka.classifiers
 

Subclasses of SingleClassifierEnhancer in weka.classifiers
 class IteratedSingleClassifierEnhancer
          Abstract utility class for handling settings common to meta classifiers that build an ensemble from a single base learner.
 class RandomizableIteratedSingleClassifierEnhancer
          Abstract utility class for handling settings common to randomizable meta classifiers that build an ensemble from a single base learner.
 class RandomizableSingleClassifierEnhancer
          Abstract utility class for handling settings common to randomizable meta classifiers that build an ensemble from a single base learner.
 

Uses of SingleClassifierEnhancer in weka.classifiers.lazy
 

Subclasses of SingleClassifierEnhancer in weka.classifiers.lazy
 class LWL
          Locally weighted learning.
 

Uses of SingleClassifierEnhancer in weka.classifiers.meta
 

Subclasses of SingleClassifierEnhancer in weka.classifiers.meta
 class AdaBoostM1
          Class for boosting a nominal class classifier using the Adaboost M1 method.
 class AdditiveRegression
          Meta classifier that enhances the performance of a regression base classifier.
 class AttributeSelectedClassifier
          Dimensionality of training and test data is reduced by attribute selection before being passed on to a classifier.
 class Bagging
          Class for bagging a classifier to reduce variance.
 class ClassificationViaRegression
          Class for doing classification using regression methods.
 class CostSensitiveClassifier
          A metaclassifier that makes its base classifier cost-sensitive.
 class CVParameterSelection
          Class for performing parameter selection by cross-validation for any classifier.

For more information, see:

R.
 class Dagging
          This meta classifier creates a number of disjoint, stratified folds out of the data and feeds each chunk of data to a copy of the supplied base classifier.
 class Decorate
          DECORATE is a meta-learner for building diverse ensembles of classifiers by using specially constructed artificial training examples.
 class END
          A meta classifier for handling multi-class datasets with 2-class classifiers by building an ensemble of nested dichotomies.

For more info, check

Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems.
 class FilteredClassifier
          Class for running an arbitrary classifier on data that has been passed through an arbitrary filter.
 class GridSearch
          Performs a grid search of parameter pairs for the a classifier (Y-axis, default is LinearRegression with the "Ridge" parameter) and the PLSFilter (X-axis, "# of Components") and chooses the best pair found for the actual predicting.

The initial grid is worked on with 2-fold CV to determine the values of the parameter pairs for the selected type of evaluation (e.g., accuracy).
 class LogitBoost
          Class for performing additive logistic regression.
 class MetaCost
          This metaclassifier makes its base classifier cost-sensitive using the method specified in

Pedro Domingos: MetaCost: A general method for making classifiers cost-sensitive.
 class MultiBoostAB
          Class for boosting a classifier using the MultiBoosting method.

MultiBoosting is an extension to the highly successful AdaBoost technique for forming decision committees.
 class MultiClassClassifier
          A metaclassifier for handling multi-class datasets with 2-class classifiers.
 class OrdinalClassClassifier
          Meta classifier that allows standard classification algorithms to be applied to ordinal class problems.

For more information see:

Eibe Frank, Mark Hall: A Simple Approach to Ordinal Classification.
 class RacedIncrementalLogitBoost
          Classifier for incremental learning of large datasets by way of racing logit-boosted committees.

For more information see:

Eibe Frank, Geoffrey Holmes, Richard Kirkby, Mark Hall: Racing committees for large datasets.
 class RandomCommittee
          Class for building an ensemble of randomizable base classifiers.
 class RandomSubSpace
          This method constructs a decision tree based classifier that maintains highest accuracy on training data and improves on generalization accuracy as it grows in complexity.
 class RegressionByDiscretization
          A regression scheme that employs any classifier on a copy of the data that has the class attribute (equal-width) discretized.
 class RotationForest
          Class for construction a Rotation Forest.
 class ThresholdSelector
          A metaclassifier that selecting a mid-point threshold on the probability output by a Classifier.
 

Uses of SingleClassifierEnhancer in weka.classifiers.meta.nestedDichotomies
 

Subclasses of SingleClassifierEnhancer in weka.classifiers.meta.nestedDichotomies
 class ClassBalancedND
          A meta classifier for handling multi-class datasets with 2-class classifiers by building a random class-balanced tree structure.

For more info, check

Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems.
 class DataNearBalancedND
          A meta classifier for handling multi-class datasets with 2-class classifiers by building a random data-balanced tree structure.

For more info, check

Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems.
 class ND
          A meta classifier for handling multi-class datasets with 2-class classifiers by building a random tree structure.

For more info, check

Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems.
 

Uses of SingleClassifierEnhancer in weka.classifiers.mi
 

Subclasses of SingleClassifierEnhancer in weka.classifiers.mi
 class MIBoost
          MI AdaBoost method, considers the geometric mean of posterior of instances inside a bag (arithmatic mean of log-posterior) and the expectation for a bag is taken inside the loss function.

For more information about Adaboost, see:

Yoav Freund, Robert E.
 class MIWrapper
          A simple Wrapper method for applying standard propositional learners to multi-instance data.

For more information see:

E.
 class SimpleMI
          Reduces MI data into mono-instance data.
 



Copyright © 2012 University of Waikato, Hamilton, NZ. All Rights Reserved.