Class PerceptronTrainer

All Implemented Interfaces:
Trainer, EventTrainer

public class PerceptronTrainer extends AbstractEventTrainer
Trains models using the perceptron algorithm.

Each outcome is represented as a binary perceptron classifier. This supports standard (integer) weighting as well average weighting as described in:

Discriminative Training Methods for Hidden Markov Models: Theory and Experiments with the Perceptron Algorithm. Michael Collins, EMNLP 2002.

See Also:
  • Field Details

  • Constructor Details

  • Method Details

    • validate

      public void validate()
      Checks the configured parameters. If a subclass overrides this, it should call super.validate();.
      Overrides:
      validate in class AbstractEventTrainer
      Throws:
      IllegalArgumentException - Thrown if the algorithm name is not equal to {PERCEPTRON_VALUE}.
    • isSortAndMerge

      public boolean isSortAndMerge()
      Specified by:
      isSortAndMerge in class AbstractEventTrainer
    • doTrain

      public AbstractModel doTrain(DataIndexer indexer) throws IOException
      Specified by:
      doTrain in class AbstractEventTrainer
      Throws:
      IOException
    • setTolerance

      public void setTolerance(double tolerance)
      Specifies the tolerance. If the change in training set accuracy is less than this, stop iterating.
      Parameters:
      tolerance - The level of tolerance. Must not be negative.
      Throws:
      IllegalArgumentException - Thrown if parameters are invalid.
    • setStepSizeDecrease

      public void setStepSizeDecrease(double decrease)
      Enables and sets step size decrease. The step size is decreased every iteration by the specified value.
      Parameters:
      decrease - The step size decrease in percent. Must not be negative.
      Throws:
      IllegalArgumentException - Thrown if parameters are invalid.
    • setSkippedAveraging

      public void setSkippedAveraging(boolean averaging)
      Enables skipped averaging, this flag changes the standard averaging to special averaging instead.

      If we are doing averaging, and the current iteration is one of the first 20, or if it is a perfect square, then updated the summed parameters.

      The reason we don't take all of them is that the parameters change less toward the end of training, so they drown out the contributions of the more volatile early iterations. The use of perfect squares allows us to sample from successively farther apart iterations.

      Parameters:
      averaging - Whether to skip 'averaging', or not.
    • trainModel

      public AbstractModel trainModel(int iterations, DataIndexer di, int cutoff)
      Trains a PerceptronModel with given parameters.
      Parameters:
      iterations - The number of iterations to use for training.
      di - The DataIndexer used as data input.
      cutoff - The {AbstractTrainer.CUTOFF_PARAM} value to use for training.
      Returns:
      A valid, trained perceptron model.
    • trainModel

      public AbstractModel trainModel(int iterations, DataIndexer di, int cutoff, boolean useAverage)
      Trains a PerceptronModel with given parameters.
      Parameters:
      iterations - The number of iterations to use for training.
      di - The DataIndexer used as data input.
      cutoff - The {AbstractTrainer.CUTOFF_PARAM} value to use for training.
      useAverage - Whether to use 'averaging', or not. See {setSkippedAveraging(boolean)} for details.
      Returns:
      A valid, trained perceptron model.