public class NGramLanguageModel extends NGramModel implements LanguageModel
LanguageModel based on a NGramModel
 using Stupid Backoff to get the probabilities of the ngrams.| Constructor and Description | 
|---|
| NGramLanguageModel() | 
| NGramLanguageModel(InputStream in) | 
| NGramLanguageModel(InputStream in,
                  int n) | 
| NGramLanguageModel(int n) | 
| Modifier and Type | Method and Description | 
|---|---|
| void | add(String... tokens) | 
| double | calculateProbability(String... tokens)Calculate the probability of a series of tokens (e.g. | 
| double | calculateProbability(StringList tokens)Calculate the probability of a series of tokens (e.g. | 
| String[] | predictNextTokens(String... tokens)Predict the most probable output sequence of tokens, given an input sequence of tokens. | 
| StringList | predictNextTokens(StringList tokens)Predict the most probable output sequence of tokens, given an input sequence of tokens. | 
add, add, add, contains, cutoff, equals, getCount, hashCode, iterator, numberOfGrams, remove, serialize, setCount, size, toDictionary, toDictionary, toStringforEach, spliteratorpublic NGramLanguageModel()
public NGramLanguageModel(int n)
public NGramLanguageModel(InputStream in) throws IOException
IOExceptionpublic NGramLanguageModel(InputStream in, int n) throws IOException
IOExceptionpublic void add(String... tokens)
public double calculateProbability(StringList tokens)
LanguageModelcalculateProbability in interface LanguageModeltokens - the text tokens to calculate the probability forpublic double calculateProbability(String... tokens)
LanguageModelcalculateProbability in interface LanguageModeltokens - the text tokens to calculate the probability forpublic StringList predictNextTokens(StringList tokens)
LanguageModelpredictNextTokens in interface LanguageModeltokens - a sequence of tokenspublic String[] predictNextTokens(String... tokens)
LanguageModelpredictNextTokens in interface LanguageModeltokens - a sequence of tokensCopyright © 2018 The Apache Software Foundation. All rights reserved.