
5 Jun
2008
5 Jun
'08
3:35 p.m.
Henning Thielemann wrote:
"Markov chain" means, that you have a sequence of random experiments, where the outcome of each experiment depends exclusively on a fixed number (the level) of experiments immediately before the current one.
Right. So a "Markov chain" is actually a technical way of describing something that's intuitively pretty obvious? (E.g., PPM compression works by assuming that the input data is some sort of Markov chain with as-yet unknown transition probabilities.)
If the level is too high, you will just reproduce the training text.
Yeah, I can see that happening! ;-) The key, I think, is for the training set to be much larger than what you want to produce...