Using Markov chains (or something similar) to produce an IRC-bot

后端 未结 3 1925
遥遥无期
遥遥无期 2021-01-30 15:09

I tried google and found little that I could understand.

I understand Markov chains to a very basic level: It\'s a mathematical model that only depends on previous input

相关标签:
3条回答
  • 2021-01-30 15:12

    Yes, a Markov chain is a finite-state machine with probabilistic state transitions. To generate random text with a simple, first-order Markov chain:

    1. Collect bigram (adjacent word pair) statistics from a corpus (collection of text).
    2. Make a markov chain with one state per word. Reserve a special state for end-of-text.
    3. The probability of jumping from state/word x to y is the probability of the words y immediately following x, estimated from relative bigram frequencies in the training corpus.
    4. Start with a random word x (perhaps determined by how often that word occurs as the first word of a sentence in the corpus). Then pick a state/word y to jump to randomly, taking into account the probability of y following x (the state transition probability). Repeat until you hit end-of-text.

    If you want to get something semi-intelligent out of this, then your best shot is to train it on lots of carefully collected texts. The "lots" part makes it produce proper sentences (or plausible IRC speak) with high probability; the "carefully collected" part means you control what it talks about. Introducing higher-order Markov chains also helps in both areas, but takes more storage to store the necessary statistics. You may also look into things like statistical smoothing.

    However, having your IRC bot actually respond to what is said to it takes a lot more than Markov chains. It may be done by doing text categorization (aka topic spotting) on what is said, then picking a domain-specific Markov chain for text generation. Naïve Bayes is a popular model for topic spotting.

    Kernighan and Pike in The Practice of Programming explore various implementation strategies for Markov chain algorithms. These, and natural language generation in general, is covered in great depth by Jurafsky and Martin, Speech and Language Processing.

    0 讨论(0)
  • 2021-01-30 15:17

    It seems to me you are trying multiple things at the same time:

    1. extracting words/sentences by idling in IRC
    2. building a knowledge base
    3. listening to some chat, parsing keywords
    4. generate some sentence regarding keywords

    Those are basically very different tasks. Markov models are often used for machine learning. I don't see much learning in your tasks though.

    larsmans answer shows how you generate sentences from word-based markov-models. You can also train the weights to favor those word-pairs that other IRC users used. But nonetheless this will not generate keyword-related sentences, because building/refining a markov model is not the same as "driving" it.

    You might try hidden markov models (HMM) where the visible output is the keywords and the hidden states are made from those word-pairs. You could then favor sentences more appropriate to specific keywords dynamically.

    0 讨论(0)
  • 2021-01-30 15:34

    You want to look for Ian Barber Text Generation ( phpir.com ). Unfortunately the site is down or offline. I have a copy of his text and I want to send it to you.

    0 讨论(0)
提交回复
热议问题