Hidden Markov model classifying a sequence in Matlab

后端 未结 2 1186
梦谈多话
梦谈多话 2021-01-06 16:03

I\'m very new to machine learning, I\'v read about Matlab\'s Statistics toolbox for hidden Markov model, I want to classify a given sequence of signals using it. I\'v 3D co-

相关标签:
2条回答
  • 2021-01-06 16:42

    Here is a general outline of the approach to classifying d-dimensional sequences using hidden Markov models:

    1) Training:

    For each class k:

    • prepare an HMM model. This includes initializing the following:
      • a transition matrix: Q-by-Q matrix, where Q is the number of states
      • a vector of prior probabilities: Q-by-1 vector
      • the emission model: in your case the observations are 3D points so you could use a mutlivariate normal distribution (with specified mean vector and covariance matrix) or a Guassian mixture model (a bunch of MVN distributions combined using mixture coefficient)
    • after properly initializing the above parameters, you train the HMM model, feeding it the set of sequences belong to this class (EM algorithm).

    2) Prediction

    Next to classify a new sequence X:

    • you compute the log-likelihood of the sequence using each model log P(X|model_k)
    • then you pick the class that gave the highest probability. This is the class prediction.

    As I mentioned in the comments, the Statistics Toolbox only implement discrete observation HMM models, so you will have to find another libraries or implement the code yourself. Kevin Murphy's toolboxes (HMM toolbox, BNT, PMTK3) are popular choices in this domain.

    Here are some answers I posted in the past using Kevin Murphy's toolboxes:

    • Issue in training hidden markov model and usage for classification
    • Simple example/use-case for a BNT gaussian_CPD

    The above answers are somewhat different from what you are trying to do here, but it's a good place to start.

    0 讨论(0)
  • 2021-01-06 16:52

    The statement/case tells to build and train a hidden Markov's model having following components specially using murphyk's toolbox for HMM as per the choice:

    1. O = Observation's vector
    2. Q = States vector
    3. T = vectors sequence
    4. nex = number of sequences
    5. M = number of mixtures

    Demo Code (from murphyk's toolbox):

        O = 8;          %Number of coefficients in a vector
        T = 420;         %Number of vectors in a sequence
        nex = 1;        %Number of sequences
        M = 1;          %Number of mixtures
        Q = 6;          %Number of states
    
    
    
    data = randn(O,T,nex);
    
    % initial guess of parameters
    prior0 = normalise(rand(Q,1));
    transmat0 = mk_stochastic(rand(Q,Q));
    
    if 0
        Sigma0 = repmat(eye(O), [1 1 Q M]);
        % Initialize each mean to a random data point
        indices = randperm(T*nex);
        mu0 = reshape(data(:,indices(1:(Q*M))), [O Q M]);
        mixmat0 = mk_stochastic(rand(Q,M));
    else
        [mu0, Sigma0] = mixgauss_init(Q*M, data, 'full');
        mu0 = reshape(mu0, [O Q M]);
        Sigma0 = reshape(Sigma0, [O O Q M]);
        mixmat0 = mk_stochastic(rand(Q,M));
    end
    
    [LL, prior1, transmat1, mu1, Sigma1, mixmat1] = ...
        mhmm_em(data, prior0, transmat0, mu0, Sigma0, mixmat0, 'max_iter', 5);
    
    
    loglik = mhmm_logprob(data, prior1, transmat1, mu1, Sigma1, mixmat1);
    
    0 讨论(0)
提交回复
热议问题