How to train a neural network to supervised data set using pybrain black-box optimization?

前端 未结 1 557
闹比i
闹比i 2021-01-30 23:46

I have played around a bit with pybrain and understand how to generate neural networks with custom architectures and train them to supervised data sets using backpropagation alg

1条回答
  •  孤街浪徒
    2021-01-31 00:34

    I finally worked it out!! Its always easy once you know how!

    Essentially the first arg to the GA is the fitness function (called evaluator in docs) which must take the second argument (an individual, called evaluable in docs) as its only arg.

    In this example will train to XOR

    from pybrain.datasets.classification import ClassificationDataSet
    # below line can be replaced with the algorithm of choice e.g.
    # from pybrain.optimization.hillclimber import HillClimber
    from pybrain.optimization.populationbased.ga import GA
    from pybrain.tools.shortcuts import buildNetwork
    
    # create XOR dataset
    d = ClassificationDataSet(2)
    d.addSample([0., 0.], [0.])
    d.addSample([0., 1.], [1.])
    d.addSample([1., 0.], [1.])
    d.addSample([1., 1.], [0.])
    d.setField('class', [ [0.],[1.],[1.],[0.]])
    
    nn = buildNetwork(2, 3, 1)
    # d.evaluateModuleMSE takes nn as its first and only argument
    ga = GA(d.evaluateModuleMSE, nn, minimize=True)
    for i in range(100):
        nn = ga.learn(0)[0]
    

    Test results after the above script:

    In [68]: nn.activate([0,0])
    Out[68]: array([-0.07944574])
    
    In [69]: nn.activate([1,0])
    Out[69]: array([ 0.97635635])
    
    In [70]: nn.activate([0,1])
    Out[70]: array([ 1.0216745])
    
    In [71]: nn.activate([1,1])
    Out[71]: array([ 0.03604205])
    

    0 讨论(0)
提交回复
热议问题