DL4J linear regression

笑着哭i 提交于 2020-12-04 03:55:54

问题


I am new in neural networks. I am trying to implement and train simple neural network with DL4j. My function:

y = x * 2 + 300

My vision

My result

Parameters:

    public final int seed = 12345;
    public final int iterations = 1;
    public final int nEpochs = 1;
    public final int batchSize = 1000;
    public final double learningRate = 0.01;
    public final Random rng = new Random(seed);
    public final int numInputs = 2;
    public final int numOutputs = 1;
    public final double maxX = 100;//xmax = 100; ymax=500. 
    public final double scale = 500;//for scale out x and y. 

Network configuration:

    public MultiLayerConfiguration createConf() {
        return new NeuralNetConfiguration.Builder()
                .seed(seed)
                .iterations(iterations)
                .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                .learningRate(learningRate)
                .weightInit(WeightInit.XAVIER)
                .updater(new Nesterovs(0.9))
                .list()
                .layer(0, new OutputLayer.Builder(LossFunctions.LossFunction.MSE)
                        .activation(Activation.IDENTITY)
                        .nIn(numInputs).nOut(numOutputs).build())
                .pretrain(false).backprop(true).build();
    }

Training data:

    public DataSetIterator generateTrainingData() {

        List<DataSet> list = new ArrayList<>();

        for (int i = 0; i < batchSize; i++) {

            double x = rng.nextDouble() * maxX * (rng.nextBoolean() ? 1 : -1);
            double y = y(x);

            list.add(
                    new DataSet(
                            Nd4j.create(new double[]{x / scale, 1}),
                            Nd4j.create(new double[]{y / scale})
                    )
            );
        }

        return new ListDataSetIterator(list, batchSize);
    }

Testing:

    public void test() {

        final MultiLayerNetwork net = new MultiLayerNetwork(createConf());
        net.init();
        net.setListeners(new ScoreIterationListener(1));

        for (int i = 0; i < nEpochs; i++) {
            net.fit(generateTrainingData());
        }

        int idx = 0;
        double x[] = new double[19];
        double y[] = new double[19];
        double p[] = new double[19];
        for (double i = -90; i < 100; i += 10) {
            x[idx] = i;
            y[idx] = y(i);
            p[idx] = scale * net.output(Nd4j.create(new double[]{i / scale, 1})).getDouble(0, 0);
            idx++;
        }
        plot(x, y, p);
    }

Please tell me what i am doing wrong or if i have incorrect vision...

Thank you in advance, Regards, Minas


回答1:


Take a look at this example: https://github.com/deeplearning4j/dl4j-examples/tree/master/dl4j-examples/src/main/java/org/deeplearning4j/examples/feedforward/regression

Few tips:

Use our built in normalization tools. Don't do this yourself. Our normalization tools allow you to normalize labels as well.

Turn minibatch off (set minibatch(false) on the neural net config near the top) Ultimately you still aren't actually doing "minibatch learning"

Also, you're regenerating the dataset each time. There's no need to do that. Just create it once and pass it in to fit.

For visualization purposes, use the restore mechanism I mentioned earlier (This is in the example, you can pick 1 of any of the normalizers like MinMaxScalar, NormalizeStandardize,.. etc)

Your iterations are also wrong. Just keep that value at 1 and keep your for loop. Otherwise you're just overfitting and spending way more of your training time then you need to. An "iteration" is actually the number of updates you want to run per fit call on the same dataset. Next release we are getting rid of that option anyways.



来源:https://stackoverflow.com/questions/48135551/dl4j-linear-regression

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!