What is a Learning Curve in machine learning?

后端 未结 10 2057
有刺的猬
有刺的猬 2020-12-23 02:15

I want to know what a learning curve in machine learning is. What is the standard way of plotting it? I mean what should be the x and y axis of my plot?

相关标签:
10条回答
  • 2020-12-23 02:21

    In Andrew's machine learning class, a learning curve is the plot of the training/cross-validation error versus the sample size. The learning curve can be used to detect whether the model has the high bias or high variance. If the model suffers from high bias problem, as the sample size increases, training error will increase and the cross validation error will decrease and at last they will be very close to each other but still at a high error rate for both training and classification error. And increasing the sample size will not help much for high bias problem.

    If the model suffers from high variance, as the keep increasing the sample size, the training error will keep increasing and cross-validation error will keep decreasing and they will end up at a low training and cross-validation error rate. So more samples will help to improve the model prediction performance if the model suffer from high variance.

    0 讨论(0)
  • 2020-12-23 02:22

    In simple terms, the learning curve is a plot between the number of instances and a metric such as loss or accuracy. This plot shows the journey learning with the gain of experience and hence is named learning curve. Learning curves are widely used in machine learning for algorithms that learn (optimize their internal parameters) incrementally over time, such as deep learning neural networks.

    0 讨论(0)
  • 2020-12-23 02:25

    Basically, a machine learning curve allows you to find the point from which the algorithm starts to learn. If you take a curve and then slice a slope tangent for derivative at the point that it starts to reach constant is when it starts to build its learning ability.

    Depending on how your x and y axis are mapped, one of your axis will start to approach a constant value while the other axis's values will keep increasing. This is when you start seeing some learning. The whole curve pretty much allows you to measure the rate at which your algorithm is able to learn. The maximum point is usually when the slope starts to recede. You can take a number of derivative measures to the maximum/minimum point.

    So from the above examples you can see that the curve is gradually tending towards a constant value. It initially starts to harness its learning through the training examples and the slope widens at maximum/mimimum point where it tends to approach closer and closer towards the constant state. At this point it is able to pick up new examples from test data and find new and unique results from data. You would have such x/y axis measures for epochs vs error.

    0 讨论(0)
  • 2020-12-23 02:30

    It is a Graph that compares the performance of a model on preparing and testing data over a changing number of training instances and these are a generally utilized as analytic instrument in machine learning for calculations that learn from a training dataset incrementally. It allows us to verify when a model has learning as much as it can about the data.

    There are three kinds of expectations to Learning curves absorb information

    • Bad Learning Curve: High Bias
    • Bad Learning Curve: High Variance
    • Ideal Learning Curve
    0 讨论(0)
  • 2020-12-23 02:32

    I just want to leave a brief note on this old question to point out that learning curve and ROC curve are not synonymous.

    As indicated in the other answers to this question, a learning curve conventionally depicts improvement in performance on the vertical axis when there are changes in another parameter (on the horizontal axis), such as training set size (in machine learning) or iteration/time (in both machine and biological learning). One salient point is that many parameters of the model are changing at different points on the plot. Other answers here have done a great job of illustrating learning curves.

    (There is also another meaning of learning curve in industrial manufacturing, originating in an observation in the 1930s that the number of labor hours needed to produce an individual unit decreases at a uniform rate as the quantity of units manufactured doubles. It isn't really relevant but is worth noting for completeness and to avoid confusion in web searches.)

    In contrast, Receiver Operating Characteristic curve, or ROC curve, does not show learning; it shows performance. An ROC curve is a graphical depiction of classifier performance that shows the trade-off between increasing true positive rates (on the vertical axis) and increasing false positive rates (on the horizontal axis) as the discrimination threshold of the classifier is varied. Thus, only a single parameter (the decision / discrimination threshold) associated with the model is changing at different points on the plot. This ROC curve (from Wikipedia) shows performance of three different classifiers.

    ROC curve, see previous link for CC licensing

    There is no learning being depicted here, but rather performance with respect to two different classes of success/error as the classifier's decision threshold is made more lenient/strict. By looking at the area under the curve, we can see an overall indication of the ability of the classifier to distinguish the classes. This area-under-the-curve metric is insensitive to the number of members in the two classes, so it may not reflect actual performance if class membership is unbalanced. The ROC curve has many subtitles and interested readers might check out:

    Fawcett, Tom. "ROC graphs: Notes and practical considerations for researchers." Machine Learning 31 (2004): 1-38.

    Swets, John A., Robyn M. Dawes, and John Monahan. "Better decisions through Science." Scientific American (2000): 83.

    0 讨论(0)
  • 2020-12-23 02:34

    How can you determine for a given model whether more training points will be helpful? A useful diagnostic for this are learning curves.

    • Plot of the prediction accuracy/error vs. the training set size (i.e.: how better does the model get at predicting the target as you the increase number of instances used to train it)

    • Learning curve conventionally depicts improvement in performance on the vertical axis when there are changes in another parameter (on the horizontal axis), such as training set size (in machine learning) or iteration/time

    • A learning curve is often useful to plot for algorithmic sanity checking or improving performance

    • Learning curve plotting can help diagnose the problems your algorithm will be suffering from

    Personally, the below two links helped me to understand better about this concept

    Learning Curve

    Sklearn Learning Curve

    0 讨论(0)
提交回复
热议问题