There are two classes, let\'s call them X and O. A number of elements belonging to these classes are spread out in the xy-plane. Here is an example where the two classes are
Computing a linear SVM then determining which side of the computed plane with optimal marginals each point lies on will tell you if the points are linearly separable.
This is overkill, but if you need a quick one off solution, there are many existing SVM libraries that will do this for you.
As mentioned by ElKamina, Linear Perceptron is guaranteed to find a solution if one exists. This approach is not efficient for large dimensions. Computationally the most effective way to decide whether two sets of points are linearly separable is by applying linear programming.
A code with an example to solve using Perceptron in Matlab is here
Well, both Perceptron and SVM (Support Vector Machines) can tell if two data sets are separable linearly, but SVM can find the Optimal Hiperplane of separability. Besides, it can work with n-dimensional vectors, not only points.
It is used in applications such as face recognition. I recomend to go deep into this topic.