Here, $x_1,x_2,x_3,x_4$ are the support vectors.
In Support Vector Machine(SVM) statistical learning method, we have to maximize the margin. Support vectors (difficult points) are those data points that the margin pushes up against the decision boundary.
Hard-Margin SVM is for linearly separable data otherwise we use the soft-margin SVM for non-linearly separable data using the kernel trick.
If the Largrangian Multiplier for the optimization of the objective function is $\alpha$ then $\alpha \neq 0$ for the support vectors and for rest of the points, it is zero. So, we can verify it by solving the convex optimization problem for the SVM and find the Lagrangian Multiplier and check whether it is non-zero or not.