in Others recategorized by
641 views
0 votes
0 votes

​​​​​​Consider the dataset with six datapoints: $\left\{\left(\text{x}_{1}, \text{y}_{1}\right),\left(\text{x}_{2}, \text{y}_{2}\right), \ldots,\left(\text{x}_{6}, \text{y}_{6}\right)\right\}$, where $x_{1}=\left[\begin{array}{l}1 \\ 0\end{array}\right], x_{2}=\left[\begin{array}{l}0 \\ 1\end{array}\right], x_{3}=\left[\begin{array}{c}0 \\ -1\end{array}\right], x_{4}=\left[\begin{array}{c}-1 \\ 0\end{array}\right], x_{5}=\left[\begin{array}{l}2 \\ 2\end{array}\right], x_{6}=\left[\begin{array}{l}-2 \\ -2\end{array}\right]$ and the labels are given by $\text{y}_{1}=\text{y}_{2}=\text{y}_{5}=1$, and $\text{y}_{3}=\text{y}_{4}=\text{y}_{6}=-1$. A hard margin linear support vector machine is trained on the above dataset.


Which ONE of the following sets is a possible set of support vectors?

  1. $\left\{x_{1}, x_{2}, x_{5}\right\}$
  2. $\left\{x_{3}, x_{4}, x_{5}\right\}$
  3. $\left\{x_{4}, x_{5}\right\}$
  4. $\left\{x_{1}, x_{2}, x_{3}, x_{4}\right\}$

in Others recategorized by
by
641 views

1 Answer

1 vote
1 vote

Here, $x_1,x_2,x_3,x_4$ are the support vectors.

    

  

In Support Vector Machine(SVM) statistical learning method, we have to maximize the margin. Support vectors (difficult points)  are those data points that the margin pushes up against the decision boundary.

   

Hard-Margin SVM is for linearly separable data otherwise we use the soft-margin SVM for non-linearly separable data using the kernel trick.

   

If the Largrangian Multiplier for the optimization of the objective function is $\alpha$ then $\alpha \neq 0$ for the support vectors and for rest of the points, it is zero. So, we can verify it by solving the convex optimization problem for the SVM and find the Lagrangian Multiplier and check whether it is non-zero or not.    

Related questions