SVM: Are all the support vectors necessarily used to construct the weights
up vote
2
down vote
favorite
I know what exactly the SVM is and very clear about the principles and algorithms. But I'm curious about are all support vectors necessarily used in constructing weights $w$.
Let $w^Tx_i + b$ denote SVM model. Based on KKT dual complementarity condition,
$alpha_i[y_i(w^Tx_i+b)-1]=0, i=1,cdots,N,$
where $w$ and $b$ are weights and bias, $alpha_i$ is Lagrange multiplier.
we know that if $alpha_i not=0$, then $y_i(w^Tx_i+b)-1=0$. And such a $x_i$ is called support vector. Sequentially, the estimation of weights $w = sum_{i=1}^malpha_{(i)}y_{(i)}x_{(i)}$ where $x_{(i)}s$ are the support vectors.
What I'm confused about is:
Is it possible that some $alpha_0=0$ and $y_0(w^Tx_0+b)-1=0$ hold simultaneously so that such a $x_i$ is a support vector itself, but is not used in constructing weights $w$, since $alpha_0=0$.
Thanks for any ideas. :)
statistics machine-learning
add a comment |
up vote
2
down vote
favorite
I know what exactly the SVM is and very clear about the principles and algorithms. But I'm curious about are all support vectors necessarily used in constructing weights $w$.
Let $w^Tx_i + b$ denote SVM model. Based on KKT dual complementarity condition,
$alpha_i[y_i(w^Tx_i+b)-1]=0, i=1,cdots,N,$
where $w$ and $b$ are weights and bias, $alpha_i$ is Lagrange multiplier.
we know that if $alpha_i not=0$, then $y_i(w^Tx_i+b)-1=0$. And such a $x_i$ is called support vector. Sequentially, the estimation of weights $w = sum_{i=1}^malpha_{(i)}y_{(i)}x_{(i)}$ where $x_{(i)}s$ are the support vectors.
What I'm confused about is:
Is it possible that some $alpha_0=0$ and $y_0(w^Tx_0+b)-1=0$ hold simultaneously so that such a $x_i$ is a support vector itself, but is not used in constructing weights $w$, since $alpha_0=0$.
Thanks for any ideas. :)
statistics machine-learning
add a comment |
up vote
2
down vote
favorite
up vote
2
down vote
favorite
I know what exactly the SVM is and very clear about the principles and algorithms. But I'm curious about are all support vectors necessarily used in constructing weights $w$.
Let $w^Tx_i + b$ denote SVM model. Based on KKT dual complementarity condition,
$alpha_i[y_i(w^Tx_i+b)-1]=0, i=1,cdots,N,$
where $w$ and $b$ are weights and bias, $alpha_i$ is Lagrange multiplier.
we know that if $alpha_i not=0$, then $y_i(w^Tx_i+b)-1=0$. And such a $x_i$ is called support vector. Sequentially, the estimation of weights $w = sum_{i=1}^malpha_{(i)}y_{(i)}x_{(i)}$ where $x_{(i)}s$ are the support vectors.
What I'm confused about is:
Is it possible that some $alpha_0=0$ and $y_0(w^Tx_0+b)-1=0$ hold simultaneously so that such a $x_i$ is a support vector itself, but is not used in constructing weights $w$, since $alpha_0=0$.
Thanks for any ideas. :)
statistics machine-learning
I know what exactly the SVM is and very clear about the principles and algorithms. But I'm curious about are all support vectors necessarily used in constructing weights $w$.
Let $w^Tx_i + b$ denote SVM model. Based on KKT dual complementarity condition,
$alpha_i[y_i(w^Tx_i+b)-1]=0, i=1,cdots,N,$
where $w$ and $b$ are weights and bias, $alpha_i$ is Lagrange multiplier.
we know that if $alpha_i not=0$, then $y_i(w^Tx_i+b)-1=0$. And such a $x_i$ is called support vector. Sequentially, the estimation of weights $w = sum_{i=1}^malpha_{(i)}y_{(i)}x_{(i)}$ where $x_{(i)}s$ are the support vectors.
What I'm confused about is:
Is it possible that some $alpha_0=0$ and $y_0(w^Tx_0+b)-1=0$ hold simultaneously so that such a $x_i$ is a support vector itself, but is not used in constructing weights $w$, since $alpha_0=0$.
Thanks for any ideas. :)
statistics machine-learning
statistics machine-learning
asked Nov 13 at 22:29
Jing Zeng
111
111
add a comment |
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2997437%2fsvm-are-all-the-support-vectors-necessarily-used-to-construct-the-weights%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown