Minimizing Jensen-Shannon Divergence with constraints
up vote
0
down vote
favorite
I am trying to minimize the following function :
$$J(p) = JSD(p_u || p)$$
with constraints :
$$ int p = 1 $$
$$ p(x) geq hat{pi} p_p(x) $$
where $JSD$ is the Jensen Shannon Divergence, $p_u = pi p_p + (1-pi) p_n$, $hat{pi} > pi$, and $p_p$ and $p_n$ are probability distributions with disjoint supports.
I formulated the problem as a minimization problem with constraints to try to solve it with a Lagrangian, but I don't really know how (and if) this works on a functional space.
edit:
Let L be the Lagrangian: $$L(p,lambda,mu) = JSD(p_u||p)-lambda int p -1 - int_x mu(x) (hat{pi}p_p - p)(x)dx$$
Functional derivative yields:
$$frac{partial L}{partial p} = ln(frac{2p}{p+p_u})-lambda+mu p = 0$$
After a discussion on the value of $mu$ (0 or not), I show that as I suspected :
On $supp(p_p)$, $p(x) = hat{pi}p_p$
On $supp(p_n)$, $p(x) = (1-hat{pi})p_n$
Two questions remains :
How to prove rigorously the necessary condition on the minimum (with the Lagrangian...) ?
How to prove there is a global minimum ? (sufficient condition)
functional-analysis optimization calculus-of-variations lagrange-multiplier
add a comment |
up vote
0
down vote
favorite
I am trying to minimize the following function :
$$J(p) = JSD(p_u || p)$$
with constraints :
$$ int p = 1 $$
$$ p(x) geq hat{pi} p_p(x) $$
where $JSD$ is the Jensen Shannon Divergence, $p_u = pi p_p + (1-pi) p_n$, $hat{pi} > pi$, and $p_p$ and $p_n$ are probability distributions with disjoint supports.
I formulated the problem as a minimization problem with constraints to try to solve it with a Lagrangian, but I don't really know how (and if) this works on a functional space.
edit:
Let L be the Lagrangian: $$L(p,lambda,mu) = JSD(p_u||p)-lambda int p -1 - int_x mu(x) (hat{pi}p_p - p)(x)dx$$
Functional derivative yields:
$$frac{partial L}{partial p} = ln(frac{2p}{p+p_u})-lambda+mu p = 0$$
After a discussion on the value of $mu$ (0 or not), I show that as I suspected :
On $supp(p_p)$, $p(x) = hat{pi}p_p$
On $supp(p_n)$, $p(x) = (1-hat{pi})p_n$
Two questions remains :
How to prove rigorously the necessary condition on the minimum (with the Lagrangian...) ?
How to prove there is a global minimum ? (sufficient condition)
functional-analysis optimization calculus-of-variations lagrange-multiplier
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am trying to minimize the following function :
$$J(p) = JSD(p_u || p)$$
with constraints :
$$ int p = 1 $$
$$ p(x) geq hat{pi} p_p(x) $$
where $JSD$ is the Jensen Shannon Divergence, $p_u = pi p_p + (1-pi) p_n$, $hat{pi} > pi$, and $p_p$ and $p_n$ are probability distributions with disjoint supports.
I formulated the problem as a minimization problem with constraints to try to solve it with a Lagrangian, but I don't really know how (and if) this works on a functional space.
edit:
Let L be the Lagrangian: $$L(p,lambda,mu) = JSD(p_u||p)-lambda int p -1 - int_x mu(x) (hat{pi}p_p - p)(x)dx$$
Functional derivative yields:
$$frac{partial L}{partial p} = ln(frac{2p}{p+p_u})-lambda+mu p = 0$$
After a discussion on the value of $mu$ (0 or not), I show that as I suspected :
On $supp(p_p)$, $p(x) = hat{pi}p_p$
On $supp(p_n)$, $p(x) = (1-hat{pi})p_n$
Two questions remains :
How to prove rigorously the necessary condition on the minimum (with the Lagrangian...) ?
How to prove there is a global minimum ? (sufficient condition)
functional-analysis optimization calculus-of-variations lagrange-multiplier
I am trying to minimize the following function :
$$J(p) = JSD(p_u || p)$$
with constraints :
$$ int p = 1 $$
$$ p(x) geq hat{pi} p_p(x) $$
where $JSD$ is the Jensen Shannon Divergence, $p_u = pi p_p + (1-pi) p_n$, $hat{pi} > pi$, and $p_p$ and $p_n$ are probability distributions with disjoint supports.
I formulated the problem as a minimization problem with constraints to try to solve it with a Lagrangian, but I don't really know how (and if) this works on a functional space.
edit:
Let L be the Lagrangian: $$L(p,lambda,mu) = JSD(p_u||p)-lambda int p -1 - int_x mu(x) (hat{pi}p_p - p)(x)dx$$
Functional derivative yields:
$$frac{partial L}{partial p} = ln(frac{2p}{p+p_u})-lambda+mu p = 0$$
After a discussion on the value of $mu$ (0 or not), I show that as I suspected :
On $supp(p_p)$, $p(x) = hat{pi}p_p$
On $supp(p_n)$, $p(x) = (1-hat{pi})p_n$
Two questions remains :
How to prove rigorously the necessary condition on the minimum (with the Lagrangian...) ?
How to prove there is a global minimum ? (sufficient condition)
functional-analysis optimization calculus-of-variations lagrange-multiplier
functional-analysis optimization calculus-of-variations lagrange-multiplier
edited Nov 16 at 12:36
asked Nov 7 at 11:40
Mathieu
12
12
add a comment |
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2988426%2fminimizing-jensen-shannon-divergence-with-constraints%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown