Prove $f(x)=x|x|$ is differentiable
up vote
0
down vote
favorite
I am trying to prove that $f:mathbb{R}^2rightarrowmathbb{R}^2$,
$$f(x)=x|x|$$
is differentiable as a part of a larger task.
I think there two ways to approach this:
By proving that for every $xinmathbb{R}^2$ there is a linear function $A$ for which
$$f(x+h)=f(x)+Ah+|h|epsilon(h)$$
where $epsilon(h)rightarrow0$ when $hrightarrow0$By proving that all of the first order partial derivatives of $f$ exist and are continuous.
For the sake of my own understanding, I would like to know how to prove this with both of the ways. Here are my attempts so far:
- $f(x+h)=(x_{1}sqrt{x_{1}^2+x_{2}^2}+h_{1},x_{2}sqrt{x_{1}^2+x_{2}^2}+h_{2})=(x_{1}sqrt{x_{1}^2+x_{2}^2},x_{2}sqrt{x_{1}^2+x_{2}^2})+(h_{1},h_{2})=f(x)+(h_{1},h_{2})$
I don't know how to go on with this since I'm not sure how I'm supposed to choose $A$. If $f$ is differentiable, $A$ should be $Df(x)$ but I don't know how should I manipulate the expression to achieve that.
- For the second way, I'm not sure how to write out the general form of the first order partial derivatives for $f$.
multivariable-calculus derivatives partial-derivative
add a comment |
up vote
0
down vote
favorite
I am trying to prove that $f:mathbb{R}^2rightarrowmathbb{R}^2$,
$$f(x)=x|x|$$
is differentiable as a part of a larger task.
I think there two ways to approach this:
By proving that for every $xinmathbb{R}^2$ there is a linear function $A$ for which
$$f(x+h)=f(x)+Ah+|h|epsilon(h)$$
where $epsilon(h)rightarrow0$ when $hrightarrow0$By proving that all of the first order partial derivatives of $f$ exist and are continuous.
For the sake of my own understanding, I would like to know how to prove this with both of the ways. Here are my attempts so far:
- $f(x+h)=(x_{1}sqrt{x_{1}^2+x_{2}^2}+h_{1},x_{2}sqrt{x_{1}^2+x_{2}^2}+h_{2})=(x_{1}sqrt{x_{1}^2+x_{2}^2},x_{2}sqrt{x_{1}^2+x_{2}^2})+(h_{1},h_{2})=f(x)+(h_{1},h_{2})$
I don't know how to go on with this since I'm not sure how I'm supposed to choose $A$. If $f$ is differentiable, $A$ should be $Df(x)$ but I don't know how should I manipulate the expression to achieve that.
- For the second way, I'm not sure how to write out the general form of the first order partial derivatives for $f$.
multivariable-calculus derivatives partial-derivative
For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
– Calum Gilhooley
Nov 18 at 13:53
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am trying to prove that $f:mathbb{R}^2rightarrowmathbb{R}^2$,
$$f(x)=x|x|$$
is differentiable as a part of a larger task.
I think there two ways to approach this:
By proving that for every $xinmathbb{R}^2$ there is a linear function $A$ for which
$$f(x+h)=f(x)+Ah+|h|epsilon(h)$$
where $epsilon(h)rightarrow0$ when $hrightarrow0$By proving that all of the first order partial derivatives of $f$ exist and are continuous.
For the sake of my own understanding, I would like to know how to prove this with both of the ways. Here are my attempts so far:
- $f(x+h)=(x_{1}sqrt{x_{1}^2+x_{2}^2}+h_{1},x_{2}sqrt{x_{1}^2+x_{2}^2}+h_{2})=(x_{1}sqrt{x_{1}^2+x_{2}^2},x_{2}sqrt{x_{1}^2+x_{2}^2})+(h_{1},h_{2})=f(x)+(h_{1},h_{2})$
I don't know how to go on with this since I'm not sure how I'm supposed to choose $A$. If $f$ is differentiable, $A$ should be $Df(x)$ but I don't know how should I manipulate the expression to achieve that.
- For the second way, I'm not sure how to write out the general form of the first order partial derivatives for $f$.
multivariable-calculus derivatives partial-derivative
I am trying to prove that $f:mathbb{R}^2rightarrowmathbb{R}^2$,
$$f(x)=x|x|$$
is differentiable as a part of a larger task.
I think there two ways to approach this:
By proving that for every $xinmathbb{R}^2$ there is a linear function $A$ for which
$$f(x+h)=f(x)+Ah+|h|epsilon(h)$$
where $epsilon(h)rightarrow0$ when $hrightarrow0$By proving that all of the first order partial derivatives of $f$ exist and are continuous.
For the sake of my own understanding, I would like to know how to prove this with both of the ways. Here are my attempts so far:
- $f(x+h)=(x_{1}sqrt{x_{1}^2+x_{2}^2}+h_{1},x_{2}sqrt{x_{1}^2+x_{2}^2}+h_{2})=(x_{1}sqrt{x_{1}^2+x_{2}^2},x_{2}sqrt{x_{1}^2+x_{2}^2})+(h_{1},h_{2})=f(x)+(h_{1},h_{2})$
I don't know how to go on with this since I'm not sure how I'm supposed to choose $A$. If $f$ is differentiable, $A$ should be $Df(x)$ but I don't know how should I manipulate the expression to achieve that.
- For the second way, I'm not sure how to write out the general form of the first order partial derivatives for $f$.
multivariable-calculus derivatives partial-derivative
multivariable-calculus derivatives partial-derivative
asked Nov 18 at 12:32
Joe
1555
1555
For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
– Calum Gilhooley
Nov 18 at 13:53
add a comment |
For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
– Calum Gilhooley
Nov 18 at 13:53
For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
– Calum Gilhooley
Nov 18 at 13:53
For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
– Calum Gilhooley
Nov 18 at 13:53
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
Part 1.
Here is a derivation (no pun intended) of $f'(x)(h)$ from first
principles. It is valid not just in $mathbb{R}^2$, but in any real inner
product space $E$, not necessarily even finite-dimensional. ($E$ is
not even assumed to be complete; but if it isn't, then I don't think
one is allowed to speak of $f$ being "differentiable" at $x$.)
First, we make some estimates:
(i) By the Triangle Inequality,
$lvertlVert x + h rVert - lVert x rVertrvert leqslant
lVert h rVert$.
(ii) If $x ne 0$, then by (i), as $h to 0$,
$$
leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
- frac{1}{lVert x rVert}rightrvert =
frac{lvertlVert x rVert - lVert x + h rVertrvert}
{(lVert x + h rVert + lVert x rVert)lVert x rVert}
leqslant frac{lVert h rVert}{lVert x rVert^2} =
O(lVert h rVert).
$$
(iii) The Cauchy-Schwarz inequality
$lvertleftlangle x, h rightranglervert leqslant
lVert x rVert lVert h rVert$ gives
$leftlangle x, h rightrangle = O(lVert h rVert)$.
Now, for all $x, h in E$,
begin{align*}
f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)(x + h)
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
&& text{by (i).}
end{align*}
This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
on, we assume that $x ne 0$.
begin{gather*}
f(x + h) - f(x) - lVert x rVert h =
frac{lVert x + h rVert^2 - lVert x rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
\ =
frac{2leftlangle x, h rightrangle + lVert h rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
=
frac{2}{lVert x + h rVert + lVert x rVert}
leftlangle x, h rightrangle x +
O(lVert h rVert^2)
\ =
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
left(frac{2}{lVert x + h rVert + lVert x rVert} -
frac{1}{lVert x rVert}right)
leftlangle x, h rightrangle x +
O(lVert h rVert^2).
end{gather*}
Therefore, by (ii) and (iii),
$$
f(x + h) = f(x) + lVert x rVert h +
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
O(lVert h rVert^2).
$$
This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
(The main result used there - apart from the Chain Rule, and the formula for
the derivative of the square root function on $mathbb{R}_{>0}$ - is
Frechet derivative for bilinear map.)
Part 2.
For simplicity [but at some risk of confusion with the earlier use of the symbol
'$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
$$
(u, v) = f(x, y) = r(x, y) = (rx, ry),
text{ where } r = sqrt{x^2 + y^2}.
$$
The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
answer has now been deleted, I'll go over the same ground here.
We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
begin{align*}
|u(h, 0)| & = h^2, v(h, 0) = 0, \
|v(0, k)| & = k^2, u(0, k) = 0,
end{align*}
showing that the partial derivatives
$D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.
Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
$partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
$$
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
r begin{pmatrix} h \ k end{pmatrix} +
frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
$$
in agreement with the previous result.
In a convenient but admittedly loose notation, simply denoting the
separate convergence of all four matrix entries,
$$
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix} =
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
$$
showing that all four partial derivatives are continuous everywhere. $square$
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
Part 1.
Here is a derivation (no pun intended) of $f'(x)(h)$ from first
principles. It is valid not just in $mathbb{R}^2$, but in any real inner
product space $E$, not necessarily even finite-dimensional. ($E$ is
not even assumed to be complete; but if it isn't, then I don't think
one is allowed to speak of $f$ being "differentiable" at $x$.)
First, we make some estimates:
(i) By the Triangle Inequality,
$lvertlVert x + h rVert - lVert x rVertrvert leqslant
lVert h rVert$.
(ii) If $x ne 0$, then by (i), as $h to 0$,
$$
leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
- frac{1}{lVert x rVert}rightrvert =
frac{lvertlVert x rVert - lVert x + h rVertrvert}
{(lVert x + h rVert + lVert x rVert)lVert x rVert}
leqslant frac{lVert h rVert}{lVert x rVert^2} =
O(lVert h rVert).
$$
(iii) The Cauchy-Schwarz inequality
$lvertleftlangle x, h rightranglervert leqslant
lVert x rVert lVert h rVert$ gives
$leftlangle x, h rightrangle = O(lVert h rVert)$.
Now, for all $x, h in E$,
begin{align*}
f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)(x + h)
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
&& text{by (i).}
end{align*}
This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
on, we assume that $x ne 0$.
begin{gather*}
f(x + h) - f(x) - lVert x rVert h =
frac{lVert x + h rVert^2 - lVert x rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
\ =
frac{2leftlangle x, h rightrangle + lVert h rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
=
frac{2}{lVert x + h rVert + lVert x rVert}
leftlangle x, h rightrangle x +
O(lVert h rVert^2)
\ =
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
left(frac{2}{lVert x + h rVert + lVert x rVert} -
frac{1}{lVert x rVert}right)
leftlangle x, h rightrangle x +
O(lVert h rVert^2).
end{gather*}
Therefore, by (ii) and (iii),
$$
f(x + h) = f(x) + lVert x rVert h +
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
O(lVert h rVert^2).
$$
This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
(The main result used there - apart from the Chain Rule, and the formula for
the derivative of the square root function on $mathbb{R}_{>0}$ - is
Frechet derivative for bilinear map.)
Part 2.
For simplicity [but at some risk of confusion with the earlier use of the symbol
'$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
$$
(u, v) = f(x, y) = r(x, y) = (rx, ry),
text{ where } r = sqrt{x^2 + y^2}.
$$
The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
answer has now been deleted, I'll go over the same ground here.
We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
begin{align*}
|u(h, 0)| & = h^2, v(h, 0) = 0, \
|v(0, k)| & = k^2, u(0, k) = 0,
end{align*}
showing that the partial derivatives
$D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.
Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
$partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
$$
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
r begin{pmatrix} h \ k end{pmatrix} +
frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
$$
in agreement with the previous result.
In a convenient but admittedly loose notation, simply denoting the
separate convergence of all four matrix entries,
$$
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix} =
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
$$
showing that all four partial derivatives are continuous everywhere. $square$
add a comment |
up vote
1
down vote
Part 1.
Here is a derivation (no pun intended) of $f'(x)(h)$ from first
principles. It is valid not just in $mathbb{R}^2$, but in any real inner
product space $E$, not necessarily even finite-dimensional. ($E$ is
not even assumed to be complete; but if it isn't, then I don't think
one is allowed to speak of $f$ being "differentiable" at $x$.)
First, we make some estimates:
(i) By the Triangle Inequality,
$lvertlVert x + h rVert - lVert x rVertrvert leqslant
lVert h rVert$.
(ii) If $x ne 0$, then by (i), as $h to 0$,
$$
leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
- frac{1}{lVert x rVert}rightrvert =
frac{lvertlVert x rVert - lVert x + h rVertrvert}
{(lVert x + h rVert + lVert x rVert)lVert x rVert}
leqslant frac{lVert h rVert}{lVert x rVert^2} =
O(lVert h rVert).
$$
(iii) The Cauchy-Schwarz inequality
$lvertleftlangle x, h rightranglervert leqslant
lVert x rVert lVert h rVert$ gives
$leftlangle x, h rightrangle = O(lVert h rVert)$.
Now, for all $x, h in E$,
begin{align*}
f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)(x + h)
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
&& text{by (i).}
end{align*}
This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
on, we assume that $x ne 0$.
begin{gather*}
f(x + h) - f(x) - lVert x rVert h =
frac{lVert x + h rVert^2 - lVert x rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
\ =
frac{2leftlangle x, h rightrangle + lVert h rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
=
frac{2}{lVert x + h rVert + lVert x rVert}
leftlangle x, h rightrangle x +
O(lVert h rVert^2)
\ =
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
left(frac{2}{lVert x + h rVert + lVert x rVert} -
frac{1}{lVert x rVert}right)
leftlangle x, h rightrangle x +
O(lVert h rVert^2).
end{gather*}
Therefore, by (ii) and (iii),
$$
f(x + h) = f(x) + lVert x rVert h +
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
O(lVert h rVert^2).
$$
This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
(The main result used there - apart from the Chain Rule, and the formula for
the derivative of the square root function on $mathbb{R}_{>0}$ - is
Frechet derivative for bilinear map.)
Part 2.
For simplicity [but at some risk of confusion with the earlier use of the symbol
'$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
$$
(u, v) = f(x, y) = r(x, y) = (rx, ry),
text{ where } r = sqrt{x^2 + y^2}.
$$
The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
answer has now been deleted, I'll go over the same ground here.
We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
begin{align*}
|u(h, 0)| & = h^2, v(h, 0) = 0, \
|v(0, k)| & = k^2, u(0, k) = 0,
end{align*}
showing that the partial derivatives
$D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.
Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
$partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
$$
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
r begin{pmatrix} h \ k end{pmatrix} +
frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
$$
in agreement with the previous result.
In a convenient but admittedly loose notation, simply denoting the
separate convergence of all four matrix entries,
$$
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix} =
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
$$
showing that all four partial derivatives are continuous everywhere. $square$
add a comment |
up vote
1
down vote
up vote
1
down vote
Part 1.
Here is a derivation (no pun intended) of $f'(x)(h)$ from first
principles. It is valid not just in $mathbb{R}^2$, but in any real inner
product space $E$, not necessarily even finite-dimensional. ($E$ is
not even assumed to be complete; but if it isn't, then I don't think
one is allowed to speak of $f$ being "differentiable" at $x$.)
First, we make some estimates:
(i) By the Triangle Inequality,
$lvertlVert x + h rVert - lVert x rVertrvert leqslant
lVert h rVert$.
(ii) If $x ne 0$, then by (i), as $h to 0$,
$$
leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
- frac{1}{lVert x rVert}rightrvert =
frac{lvertlVert x rVert - lVert x + h rVertrvert}
{(lVert x + h rVert + lVert x rVert)lVert x rVert}
leqslant frac{lVert h rVert}{lVert x rVert^2} =
O(lVert h rVert).
$$
(iii) The Cauchy-Schwarz inequality
$lvertleftlangle x, h rightranglervert leqslant
lVert x rVert lVert h rVert$ gives
$leftlangle x, h rightrangle = O(lVert h rVert)$.
Now, for all $x, h in E$,
begin{align*}
f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)(x + h)
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
&& text{by (i).}
end{align*}
This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
on, we assume that $x ne 0$.
begin{gather*}
f(x + h) - f(x) - lVert x rVert h =
frac{lVert x + h rVert^2 - lVert x rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
\ =
frac{2leftlangle x, h rightrangle + lVert h rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
=
frac{2}{lVert x + h rVert + lVert x rVert}
leftlangle x, h rightrangle x +
O(lVert h rVert^2)
\ =
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
left(frac{2}{lVert x + h rVert + lVert x rVert} -
frac{1}{lVert x rVert}right)
leftlangle x, h rightrangle x +
O(lVert h rVert^2).
end{gather*}
Therefore, by (ii) and (iii),
$$
f(x + h) = f(x) + lVert x rVert h +
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
O(lVert h rVert^2).
$$
This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
(The main result used there - apart from the Chain Rule, and the formula for
the derivative of the square root function on $mathbb{R}_{>0}$ - is
Frechet derivative for bilinear map.)
Part 2.
For simplicity [but at some risk of confusion with the earlier use of the symbol
'$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
$$
(u, v) = f(x, y) = r(x, y) = (rx, ry),
text{ where } r = sqrt{x^2 + y^2}.
$$
The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
answer has now been deleted, I'll go over the same ground here.
We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
begin{align*}
|u(h, 0)| & = h^2, v(h, 0) = 0, \
|v(0, k)| & = k^2, u(0, k) = 0,
end{align*}
showing that the partial derivatives
$D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.
Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
$partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
$$
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
r begin{pmatrix} h \ k end{pmatrix} +
frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
$$
in agreement with the previous result.
In a convenient but admittedly loose notation, simply denoting the
separate convergence of all four matrix entries,
$$
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix} =
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
$$
showing that all four partial derivatives are continuous everywhere. $square$
Part 1.
Here is a derivation (no pun intended) of $f'(x)(h)$ from first
principles. It is valid not just in $mathbb{R}^2$, but in any real inner
product space $E$, not necessarily even finite-dimensional. ($E$ is
not even assumed to be complete; but if it isn't, then I don't think
one is allowed to speak of $f$ being "differentiable" at $x$.)
First, we make some estimates:
(i) By the Triangle Inequality,
$lvertlVert x + h rVert - lVert x rVertrvert leqslant
lVert h rVert$.
(ii) If $x ne 0$, then by (i), as $h to 0$,
$$
leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
- frac{1}{lVert x rVert}rightrvert =
frac{lvertlVert x rVert - lVert x + h rVertrvert}
{(lVert x + h rVert + lVert x rVert)lVert x rVert}
leqslant frac{lVert h rVert}{lVert x rVert^2} =
O(lVert h rVert).
$$
(iii) The Cauchy-Schwarz inequality
$lvertleftlangle x, h rightranglervert leqslant
lVert x rVert lVert h rVert$ gives
$leftlangle x, h rightrangle = O(lVert h rVert)$.
Now, for all $x, h in E$,
begin{align*}
f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)(x + h)
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
&& text{by (i).}
end{align*}
This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
on, we assume that $x ne 0$.
begin{gather*}
f(x + h) - f(x) - lVert x rVert h =
frac{lVert x + h rVert^2 - lVert x rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
\ =
frac{2leftlangle x, h rightrangle + lVert h rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
=
frac{2}{lVert x + h rVert + lVert x rVert}
leftlangle x, h rightrangle x +
O(lVert h rVert^2)
\ =
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
left(frac{2}{lVert x + h rVert + lVert x rVert} -
frac{1}{lVert x rVert}right)
leftlangle x, h rightrangle x +
O(lVert h rVert^2).
end{gather*}
Therefore, by (ii) and (iii),
$$
f(x + h) = f(x) + lVert x rVert h +
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
O(lVert h rVert^2).
$$
This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
(The main result used there - apart from the Chain Rule, and the formula for
the derivative of the square root function on $mathbb{R}_{>0}$ - is
Frechet derivative for bilinear map.)
Part 2.
For simplicity [but at some risk of confusion with the earlier use of the symbol
'$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
$$
(u, v) = f(x, y) = r(x, y) = (rx, ry),
text{ where } r = sqrt{x^2 + y^2}.
$$
The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
answer has now been deleted, I'll go over the same ground here.
We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
begin{align*}
|u(h, 0)| & = h^2, v(h, 0) = 0, \
|v(0, k)| & = k^2, u(0, k) = 0,
end{align*}
showing that the partial derivatives
$D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.
Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
$partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
$$
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
r begin{pmatrix} h \ k end{pmatrix} +
frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
$$
in agreement with the previous result.
In a convenient but admittedly loose notation, simply denoting the
separate convergence of all four matrix entries,
$$
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix} =
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
$$
showing that all four partial derivatives are continuous everywhere. $square$
edited Nov 22 at 20:32
answered Nov 22 at 10:52
Calum Gilhooley
4,052529
4,052529
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3003478%2fprove-fx-x-x-is-differentiable%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
– Calum Gilhooley
Nov 18 at 13:53