Prove $f(x)=x|x|$ is differentiable











up vote
0
down vote

favorite












I am trying to prove that $f:mathbb{R}^2rightarrowmathbb{R}^2$,
$$f(x)=x|x|$$
is differentiable as a part of a larger task.



I think there two ways to approach this:




  1. By proving that for every $xinmathbb{R}^2$ there is a linear function $A$ for which
    $$f(x+h)=f(x)+Ah+|h|epsilon(h)$$
    where $epsilon(h)rightarrow0$ when $hrightarrow0$


  2. By proving that all of the first order partial derivatives of $f$ exist and are continuous.



For the sake of my own understanding, I would like to know how to prove this with both of the ways. Here are my attempts so far:




  1. $f(x+h)=(x_{1}sqrt{x_{1}^2+x_{2}^2}+h_{1},x_{2}sqrt{x_{1}^2+x_{2}^2}+h_{2})=(x_{1}sqrt{x_{1}^2+x_{2}^2},x_{2}sqrt{x_{1}^2+x_{2}^2})+(h_{1},h_{2})=f(x)+(h_{1},h_{2})$


I don't know how to go on with this since I'm not sure how I'm supposed to choose $A$. If $f$ is differentiable, $A$ should be $Df(x)$ but I don't know how should I manipulate the expression to achieve that.




  1. For the second way, I'm not sure how to write out the general form of the first order partial derivatives for $f$.










share|cite|improve this question






















  • For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
    – Calum Gilhooley
    Nov 18 at 13:53















up vote
0
down vote

favorite












I am trying to prove that $f:mathbb{R}^2rightarrowmathbb{R}^2$,
$$f(x)=x|x|$$
is differentiable as a part of a larger task.



I think there two ways to approach this:




  1. By proving that for every $xinmathbb{R}^2$ there is a linear function $A$ for which
    $$f(x+h)=f(x)+Ah+|h|epsilon(h)$$
    where $epsilon(h)rightarrow0$ when $hrightarrow0$


  2. By proving that all of the first order partial derivatives of $f$ exist and are continuous.



For the sake of my own understanding, I would like to know how to prove this with both of the ways. Here are my attempts so far:




  1. $f(x+h)=(x_{1}sqrt{x_{1}^2+x_{2}^2}+h_{1},x_{2}sqrt{x_{1}^2+x_{2}^2}+h_{2})=(x_{1}sqrt{x_{1}^2+x_{2}^2},x_{2}sqrt{x_{1}^2+x_{2}^2})+(h_{1},h_{2})=f(x)+(h_{1},h_{2})$


I don't know how to go on with this since I'm not sure how I'm supposed to choose $A$. If $f$ is differentiable, $A$ should be $Df(x)$ but I don't know how should I manipulate the expression to achieve that.




  1. For the second way, I'm not sure how to write out the general form of the first order partial derivatives for $f$.










share|cite|improve this question






















  • For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
    – Calum Gilhooley
    Nov 18 at 13:53













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I am trying to prove that $f:mathbb{R}^2rightarrowmathbb{R}^2$,
$$f(x)=x|x|$$
is differentiable as a part of a larger task.



I think there two ways to approach this:




  1. By proving that for every $xinmathbb{R}^2$ there is a linear function $A$ for which
    $$f(x+h)=f(x)+Ah+|h|epsilon(h)$$
    where $epsilon(h)rightarrow0$ when $hrightarrow0$


  2. By proving that all of the first order partial derivatives of $f$ exist and are continuous.



For the sake of my own understanding, I would like to know how to prove this with both of the ways. Here are my attempts so far:




  1. $f(x+h)=(x_{1}sqrt{x_{1}^2+x_{2}^2}+h_{1},x_{2}sqrt{x_{1}^2+x_{2}^2}+h_{2})=(x_{1}sqrt{x_{1}^2+x_{2}^2},x_{2}sqrt{x_{1}^2+x_{2}^2})+(h_{1},h_{2})=f(x)+(h_{1},h_{2})$


I don't know how to go on with this since I'm not sure how I'm supposed to choose $A$. If $f$ is differentiable, $A$ should be $Df(x)$ but I don't know how should I manipulate the expression to achieve that.




  1. For the second way, I'm not sure how to write out the general form of the first order partial derivatives for $f$.










share|cite|improve this question













I am trying to prove that $f:mathbb{R}^2rightarrowmathbb{R}^2$,
$$f(x)=x|x|$$
is differentiable as a part of a larger task.



I think there two ways to approach this:




  1. By proving that for every $xinmathbb{R}^2$ there is a linear function $A$ for which
    $$f(x+h)=f(x)+Ah+|h|epsilon(h)$$
    where $epsilon(h)rightarrow0$ when $hrightarrow0$


  2. By proving that all of the first order partial derivatives of $f$ exist and are continuous.



For the sake of my own understanding, I would like to know how to prove this with both of the ways. Here are my attempts so far:




  1. $f(x+h)=(x_{1}sqrt{x_{1}^2+x_{2}^2}+h_{1},x_{2}sqrt{x_{1}^2+x_{2}^2}+h_{2})=(x_{1}sqrt{x_{1}^2+x_{2}^2},x_{2}sqrt{x_{1}^2+x_{2}^2})+(h_{1},h_{2})=f(x)+(h_{1},h_{2})$


I don't know how to go on with this since I'm not sure how I'm supposed to choose $A$. If $f$ is differentiable, $A$ should be $Df(x)$ but I don't know how should I manipulate the expression to achieve that.




  1. For the second way, I'm not sure how to write out the general form of the first order partial derivatives for $f$.







multivariable-calculus derivatives partial-derivative






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 18 at 12:32









Joe

1555




1555












  • For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
    – Calum Gilhooley
    Nov 18 at 13:53


















  • For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
    – Calum Gilhooley
    Nov 18 at 13:53
















For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
– Calum Gilhooley
Nov 18 at 13:53




For $x ne 0$ (see José Carlos Santos's answer for $x = 0$), one could argue that $mathbb{R}^2 times mathbb{R}^2 to mathbb{R}$, $(x, y) mapsto langle x, y rangle$ is bilinear, so differentiable, with derivative $(h, k) mapsto langle x, k rangle + langle h, y rangle$; so $|x|^2 = langle x, x rangle$ is differentiable, with derivative $h mapsto 2langle x, h rangle$; so $|x| = sqrt{langle x, x rangle}$ is differentiable, with derivative $h mapsto langle x, h rangle/|x|$; and scalar multiplication is bilinear, so $f'(x)(h) = langle x, h rangle x/|x| + |x|h$.
– Calum Gilhooley
Nov 18 at 13:53










1 Answer
1






active

oldest

votes

















up vote
1
down vote













Part 1.
Here is a derivation (no pun intended) of $f'(x)(h)$ from first
principles. It is valid not just in $mathbb{R}^2$, but in any real inner
product space $E$, not necessarily even finite-dimensional. ($E$ is
not even assumed to be complete; but if it isn't, then I don't think
one is allowed to speak of $f$ being "differentiable" at $x$.)



First, we make some estimates:



(i) By the Triangle Inequality,
$lvertlVert x + h rVert - lVert x rVertrvert leqslant
lVert h rVert$
.



(ii) If $x ne 0$, then by (i), as $h to 0$,
$$
leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
- frac{1}{lVert x rVert}rightrvert =
frac{lvertlVert x rVert - lVert x + h rVertrvert}
{(lVert x + h rVert + lVert x rVert)lVert x rVert}
leqslant frac{lVert h rVert}{lVert x rVert^2} =
O(lVert h rVert).
$$



(iii) The Cauchy-Schwarz inequality
$lvertleftlangle x, h rightranglervert leqslant
lVert x rVert lVert h rVert$
gives
$leftlangle x, h rightrangle = O(lVert h rVert)$.



Now, for all $x, h in E$,
begin{align*}
f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)(x + h)
\ & = lVert x rVert h +
(lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
&& text{by (i).}
end{align*}

This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
on, we assume that $x ne 0$.
begin{gather*}
f(x + h) - f(x) - lVert x rVert h =
frac{lVert x + h rVert^2 - lVert x rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
\ =
frac{2leftlangle x, h rightrangle + lVert h rVert^2}
{lVert x + h rVert + lVert x rVert}x +
O(lVert h rVert^2)
=
frac{2}{lVert x + h rVert + lVert x rVert}
leftlangle x, h rightrangle x +
O(lVert h rVert^2)
\ =
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
left(frac{2}{lVert x + h rVert + lVert x rVert} -
frac{1}{lVert x rVert}right)
leftlangle x, h rightrangle x +
O(lVert h rVert^2).
end{gather*}

Therefore, by (ii) and (iii),
$$
f(x + h) = f(x) + lVert x rVert h +
frac{leftlangle x, h rightrangle}{lVert x rVert}x +
O(lVert h rVert^2).
$$



This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
(The main result used there - apart from the Chain Rule, and the formula for
the derivative of the square root function on $mathbb{R}_{>0}$ - is
Frechet derivative for bilinear map.)





Part 2.
For simplicity [but at some risk of confusion with the earlier use of the symbol
'$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
$$
(u, v) = f(x, y) = r(x, y) = (rx, ry),
text{ where } r = sqrt{x^2 + y^2}.
$$



The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
answer has now been deleted, I'll go over the same ground here.



We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
begin{align*}
|u(h, 0)| & = h^2, v(h, 0) = 0, \
|v(0, k)| & = k^2, u(0, k) = 0,
end{align*}

showing that the partial derivatives
$D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.



Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
$partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
$$
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix}
begin{pmatrix} h \ k end{pmatrix} =
r begin{pmatrix} h \ k end{pmatrix} +
frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
$$

in agreement with the previous result.



In a convenient but admittedly loose notation, simply denoting the
separate convergence of all four matrix entries,
$$
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{partial u}{partial x} & frac{partial u}{partial y} \
frac{partial v}{partial x} & frac{partial v}{partial y}
end{pmatrix} =
lim_{(x, y) to (0, 0)}
begin{pmatrix}
frac{x^2}{r} + r & frac{xy}{r} \
frac{xy}{r} & frac{y^2}{r} + r
end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
$$

showing that all four partial derivatives are continuous everywhere. $square$






share|cite|improve this answer























    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3003478%2fprove-fx-x-x-is-differentiable%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote













    Part 1.
    Here is a derivation (no pun intended) of $f'(x)(h)$ from first
    principles. It is valid not just in $mathbb{R}^2$, but in any real inner
    product space $E$, not necessarily even finite-dimensional. ($E$ is
    not even assumed to be complete; but if it isn't, then I don't think
    one is allowed to speak of $f$ being "differentiable" at $x$.)



    First, we make some estimates:



    (i) By the Triangle Inequality,
    $lvertlVert x + h rVert - lVert x rVertrvert leqslant
    lVert h rVert$
    .



    (ii) If $x ne 0$, then by (i), as $h to 0$,
    $$
    leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
    - frac{1}{lVert x rVert}rightrvert =
    frac{lvertlVert x rVert - lVert x + h rVertrvert}
    {(lVert x + h rVert + lVert x rVert)lVert x rVert}
    leqslant frac{lVert h rVert}{lVert x rVert^2} =
    O(lVert h rVert).
    $$



    (iii) The Cauchy-Schwarz inequality
    $lvertleftlangle x, h rightranglervert leqslant
    lVert x rVert lVert h rVert$
    gives
    $leftlangle x, h rightrangle = O(lVert h rVert)$.



    Now, for all $x, h in E$,
    begin{align*}
    f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
    \ & = lVert x rVert h +
    (lVert x + h rVert - lVert x rVert)(x + h)
    \ & = lVert x rVert h +
    (lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
    && text{by (i).}
    end{align*}

    This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
    on, we assume that $x ne 0$.
    begin{gather*}
    f(x + h) - f(x) - lVert x rVert h =
    frac{lVert x + h rVert^2 - lVert x rVert^2}
    {lVert x + h rVert + lVert x rVert}x +
    O(lVert h rVert^2)
    \ =
    frac{2leftlangle x, h rightrangle + lVert h rVert^2}
    {lVert x + h rVert + lVert x rVert}x +
    O(lVert h rVert^2)
    =
    frac{2}{lVert x + h rVert + lVert x rVert}
    leftlangle x, h rightrangle x +
    O(lVert h rVert^2)
    \ =
    frac{leftlangle x, h rightrangle}{lVert x rVert}x +
    left(frac{2}{lVert x + h rVert + lVert x rVert} -
    frac{1}{lVert x rVert}right)
    leftlangle x, h rightrangle x +
    O(lVert h rVert^2).
    end{gather*}

    Therefore, by (ii) and (iii),
    $$
    f(x + h) = f(x) + lVert x rVert h +
    frac{leftlangle x, h rightrangle}{lVert x rVert}x +
    O(lVert h rVert^2).
    $$



    This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
    (The main result used there - apart from the Chain Rule, and the formula for
    the derivative of the square root function on $mathbb{R}_{>0}$ - is
    Frechet derivative for bilinear map.)





    Part 2.
    For simplicity [but at some risk of confusion with the earlier use of the symbol
    '$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
    $$
    (u, v) = f(x, y) = r(x, y) = (rx, ry),
    text{ where } r = sqrt{x^2 + y^2}.
    $$



    The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
    answer has now been deleted, I'll go over the same ground here.



    We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
    begin{align*}
    |u(h, 0)| & = h^2, v(h, 0) = 0, \
    |v(0, k)| & = k^2, u(0, k) = 0,
    end{align*}

    showing that the partial derivatives
    $D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.



    Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
    $partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
    $$
    begin{pmatrix}
    frac{partial u}{partial x} & frac{partial u}{partial y} \
    frac{partial v}{partial x} & frac{partial v}{partial y}
    end{pmatrix}
    begin{pmatrix} h \ k end{pmatrix} =
    begin{pmatrix}
    frac{x^2}{r} + r & frac{xy}{r} \
    frac{xy}{r} & frac{y^2}{r} + r
    end{pmatrix}
    begin{pmatrix} h \ k end{pmatrix} =
    r begin{pmatrix} h \ k end{pmatrix} +
    frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
    $$

    in agreement with the previous result.



    In a convenient but admittedly loose notation, simply denoting the
    separate convergence of all four matrix entries,
    $$
    lim_{(x, y) to (0, 0)}
    begin{pmatrix}
    frac{partial u}{partial x} & frac{partial u}{partial y} \
    frac{partial v}{partial x} & frac{partial v}{partial y}
    end{pmatrix} =
    lim_{(x, y) to (0, 0)}
    begin{pmatrix}
    frac{x^2}{r} + r & frac{xy}{r} \
    frac{xy}{r} & frac{y^2}{r} + r
    end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
    $$

    showing that all four partial derivatives are continuous everywhere. $square$






    share|cite|improve this answer



























      up vote
      1
      down vote













      Part 1.
      Here is a derivation (no pun intended) of $f'(x)(h)$ from first
      principles. It is valid not just in $mathbb{R}^2$, but in any real inner
      product space $E$, not necessarily even finite-dimensional. ($E$ is
      not even assumed to be complete; but if it isn't, then I don't think
      one is allowed to speak of $f$ being "differentiable" at $x$.)



      First, we make some estimates:



      (i) By the Triangle Inequality,
      $lvertlVert x + h rVert - lVert x rVertrvert leqslant
      lVert h rVert$
      .



      (ii) If $x ne 0$, then by (i), as $h to 0$,
      $$
      leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
      - frac{1}{lVert x rVert}rightrvert =
      frac{lvertlVert x rVert - lVert x + h rVertrvert}
      {(lVert x + h rVert + lVert x rVert)lVert x rVert}
      leqslant frac{lVert h rVert}{lVert x rVert^2} =
      O(lVert h rVert).
      $$



      (iii) The Cauchy-Schwarz inequality
      $lvertleftlangle x, h rightranglervert leqslant
      lVert x rVert lVert h rVert$
      gives
      $leftlangle x, h rightrangle = O(lVert h rVert)$.



      Now, for all $x, h in E$,
      begin{align*}
      f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
      \ & = lVert x rVert h +
      (lVert x + h rVert - lVert x rVert)(x + h)
      \ & = lVert x rVert h +
      (lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
      && text{by (i).}
      end{align*}

      This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
      on, we assume that $x ne 0$.
      begin{gather*}
      f(x + h) - f(x) - lVert x rVert h =
      frac{lVert x + h rVert^2 - lVert x rVert^2}
      {lVert x + h rVert + lVert x rVert}x +
      O(lVert h rVert^2)
      \ =
      frac{2leftlangle x, h rightrangle + lVert h rVert^2}
      {lVert x + h rVert + lVert x rVert}x +
      O(lVert h rVert^2)
      =
      frac{2}{lVert x + h rVert + lVert x rVert}
      leftlangle x, h rightrangle x +
      O(lVert h rVert^2)
      \ =
      frac{leftlangle x, h rightrangle}{lVert x rVert}x +
      left(frac{2}{lVert x + h rVert + lVert x rVert} -
      frac{1}{lVert x rVert}right)
      leftlangle x, h rightrangle x +
      O(lVert h rVert^2).
      end{gather*}

      Therefore, by (ii) and (iii),
      $$
      f(x + h) = f(x) + lVert x rVert h +
      frac{leftlangle x, h rightrangle}{lVert x rVert}x +
      O(lVert h rVert^2).
      $$



      This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
      (The main result used there - apart from the Chain Rule, and the formula for
      the derivative of the square root function on $mathbb{R}_{>0}$ - is
      Frechet derivative for bilinear map.)





      Part 2.
      For simplicity [but at some risk of confusion with the earlier use of the symbol
      '$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
      $$
      (u, v) = f(x, y) = r(x, y) = (rx, ry),
      text{ where } r = sqrt{x^2 + y^2}.
      $$



      The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
      answer has now been deleted, I'll go over the same ground here.



      We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
      begin{align*}
      |u(h, 0)| & = h^2, v(h, 0) = 0, \
      |v(0, k)| & = k^2, u(0, k) = 0,
      end{align*}

      showing that the partial derivatives
      $D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.



      Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
      $partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
      $$
      begin{pmatrix}
      frac{partial u}{partial x} & frac{partial u}{partial y} \
      frac{partial v}{partial x} & frac{partial v}{partial y}
      end{pmatrix}
      begin{pmatrix} h \ k end{pmatrix} =
      begin{pmatrix}
      frac{x^2}{r} + r & frac{xy}{r} \
      frac{xy}{r} & frac{y^2}{r} + r
      end{pmatrix}
      begin{pmatrix} h \ k end{pmatrix} =
      r begin{pmatrix} h \ k end{pmatrix} +
      frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
      $$

      in agreement with the previous result.



      In a convenient but admittedly loose notation, simply denoting the
      separate convergence of all four matrix entries,
      $$
      lim_{(x, y) to (0, 0)}
      begin{pmatrix}
      frac{partial u}{partial x} & frac{partial u}{partial y} \
      frac{partial v}{partial x} & frac{partial v}{partial y}
      end{pmatrix} =
      lim_{(x, y) to (0, 0)}
      begin{pmatrix}
      frac{x^2}{r} + r & frac{xy}{r} \
      frac{xy}{r} & frac{y^2}{r} + r
      end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
      $$

      showing that all four partial derivatives are continuous everywhere. $square$






      share|cite|improve this answer

























        up vote
        1
        down vote










        up vote
        1
        down vote









        Part 1.
        Here is a derivation (no pun intended) of $f'(x)(h)$ from first
        principles. It is valid not just in $mathbb{R}^2$, but in any real inner
        product space $E$, not necessarily even finite-dimensional. ($E$ is
        not even assumed to be complete; but if it isn't, then I don't think
        one is allowed to speak of $f$ being "differentiable" at $x$.)



        First, we make some estimates:



        (i) By the Triangle Inequality,
        $lvertlVert x + h rVert - lVert x rVertrvert leqslant
        lVert h rVert$
        .



        (ii) If $x ne 0$, then by (i), as $h to 0$,
        $$
        leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
        - frac{1}{lVert x rVert}rightrvert =
        frac{lvertlVert x rVert - lVert x + h rVertrvert}
        {(lVert x + h rVert + lVert x rVert)lVert x rVert}
        leqslant frac{lVert h rVert}{lVert x rVert^2} =
        O(lVert h rVert).
        $$



        (iii) The Cauchy-Schwarz inequality
        $lvertleftlangle x, h rightranglervert leqslant
        lVert x rVert lVert h rVert$
        gives
        $leftlangle x, h rightrangle = O(lVert h rVert)$.



        Now, for all $x, h in E$,
        begin{align*}
        f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
        \ & = lVert x rVert h +
        (lVert x + h rVert - lVert x rVert)(x + h)
        \ & = lVert x rVert h +
        (lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
        && text{by (i).}
        end{align*}

        This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
        on, we assume that $x ne 0$.
        begin{gather*}
        f(x + h) - f(x) - lVert x rVert h =
        frac{lVert x + h rVert^2 - lVert x rVert^2}
        {lVert x + h rVert + lVert x rVert}x +
        O(lVert h rVert^2)
        \ =
        frac{2leftlangle x, h rightrangle + lVert h rVert^2}
        {lVert x + h rVert + lVert x rVert}x +
        O(lVert h rVert^2)
        =
        frac{2}{lVert x + h rVert + lVert x rVert}
        leftlangle x, h rightrangle x +
        O(lVert h rVert^2)
        \ =
        frac{leftlangle x, h rightrangle}{lVert x rVert}x +
        left(frac{2}{lVert x + h rVert + lVert x rVert} -
        frac{1}{lVert x rVert}right)
        leftlangle x, h rightrangle x +
        O(lVert h rVert^2).
        end{gather*}

        Therefore, by (ii) and (iii),
        $$
        f(x + h) = f(x) + lVert x rVert h +
        frac{leftlangle x, h rightrangle}{lVert x rVert}x +
        O(lVert h rVert^2).
        $$



        This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
        (The main result used there - apart from the Chain Rule, and the formula for
        the derivative of the square root function on $mathbb{R}_{>0}$ - is
        Frechet derivative for bilinear map.)





        Part 2.
        For simplicity [but at some risk of confusion with the earlier use of the symbol
        '$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
        $$
        (u, v) = f(x, y) = r(x, y) = (rx, ry),
        text{ where } r = sqrt{x^2 + y^2}.
        $$



        The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
        answer has now been deleted, I'll go over the same ground here.



        We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
        begin{align*}
        |u(h, 0)| & = h^2, v(h, 0) = 0, \
        |v(0, k)| & = k^2, u(0, k) = 0,
        end{align*}

        showing that the partial derivatives
        $D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.



        Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
        $partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
        $$
        begin{pmatrix}
        frac{partial u}{partial x} & frac{partial u}{partial y} \
        frac{partial v}{partial x} & frac{partial v}{partial y}
        end{pmatrix}
        begin{pmatrix} h \ k end{pmatrix} =
        begin{pmatrix}
        frac{x^2}{r} + r & frac{xy}{r} \
        frac{xy}{r} & frac{y^2}{r} + r
        end{pmatrix}
        begin{pmatrix} h \ k end{pmatrix} =
        r begin{pmatrix} h \ k end{pmatrix} +
        frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
        $$

        in agreement with the previous result.



        In a convenient but admittedly loose notation, simply denoting the
        separate convergence of all four matrix entries,
        $$
        lim_{(x, y) to (0, 0)}
        begin{pmatrix}
        frac{partial u}{partial x} & frac{partial u}{partial y} \
        frac{partial v}{partial x} & frac{partial v}{partial y}
        end{pmatrix} =
        lim_{(x, y) to (0, 0)}
        begin{pmatrix}
        frac{x^2}{r} + r & frac{xy}{r} \
        frac{xy}{r} & frac{y^2}{r} + r
        end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
        $$

        showing that all four partial derivatives are continuous everywhere. $square$






        share|cite|improve this answer














        Part 1.
        Here is a derivation (no pun intended) of $f'(x)(h)$ from first
        principles. It is valid not just in $mathbb{R}^2$, but in any real inner
        product space $E$, not necessarily even finite-dimensional. ($E$ is
        not even assumed to be complete; but if it isn't, then I don't think
        one is allowed to speak of $f$ being "differentiable" at $x$.)



        First, we make some estimates:



        (i) By the Triangle Inequality,
        $lvertlVert x + h rVert - lVert x rVertrvert leqslant
        lVert h rVert$
        .



        (ii) If $x ne 0$, then by (i), as $h to 0$,
        $$
        leftlvertfrac{2}{lVert x + h rVert + lVert x rVert}
        - frac{1}{lVert x rVert}rightrvert =
        frac{lvertlVert x rVert - lVert x + h rVertrvert}
        {(lVert x + h rVert + lVert x rVert)lVert x rVert}
        leqslant frac{lVert h rVert}{lVert x rVert^2} =
        O(lVert h rVert).
        $$



        (iii) The Cauchy-Schwarz inequality
        $lvertleftlangle x, h rightranglervert leqslant
        lVert x rVert lVert h rVert$
        gives
        $leftlangle x, h rightrangle = O(lVert h rVert)$.



        Now, for all $x, h in E$,
        begin{align*}
        f(x + h) - f(x) & = lVert x + h rVert(x + h) - lVert x rVert x
        \ & = lVert x rVert h +
        (lVert x + h rVert - lVert x rVert)(x + h)
        \ & = lVert x rVert h +
        (lVert x + h rVert - lVert x rVert)x + O(lVert h rVert^2),
        && text{by (i).}
        end{align*}

        This proves that $f'(x)(h) = 0$ for all $h$ when $x = 0$. From now
        on, we assume that $x ne 0$.
        begin{gather*}
        f(x + h) - f(x) - lVert x rVert h =
        frac{lVert x + h rVert^2 - lVert x rVert^2}
        {lVert x + h rVert + lVert x rVert}x +
        O(lVert h rVert^2)
        \ =
        frac{2leftlangle x, h rightrangle + lVert h rVert^2}
        {lVert x + h rVert + lVert x rVert}x +
        O(lVert h rVert^2)
        =
        frac{2}{lVert x + h rVert + lVert x rVert}
        leftlangle x, h rightrangle x +
        O(lVert h rVert^2)
        \ =
        frac{leftlangle x, h rightrangle}{lVert x rVert}x +
        left(frac{2}{lVert x + h rVert + lVert x rVert} -
        frac{1}{lVert x rVert}right)
        leftlangle x, h rightrangle x +
        O(lVert h rVert^2).
        end{gather*}

        Therefore, by (ii) and (iii),
        $$
        f(x + h) = f(x) + lVert x rVert h +
        frac{leftlangle x, h rightrangle}{lVert x rVert}x +
        O(lVert h rVert^2).
        $$



        This agrees with the formula for $f'(x)(h)$ in my earlier brief comment.
        (The main result used there - apart from the Chain Rule, and the formula for
        the derivative of the square root function on $mathbb{R}_{>0}$ - is
        Frechet derivative for bilinear map.)





        Part 2.
        For simplicity [but at some risk of confusion with the earlier use of the symbol
        '$x$'!], I'll use the notation $(x, y)$, instead of $(x_1, x_2)$, and write
        $$
        (u, v) = f(x, y) = r(x, y) = (rx, ry),
        text{ where } r = sqrt{x^2 + y^2}.
        $$



        The case $(x, y) = (0, 0)$ was dealt with in an earlier answer, but as that
        answer has now been deleted, I'll go over the same ground here.



        We have $f(h, 0) = (h|h|, 0)$, $f(0, k) = (0, k|k|)$, and so
        begin{align*}
        |u(h, 0)| & = h^2, v(h, 0) = 0, \
        |v(0, k)| & = k^2, u(0, k) = 0,
        end{align*}

        showing that the partial derivatives
        $D_1u(0, 0)$, $D_1v(0, 0), D_2v(0, 0)$, $D_2u(0, 0)$ exist, and are all zero.



        Assume now that $(x, y) ne (0, 0)$. Then $r > 0$, and
        $partial r/partial x = x/r$, $partial r/partial y = y/r$, whence
        $$
        begin{pmatrix}
        frac{partial u}{partial x} & frac{partial u}{partial y} \
        frac{partial v}{partial x} & frac{partial v}{partial y}
        end{pmatrix}
        begin{pmatrix} h \ k end{pmatrix} =
        begin{pmatrix}
        frac{x^2}{r} + r & frac{xy}{r} \
        frac{xy}{r} & frac{y^2}{r} + r
        end{pmatrix}
        begin{pmatrix} h \ k end{pmatrix} =
        r begin{pmatrix} h \ k end{pmatrix} +
        frac{xh + yk}{r} begin{pmatrix} x \ y end{pmatrix},
        $$

        in agreement with the previous result.



        In a convenient but admittedly loose notation, simply denoting the
        separate convergence of all four matrix entries,
        $$
        lim_{(x, y) to (0, 0)}
        begin{pmatrix}
        frac{partial u}{partial x} & frac{partial u}{partial y} \
        frac{partial v}{partial x} & frac{partial v}{partial y}
        end{pmatrix} =
        lim_{(x, y) to (0, 0)}
        begin{pmatrix}
        frac{x^2}{r} + r & frac{xy}{r} \
        frac{xy}{r} & frac{y^2}{r} + r
        end{pmatrix} = begin{pmatrix} 0 & 0 \ 0 & 0 end{pmatrix},
        $$

        showing that all four partial derivatives are continuous everywhere. $square$







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Nov 22 at 20:32

























        answered Nov 22 at 10:52









        Calum Gilhooley

        4,052529




        4,052529






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3003478%2fprove-fx-x-x-is-differentiable%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            QoS: MAC-Priority for clients behind a repeater

            Ивакино (Тотемский район)

            Can't locate Autom4te/ChannelDefs.pm in @INC (when it definitely is there)