Proof that a matrix is invertible if and only if meets this property.











up vote
-1
down vote

favorite












We have $Ain operatorname{Mat}_n(K)$ and I am told to prove that $Ain GL_n(K)$ if and only if meets
$ABiggl(begin{matrix} x_1 \ ... \ x_n end{matrix}Biggr)=Biggl(begin{matrix} 0 \ ... \ 0 end{matrix}Biggr)$ then $x_1,...,x_n = 0$



As a tip I am told to consider $beta$ the base of a vector space $V$ with $n$ dimension and $fin operatorname{End}V)$ whose associated matrix is $A$.



I don't know how to use that "tip" to prove that.
Thanks in advance.










share|cite|improve this question
























  • Suppose that $A$ is invertible and $Ax=0$. Then $x=A^{-1}Ax=A^{-1}(0)=0$ as required.
    – Dietrich Burde
    Nov 16 at 19:03

















up vote
-1
down vote

favorite












We have $Ain operatorname{Mat}_n(K)$ and I am told to prove that $Ain GL_n(K)$ if and only if meets
$ABiggl(begin{matrix} x_1 \ ... \ x_n end{matrix}Biggr)=Biggl(begin{matrix} 0 \ ... \ 0 end{matrix}Biggr)$ then $x_1,...,x_n = 0$



As a tip I am told to consider $beta$ the base of a vector space $V$ with $n$ dimension and $fin operatorname{End}V)$ whose associated matrix is $A$.



I don't know how to use that "tip" to prove that.
Thanks in advance.










share|cite|improve this question
























  • Suppose that $A$ is invertible and $Ax=0$. Then $x=A^{-1}Ax=A^{-1}(0)=0$ as required.
    – Dietrich Burde
    Nov 16 at 19:03















up vote
-1
down vote

favorite









up vote
-1
down vote

favorite











We have $Ain operatorname{Mat}_n(K)$ and I am told to prove that $Ain GL_n(K)$ if and only if meets
$ABiggl(begin{matrix} x_1 \ ... \ x_n end{matrix}Biggr)=Biggl(begin{matrix} 0 \ ... \ 0 end{matrix}Biggr)$ then $x_1,...,x_n = 0$



As a tip I am told to consider $beta$ the base of a vector space $V$ with $n$ dimension and $fin operatorname{End}V)$ whose associated matrix is $A$.



I don't know how to use that "tip" to prove that.
Thanks in advance.










share|cite|improve this question















We have $Ain operatorname{Mat}_n(K)$ and I am told to prove that $Ain GL_n(K)$ if and only if meets
$ABiggl(begin{matrix} x_1 \ ... \ x_n end{matrix}Biggr)=Biggl(begin{matrix} 0 \ ... \ 0 end{matrix}Biggr)$ then $x_1,...,x_n = 0$



As a tip I am told to consider $beta$ the base of a vector space $V$ with $n$ dimension and $fin operatorname{End}V)$ whose associated matrix is $A$.



I don't know how to use that "tip" to prove that.
Thanks in advance.







linear-algebra






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 16 at 20:20









Bernard

116k637108




116k637108










asked Nov 16 at 19:01









Andarrkor

33




33












  • Suppose that $A$ is invertible and $Ax=0$. Then $x=A^{-1}Ax=A^{-1}(0)=0$ as required.
    – Dietrich Burde
    Nov 16 at 19:03




















  • Suppose that $A$ is invertible and $Ax=0$. Then $x=A^{-1}Ax=A^{-1}(0)=0$ as required.
    – Dietrich Burde
    Nov 16 at 19:03


















Suppose that $A$ is invertible and $Ax=0$. Then $x=A^{-1}Ax=A^{-1}(0)=0$ as required.
– Dietrich Burde
Nov 16 at 19:03






Suppose that $A$ is invertible and $Ax=0$. Then $x=A^{-1}Ax=A^{-1}(0)=0$ as required.
– Dietrich Burde
Nov 16 at 19:03












3 Answers
3






active

oldest

votes

















up vote
2
down vote



accepted










As indicated by your tip, consider the endomorphism $f$ represented by this matrix in a given basis of a $K$-vector space $V$ with dimension $n$.



Remember that $A$ is invertible if and only if $f$ is an isomorphism (more exactly, an automorphism since $f$ is an endomorphism).



Now in a finite dimensional space, $f$ is an automorphism if and only if it is injective (and also if and only if it is surjective).



Now, how do you characterise an injective linear map?



Some more details:



$f$ is injective (hence bijective) if and only if $ker f={0}$, i.e. if and only if
$$(f(v)=0)implies (v=0).$$
What is the relation between $f(v)$ and $A$?






share|cite|improve this answer























  • A linear transformation is injective when $Ker f = {0_v}$. But I do not know how to get to the matrix form from there.
    – Andarrkor
    Nov 16 at 20:34










  • I've added some details. Is that clearer now?
    – Bernard
    Nov 16 at 20:42










  • Sorry for asking so many questions, but there is something that I don't get. To get the images of the linear transformation $f$, being $v$ any vector of $V$ and ${v_1,...,v_n}$ a base of V we do. $f(v) = (v_1 ... v_n)Abegin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ being $begin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ the coordinates of $v$ with the base ${v_1,...,v_n}$. I don't understand which is the relation between that and $Ax=0$.
    – Andarrkor
    Nov 16 at 21:02










  • Let's call $(x_1, x_2,dots, x_n)$ the (unknown) coordinates of a vector $v$ in the kernel. Then $f(v)=Abegin{bmatrix}x_1\x_2\vdots\x_nend{bmatrix}$. So $vinker f$ means the coordinates of $v$ satisfy the linear system of equations defined by $A$.
    – Bernard
    Nov 16 at 21:09












  • Thanks for answering. But what happens with the matrix with the base vectors $(v_1...v_n)$ that are also in the multiplication?
    – Andarrkor
    Nov 17 at 1:34




















up vote
0
down vote













To get you started: A matrix operation is invertible if it is injective and surjective. If $Ax=0$ for some non-zero vector $x$, then $A$ is not injective, so it is not invertible. The contrapositive is that $A$ being invertible implies $Ax=0$ only for $x=0$.






share|cite|improve this answer




























    up vote
    0
    down vote













    If $A$ is invertible, then using $Rank-nullity$ $theorem$, we can proove that $text{Ker}(A) = {0}$, so if $A(x) = 0$ then $x=0$.



    Reciprocally, if $A$ meets the condition:



    $ABiggl(begin{matrix} x_1 \ ... \ x_n end{matrix}Biggr)=Biggl(begin{matrix} 0 \ ... \ 0 end{matrix}Biggr)$ then $x_1,...,x_n = 0$, it means that $text{Ker}(A) = {0}$. Using again the $Rank-nullity$ $theorem$, we can prove that $text{Rank}(A) = n$, which ensures us that $A$ is invertible.






    share|cite|improve this answer





















    • We have learnt the rank-nullity theorem just for lineal applications and not for matrices. So I have to use the linear aplication whose associated matrix is A to prove that. I know that there is a link between them as matrices and linear applications are isomorphic but I do not know how to "pass" from matrices to linear applications and vice versa. Hope you understood my problem.
      – Andarrkor
      Nov 16 at 19:56













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3001517%2fproof-that-a-matrix-is-invertible-if-and-only-if-meets-this-property%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    2
    down vote



    accepted










    As indicated by your tip, consider the endomorphism $f$ represented by this matrix in a given basis of a $K$-vector space $V$ with dimension $n$.



    Remember that $A$ is invertible if and only if $f$ is an isomorphism (more exactly, an automorphism since $f$ is an endomorphism).



    Now in a finite dimensional space, $f$ is an automorphism if and only if it is injective (and also if and only if it is surjective).



    Now, how do you characterise an injective linear map?



    Some more details:



    $f$ is injective (hence bijective) if and only if $ker f={0}$, i.e. if and only if
    $$(f(v)=0)implies (v=0).$$
    What is the relation between $f(v)$ and $A$?






    share|cite|improve this answer























    • A linear transformation is injective when $Ker f = {0_v}$. But I do not know how to get to the matrix form from there.
      – Andarrkor
      Nov 16 at 20:34










    • I've added some details. Is that clearer now?
      – Bernard
      Nov 16 at 20:42










    • Sorry for asking so many questions, but there is something that I don't get. To get the images of the linear transformation $f$, being $v$ any vector of $V$ and ${v_1,...,v_n}$ a base of V we do. $f(v) = (v_1 ... v_n)Abegin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ being $begin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ the coordinates of $v$ with the base ${v_1,...,v_n}$. I don't understand which is the relation between that and $Ax=0$.
      – Andarrkor
      Nov 16 at 21:02










    • Let's call $(x_1, x_2,dots, x_n)$ the (unknown) coordinates of a vector $v$ in the kernel. Then $f(v)=Abegin{bmatrix}x_1\x_2\vdots\x_nend{bmatrix}$. So $vinker f$ means the coordinates of $v$ satisfy the linear system of equations defined by $A$.
      – Bernard
      Nov 16 at 21:09












    • Thanks for answering. But what happens with the matrix with the base vectors $(v_1...v_n)$ that are also in the multiplication?
      – Andarrkor
      Nov 17 at 1:34

















    up vote
    2
    down vote



    accepted










    As indicated by your tip, consider the endomorphism $f$ represented by this matrix in a given basis of a $K$-vector space $V$ with dimension $n$.



    Remember that $A$ is invertible if and only if $f$ is an isomorphism (more exactly, an automorphism since $f$ is an endomorphism).



    Now in a finite dimensional space, $f$ is an automorphism if and only if it is injective (and also if and only if it is surjective).



    Now, how do you characterise an injective linear map?



    Some more details:



    $f$ is injective (hence bijective) if and only if $ker f={0}$, i.e. if and only if
    $$(f(v)=0)implies (v=0).$$
    What is the relation between $f(v)$ and $A$?






    share|cite|improve this answer























    • A linear transformation is injective when $Ker f = {0_v}$. But I do not know how to get to the matrix form from there.
      – Andarrkor
      Nov 16 at 20:34










    • I've added some details. Is that clearer now?
      – Bernard
      Nov 16 at 20:42










    • Sorry for asking so many questions, but there is something that I don't get. To get the images of the linear transformation $f$, being $v$ any vector of $V$ and ${v_1,...,v_n}$ a base of V we do. $f(v) = (v_1 ... v_n)Abegin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ being $begin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ the coordinates of $v$ with the base ${v_1,...,v_n}$. I don't understand which is the relation between that and $Ax=0$.
      – Andarrkor
      Nov 16 at 21:02










    • Let's call $(x_1, x_2,dots, x_n)$ the (unknown) coordinates of a vector $v$ in the kernel. Then $f(v)=Abegin{bmatrix}x_1\x_2\vdots\x_nend{bmatrix}$. So $vinker f$ means the coordinates of $v$ satisfy the linear system of equations defined by $A$.
      – Bernard
      Nov 16 at 21:09












    • Thanks for answering. But what happens with the matrix with the base vectors $(v_1...v_n)$ that are also in the multiplication?
      – Andarrkor
      Nov 17 at 1:34















    up vote
    2
    down vote



    accepted







    up vote
    2
    down vote



    accepted






    As indicated by your tip, consider the endomorphism $f$ represented by this matrix in a given basis of a $K$-vector space $V$ with dimension $n$.



    Remember that $A$ is invertible if and only if $f$ is an isomorphism (more exactly, an automorphism since $f$ is an endomorphism).



    Now in a finite dimensional space, $f$ is an automorphism if and only if it is injective (and also if and only if it is surjective).



    Now, how do you characterise an injective linear map?



    Some more details:



    $f$ is injective (hence bijective) if and only if $ker f={0}$, i.e. if and only if
    $$(f(v)=0)implies (v=0).$$
    What is the relation between $f(v)$ and $A$?






    share|cite|improve this answer














    As indicated by your tip, consider the endomorphism $f$ represented by this matrix in a given basis of a $K$-vector space $V$ with dimension $n$.



    Remember that $A$ is invertible if and only if $f$ is an isomorphism (more exactly, an automorphism since $f$ is an endomorphism).



    Now in a finite dimensional space, $f$ is an automorphism if and only if it is injective (and also if and only if it is surjective).



    Now, how do you characterise an injective linear map?



    Some more details:



    $f$ is injective (hence bijective) if and only if $ker f={0}$, i.e. if and only if
    $$(f(v)=0)implies (v=0).$$
    What is the relation between $f(v)$ and $A$?







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Nov 17 at 19:53

























    answered Nov 16 at 20:27









    Bernard

    116k637108




    116k637108












    • A linear transformation is injective when $Ker f = {0_v}$. But I do not know how to get to the matrix form from there.
      – Andarrkor
      Nov 16 at 20:34










    • I've added some details. Is that clearer now?
      – Bernard
      Nov 16 at 20:42










    • Sorry for asking so many questions, but there is something that I don't get. To get the images of the linear transformation $f$, being $v$ any vector of $V$ and ${v_1,...,v_n}$ a base of V we do. $f(v) = (v_1 ... v_n)Abegin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ being $begin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ the coordinates of $v$ with the base ${v_1,...,v_n}$. I don't understand which is the relation between that and $Ax=0$.
      – Andarrkor
      Nov 16 at 21:02










    • Let's call $(x_1, x_2,dots, x_n)$ the (unknown) coordinates of a vector $v$ in the kernel. Then $f(v)=Abegin{bmatrix}x_1\x_2\vdots\x_nend{bmatrix}$. So $vinker f$ means the coordinates of $v$ satisfy the linear system of equations defined by $A$.
      – Bernard
      Nov 16 at 21:09












    • Thanks for answering. But what happens with the matrix with the base vectors $(v_1...v_n)$ that are also in the multiplication?
      – Andarrkor
      Nov 17 at 1:34




















    • A linear transformation is injective when $Ker f = {0_v}$. But I do not know how to get to the matrix form from there.
      – Andarrkor
      Nov 16 at 20:34










    • I've added some details. Is that clearer now?
      – Bernard
      Nov 16 at 20:42










    • Sorry for asking so many questions, but there is something that I don't get. To get the images of the linear transformation $f$, being $v$ any vector of $V$ and ${v_1,...,v_n}$ a base of V we do. $f(v) = (v_1 ... v_n)Abegin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ being $begin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ the coordinates of $v$ with the base ${v_1,...,v_n}$. I don't understand which is the relation between that and $Ax=0$.
      – Andarrkor
      Nov 16 at 21:02










    • Let's call $(x_1, x_2,dots, x_n)$ the (unknown) coordinates of a vector $v$ in the kernel. Then $f(v)=Abegin{bmatrix}x_1\x_2\vdots\x_nend{bmatrix}$. So $vinker f$ means the coordinates of $v$ satisfy the linear system of equations defined by $A$.
      – Bernard
      Nov 16 at 21:09












    • Thanks for answering. But what happens with the matrix with the base vectors $(v_1...v_n)$ that are also in the multiplication?
      – Andarrkor
      Nov 17 at 1:34


















    A linear transformation is injective when $Ker f = {0_v}$. But I do not know how to get to the matrix form from there.
    – Andarrkor
    Nov 16 at 20:34




    A linear transformation is injective when $Ker f = {0_v}$. But I do not know how to get to the matrix form from there.
    – Andarrkor
    Nov 16 at 20:34












    I've added some details. Is that clearer now?
    – Bernard
    Nov 16 at 20:42




    I've added some details. Is that clearer now?
    – Bernard
    Nov 16 at 20:42












    Sorry for asking so many questions, but there is something that I don't get. To get the images of the linear transformation $f$, being $v$ any vector of $V$ and ${v_1,...,v_n}$ a base of V we do. $f(v) = (v_1 ... v_n)Abegin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ being $begin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ the coordinates of $v$ with the base ${v_1,...,v_n}$. I don't understand which is the relation between that and $Ax=0$.
    – Andarrkor
    Nov 16 at 21:02




    Sorry for asking so many questions, but there is something that I don't get. To get the images of the linear transformation $f$, being $v$ any vector of $V$ and ${v_1,...,v_n}$ a base of V we do. $f(v) = (v_1 ... v_n)Abegin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ being $begin{bmatrix}lambda_1\...\lambda_nend{bmatrix}$ the coordinates of $v$ with the base ${v_1,...,v_n}$. I don't understand which is the relation between that and $Ax=0$.
    – Andarrkor
    Nov 16 at 21:02












    Let's call $(x_1, x_2,dots, x_n)$ the (unknown) coordinates of a vector $v$ in the kernel. Then $f(v)=Abegin{bmatrix}x_1\x_2\vdots\x_nend{bmatrix}$. So $vinker f$ means the coordinates of $v$ satisfy the linear system of equations defined by $A$.
    – Bernard
    Nov 16 at 21:09






    Let's call $(x_1, x_2,dots, x_n)$ the (unknown) coordinates of a vector $v$ in the kernel. Then $f(v)=Abegin{bmatrix}x_1\x_2\vdots\x_nend{bmatrix}$. So $vinker f$ means the coordinates of $v$ satisfy the linear system of equations defined by $A$.
    – Bernard
    Nov 16 at 21:09














    Thanks for answering. But what happens with the matrix with the base vectors $(v_1...v_n)$ that are also in the multiplication?
    – Andarrkor
    Nov 17 at 1:34






    Thanks for answering. But what happens with the matrix with the base vectors $(v_1...v_n)$ that are also in the multiplication?
    – Andarrkor
    Nov 17 at 1:34












    up vote
    0
    down vote













    To get you started: A matrix operation is invertible if it is injective and surjective. If $Ax=0$ for some non-zero vector $x$, then $A$ is not injective, so it is not invertible. The contrapositive is that $A$ being invertible implies $Ax=0$ only for $x=0$.






    share|cite|improve this answer

























      up vote
      0
      down vote













      To get you started: A matrix operation is invertible if it is injective and surjective. If $Ax=0$ for some non-zero vector $x$, then $A$ is not injective, so it is not invertible. The contrapositive is that $A$ being invertible implies $Ax=0$ only for $x=0$.






      share|cite|improve this answer























        up vote
        0
        down vote










        up vote
        0
        down vote









        To get you started: A matrix operation is invertible if it is injective and surjective. If $Ax=0$ for some non-zero vector $x$, then $A$ is not injective, so it is not invertible. The contrapositive is that $A$ being invertible implies $Ax=0$ only for $x=0$.






        share|cite|improve this answer












        To get you started: A matrix operation is invertible if it is injective and surjective. If $Ax=0$ for some non-zero vector $x$, then $A$ is not injective, so it is not invertible. The contrapositive is that $A$ being invertible implies $Ax=0$ only for $x=0$.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Nov 16 at 19:16









        The Count

        2,30361431




        2,30361431






















            up vote
            0
            down vote













            If $A$ is invertible, then using $Rank-nullity$ $theorem$, we can proove that $text{Ker}(A) = {0}$, so if $A(x) = 0$ then $x=0$.



            Reciprocally, if $A$ meets the condition:



            $ABiggl(begin{matrix} x_1 \ ... \ x_n end{matrix}Biggr)=Biggl(begin{matrix} 0 \ ... \ 0 end{matrix}Biggr)$ then $x_1,...,x_n = 0$, it means that $text{Ker}(A) = {0}$. Using again the $Rank-nullity$ $theorem$, we can prove that $text{Rank}(A) = n$, which ensures us that $A$ is invertible.






            share|cite|improve this answer





















            • We have learnt the rank-nullity theorem just for lineal applications and not for matrices. So I have to use the linear aplication whose associated matrix is A to prove that. I know that there is a link between them as matrices and linear applications are isomorphic but I do not know how to "pass" from matrices to linear applications and vice versa. Hope you understood my problem.
              – Andarrkor
              Nov 16 at 19:56

















            up vote
            0
            down vote













            If $A$ is invertible, then using $Rank-nullity$ $theorem$, we can proove that $text{Ker}(A) = {0}$, so if $A(x) = 0$ then $x=0$.



            Reciprocally, if $A$ meets the condition:



            $ABiggl(begin{matrix} x_1 \ ... \ x_n end{matrix}Biggr)=Biggl(begin{matrix} 0 \ ... \ 0 end{matrix}Biggr)$ then $x_1,...,x_n = 0$, it means that $text{Ker}(A) = {0}$. Using again the $Rank-nullity$ $theorem$, we can prove that $text{Rank}(A) = n$, which ensures us that $A$ is invertible.






            share|cite|improve this answer





















            • We have learnt the rank-nullity theorem just for lineal applications and not for matrices. So I have to use the linear aplication whose associated matrix is A to prove that. I know that there is a link between them as matrices and linear applications are isomorphic but I do not know how to "pass" from matrices to linear applications and vice versa. Hope you understood my problem.
              – Andarrkor
              Nov 16 at 19:56















            up vote
            0
            down vote










            up vote
            0
            down vote









            If $A$ is invertible, then using $Rank-nullity$ $theorem$, we can proove that $text{Ker}(A) = {0}$, so if $A(x) = 0$ then $x=0$.



            Reciprocally, if $A$ meets the condition:



            $ABiggl(begin{matrix} x_1 \ ... \ x_n end{matrix}Biggr)=Biggl(begin{matrix} 0 \ ... \ 0 end{matrix}Biggr)$ then $x_1,...,x_n = 0$, it means that $text{Ker}(A) = {0}$. Using again the $Rank-nullity$ $theorem$, we can prove that $text{Rank}(A) = n$, which ensures us that $A$ is invertible.






            share|cite|improve this answer












            If $A$ is invertible, then using $Rank-nullity$ $theorem$, we can proove that $text{Ker}(A) = {0}$, so if $A(x) = 0$ then $x=0$.



            Reciprocally, if $A$ meets the condition:



            $ABiggl(begin{matrix} x_1 \ ... \ x_n end{matrix}Biggr)=Biggl(begin{matrix} 0 \ ... \ 0 end{matrix}Biggr)$ then $x_1,...,x_n = 0$, it means that $text{Ker}(A) = {0}$. Using again the $Rank-nullity$ $theorem$, we can prove that $text{Rank}(A) = n$, which ensures us that $A$ is invertible.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Nov 16 at 19:40









            Euler Pythagoras

            3619




            3619












            • We have learnt the rank-nullity theorem just for lineal applications and not for matrices. So I have to use the linear aplication whose associated matrix is A to prove that. I know that there is a link between them as matrices and linear applications are isomorphic but I do not know how to "pass" from matrices to linear applications and vice versa. Hope you understood my problem.
              – Andarrkor
              Nov 16 at 19:56




















            • We have learnt the rank-nullity theorem just for lineal applications and not for matrices. So I have to use the linear aplication whose associated matrix is A to prove that. I know that there is a link between them as matrices and linear applications are isomorphic but I do not know how to "pass" from matrices to linear applications and vice versa. Hope you understood my problem.
              – Andarrkor
              Nov 16 at 19:56


















            We have learnt the rank-nullity theorem just for lineal applications and not for matrices. So I have to use the linear aplication whose associated matrix is A to prove that. I know that there is a link between them as matrices and linear applications are isomorphic but I do not know how to "pass" from matrices to linear applications and vice versa. Hope you understood my problem.
            – Andarrkor
            Nov 16 at 19:56






            We have learnt the rank-nullity theorem just for lineal applications and not for matrices. So I have to use the linear aplication whose associated matrix is A to prove that. I know that there is a link between them as matrices and linear applications are isomorphic but I do not know how to "pass" from matrices to linear applications and vice versa. Hope you understood my problem.
            – Andarrkor
            Nov 16 at 19:56




















             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3001517%2fproof-that-a-matrix-is-invertible-if-and-only-if-meets-this-property%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            QoS: MAC-Priority for clients behind a repeater

            Ивакино (Тотемский район)

            Can't locate Autom4te/ChannelDefs.pm in @INC (when it definitely is there)