Proving that the input space of a surjective linear transformation has a dimension at least as large as the...











up vote
1
down vote

favorite












Say we have a linear transformation $T : V to W$. We know that $T$ is surjective or “onto $W$”.



I'm trying to prove that $n = dim(V) > m = dim(W)$. Intuitively, this makes sense. $V$ would have to be at least as “large” as $W$ in order for every vector in $W$ to be the image of at least one vector in $V$.



I'm having trouble sort of “wrapping up” this proof. Here's what I have.





Let $x in V, y in W$.



We know $x$ must be a linear combination of some basis for $V$. Say that basis is ${v_1, cdots, v_n}$. $x = c_1v_1 + cdots + c_nv_n$.



From the properties of a linear transformation we know that: $T(x) = c_1T(v_1) + cdots + c_nT(v_n) = y$.



This tells me that $y$ is a linear combination of the set ${T(v_1), cdots T(v_n)}$. This means that the some (non-strict) subset of ${T(v_1), cdots, T(v_n)}$ must form a basis for $W$. A basis for $W$ requires at least $m$ linearly independent vectors. Thus, $n geq m$.





My gut says this feels incomplete. I feel like I've made an assumption that the set ${T(v_1), cdots, T(v_n)}$ is linearly independent because we know ${v_1, cdots, v_n}$ is, but I haven't shown that. Any hints?










share|cite|improve this question




















  • 1




    Can you use the rank-nullity formula?
    – Bernard
    Nov 17 at 20:12















up vote
1
down vote

favorite












Say we have a linear transformation $T : V to W$. We know that $T$ is surjective or “onto $W$”.



I'm trying to prove that $n = dim(V) > m = dim(W)$. Intuitively, this makes sense. $V$ would have to be at least as “large” as $W$ in order for every vector in $W$ to be the image of at least one vector in $V$.



I'm having trouble sort of “wrapping up” this proof. Here's what I have.





Let $x in V, y in W$.



We know $x$ must be a linear combination of some basis for $V$. Say that basis is ${v_1, cdots, v_n}$. $x = c_1v_1 + cdots + c_nv_n$.



From the properties of a linear transformation we know that: $T(x) = c_1T(v_1) + cdots + c_nT(v_n) = y$.



This tells me that $y$ is a linear combination of the set ${T(v_1), cdots T(v_n)}$. This means that the some (non-strict) subset of ${T(v_1), cdots, T(v_n)}$ must form a basis for $W$. A basis for $W$ requires at least $m$ linearly independent vectors. Thus, $n geq m$.





My gut says this feels incomplete. I feel like I've made an assumption that the set ${T(v_1), cdots, T(v_n)}$ is linearly independent because we know ${v_1, cdots, v_n}$ is, but I haven't shown that. Any hints?










share|cite|improve this question




















  • 1




    Can you use the rank-nullity formula?
    – Bernard
    Nov 17 at 20:12













up vote
1
down vote

favorite









up vote
1
down vote

favorite











Say we have a linear transformation $T : V to W$. We know that $T$ is surjective or “onto $W$”.



I'm trying to prove that $n = dim(V) > m = dim(W)$. Intuitively, this makes sense. $V$ would have to be at least as “large” as $W$ in order for every vector in $W$ to be the image of at least one vector in $V$.



I'm having trouble sort of “wrapping up” this proof. Here's what I have.





Let $x in V, y in W$.



We know $x$ must be a linear combination of some basis for $V$. Say that basis is ${v_1, cdots, v_n}$. $x = c_1v_1 + cdots + c_nv_n$.



From the properties of a linear transformation we know that: $T(x) = c_1T(v_1) + cdots + c_nT(v_n) = y$.



This tells me that $y$ is a linear combination of the set ${T(v_1), cdots T(v_n)}$. This means that the some (non-strict) subset of ${T(v_1), cdots, T(v_n)}$ must form a basis for $W$. A basis for $W$ requires at least $m$ linearly independent vectors. Thus, $n geq m$.





My gut says this feels incomplete. I feel like I've made an assumption that the set ${T(v_1), cdots, T(v_n)}$ is linearly independent because we know ${v_1, cdots, v_n}$ is, but I haven't shown that. Any hints?










share|cite|improve this question















Say we have a linear transformation $T : V to W$. We know that $T$ is surjective or “onto $W$”.



I'm trying to prove that $n = dim(V) > m = dim(W)$. Intuitively, this makes sense. $V$ would have to be at least as “large” as $W$ in order for every vector in $W$ to be the image of at least one vector in $V$.



I'm having trouble sort of “wrapping up” this proof. Here's what I have.





Let $x in V, y in W$.



We know $x$ must be a linear combination of some basis for $V$. Say that basis is ${v_1, cdots, v_n}$. $x = c_1v_1 + cdots + c_nv_n$.



From the properties of a linear transformation we know that: $T(x) = c_1T(v_1) + cdots + c_nT(v_n) = y$.



This tells me that $y$ is a linear combination of the set ${T(v_1), cdots T(v_n)}$. This means that the some (non-strict) subset of ${T(v_1), cdots, T(v_n)}$ must form a basis for $W$. A basis for $W$ requires at least $m$ linearly independent vectors. Thus, $n geq m$.





My gut says this feels incomplete. I feel like I've made an assumption that the set ${T(v_1), cdots, T(v_n)}$ is linearly independent because we know ${v_1, cdots, v_n}$ is, but I haven't shown that. Any hints?







linear-algebra linear-transformations






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 17 at 20:55

























asked Nov 17 at 20:00









Emily Horsman

111110




111110








  • 1




    Can you use the rank-nullity formula?
    – Bernard
    Nov 17 at 20:12














  • 1




    Can you use the rank-nullity formula?
    – Bernard
    Nov 17 at 20:12








1




1




Can you use the rank-nullity formula?
– Bernard
Nov 17 at 20:12




Can you use the rank-nullity formula?
– Bernard
Nov 17 at 20:12










2 Answers
2






active

oldest

votes

















up vote
1
down vote



accepted










Your idea is good: if ${v_1,dots,v_n}$ is a basis of $V$, then ${T(v_1),dots,T(v_n)}$ is a spanning set for $W$, which you can extract a basis from. Therefore $dim Wledim V$.



How can you extract a basis? Suppose that ${u_1,dots,u_r}$ is a spanning set of the vector space $U$. Then either the set is linearly dependent or independent. In the latter case you're finished. Otherwise, one vector is a linear combination of the others; without loss of generality, it can be taken as $u_r$ and then ${u_1,dots,u_{r-1}}$ is again a spanning set. Repeat until you have to stop because the set you get is linearly independent.





The rank-nullity theorem tells even more: if $Tcolon Vto W$ is a linear map, then
$$
dim V=dimker T+dimoperatorname{im}T
$$

(where $ker T$ is the kernel and $operatorname{im}T$ is the image). If $T$ is surjective, then $dim W=dim V-dimker Tle dim V$.






share|cite|improve this answer





















  • "If $T$ is surjective, then $dim W = dim V - dim ker T$" Does this imply that $T$ maps any vector in $V$ to either a vector in $W$ or to 0?
    – Emily Horsman
    Nov 17 at 21:34












  • @EmilyHorsman Isn't $0$ a vector in $W$? Don't read too much in the formula: it's a numeric relation between dimensions.
    – egreg
    Nov 17 at 21:34












  • Ah right, that's true.
    – Emily Horsman
    Nov 17 at 21:34










  • Okay here's my summary to ensure I understand. $text{rank}(T) = dim(V) - text{nullity}(T)$. $dim(W) = text{rank}(T)$ since $T$ is surjective and thus every vector in $W$ is the image of a vector in $V$. $dim(W) = dim(V) - text{nullity}(T) leq dim(V)$.
    – Emily Horsman
    Nov 17 at 21:52












  • @EmilyHorsman Right so.
    – egreg
    Nov 17 at 22:03


















up vote
1
down vote













I would start from a basis of $W$: $w_1,ldots,w_m$. $T$ is surjective then there are $v_1,ldots,v_m$ such that $w_k=T(v_k)$. It is straigthforward by definition to verify that $v_1,ldots, v_m$ are linearly independent
$$
sum c_kv_k=0implies Tbig(sum c_kv_kbig)=sum c_k w_k=0implies text{ all }c_k=0,
$$

hence, $nge m$.






share|cite|improve this answer





















    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3002748%2fproving-that-the-input-space-of-a-surjective-linear-transformation-has-a-dimensi%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted










    Your idea is good: if ${v_1,dots,v_n}$ is a basis of $V$, then ${T(v_1),dots,T(v_n)}$ is a spanning set for $W$, which you can extract a basis from. Therefore $dim Wledim V$.



    How can you extract a basis? Suppose that ${u_1,dots,u_r}$ is a spanning set of the vector space $U$. Then either the set is linearly dependent or independent. In the latter case you're finished. Otherwise, one vector is a linear combination of the others; without loss of generality, it can be taken as $u_r$ and then ${u_1,dots,u_{r-1}}$ is again a spanning set. Repeat until you have to stop because the set you get is linearly independent.





    The rank-nullity theorem tells even more: if $Tcolon Vto W$ is a linear map, then
    $$
    dim V=dimker T+dimoperatorname{im}T
    $$

    (where $ker T$ is the kernel and $operatorname{im}T$ is the image). If $T$ is surjective, then $dim W=dim V-dimker Tle dim V$.






    share|cite|improve this answer





















    • "If $T$ is surjective, then $dim W = dim V - dim ker T$" Does this imply that $T$ maps any vector in $V$ to either a vector in $W$ or to 0?
      – Emily Horsman
      Nov 17 at 21:34












    • @EmilyHorsman Isn't $0$ a vector in $W$? Don't read too much in the formula: it's a numeric relation between dimensions.
      – egreg
      Nov 17 at 21:34












    • Ah right, that's true.
      – Emily Horsman
      Nov 17 at 21:34










    • Okay here's my summary to ensure I understand. $text{rank}(T) = dim(V) - text{nullity}(T)$. $dim(W) = text{rank}(T)$ since $T$ is surjective and thus every vector in $W$ is the image of a vector in $V$. $dim(W) = dim(V) - text{nullity}(T) leq dim(V)$.
      – Emily Horsman
      Nov 17 at 21:52












    • @EmilyHorsman Right so.
      – egreg
      Nov 17 at 22:03















    up vote
    1
    down vote



    accepted










    Your idea is good: if ${v_1,dots,v_n}$ is a basis of $V$, then ${T(v_1),dots,T(v_n)}$ is a spanning set for $W$, which you can extract a basis from. Therefore $dim Wledim V$.



    How can you extract a basis? Suppose that ${u_1,dots,u_r}$ is a spanning set of the vector space $U$. Then either the set is linearly dependent or independent. In the latter case you're finished. Otherwise, one vector is a linear combination of the others; without loss of generality, it can be taken as $u_r$ and then ${u_1,dots,u_{r-1}}$ is again a spanning set. Repeat until you have to stop because the set you get is linearly independent.





    The rank-nullity theorem tells even more: if $Tcolon Vto W$ is a linear map, then
    $$
    dim V=dimker T+dimoperatorname{im}T
    $$

    (where $ker T$ is the kernel and $operatorname{im}T$ is the image). If $T$ is surjective, then $dim W=dim V-dimker Tle dim V$.






    share|cite|improve this answer





















    • "If $T$ is surjective, then $dim W = dim V - dim ker T$" Does this imply that $T$ maps any vector in $V$ to either a vector in $W$ or to 0?
      – Emily Horsman
      Nov 17 at 21:34












    • @EmilyHorsman Isn't $0$ a vector in $W$? Don't read too much in the formula: it's a numeric relation between dimensions.
      – egreg
      Nov 17 at 21:34












    • Ah right, that's true.
      – Emily Horsman
      Nov 17 at 21:34










    • Okay here's my summary to ensure I understand. $text{rank}(T) = dim(V) - text{nullity}(T)$. $dim(W) = text{rank}(T)$ since $T$ is surjective and thus every vector in $W$ is the image of a vector in $V$. $dim(W) = dim(V) - text{nullity}(T) leq dim(V)$.
      – Emily Horsman
      Nov 17 at 21:52












    • @EmilyHorsman Right so.
      – egreg
      Nov 17 at 22:03













    up vote
    1
    down vote



    accepted







    up vote
    1
    down vote



    accepted






    Your idea is good: if ${v_1,dots,v_n}$ is a basis of $V$, then ${T(v_1),dots,T(v_n)}$ is a spanning set for $W$, which you can extract a basis from. Therefore $dim Wledim V$.



    How can you extract a basis? Suppose that ${u_1,dots,u_r}$ is a spanning set of the vector space $U$. Then either the set is linearly dependent or independent. In the latter case you're finished. Otherwise, one vector is a linear combination of the others; without loss of generality, it can be taken as $u_r$ and then ${u_1,dots,u_{r-1}}$ is again a spanning set. Repeat until you have to stop because the set you get is linearly independent.





    The rank-nullity theorem tells even more: if $Tcolon Vto W$ is a linear map, then
    $$
    dim V=dimker T+dimoperatorname{im}T
    $$

    (where $ker T$ is the kernel and $operatorname{im}T$ is the image). If $T$ is surjective, then $dim W=dim V-dimker Tle dim V$.






    share|cite|improve this answer












    Your idea is good: if ${v_1,dots,v_n}$ is a basis of $V$, then ${T(v_1),dots,T(v_n)}$ is a spanning set for $W$, which you can extract a basis from. Therefore $dim Wledim V$.



    How can you extract a basis? Suppose that ${u_1,dots,u_r}$ is a spanning set of the vector space $U$. Then either the set is linearly dependent or independent. In the latter case you're finished. Otherwise, one vector is a linear combination of the others; without loss of generality, it can be taken as $u_r$ and then ${u_1,dots,u_{r-1}}$ is again a spanning set. Repeat until you have to stop because the set you get is linearly independent.





    The rank-nullity theorem tells even more: if $Tcolon Vto W$ is a linear map, then
    $$
    dim V=dimker T+dimoperatorname{im}T
    $$

    (where $ker T$ is the kernel and $operatorname{im}T$ is the image). If $T$ is surjective, then $dim W=dim V-dimker Tle dim V$.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Nov 17 at 21:23









    egreg

    175k1383198




    175k1383198












    • "If $T$ is surjective, then $dim W = dim V - dim ker T$" Does this imply that $T$ maps any vector in $V$ to either a vector in $W$ or to 0?
      – Emily Horsman
      Nov 17 at 21:34












    • @EmilyHorsman Isn't $0$ a vector in $W$? Don't read too much in the formula: it's a numeric relation between dimensions.
      – egreg
      Nov 17 at 21:34












    • Ah right, that's true.
      – Emily Horsman
      Nov 17 at 21:34










    • Okay here's my summary to ensure I understand. $text{rank}(T) = dim(V) - text{nullity}(T)$. $dim(W) = text{rank}(T)$ since $T$ is surjective and thus every vector in $W$ is the image of a vector in $V$. $dim(W) = dim(V) - text{nullity}(T) leq dim(V)$.
      – Emily Horsman
      Nov 17 at 21:52












    • @EmilyHorsman Right so.
      – egreg
      Nov 17 at 22:03


















    • "If $T$ is surjective, then $dim W = dim V - dim ker T$" Does this imply that $T$ maps any vector in $V$ to either a vector in $W$ or to 0?
      – Emily Horsman
      Nov 17 at 21:34












    • @EmilyHorsman Isn't $0$ a vector in $W$? Don't read too much in the formula: it's a numeric relation between dimensions.
      – egreg
      Nov 17 at 21:34












    • Ah right, that's true.
      – Emily Horsman
      Nov 17 at 21:34










    • Okay here's my summary to ensure I understand. $text{rank}(T) = dim(V) - text{nullity}(T)$. $dim(W) = text{rank}(T)$ since $T$ is surjective and thus every vector in $W$ is the image of a vector in $V$. $dim(W) = dim(V) - text{nullity}(T) leq dim(V)$.
      – Emily Horsman
      Nov 17 at 21:52












    • @EmilyHorsman Right so.
      – egreg
      Nov 17 at 22:03
















    "If $T$ is surjective, then $dim W = dim V - dim ker T$" Does this imply that $T$ maps any vector in $V$ to either a vector in $W$ or to 0?
    – Emily Horsman
    Nov 17 at 21:34






    "If $T$ is surjective, then $dim W = dim V - dim ker T$" Does this imply that $T$ maps any vector in $V$ to either a vector in $W$ or to 0?
    – Emily Horsman
    Nov 17 at 21:34














    @EmilyHorsman Isn't $0$ a vector in $W$? Don't read too much in the formula: it's a numeric relation between dimensions.
    – egreg
    Nov 17 at 21:34






    @EmilyHorsman Isn't $0$ a vector in $W$? Don't read too much in the formula: it's a numeric relation between dimensions.
    – egreg
    Nov 17 at 21:34














    Ah right, that's true.
    – Emily Horsman
    Nov 17 at 21:34




    Ah right, that's true.
    – Emily Horsman
    Nov 17 at 21:34












    Okay here's my summary to ensure I understand. $text{rank}(T) = dim(V) - text{nullity}(T)$. $dim(W) = text{rank}(T)$ since $T$ is surjective and thus every vector in $W$ is the image of a vector in $V$. $dim(W) = dim(V) - text{nullity}(T) leq dim(V)$.
    – Emily Horsman
    Nov 17 at 21:52






    Okay here's my summary to ensure I understand. $text{rank}(T) = dim(V) - text{nullity}(T)$. $dim(W) = text{rank}(T)$ since $T$ is surjective and thus every vector in $W$ is the image of a vector in $V$. $dim(W) = dim(V) - text{nullity}(T) leq dim(V)$.
    – Emily Horsman
    Nov 17 at 21:52














    @EmilyHorsman Right so.
    – egreg
    Nov 17 at 22:03




    @EmilyHorsman Right so.
    – egreg
    Nov 17 at 22:03










    up vote
    1
    down vote













    I would start from a basis of $W$: $w_1,ldots,w_m$. $T$ is surjective then there are $v_1,ldots,v_m$ such that $w_k=T(v_k)$. It is straigthforward by definition to verify that $v_1,ldots, v_m$ are linearly independent
    $$
    sum c_kv_k=0implies Tbig(sum c_kv_kbig)=sum c_k w_k=0implies text{ all }c_k=0,
    $$

    hence, $nge m$.






    share|cite|improve this answer

























      up vote
      1
      down vote













      I would start from a basis of $W$: $w_1,ldots,w_m$. $T$ is surjective then there are $v_1,ldots,v_m$ such that $w_k=T(v_k)$. It is straigthforward by definition to verify that $v_1,ldots, v_m$ are linearly independent
      $$
      sum c_kv_k=0implies Tbig(sum c_kv_kbig)=sum c_k w_k=0implies text{ all }c_k=0,
      $$

      hence, $nge m$.






      share|cite|improve this answer























        up vote
        1
        down vote










        up vote
        1
        down vote









        I would start from a basis of $W$: $w_1,ldots,w_m$. $T$ is surjective then there are $v_1,ldots,v_m$ such that $w_k=T(v_k)$. It is straigthforward by definition to verify that $v_1,ldots, v_m$ are linearly independent
        $$
        sum c_kv_k=0implies Tbig(sum c_kv_kbig)=sum c_k w_k=0implies text{ all }c_k=0,
        $$

        hence, $nge m$.






        share|cite|improve this answer












        I would start from a basis of $W$: $w_1,ldots,w_m$. $T$ is surjective then there are $v_1,ldots,v_m$ such that $w_k=T(v_k)$. It is straigthforward by definition to verify that $v_1,ldots, v_m$ are linearly independent
        $$
        sum c_kv_k=0implies Tbig(sum c_kv_kbig)=sum c_k w_k=0implies text{ all }c_k=0,
        $$

        hence, $nge m$.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Nov 17 at 21:34









        A.Γ.

        21.2k22455




        21.2k22455






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3002748%2fproving-that-the-input-space-of-a-surjective-linear-transformation-has-a-dimensi%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            AnyDesk - Fatal Program Failure

            How to calibrate 16:9 built-in touch-screen to a 4:3 resolution?

            QoS: MAC-Priority for clients behind a repeater