My computer doesn't recognize the monitor's resolution on some VGA cables, what's going on?











up vote
2
down vote

favorite












My Mac doesn't recognize my TV's resolution on (so far) 3 VGA cables I've tried but it works just fine using other seemingly old VGA cables. My roommate had the same experience on Mint Linux.



What's the difference between VGA cables? What should I look for in a VGA cable that I buy? Are those still being made?










share|improve this question






















  • TVs don't commonly have VGA, and I've generally had to set it manually. What sort of Mac? Macs haven't had VGA out in years so what sort of adaptors are you using?
    – Journeyman Geek
    Jul 21 '15 at 2:51










  • @JourneymanGeek It's a 3-year-old Samsung TV, relatively low-end. It has an HDMI port which I use for my RPi and a VGA port that I use for my 2013 MacBook Air. I have an Apple adapter to convert I believe display port to VGA.
    – Leo Jweda
    Jul 21 '15 at 2:53










  • I half suspect the adaptor may come into play. I'd also add, as of 2015, no major computer manufacturer is supporting VGA and DVI, so chances are those cables will stop being made in a few years. You can still find some systems with support, but in general dp's the preferred standard for Monitors, and HDMI for TVs. PCs will output both, of course.
    – Journeyman Geek
    Jul 21 '15 at 3:14










  • @JourneymanGeek Like I said, in the exact same setup, with the exact same adapters, some cables work and other don't. It most definitely NOT the adapter. Cute dog BTW.
    – Leo Jweda
    Jul 21 '15 at 3:17















up vote
2
down vote

favorite












My Mac doesn't recognize my TV's resolution on (so far) 3 VGA cables I've tried but it works just fine using other seemingly old VGA cables. My roommate had the same experience on Mint Linux.



What's the difference between VGA cables? What should I look for in a VGA cable that I buy? Are those still being made?










share|improve this question






















  • TVs don't commonly have VGA, and I've generally had to set it manually. What sort of Mac? Macs haven't had VGA out in years so what sort of adaptors are you using?
    – Journeyman Geek
    Jul 21 '15 at 2:51










  • @JourneymanGeek It's a 3-year-old Samsung TV, relatively low-end. It has an HDMI port which I use for my RPi and a VGA port that I use for my 2013 MacBook Air. I have an Apple adapter to convert I believe display port to VGA.
    – Leo Jweda
    Jul 21 '15 at 2:53










  • I half suspect the adaptor may come into play. I'd also add, as of 2015, no major computer manufacturer is supporting VGA and DVI, so chances are those cables will stop being made in a few years. You can still find some systems with support, but in general dp's the preferred standard for Monitors, and HDMI for TVs. PCs will output both, of course.
    – Journeyman Geek
    Jul 21 '15 at 3:14










  • @JourneymanGeek Like I said, in the exact same setup, with the exact same adapters, some cables work and other don't. It most definitely NOT the adapter. Cute dog BTW.
    – Leo Jweda
    Jul 21 '15 at 3:17













up vote
2
down vote

favorite









up vote
2
down vote

favorite











My Mac doesn't recognize my TV's resolution on (so far) 3 VGA cables I've tried but it works just fine using other seemingly old VGA cables. My roommate had the same experience on Mint Linux.



What's the difference between VGA cables? What should I look for in a VGA cable that I buy? Are those still being made?










share|improve this question













My Mac doesn't recognize my TV's resolution on (so far) 3 VGA cables I've tried but it works just fine using other seemingly old VGA cables. My roommate had the same experience on Mint Linux.



What's the difference between VGA cables? What should I look for in a VGA cable that I buy? Are those still being made?







display macbook vga cable external-display






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jul 21 '15 at 1:42









Leo Jweda

1135




1135












  • TVs don't commonly have VGA, and I've generally had to set it manually. What sort of Mac? Macs haven't had VGA out in years so what sort of adaptors are you using?
    – Journeyman Geek
    Jul 21 '15 at 2:51










  • @JourneymanGeek It's a 3-year-old Samsung TV, relatively low-end. It has an HDMI port which I use for my RPi and a VGA port that I use for my 2013 MacBook Air. I have an Apple adapter to convert I believe display port to VGA.
    – Leo Jweda
    Jul 21 '15 at 2:53










  • I half suspect the adaptor may come into play. I'd also add, as of 2015, no major computer manufacturer is supporting VGA and DVI, so chances are those cables will stop being made in a few years. You can still find some systems with support, but in general dp's the preferred standard for Monitors, and HDMI for TVs. PCs will output both, of course.
    – Journeyman Geek
    Jul 21 '15 at 3:14










  • @JourneymanGeek Like I said, in the exact same setup, with the exact same adapters, some cables work and other don't. It most definitely NOT the adapter. Cute dog BTW.
    – Leo Jweda
    Jul 21 '15 at 3:17


















  • TVs don't commonly have VGA, and I've generally had to set it manually. What sort of Mac? Macs haven't had VGA out in years so what sort of adaptors are you using?
    – Journeyman Geek
    Jul 21 '15 at 2:51










  • @JourneymanGeek It's a 3-year-old Samsung TV, relatively low-end. It has an HDMI port which I use for my RPi and a VGA port that I use for my 2013 MacBook Air. I have an Apple adapter to convert I believe display port to VGA.
    – Leo Jweda
    Jul 21 '15 at 2:53










  • I half suspect the adaptor may come into play. I'd also add, as of 2015, no major computer manufacturer is supporting VGA and DVI, so chances are those cables will stop being made in a few years. You can still find some systems with support, but in general dp's the preferred standard for Monitors, and HDMI for TVs. PCs will output both, of course.
    – Journeyman Geek
    Jul 21 '15 at 3:14










  • @JourneymanGeek Like I said, in the exact same setup, with the exact same adapters, some cables work and other don't. It most definitely NOT the adapter. Cute dog BTW.
    – Leo Jweda
    Jul 21 '15 at 3:17
















TVs don't commonly have VGA, and I've generally had to set it manually. What sort of Mac? Macs haven't had VGA out in years so what sort of adaptors are you using?
– Journeyman Geek
Jul 21 '15 at 2:51




TVs don't commonly have VGA, and I've generally had to set it manually. What sort of Mac? Macs haven't had VGA out in years so what sort of adaptors are you using?
– Journeyman Geek
Jul 21 '15 at 2:51












@JourneymanGeek It's a 3-year-old Samsung TV, relatively low-end. It has an HDMI port which I use for my RPi and a VGA port that I use for my 2013 MacBook Air. I have an Apple adapter to convert I believe display port to VGA.
– Leo Jweda
Jul 21 '15 at 2:53




@JourneymanGeek It's a 3-year-old Samsung TV, relatively low-end. It has an HDMI port which I use for my RPi and a VGA port that I use for my 2013 MacBook Air. I have an Apple adapter to convert I believe display port to VGA.
– Leo Jweda
Jul 21 '15 at 2:53












I half suspect the adaptor may come into play. I'd also add, as of 2015, no major computer manufacturer is supporting VGA and DVI, so chances are those cables will stop being made in a few years. You can still find some systems with support, but in general dp's the preferred standard for Monitors, and HDMI for TVs. PCs will output both, of course.
– Journeyman Geek
Jul 21 '15 at 3:14




I half suspect the adaptor may come into play. I'd also add, as of 2015, no major computer manufacturer is supporting VGA and DVI, so chances are those cables will stop being made in a few years. You can still find some systems with support, but in general dp's the preferred standard for Monitors, and HDMI for TVs. PCs will output both, of course.
– Journeyman Geek
Jul 21 '15 at 3:14












@JourneymanGeek Like I said, in the exact same setup, with the exact same adapters, some cables work and other don't. It most definitely NOT the adapter. Cute dog BTW.
– Leo Jweda
Jul 21 '15 at 3:17




@JourneymanGeek Like I said, in the exact same setup, with the exact same adapters, some cables work and other don't. It most definitely NOT the adapter. Cute dog BTW.
– Leo Jweda
Jul 21 '15 at 3:17










3 Answers
3






active

oldest

votes

















up vote
6
down vote



accepted










As @Wyzard says, computers talk to monitors via DDC and this uses 4 pins on the vga.
See the pinout on wikipedia.



A working vga cable needs all the pins wired independently from each, and also from the cable shielding
to the chassis, i.e the metal cover of the plugs and sockets. Only pins 4 and 11 can be missing or not cabled.



Inside a monitor there is a small eprom holding information on the monitor's resolutions (the EDID). This
eprom is accessible even when the monitor is powered off, as it is
independently powered from the computer by 5 volts on pin 9, with ground on pins 10 and 5.
The eprom is read using i2c clock and data signals on pins 12 and 15.
These days with DDC, the i2c bus can do other things too.



The analogue video signals on pins 1, 2, 3 have independent analogue ground on 6, 7, and 8.



Some bad vga cables will not have these analogue grounds, nor the i2c ground, and assume the cable shielding
will do. This often does not work.
The only way to check vga cables is with a continuity tester (multimeter) ensuring that all pins are wired
through, and none are in common, including the shield.






share|improve this answer





















  • Update: I got a C2G cable that works. Thanks.
    – Leo Jweda
    Jul 24 '15 at 3:41


















up vote
4
down vote













Detecting a monitor's resolution depends on information provided by the monitor itself via DDC. In a VGA cable, this uses several pins and wires that are separate from the ones used for the actual video signal.



A cheap VGA cable might be missing some of these wires (to save on costs), or the wire might be present but broken (due to wear & tear).






share|improve this answer

















  • 1




    How do I find a cable that has all the wires? I tried cheap ones from Amazon and a $20+ one from a brick and mortar store and none of them worked. The only ones that have worked are old ones for some reason.
    – Leo Jweda
    Jul 21 '15 at 2:54










  • Don't know, sorry. I've never encountered a cable with that problem, but I haven't used VGA cables in a long time.
    – Wyzard
    Jul 21 '15 at 3:18


















up vote
0
down vote













Simple Answer, stop using VGA cables, use HDMI or DisplayPort



Long answer: well quality and length of vga cables has alot to say about resolution






share|improve this answer

















  • 1




    I'm aware of the drawbacks of using VGA cables but my current setup requires me to use one. I highly doubt it's length because I've tried different lengths and that did not seem to have an effect on the quality. After all, my setup only requires about 2 feet.
    – Leo Jweda
    Jul 21 '15 at 2:11










  • However, note that a quality VGA (and most other kinds of) cable need not cost a fortune nor have gold plated pens. The important factors are being wired-properly with the right gauge wire, and being shielded.
    – martineau
    Jul 21 '15 at 9:29










  • length also plays a big role, longer cables will lose signals
    – nwgat
    Jul 21 '15 at 10:00











Your Answer








StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "3"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f943139%2fmy-computer-doesnt-recognize-the-monitors-resolution-on-some-vga-cables-what%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























3 Answers
3






active

oldest

votes








3 Answers
3






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
6
down vote



accepted










As @Wyzard says, computers talk to monitors via DDC and this uses 4 pins on the vga.
See the pinout on wikipedia.



A working vga cable needs all the pins wired independently from each, and also from the cable shielding
to the chassis, i.e the metal cover of the plugs and sockets. Only pins 4 and 11 can be missing or not cabled.



Inside a monitor there is a small eprom holding information on the monitor's resolutions (the EDID). This
eprom is accessible even when the monitor is powered off, as it is
independently powered from the computer by 5 volts on pin 9, with ground on pins 10 and 5.
The eprom is read using i2c clock and data signals on pins 12 and 15.
These days with DDC, the i2c bus can do other things too.



The analogue video signals on pins 1, 2, 3 have independent analogue ground on 6, 7, and 8.



Some bad vga cables will not have these analogue grounds, nor the i2c ground, and assume the cable shielding
will do. This often does not work.
The only way to check vga cables is with a continuity tester (multimeter) ensuring that all pins are wired
through, and none are in common, including the shield.






share|improve this answer





















  • Update: I got a C2G cable that works. Thanks.
    – Leo Jweda
    Jul 24 '15 at 3:41















up vote
6
down vote



accepted










As @Wyzard says, computers talk to monitors via DDC and this uses 4 pins on the vga.
See the pinout on wikipedia.



A working vga cable needs all the pins wired independently from each, and also from the cable shielding
to the chassis, i.e the metal cover of the plugs and sockets. Only pins 4 and 11 can be missing or not cabled.



Inside a monitor there is a small eprom holding information on the monitor's resolutions (the EDID). This
eprom is accessible even when the monitor is powered off, as it is
independently powered from the computer by 5 volts on pin 9, with ground on pins 10 and 5.
The eprom is read using i2c clock and data signals on pins 12 and 15.
These days with DDC, the i2c bus can do other things too.



The analogue video signals on pins 1, 2, 3 have independent analogue ground on 6, 7, and 8.



Some bad vga cables will not have these analogue grounds, nor the i2c ground, and assume the cable shielding
will do. This often does not work.
The only way to check vga cables is with a continuity tester (multimeter) ensuring that all pins are wired
through, and none are in common, including the shield.






share|improve this answer





















  • Update: I got a C2G cable that works. Thanks.
    – Leo Jweda
    Jul 24 '15 at 3:41













up vote
6
down vote



accepted







up vote
6
down vote



accepted






As @Wyzard says, computers talk to monitors via DDC and this uses 4 pins on the vga.
See the pinout on wikipedia.



A working vga cable needs all the pins wired independently from each, and also from the cable shielding
to the chassis, i.e the metal cover of the plugs and sockets. Only pins 4 and 11 can be missing or not cabled.



Inside a monitor there is a small eprom holding information on the monitor's resolutions (the EDID). This
eprom is accessible even when the monitor is powered off, as it is
independently powered from the computer by 5 volts on pin 9, with ground on pins 10 and 5.
The eprom is read using i2c clock and data signals on pins 12 and 15.
These days with DDC, the i2c bus can do other things too.



The analogue video signals on pins 1, 2, 3 have independent analogue ground on 6, 7, and 8.



Some bad vga cables will not have these analogue grounds, nor the i2c ground, and assume the cable shielding
will do. This often does not work.
The only way to check vga cables is with a continuity tester (multimeter) ensuring that all pins are wired
through, and none are in common, including the shield.






share|improve this answer












As @Wyzard says, computers talk to monitors via DDC and this uses 4 pins on the vga.
See the pinout on wikipedia.



A working vga cable needs all the pins wired independently from each, and also from the cable shielding
to the chassis, i.e the metal cover of the plugs and sockets. Only pins 4 and 11 can be missing or not cabled.



Inside a monitor there is a small eprom holding information on the monitor's resolutions (the EDID). This
eprom is accessible even when the monitor is powered off, as it is
independently powered from the computer by 5 volts on pin 9, with ground on pins 10 and 5.
The eprom is read using i2c clock and data signals on pins 12 and 15.
These days with DDC, the i2c bus can do other things too.



The analogue video signals on pins 1, 2, 3 have independent analogue ground on 6, 7, and 8.



Some bad vga cables will not have these analogue grounds, nor the i2c ground, and assume the cable shielding
will do. This often does not work.
The only way to check vga cables is with a continuity tester (multimeter) ensuring that all pins are wired
through, and none are in common, including the shield.







share|improve this answer












share|improve this answer



share|improve this answer










answered Jul 21 '15 at 7:27









meuh

3,3601920




3,3601920












  • Update: I got a C2G cable that works. Thanks.
    – Leo Jweda
    Jul 24 '15 at 3:41


















  • Update: I got a C2G cable that works. Thanks.
    – Leo Jweda
    Jul 24 '15 at 3:41
















Update: I got a C2G cable that works. Thanks.
– Leo Jweda
Jul 24 '15 at 3:41




Update: I got a C2G cable that works. Thanks.
– Leo Jweda
Jul 24 '15 at 3:41












up vote
4
down vote













Detecting a monitor's resolution depends on information provided by the monitor itself via DDC. In a VGA cable, this uses several pins and wires that are separate from the ones used for the actual video signal.



A cheap VGA cable might be missing some of these wires (to save on costs), or the wire might be present but broken (due to wear & tear).






share|improve this answer

















  • 1




    How do I find a cable that has all the wires? I tried cheap ones from Amazon and a $20+ one from a brick and mortar store and none of them worked. The only ones that have worked are old ones for some reason.
    – Leo Jweda
    Jul 21 '15 at 2:54










  • Don't know, sorry. I've never encountered a cable with that problem, but I haven't used VGA cables in a long time.
    – Wyzard
    Jul 21 '15 at 3:18















up vote
4
down vote













Detecting a monitor's resolution depends on information provided by the monitor itself via DDC. In a VGA cable, this uses several pins and wires that are separate from the ones used for the actual video signal.



A cheap VGA cable might be missing some of these wires (to save on costs), or the wire might be present but broken (due to wear & tear).






share|improve this answer

















  • 1




    How do I find a cable that has all the wires? I tried cheap ones from Amazon and a $20+ one from a brick and mortar store and none of them worked. The only ones that have worked are old ones for some reason.
    – Leo Jweda
    Jul 21 '15 at 2:54










  • Don't know, sorry. I've never encountered a cable with that problem, but I haven't used VGA cables in a long time.
    – Wyzard
    Jul 21 '15 at 3:18













up vote
4
down vote










up vote
4
down vote









Detecting a monitor's resolution depends on information provided by the monitor itself via DDC. In a VGA cable, this uses several pins and wires that are separate from the ones used for the actual video signal.



A cheap VGA cable might be missing some of these wires (to save on costs), or the wire might be present but broken (due to wear & tear).






share|improve this answer












Detecting a monitor's resolution depends on information provided by the monitor itself via DDC. In a VGA cable, this uses several pins and wires that are separate from the ones used for the actual video signal.



A cheap VGA cable might be missing some of these wires (to save on costs), or the wire might be present but broken (due to wear & tear).







share|improve this answer












share|improve this answer



share|improve this answer










answered Jul 21 '15 at 2:47









Wyzard

5,62822123




5,62822123








  • 1




    How do I find a cable that has all the wires? I tried cheap ones from Amazon and a $20+ one from a brick and mortar store and none of them worked. The only ones that have worked are old ones for some reason.
    – Leo Jweda
    Jul 21 '15 at 2:54










  • Don't know, sorry. I've never encountered a cable with that problem, but I haven't used VGA cables in a long time.
    – Wyzard
    Jul 21 '15 at 3:18














  • 1




    How do I find a cable that has all the wires? I tried cheap ones from Amazon and a $20+ one from a brick and mortar store and none of them worked. The only ones that have worked are old ones for some reason.
    – Leo Jweda
    Jul 21 '15 at 2:54










  • Don't know, sorry. I've never encountered a cable with that problem, but I haven't used VGA cables in a long time.
    – Wyzard
    Jul 21 '15 at 3:18








1




1




How do I find a cable that has all the wires? I tried cheap ones from Amazon and a $20+ one from a brick and mortar store and none of them worked. The only ones that have worked are old ones for some reason.
– Leo Jweda
Jul 21 '15 at 2:54




How do I find a cable that has all the wires? I tried cheap ones from Amazon and a $20+ one from a brick and mortar store and none of them worked. The only ones that have worked are old ones for some reason.
– Leo Jweda
Jul 21 '15 at 2:54












Don't know, sorry. I've never encountered a cable with that problem, but I haven't used VGA cables in a long time.
– Wyzard
Jul 21 '15 at 3:18




Don't know, sorry. I've never encountered a cable with that problem, but I haven't used VGA cables in a long time.
– Wyzard
Jul 21 '15 at 3:18










up vote
0
down vote













Simple Answer, stop using VGA cables, use HDMI or DisplayPort



Long answer: well quality and length of vga cables has alot to say about resolution






share|improve this answer

















  • 1




    I'm aware of the drawbacks of using VGA cables but my current setup requires me to use one. I highly doubt it's length because I've tried different lengths and that did not seem to have an effect on the quality. After all, my setup only requires about 2 feet.
    – Leo Jweda
    Jul 21 '15 at 2:11










  • However, note that a quality VGA (and most other kinds of) cable need not cost a fortune nor have gold plated pens. The important factors are being wired-properly with the right gauge wire, and being shielded.
    – martineau
    Jul 21 '15 at 9:29










  • length also plays a big role, longer cables will lose signals
    – nwgat
    Jul 21 '15 at 10:00















up vote
0
down vote













Simple Answer, stop using VGA cables, use HDMI or DisplayPort



Long answer: well quality and length of vga cables has alot to say about resolution






share|improve this answer

















  • 1




    I'm aware of the drawbacks of using VGA cables but my current setup requires me to use one. I highly doubt it's length because I've tried different lengths and that did not seem to have an effect on the quality. After all, my setup only requires about 2 feet.
    – Leo Jweda
    Jul 21 '15 at 2:11










  • However, note that a quality VGA (and most other kinds of) cable need not cost a fortune nor have gold plated pens. The important factors are being wired-properly with the right gauge wire, and being shielded.
    – martineau
    Jul 21 '15 at 9:29










  • length also plays a big role, longer cables will lose signals
    – nwgat
    Jul 21 '15 at 10:00













up vote
0
down vote










up vote
0
down vote









Simple Answer, stop using VGA cables, use HDMI or DisplayPort



Long answer: well quality and length of vga cables has alot to say about resolution






share|improve this answer












Simple Answer, stop using VGA cables, use HDMI or DisplayPort



Long answer: well quality and length of vga cables has alot to say about resolution







share|improve this answer












share|improve this answer



share|improve this answer










answered Jul 21 '15 at 2:03









nwgat

8081811




8081811








  • 1




    I'm aware of the drawbacks of using VGA cables but my current setup requires me to use one. I highly doubt it's length because I've tried different lengths and that did not seem to have an effect on the quality. After all, my setup only requires about 2 feet.
    – Leo Jweda
    Jul 21 '15 at 2:11










  • However, note that a quality VGA (and most other kinds of) cable need not cost a fortune nor have gold plated pens. The important factors are being wired-properly with the right gauge wire, and being shielded.
    – martineau
    Jul 21 '15 at 9:29










  • length also plays a big role, longer cables will lose signals
    – nwgat
    Jul 21 '15 at 10:00














  • 1




    I'm aware of the drawbacks of using VGA cables but my current setup requires me to use one. I highly doubt it's length because I've tried different lengths and that did not seem to have an effect on the quality. After all, my setup only requires about 2 feet.
    – Leo Jweda
    Jul 21 '15 at 2:11










  • However, note that a quality VGA (and most other kinds of) cable need not cost a fortune nor have gold plated pens. The important factors are being wired-properly with the right gauge wire, and being shielded.
    – martineau
    Jul 21 '15 at 9:29










  • length also plays a big role, longer cables will lose signals
    – nwgat
    Jul 21 '15 at 10:00








1




1




I'm aware of the drawbacks of using VGA cables but my current setup requires me to use one. I highly doubt it's length because I've tried different lengths and that did not seem to have an effect on the quality. After all, my setup only requires about 2 feet.
– Leo Jweda
Jul 21 '15 at 2:11




I'm aware of the drawbacks of using VGA cables but my current setup requires me to use one. I highly doubt it's length because I've tried different lengths and that did not seem to have an effect on the quality. After all, my setup only requires about 2 feet.
– Leo Jweda
Jul 21 '15 at 2:11












However, note that a quality VGA (and most other kinds of) cable need not cost a fortune nor have gold plated pens. The important factors are being wired-properly with the right gauge wire, and being shielded.
– martineau
Jul 21 '15 at 9:29




However, note that a quality VGA (and most other kinds of) cable need not cost a fortune nor have gold plated pens. The important factors are being wired-properly with the right gauge wire, and being shielded.
– martineau
Jul 21 '15 at 9:29












length also plays a big role, longer cables will lose signals
– nwgat
Jul 21 '15 at 10:00




length also plays a big role, longer cables will lose signals
– nwgat
Jul 21 '15 at 10:00


















draft saved

draft discarded




















































Thanks for contributing an answer to Super User!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f943139%2fmy-computer-doesnt-recognize-the-monitors-resolution-on-some-vga-cables-what%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

AnyDesk - Fatal Program Failure

How to calibrate 16:9 built-in touch-screen to a 4:3 resolution?

QoS: MAC-Priority for clients behind a repeater