PCWorld Forums

PCWorld Forums: Vga Vs. Dvi - PCWorld Forums

Jump to content

Page 1 of 1
  • You cannot start a new topic
  • You cannot reply to this topic

Vga Vs. Dvi

#1 User is offline   gamersim17 

  • Advanced Member
  • PipPipPipPip
  • Group: Members
  • Posts: 127
  • Joined: 09-December 12
  • Location:Wisconsin

Posted 27 February 2013 - 05:30 PM

Hi there, I was just wonder if there's a quality difference between VGA and DVI? If not then what is the difference between the two?

Thanks,

Adam B. A.K.K gamersim17
Adam B. A.K.A. gamersim17
0

#2 User is offline   LiveBrianD 

  • Elite
  • PipPipPipPipPipPipPipPip
  • Group: Members
  • Posts: 12,210
  • Joined: 31-December 09
  • Location:::1

Posted 27 February 2013 - 05:35 PM

VGA is analog (and thus slightly lower quality), and should generally be avoided if possible. DVI, HDMI (which is electrically compatible with DVI btw, making adapters between the two cheap and passive), and DisplayPort (as well as Mini DisplayPort) are all digital.
Spoiler
"The Internet will be used for all kinds of spurious things, including fake quotes from smart people." -Albert Einstein
Need a Windows ISO image?
0

#3 User is offline   Szczecinianin 

  • Advanced Member
  • PipPipPipPip
  • Group: Members
  • Posts: 441
  • Joined: 12-January 11
  • Location:Szczecin, Poland

Posted 28 February 2013 - 01:05 AM

Hi,

VGA is [b[analog only[/b], whereas DVI can be DVI-A (analog); DVI-D (digital) and DVI-I which is both. This last option is best and makes this interface better. It's also possible to use an HDMI adapter with DVI so it is by no means archaic.     


0

#4 User is offline   Szczecinianin 

  • Advanced Member
  • PipPipPipPip
  • Group: Members
  • Posts: 441
  • Joined: 12-January 11
  • Location:Szczecin, Poland

Posted 28 February 2013 - 01:06 AM

Hi,

VGA is analog only, whereas DVI can be DVI-A (analog); DVI-D (digital) and DVI-I which is both. This last option is best and makes this interface better. It's also possible to use an HDMI adapter with DVI so it is by no means archaic.     

This post has been edited by Szczecinianin: 28 February 2013 - 01:07 AM

0

#5 User is offline   gamersim17 

  • Advanced Member
  • PipPipPipPip
  • Group: Members
  • Posts: 127
  • Joined: 09-December 12
  • Location:Wisconsin

Posted 28 February 2013 - 07:21 AM

Awesome thanks for clearing that up, this information has been very helpful.

Yours Truly,

Adam B. A.K.A. gamesim17
Adam B. A.K.A. gamersim17
0

#6 User is offline   mjd420nova 

  • Expert
  • PipPipPipPipPipPip
  • Group: Members
  • Posts: 3,340
  • Joined: 05-August 06
  • Location:Fremont, California

Posted 28 February 2013 - 11:00 AM

I don't have that many clients that play games but most do some video editing and their desktop units are from around 5 generations of procesor. There is a point where DVI became available along with a VGA port on upper end video cards. Then it kind of stalled out before the HDTV craze and the HDMI standard was accepted. The DVI is the first step beyond the 640 X 480 and was the first step into the HD realm and the monitor market began to feed the frenzy with wide screen displays (1440 X 900). In the home video market it started with the composite video and expanded to component (RGB-red-green-blue) and then an entry by VCR makers opened S-video and the beginnings of the HD market. HDMI was the first video interface standard that had handshaking with attached devices.
0

#7 User is offline   waldojim 

  • Elite
  • PipPipPipPipPipPipPipPip
  • Group: Members
  • Posts: 16,412
  • Joined: 29-October 08
  • Location:Texas

Posted 04 March 2013 - 07:54 PM

View Postmjd420nova, on 28 February 2013 - 11:00 AM, said:

I don't have that many clients that play games but most do some video editing and their desktop units are from around 5 generations of procesor. There is a point where DVI became available along with a VGA port on upper end video cards. Then it kind of stalled out before the HDTV craze and the HDMI standard was accepted. The DVI is the first step beyond the 640 X 480 and was the first step into the HD realm and the monitor market began to feed the frenzy with wide screen displays (1440 X 900). In the home video market it started with the composite video and expanded to component (RGB-red-green-blue) and then an entry by VCR makers opened S-video and the beginnings of the HD market. HDMI was the first video interface standard that had handshaking with attached devices.


For what little it is worth, VGA (well actually into the XGA realm) scaled up to 2048x1536 on that analog interface. Well beyond most DVI/HDMI based monitors sold these days. With a quality cable, and decent enough video card, it can scale beyond that even.
"There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'" -- Isaac Asimov

Steam Machine: MSI 970A-G46, AMD Phenom 955 @ 4.0Ghz, 8GB Gskill ram @1600mhz, 128GB Plextor M5s, EVGA GTX 550Ti
Laptop: Alienware 14, Intel i7-4700MQ, 8GB DDR3 ram, Nvidia GTX 765M 4GB DDR5, Plextor M3 256GB SSD, 1080P IPS display, Killer GigE, Killer 1202 wifi
Hackintosh: Gigabyte H61m-HD2, Celeron G1610, 4GB Patriot ram @1333Mhz, Asus GT210, WD 1TB Black, Silverstone ES50 500watt PSU, OS-X Mountain Liion
0

#8 User is offline   mjd420nova 

  • Expert
  • PipPipPipPipPipPip
  • Group: Members
  • Posts: 3,340
  • Joined: 05-August 06
  • Location:Fremont, California

Posted 05 March 2013 - 07:21 PM

View Postwaldojim, on 04 March 2013 - 07:54 PM, said:

View Postmjd420nova, on 28 February 2013 - 11:00 AM, said:

I don't have that many clients that play games but most do some video editing and their desktop units are from around 5 generations of procesor. There is a point where DVI became available along with a VGA port on upper end video cards. Then it kind of stalled out before the HDTV craze and the HDMI standard was accepted. The DVI is the first step beyond the 640 X 480 and was the first step into the HD realm and the monitor market began to feed the frenzy with wide screen displays (1440 X 900). In the home video market it started with the composite video and expanded to component (RGB-red-green-blue) and then an entry by VCR makers opened S-video and the beginnings of the HD market. HDMI was the first video interface standard that had handshaking with attached devices.


For what little it is worth, VGA (well actually into the XGA realm) scaled up to 2048x1536 on that analog interface. Well beyond most DVI/HDMI based monitors sold these days. With a quality cable, and decent enough video card, it can scale beyond that even.



The XGA was a perplexing setup from the beginning as all the users had to buy an expensive monitor ($2,000) and then have it modified to the tune of another $1,500. They are very impressive. The first ones I saw were for Apple machines and had a whopping 64 MB memory for the video. It was accompanied with a software program called Knowledgeware. It was the first stages of photo editing on a computer. And noisy, wow, most users had to move the units into another room seperate from their studios. Some of the monitors were the huge (for that time) at 35 inches. Originally a color TV but the tuners were removed and the video bandwidth expanded from the normal 3.58 MCS to over 12 MCS. And it got HOT, I even saw melted cases. I don't think monitors should have fans but there's always the exception. They used the RGB to match the color circuits on the TV. HDMI has now become the universal inteface for the consumer electronics and has brigded the audio video seperation, allowing one cable to handle both.

This post has been edited by mjd420nova: 05 March 2013 - 07:23 PM

0

#9 User is offline   waldojim 

  • Elite
  • PipPipPipPipPipPipPipPip
  • Group: Members
  • Posts: 16,412
  • Joined: 29-October 08
  • Location:Texas

Posted 06 March 2013 - 10:15 AM

to the tune of another $1,500. They are very impressive. The first ones I saw were for Apple machines and had a whopping 64 MB memory for the video. It was accompanied with a software program called Knowledgeware. It was the first stages of photo editing on a computer. And noisy, wow, most users had to move the units into another room seperate from their studios. Some of the monitors were the huge (for that time) at 35 inches. Originally a color TV but the tuners were removed and the video bandwidth expanded from the normal 3.58 MCS to over 12 MCS. And it got HOT, I even saw melted cases. I don't think monitors should have fans but there's always the exception. They used the RGB to match the color circuits on the TV. HDMI has now become the universal inteface for the consumer electronics and has brigded the audio video seperation, allowing one cable to handle both.
[/quote]

As we both know, prices quickly drop as technology becomes more available. My trusty crt was a 21" Viewsonic with a 1600*1200 resolution, and was purely analog. That was about $300 new as I recall.

Technology evolves rapidly, even the analog side does.
"There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'" -- Isaac Asimov

Steam Machine: MSI 970A-G46, AMD Phenom 955 @ 4.0Ghz, 8GB Gskill ram @1600mhz, 128GB Plextor M5s, EVGA GTX 550Ti
Laptop: Alienware 14, Intel i7-4700MQ, 8GB DDR3 ram, Nvidia GTX 765M 4GB DDR5, Plextor M3 256GB SSD, 1080P IPS display, Killer GigE, Killer 1202 wifi
Hackintosh: Gigabyte H61m-HD2, Celeron G1610, 4GB Patriot ram @1333Mhz, Asus GT210, WD 1TB Black, Silverstone ES50 500watt PSU, OS-X Mountain Liion
0

#10 User is offline   LiveBrianD 

  • Elite
  • PipPipPipPipPipPipPipPip
  • Group: Members
  • Posts: 12,210
  • Joined: 31-December 09
  • Location:::1

Posted 06 March 2013 - 11:22 AM

In that regard, it's a wonder that PCs have been stagnant in screen resolution. (particularly compared to phones) All those 15" 1366x768 laptops look terrible IMO, and I'm not seeing many signs of improvement.
Spoiler
"The Internet will be used for all kinds of spurious things, including fake quotes from smart people." -Albert Einstein
Need a Windows ISO image?
0

#11 User is offline   waldojim 

  • Elite
  • PipPipPipPipPipPipPipPip
  • Group: Members
  • Posts: 16,412
  • Joined: 29-October 08
  • Location:Texas

Posted 06 March 2013 - 07:07 PM

View PostLiveBrianD, on 06 March 2013 - 11:22 AM, said:

In that regard, it's a wonder that PCs have been stagnant in screen resolution. (particularly compared to phones) All those 15" 1366x768 laptops look terrible IMO, and I'm not seeing many signs of improvement.

People have settled into the "good enough" groove and really don't seem to care. Call it how you will.

At this point, I want a monitor that goes far above and beyond what I really need. I would like to go with a very large 3d capable high resolution beastie... if possible.
"There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'" -- Isaac Asimov

Steam Machine: MSI 970A-G46, AMD Phenom 955 @ 4.0Ghz, 8GB Gskill ram @1600mhz, 128GB Plextor M5s, EVGA GTX 550Ti
Laptop: Alienware 14, Intel i7-4700MQ, 8GB DDR3 ram, Nvidia GTX 765M 4GB DDR5, Plextor M3 256GB SSD, 1080P IPS display, Killer GigE, Killer 1202 wifi
Hackintosh: Gigabyte H61m-HD2, Celeron G1610, 4GB Patriot ram @1333Mhz, Asus GT210, WD 1TB Black, Silverstone ES50 500watt PSU, OS-X Mountain Liion
0

#12 User is offline   mjd420nova 

  • Expert
  • PipPipPipPipPipPip
  • Group: Members
  • Posts: 3,340
  • Joined: 05-August 06
  • Location:Fremont, California

Posted 07 March 2013 - 04:35 PM

I have to second that opinion. The consumers become complacent with what they have and when it dies or they find a new game that the old machine can't run. The eyes bug out and they start to drool. A week later they wonder how they could look at a 640 X 480 anymore. Like the difference between SD TV and HDTV. Makes me want to clean my glasses. How did we watch that stuff??
0

#13 User is offline   LiveBrianD 

  • Elite
  • PipPipPipPipPipPipPipPip
  • Group: Members
  • Posts: 12,210
  • Joined: 31-December 09
  • Location:::1

Posted 07 March 2013 - 05:32 PM

View Postwaldojim, on 06 March 2013 - 07:07 PM, said:

View PostLiveBrianD, on 06 March 2013 - 11:22 AM, said:

In that regard, it's a wonder that PCs have been stagnant in screen resolution. (particularly compared to phones) All those 15" 1366x768 laptops look terrible IMO, and I'm not seeing many signs of improvement.

People have settled into the "good enough" groove and really don't seem to care. Call it how you will.

At this point, I want a monitor that goes far above and beyond what I really need. I would like to go with a very large 3d capable high resolution beastie... if possible.


I still wonder though - when 5 year old PCs are good enough for most tasks, what motivates manufacturers to keep improving them and making them faster, unlike most monitors?
Spoiler
"The Internet will be used for all kinds of spurious things, including fake quotes from smart people." -Albert Einstein
Need a Windows ISO image?
0

#14 User is offline   waldojim 

  • Elite
  • PipPipPipPipPipPipPipPip
  • Group: Members
  • Posts: 16,412
  • Joined: 29-October 08
  • Location:Texas

Posted 10 March 2013 - 07:03 PM

View Postmjd420nova, on 07 March 2013 - 04:35 PM, said:

I have to second that opinion. The consumers become complacent with what they have and when it dies or they find a new game that the old machine can't run. The eyes bug out and they start to drool. A week later they wonder how they could look at a 640 X 480 anymore. Like the difference between SD TV and HDTV. Makes me want to clean my glasses. How did we watch that stuff??

Truthfully, I think we have a different problem these days. Analog signals (in the purest forms) have very few real limitations. It is entirely possible to capture an image in analog media, transmit it (in analog form), and reproduce it, while maintaining enough detail that it would be virtually indistinguishable from a digital SHD signal. Truth be told, the only giveaway would be the absolute lack of digital artifacts.

It is amazing how much we gave up moving to digital, and how much of that people just accept...
"There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'" -- Isaac Asimov

Steam Machine: MSI 970A-G46, AMD Phenom 955 @ 4.0Ghz, 8GB Gskill ram @1600mhz, 128GB Plextor M5s, EVGA GTX 550Ti
Laptop: Alienware 14, Intel i7-4700MQ, 8GB DDR3 ram, Nvidia GTX 765M 4GB DDR5, Plextor M3 256GB SSD, 1080P IPS display, Killer GigE, Killer 1202 wifi
Hackintosh: Gigabyte H61m-HD2, Celeron G1610, 4GB Patriot ram @1333Mhz, Asus GT210, WD 1TB Black, Silverstone ES50 500watt PSU, OS-X Mountain Liion
0

#15 User is offline   waldojim 

  • Elite
  • PipPipPipPipPipPipPipPip
  • Group: Members
  • Posts: 16,412
  • Joined: 29-October 08
  • Location:Texas

Posted 10 March 2013 - 07:08 PM

View PostLiveBrianD, on 07 March 2013 - 05:32 PM, said:

I still wonder though - when 5 year old PCs are good enough for most tasks, what motivates manufacturers to keep improving them and making them faster, unlike most monitors?

There is always a desire for more power, or bigger monitors/TV's. That is why Monitors were 12" some 20 years ago, and are upwards of 30" today. The problem is that size and performance are the only two aspects that mean anything to most people, and is largely what drives the research. Quality is a back-burner item, left for the "pros" that are willing to pay two, or even three times the actual value for a display.
"There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'" -- Isaac Asimov

Steam Machine: MSI 970A-G46, AMD Phenom 955 @ 4.0Ghz, 8GB Gskill ram @1600mhz, 128GB Plextor M5s, EVGA GTX 550Ti
Laptop: Alienware 14, Intel i7-4700MQ, 8GB DDR3 ram, Nvidia GTX 765M 4GB DDR5, Plextor M3 256GB SSD, 1080P IPS display, Killer GigE, Killer 1202 wifi
Hackintosh: Gigabyte H61m-HD2, Celeron G1610, 4GB Patriot ram @1333Mhz, Asus GT210, WD 1TB Black, Silverstone ES50 500watt PSU, OS-X Mountain Liion
0

#16 User is offline   LiveBrianD 

  • Elite
  • PipPipPipPipPipPipPipPip
  • Group: Members
  • Posts: 12,210
  • Joined: 31-December 09
  • Location:::1

Posted 10 March 2013 - 08:18 PM

I guess so. Come to think of it, that's probably due to the same resume reason consumer machines tend to be cheaply built, despite having good specs at a cheaper price.
Spoiler
"The Internet will be used for all kinds of spurious things, including fake quotes from smart people." -Albert Einstein
Need a Windows ISO image?
0

#17 User is offline   mjd420nova 

  • Expert
  • PipPipPipPipPipPip
  • Group: Members
  • Posts: 3,340
  • Joined: 05-August 06
  • Location:Fremont, California

Posted 11 March 2013 - 06:42 AM

That's what drives the industry. The recyclers are loaded with huge plasma and LCD display panels from early units, less than ten years old, that have died already. Low prices, acceptable specs but doubious reliablity. Every store or chain has a "house" brand. The LCD/LCD panels are almost exclusively made by Sharp. The plasma elements are mostly Samsung. I'm from the old school where analog to analog transfers come with a loss in some signal content. Conversion from one media to another with consumer available equipment, some content is lost but quite small. In digital format, the signal and its content can be copied EXACTLY with not loss. There are limits however, both in the visual ranges and audio ranges. These bandwidth limits are just that, LIMITS. This doesn't exist in the analog world. Will I miss that content?? Depends on if I've heard/viewed the content in analog form and then in digital media. With audio, I can surely tell, not so much with video.
0

Share this topic:


Page 1 of 1
  • You cannot start a new topic
  • You cannot reply to this topic

1 User(s) are reading this topic
0 members, 1 guests, 0 anonymous users