Talk:Video Graphics Array/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1

Memory base

From the article: the video memory for color mode is mapped at 0xb8000-0xbffff. I thought VGA graphics memory started at 0xa0000? At least, in linear (320x200x256) mode, where each byte was one pixel? -- pne 05:07, 26 Aug 2004 (UTC)

  Answer:  In graphics mode yes it's 0xa0000.  Colour text mode is "0xb8000".  -- Funkymonkey.

What about the VESA standard for successors to VGA?

  Feel free to add your own info :)  -- Funkymonkey.
0xb8000-0xbfff is also used for the old CGA color modes (320x200x4, 640x480x2) and text mode. Everything else (320x200x256, 640x480x16, and the EGA 16-color modes) use 0xa0000-0xaffff. 0xb0000-0xb7fff isn't usually touched by VGAs since it's the MDA text buffer. The VESA extensions to VGA are described over in Super VGA. -lee 17:17, 21 Sep 2004 (UTC)
  0xb0000-0xb7fff is a perfectly valid address space for the VGA when operating in a Mono text mode (Mode 7).  
  -- Funkymonkey

Vector Graphics Array Can anyone tell me how to connect a xbox360 console to a monitor using a VGA AV cable?What is meant by a female/female adapter?

Removed 640x400 Mode-X and Direct-X reference

I removed the reference to 800x600 and 640x400 modes, as I'm pretty sure they're not possible using standard VGA hardware. 800x600 maybe at a low refresh rate? Remember the distinction between a clone VGA and a "Super VGA" is blurred, some clone VGA's such as Oak's OTI037 256K VGA were capable of 800x600 I remember. However, this page is about the true blue IBM original. The main reason these modes should be near impossible (especially 640x400 in 256 colours) on standard VGA hardware is that the video bandwidth (28Mhz max) is too low. Horizontal Scan Rate would be unacceptably low. I'd love to be proved wrong however - if someone can demostrate the CRTC settings for a 640x400 256 colour, or 800x600 mode that would run on an IBM VGA with multisync monitor i'd be interested to see it. I also removed the reference to the 'Direct-X' term double buffering. Double Buffering has long been used as a term before the introduction of Direct-X. -- Funkymonkey May 8, 2005 - Hi Funkymonkey! There was an old MS-DOS program "FRACTINT", which claimed to support output resolutions up to 800x600 (16-color) on a true "IBM VGA adapter." Years ago, I fooled around with the program, but on an SVGA adapter, so I can't verify the program's claims. I do remember "ModeX" allowed up to 360x480 (60Hz) without resorting to outrageous refresh-rates -- a handful of MS-DOS games used this mode (my favorite was "Bananoid", a shareware clone of Arkanoid.) http://spanky.triumf.ca/www/fractint/hardware_modes.html#video_notes_anchor June 23rd, 2005. I read that link and it seems you're right about the 800x600 mode - tres cool, and a good find! We should integrate the information from that link back into the VGA page, I think i'll do that! -- Funkymonkey.

I can provide working programs that display 800x600 in 16 colors and 640x400 in 256 colors on a STOCK VGA so I know it is possible. The only drawback is that most monitors couldn't display it (it requires a very low refresh rate). If those were taken out, I'm going to put them back in. Trixter 21:38, 12 January 2006 (UTC)

Cool - I'd love to see those programs, if you can make them available. What monitors have you found are able to sync to these low refresh rates? Thanks. Funkymonkey

I'd have to find the code :-) But fractint is one of the 800x600@16 programs so just grab that and try it in DOS fullscreen. As for monitors, not many unfortunately. Trixter 23:59, 16 January 2006 (UTC)

I'm willing to accept that 800x600 is possible, and it has been show (Fractint as you say) but 640x400 in 256 colours seems unlikely due to the very low horizontal frequency that would involved. In 256 colour modes, the VGA is operating at half the horizontal clock speed as two normal clocks elapse for each pixel. Can we leave out the reference to 640x400 in 256 colours until there is some proof of it's existance? --Funkymonkey.

I guess I'm going to have to find that code then. But unless you have a low-scanning monitor (one that can do 15KHz horizontal) you probably won't get it to work. It *is* possible, I've done it, but just like 800x600x16 it displays on about 1% of monitors out there.
But what's wrong with the paragraph right under it? Does it not explain the limitations of trying to use such extreme modes? This was a big deal back then, I remember a lot of discussion around getting ColoRIX to use additional tweaked modes... Trixter 21:53, 25 January 2006 (UTC)

Leave it in if you like then, I just think it's nice if we have some evidence to backup the mode's described as possible. --Funkymonkey. The article says "360x480 (highest resolution compatible with standard VGA monitors". I guess that's for 256 color modes. What is the highest 16 color mode resolution compatible with a standard VGA monitor? Calvero2 11:32, 8 April 2007 (UTC)

VGA 256x177?

Could be possible that VGA can display 256x177? I found that when playing Rastan on Dosbox. Or pearps it's EGA tweak.

You're correct. Rastan tweaks a 256x177 mode. The 256 pixel columns across is not only easy but common; the 177 lines is the unconventional part.
256 pixel columns was a huge boon to game programmers -- it meant you could treat screen coordinates as going from 0 to 255 across, which meant your "X" variable could fit into a single byte-size register. Some games used this and put a status display up from pixel columns 256-319, but a few games, like Rastan, actually tweaked a 256-column mode. Trixter 23:10, 10 January 2006 (UTC)
Thanks. I played Rastan on my old computer with a TV plugged. The screen displays odd effects when playing Rastan.
That was CGA composite output, which displayed odd artifacts if the game didn't specifically support it. See the wikipedia article on CGA for more info. Trixter 21:39, 12 January 2006 (UTC)
My old computer does not have a CGA composite output. I used a device to connect the TV via VGA output. Peharps the device was not designed to use 256x177 mode.
It's ever worst when connecting the tv to my newest computer connected via 3d card and playing Rastan. Everything's flickering.

Meaning of VGA

It is my understanding that VGA was an important landmark in that, for the first time, standard IBM PC color computer monitors came to have the same resolution as North American TV broadcasts (NTSC standard); thence the name "Video Graphics Array", or an array of pixels matching the resolution of standard video. As written the article named "NTSC", color TV broadcasts have 486 "viewable" horizontal scanning lines per frame which closely matches VGA's 480 number. As for the 640 number it surely must match the equivalent resolution if the scanning were done with vertical lines instead of with horizontal lines, but I'm unsure about this. I don't know were to add this comment without disturbing the balance of this article. Perhaps if the original author sees this note he/she may add it appropriately.

IIRC, the 486 comes from some digital video format (DV, D1?) while broadcast NTSC has indeed 480 lines (but often more than 640 pixels per line when transmitted digitally). Christoph Päper 22:04, 1 June 2006 (UTC)
This doesn't really make much sense to me. First, the VGA was never really meant for digital video applications (among other things, it didn't have video-in/video-out capabilities). Second, VGA was not actually the first PC video system to use the 640x480 resolution; the Professional Graphics Controller before it supported 640x480, as well as several "super EGA" cards meant for use with multi-standard monitors like the NEC MultiSync II. Third, and probably most important, the VGA was the first IBM PC graphics adapter to fit into a single gate array, and at first was not offered as a separate "adapter" card. -lee 13:15, 25 September 2007 (UTC)
Original Poster: False. (1) NTSC's analog (key word) resolution is variable from 330 lines left-to-right (VHS) to 440 lines (broadcast) to 660 lines (DVD or other "lossfree" source). There's no limitation or fixed resolution, because it's an analog system. (2) VGA was hardly the first computer to feature 640x480 resolution. Several other computers already had the same capability (Commodore Amiga and Atari ST spring immediately to mind), and they were better suited to television production since they had interlace, whereas VGA used the non-compatible progressive scan. Interlace is better for 1980s TV production. (3) The earlier CGA and EGA cards had composite and S-video outputs for connections to standard televisions. As did most computers of the era, so connecting your computer to a TV was nothing new. It dated back to the 1978 Atari 800, and probably earlier than that. ---- Theaveng (talk) 15:32, 7 January 2008 (UTC)

HVGA?

Can someone familiar with this topic take a look at the recently created article on HVGA? Thanks. -- Rick Block (talk) 01:25, 8 June 2007 (UTC)

Technical details/History

AFAIK, the firt VGA appeared in the IBM Personal System/2. It was a part of the Micro Channel architecture memory controller, like modern IGP in the Northbridge. Originally, there was no such thing as "VGA Chipset" in the PS/2. The first VGA-compatible ISA chipset was a Paradise VGA (later Western Digital). - Alecv 19:24, 8 June 2007 (UTC) Answer: Shame i don't have my old PS/2 to hand, but IIRC when looking at the board you could identify discrete chips that made up the VGA. Things weren't that integrated back then. If you have ever seen an 8514/A board (i own one) it is made up of a very large number of discrete chips. It may well be that some of the core logic was integrated into the MCA memory controller, but i'm pretty sure it was a chipset all in all. (I may be talking crap though, because it depends on whether you count the video RAM, and the DAC as seperate or not) 149.254.200.220 18:02, 5 September 2007 (UTC)funkymonkey

The VGA itself was indeed a separate chip, IBM part number 15F6864. A complete VGA setup included the '6864, 8 NEC 4464 RAM chips (64kx4, a total of 256k), an Inmos G171 6-bit/3-channel RAMDAC with 256-entry CLUT, and two dot clock crystals (25.175 MHz and 28.322 MHz). Pretty much everyone used this same layout, which made upgrading things piece-by-piece easy (adding more RAM and higher dot clocks were some of the first mods people did to the basic design). The early Paradise chips were very similar to the 15F6864, even to the point of using external clock crystals (I saw a Paradise board with 5 of them once, sometime in the mid-1990s...yeah, it was old even then).
While I'm thinking about it: The MCGA, VGA's short-lived, cut-down sibling, was embedded somewhere in the chipset of the PS/2 Models 25/30-8086, and at least two of those chips were semi-custom Epson gate arrays. -lee 18:22, 24 September 2007 (UTC)
Lee that is awesome information, very impressed. How did you come by that info? -FunkyMonkey
Partially from trolling the web to refresh my memory, partially from having a basement full of old PCs (including an IBM PS/2 Model 80 at one point) and tons and tons of old ISA video cards. From about 1993-2000, my brother was friends with a second-hand store owner who frequented Government surplus auctions, and he would just give us stuff (most of it dating from the late 1980s and very early 1990s). We ended up having to trash most of it when we moved house in 2000, though. -lee (talk) 16:26, 10 June 2008 (UTC)
funkymonkey, could you please take a picture of your 8514/A board? The 8514 article is pretty sparse at the moment, and it's kind of become a pet project of mine because of a game called Mah Jongg -8514- that made a bad bet on the 8514 standard becoming widely adopted. DOSGuy 21:21, 24 September 2007 (UTC)
Only just saw this, but yes, I'd be happy to post a pic on the 8514 article (might need a dust off first!!). The girlfriend is coming over tonight, so I'll ask her to bring the camera over. I'll also post you a URL to a website with all the pics here once done. -FunkyMonkey

The default palette?

From experimentation, I don't believe that the mode 13h palette is the graphics adapter's default palette. By setting the graphics mode directly, I found 8 shades of the EGA colouring scheme, and 192 blank colours. Though I did this under QEMU, so it might not be right, however, the so-called "default palette" is almost definitely stored in the BIOS. I'll try on actual hardware (no original VGA chipset though :\) --thematrixeatsyou 08:49, 2 November 2007 (UTC)

I'm not sure what you are trying to say here, but mode 13h is not a palette. VGA has a default palette defined that is EGA and CGA compatible, but applications are free to reprogram that palette as they see fit. It is likely that the default palette is stored in the BIOS or Video BIOS, but that is just an implementation detail.
--Anss123 09:18, 2 November 2007 (UTC)

Simple Answers.

¿Is a simple answer really so much to ask for? All that technical information is great and wonderful- But I can’t tell from the article what the Hell a VGA does, only that they are (or maybe were even that’s not really clear) used in video games (I was told by a salesman that they can be used to convert a TV screen to a computer screen, but thinking he was telling me what HE wanted me to hear, I thought I’d check it out- obviously, this article was less than no help, or I wouldn’t be leaving a comment).

Include a SIMPLE explanation for those of us who aren’t computer nerds. —Preceding unsigned comment added by 71.34.68.223 (talkcontribs) 08:06, 8 March 2009 (UTC)

Linear 320x240x8 mode

It's possible on the official VGA cards to put the card into a linear 320x240x8 mode instead of the more common planar "mode X". There's a register that can be used to enable a 128k window from A0000 to BFFFF instead of the usual 64k at A0000 to AFFFF. Combined with some of the register tweaks of "mode X", you'd get a linear 320x240x8. I doubt it was used on many games, if any, because some clone cards didn't support this setting. --Myria (talk) 21:47, 17 November 2007 (UTC)

Can we put-up a 256 color VGA image?

I'd like the readers to be able to see how a 256-color image actually appeared back in the 1980s. (I suppose a 256 color GIF image would be an appropriate substitute.) ---- Theaveng (talk) 15:34, 7 January 2008 (UTC)

The Signal graphic has a slight error

It shows the front porch as the start of the line, however the front porch happens at the END of the line, whereas the sync pulse is the official start of the line (when the electron gun moves from one edge to the other edge). ---- Theaveng (talk) 15:46, 18 May 2009 (UTC)

Port / Plug pic

Needs a picture of a VGA plug and/or port. See the page for DVI (Digital Video Interface) for an example. 76.2.89.37 (talk) 17:20, 12 April 2010 (UTC)

 Done --nn123645 (talk) 03:51, 24 January 2011 (UTC)

vga

hii, i want to know if in any computer vga is not install so what the effect of it on any pc... —Preceding unsigned comment added by 59.95.109.207 (talk) 16:29, 20 May 2010 (UTC)

The PC then doesn't have a VGA card.
(I'm not sure what else you're after, really) 193.63.174.211 (talk) 18:40, 23 January 2012 (UTC)

Diagram of VGA Signal

A diagram of an VGA signal would be really helpful. This is a good one http://www.vga-avr.narod.ru/vga_timing/vga_timing.gif — Preceding unsigned comment added by 209.94.128.118 (talk) 18:55, 4 December 2011 (UTC)

VGA has same Resolution as NTSC color TV

I think this fact should be emphasized in this article. As I recall at the time, color computer screens were of lesser resolution that a standard color TV which was regarded as the "panacea" of picture quality. Thence when VGA first appeared it was highly regarded for being "as good as" a color TV set of the time (north American NTSC standard), which can be loosely described as having a resolution of 640 X 480 pixels. The name "Video Graphics Array" can be read as "having the same resolution as NTSC color video". By the way the "A" was commonly mistaken for Video Graphics ADAPTER because the earlier standard had been CGA and EGA where both "A´s" stood for "Adapter" where in VGA the "A" stands for ARRAY.

No, VGA is higher resolution than TV, at least from the point of view of displaying text - 80 column text on a CGA is vile, but you can have very sharp 80 column text on a VGA screen. ( And even CGA displayed better on a dedicated monitor than through a TV set). --Wtshymanski (talk) 22:56, 25 November 2010 (UTC)
Um, nope, VGA was never as lousy as TV. NTSC had 525 scanlines (486 visible). CRT monitors have no native resolution, but colour monitors had to be capable of at least 640x480, and quickly raced to 800x600, 1024x768 and higher. Color televisions were based on a decades-old standard by the time VGA came along. DOSGuy (talk) 18:52, 27 November 2010 (UTC)
/facepalm ... No, no, fellas, you misunderstand. You're looking at it backwards. The critical part of VIDEO graphics array is that it could be used with a genlock or other simplistic scan converter to insert broadcast-quality overlay text (and graphics, both overlay and fullscreen) into a TV-standard video stream. It's not so that you could hook up a standard TV to use as your everyday computer monitor. PCs were largely business machines at the time, it wasn't meant for gaming but e.g. display of stock price charts, or sports scores in a nice large colourful font to compensate for the terrible fuzziness of contemporary NTSC signals.
It IS, at least in terms of the vertical scanning (the most critical part), double NTSC standard, so it would work well with genlockers (needing a minimal amount of buffer RAM, and quite simple scanrate conversion hardware), or even just with a locked-off camera pointed at a monitor (camera scan always hits the same part of the screen at the same point in the monitor's own refresh, so the image remains steady - moired, perhaps, but at least not flickering or rolling).
The scan rate is exactly double that of NTSC, and there are 525 scan lines - but only 480 of them are "active". Six less than NTSC itself, but this was considered well within the normal overscan allowance and therefore irrelevant. It instead was a nice convenient multiple of 8 and significantly higher rez than almost all of it's rivals. The horizontal rez was more simply just a replication of the existing CGA and EGA standards (it's enough for 80 column text; you don't want to make it NARROWER than the previous generation; and it's comfortably above the available analogue resolution that pixelisation isn't an issue), the lower ones either half the top one or replications of prior standards, and the colour depths a matter of "how many bitplanes can we fit into an affordable amount of memory at these resolutions"?. 16 colours doesn't seem much by modern standards but is plenty for putting up a nameplate or a league table with some simple icons or logos, and you can get away with 320x480 (or 640x200) with a bit of care if you want to use 256 colours instead.
And of course, besides that use, it's also quite good for normal everyday computer use. Take a regular, if high-quality TV. Fiddle with the guts so it now scans at double speed, non-interlaced, and accepts plain RGB input. Voila, a cheap VGA monitor. Solid picture (for the 80s, and with slow phosphors), good resolution that's about at the limit of what the CRT mask can support, and good colour depth. Hi rez with reasonable colour for word processing, business graphics, file handling and the like. Low rez for games and art packages.
Put them together and you have a fair coup vs the Mac and a host of other rivals of the time which were using non-TV compatible and typically lower resolutions with poor upgradeability. Excepting the Amiga of course, but that still had a lower colour depth (excepting the rather limited-utility HAM, and the highest rez 32-colour mode) and max NTSC resolution, and used an interlaced video mode for anything greater than 256 lines, so it wasn't so hot for hi-rez productivity.
Clear now? 193.63.174.211 (talk) 18:35, 23 January 2012 (UTC)

changes I made re: 16 and 256 colour modes available on standard adaptor

OK, there were two things, which the "edit note" line just wasn't long enough to contain.

One: stating the VGA's 16-colour modes (in 640x350 and 320x200, though it also holds for 640x200) as distinct from those of EGA. These are listed either separately, or on combined lines in most VGA listing tables, e.g. "mode 1+" vs "mode 1". Although the great majority of programs using 16 colour PC graphics at 320x200, or use 640x350 at all just go for the EGA palette as that's the easiest method and provides greatest compatibility, and go all-out with 256 colour (or 640x480 mode) in specific VGA modes, I have seen it done. Particularly where programs have been converted to the PC from the Amiga or ST platform where they used either a more limited but fully customised palette (typically 16 colours), or a medium-resolution colour mode with decidedly non-PC-standard colours (eg from a Mac II Color). For example, the early 90s "Space Shuttle Simulator" game was VGA/MCGA only, but used 16-colour 320x200 mode, with a customised palette matching that of the non-PC computers offering a much subtler and more variable range of shades than the EGA despite its similarly narrow range of simultaneous colours.

One presumes it made the porting easier, without losing any graphical fidelity or having to put in more work to upgrade the graphics, whilst reducing the on-disk and in-RAM data footprint, and giving faster screen updates, for titles where having a great spread of colour was not as important so long as certain bases were covered.

There's at least a couple other notable PC games I've played which use 320x200x16 and 640x350x16 with non-EGA colours but, of course, will they come back to mind right now? Will they hell. I can just see a vague outline of the loading screen in my mind's eye but no actual words or anything. :( ((Corncob 3D looms large, but I think that was actually 640x200 in 16 colours, either only slightly customised or just plain EGA))

Two: Removing the notes about ColoRIX doing 640x400 in 256 colours. I'm sorry, maybe there's a print citation for this somewhere? But I can't find anything online. In fact, what I DO find is evidence that it will do that mode, and 640x480, and 800x600, and even 1024x768, but ONLY with specifically compatible, non-VGA-standard cards... XGAs, SVGAs, and "expanded VGA" adapters (ie standard VGAs with their own proprietary add-on SVGA-like modes), which are of course NOT "VGA" for the purposes of an article such as this. This was from a wide ranging review of different high-rez and high-colour paint packages in a 1988 PC magazine that had been put online. A smaller article from another edition of the same magazine a few months later also mentioned that it now had the capability to go up to 360x480 in 256 colours on a standard VGA adaptor in addition to the higher rez 256-colour modes available on more expensive non-standard cards with suitable drivers.

Other resources, including some I've recently cited for other information, show that the VGA specification simply doesn't allow for the combination of 256 colour mode with high horizontal resolution. The hardware can't drag pixel data out of the VRAM, fling it down the internal bus and drop it into the DAC fast enough. The most you're ever going to get is 800 (900?) pixels from front to back porch in 4-bpp mode, or 400 (450?) in 8-bpp mode (or, 400/450 bytes per line), with 640/720 and 320/360 being the standard within normal overscan limits.

With a fair bit of overscan, you might achieve something exotic like 400x600 in 256 colours, giving decent resolution with an uneven aspect ratio (and slow, choppy refresh as you're using very nearly the entire 256kb at once; halving this to 400x300 actually seems to work well in the case of a very few games that offer it, like the original Grand Theft Auto), but not 512, 640, 720 or 800-by-anything (not 600 lines, not 480, not 400, not 350, not 240, not 200...) without dropping the bit depth.

I'm sorry, it just doesn't seem to be possible within the limits of the hardware, and there's no evidence I've ever seen, or can find now, that suggests it's possible to exceed it. You may as well ask a C64 to run in 80-column mode, an Amiga to give more than 4bpp in 70ns or 2bpp in 35ns pixel mode, a Spectrum to offer a full-rez mode with single-pixel colour addressability, or an ST to pump out hi-rez colour or 16 colours in 80 column mode with full pixel level colour addressability. It's not going to happen.

However, someone, somewhere, might have been very very clever. If you've hard evidence that ColoRIX or some other software can actually break this barrier and do a 256KB, 640x400x8bpp mode on a STANDARD, fully spec-compliant, no-proprietary-anything VGA card, then it can go back in. And I'll be both impressed and rather pleased at mankind's ingenuity. Particularly as this would mean somehow tricking the internal (4bpp) pixel bus into running at a minimum of almost 36Mhz when it only has crystals for 28 and 25Mhz onboard.

However, I highly doubt it, as some bugger would have done it already, and become legend for their achievement - even though it would have very quickly have been rendered pointless by the rise of VESA compatible SVGAs, competitive demo-hackers and the like put a lot of stock in what you're able to do within the limits of pure stock hardware.

Hell, no-one ever even made serious use of 360x240/350/400/480 (or 400x350?). The former two would have been very good for games, giving a twin-buffer 256-colour display within the 256kb limit, and something resembling a normal aspect ratio (ie more horizontal pixels than vertical), with the choice being between squarer pixels or a slightly better monitor refresh rate. I think the only places I've seen them on offer are in graphics demos, FracTint, Quake and GTA. 193.63.174.211 (talk) 13:18, 7 November 2012 (UTC)

That big list of "higher resolution and other modes", bloody hell.

Who put that in? It's completely unnecessary. As the adaptor could output a fair variety of horizontal and vertical resolutions, each one in free combination with the other so long as the rules over memory use and colour depth were respected, it should only be necessary to add a note regarding that.

IE, that it could display any combination of X, Y, or Z columns of pixels horizontally with up to 256 colours, U, V or W with up to 16 colours, and A, B, or C lines at 60Hz and D, E, F at 70Hz (or half as many with line doubling) vertically.

Rather than a long aspergic list of every single possible graphic mode, which is really not the sort of thing Wikipedia is about.

Besides, I'm highly skeptical about some of them - I don't believe, for example, that a 600-line (or even anything taller than 512-line) mode was possible on a standard VGA. 800 pixels across may have been possible, but not that many high. Otherwise, whence cometh the difference between VGA and SVGA? The 256/512-pixel-wide modes seem unlikely as well, unless they were effectively just 320/640 pixel ones masked down. The pixel and line timings ran off fixed quartz crystal standards after all, not a freely reprogrammable PLL loop like in modern cards, hence the reasonably well-fixed 320/640/720 width and 200/350/400/480 height modes (working within 800 and 900 pixel-clock wide and 441/525 scanline frames, just using more or less of the available height for 350 vs 400 line, and width for 720 vs 800 column modes, rather than changing the line/frame display time). 25ish MHz for 320/640, and 28 for 720-800 width, plus a fixed 31kHz line frequency (which was kinda inviolable because changing it could cause monitor damage).

I'm going to see what I can do about editing that down now. Please think about what you're doing, in future, before dumping such a big ol' list of stuff like that. 193.63.174.211 (talk) 12:01, 20 June 2014 (UTC)


4throck (talk) 13:26, 20 June 2014 (UTC)

I put the list there. Agreed that as you suggest would be better:

IE, that it could display any combination of X, Y, or Z columns of pixels horizontally with up to 256 colours, U, V or W with up to 16 colours, and A, B, or C lines at 60Hz and D, E, F at 70Hz (or half as many with line doubling) vertically.

It's the same information, but on a better layout.

I also suggest that you change than in the article itself. Complaining about it here will only take your time and not get it done.

Agree that some combinations are non-standard, but I remember using MAME in 256x240. But again, perhaps that was SuperVGA. Some references are needed on that part of the article I agree.

Well, that's mighty big of you to own up to it :) and I apologise for my tone, as editing it actually proved to be quite a job - but it WAS a bit sprawling to look at originally.
Just to clear things up - this article is about the original VGA card from the late 80s, and not so much the standards built on it afterwards, or shared by later hardware (including the not-much-later SVGA, or any extended capability VESA card that co-opted the "VGA" port and name). It's quite possible that MAME still uses good old ModeX, but whether you'd get a good play speed out of it using an original card is a whole different question. There's also a fair likelihood that it actually uses some kind of VESA mode with entirely customised timings, perhaps based on a down-rezzed version of SVGA or XGA, in order to give you that console-like mode without either chunky side borders, or you needing to adjust your monitor every time you go in and out of fullscreen (EG something based off of a mix of 1024x768 and 960x720, dividing the pixel and line clocks down by (4x3=) 12 and 3 to achieve 256 x 240 then cutting the V-rez back slightly... or basing off 800x560, which was a briefly used multisync standard, and dividing by (3x2.5=) 7.5 and 2.5 for 267 and 224... or something more exotic besides, maybe even a 1024x560 combi...).
I recall ZSNES using ModeX by name way back in the day, and it's "old computer" compatible bog-standard VGA mode claimed to be 256x224 (or 239) but really seemed rather more like an abitrary squarish bunch of pixels matted down inside a 320x240 frame... Perhaps it was setting the VGA to do that using wider front/rear porches, or maybe it did it all in software, who knows, but it sure didn't fill the screen.
Anyway, I've done what I can with it. It probably needs chopping down again by someone who's a better editor than I, but it seemed that it might be useful to use some of the space freed up by removing/reformatting the list to add some better explanation. 193.63.174.211 (talk) 15:24, 20 June 2014 (UTC)

4throck (talk) 22:38, 20 June 2014 (UTC) You did a nice job and provided good information :-) Completely agree with the scope of the article. As it is, at least the reader will understand that the original standard was expanded with new modes, either through better hardware or through software hacks. But the drive to have better graphics was there, thus SVGA appears shortly after. In the future some of that info may be moved into more technical articles, but it's good as it is. Thanks!

Intro edits

I removed:

"when using a DVI or HDMI connection, especially on larger sized LCD/LED monitors or TVs, quality degradation, if present, is prominently visible."

This made little sense in context. Perhaps whoever wrote this meant that the quality degradation is ***noticeable*** in comparison to digital formats like DVI or HDMI. Those formats themselves, if used properly, should not show any VGA-style degradation.

If that is what was meant, I guess we can add it back in. If not, well, it'll just confuse people...

Bilditup1 (talk) 17:19, 11 October 2014 (UTC)

Technically a misnomer, I think

I am almost certain that the 'V' in VGA actually stands for 'Versatile' and NOT 'Video', as defined by IBM, the inventor of the standard.

I think 'Video' is actually a (very) widespread misnomer, which arose because it seemed 'obvious' that the 'V' stood for 'video'.

A more common misnomer is thinking that the 'A' stands for 'Adapter', which is incorrect and I'm pleased to see that this mistake is not repeated here.

A few sources around the Web cite 'Versatile Graphics Array', but really we need to find the original IBM published standard, or at least a similarly authoritative source. I've searched but can't find it.

Two questions: 1/ Amongst our older contributors, does 'Versatile' rather than 'Video' ring any bells of recognition? 2/ Can anyone help pin down some suitably authoritative source, such as IBM's own documentation?

If 'Versatile' turns out to be correct, we should still mention 'Video' as a commonly-used alternative, simply because it has *almost* become accepted usage, despite being wrong.

Steve — Preceding unsigned comment added by Steve Thackery (talkcontribs) 18:58, 13 April 2014 (UTC)

If the above is accurate, I would say that the use of "video" is primary at this point. If we can find some WP:RS sources that state the original IBM spec was "versatile", then we should certainly state that "versatile" was the original usage. However, I would not put it as primary in the article. Such acronyms can change definition over time despite the intentions of the original group of people using it. Note that we would need something that explicitly indicates that "Versatile Graphics Array" was the original intended expansion of the acronym, not that someone just happens to use it now.
While I do not consider the number of hits from a search engine anything close to proof, they can be indicative of usage patterns. On Google, "Versatile Graphics Array" returns 29 results – it appears that Google can't count its own number of displayed results – with about 62 if the omitted results are shown. However, "Video Graphics Array" produces about 384,000 hits. That is a difference of 4 orders of magnitude. — Makyen (talk) 20:38, 13 April 2014 (UTC)

Colleagues, I think I am wrong. I have found some vaguely contemporary IBM documents and they definitely refer to "Video Graphics Array". Thus it looks very much like - even if it did once stand for "Versatile..." - IBM themselves definitely adopted "Video..." early on (or perhaps always did, and "Versatile..." is the misnomer).

I feel I should apologise for wasting your time over this. As I cannot find any supporting evidence it is clear I must withdraw this suggestion.

Steve Steve Thackery (talk) 21:31, 14 April 2014 (UTC)

I was one of the the techies responsible for the UK announcement of PS/2 (I did Micro Channel and I can assure everyone that the term is (and always was) VIDEO Graphics Array. — Preceding unsigned comment added by 31.49.247.15 (talk) 17:29, 28 June 2015 (UTC)

Add monochrome (1-bit) mode to the standard list?

I'm going to go and add this, and see if it gets reverted. As far as I know, 640x480 (and possibly 640x400?) with a single bitplane - i.e. classic monochrome - is also a standard VGA resolution. I remember being able to set it... in QBasic, in DOS, in Windows 3.1 and even Win 95 with the standard VGA driver (you know... the one which set the dark colours to be not-quite-primary, but not EGA standard either)... and really, it has to be there as an option, otherwise what do you do if you only have 128k or 64k of RAM onboard? Only use the lower resolutions? In which case why not buy an EGA or MCGA board?

Only 256k allows 640x480 in 16 colours (and 320x480 in 256), even though 64k allows 320x200 in 256.

128k limits you to 640x400 (or 640x350, or 320x480) in 16, 640x200/320x400 in 256, and 640x480 in mono (or potentially 4-colour/4-grey, if anyone had bothered to think about offering it... it might have been available as a hack?)

64k makes 640x480 mono-only, along with 640x400, 640x350 and 320x480 (again, 4-colour is a neglected possibility), with 640x200, 320x400 and 320x240(!) all dropping to 16... 193.63.174.211 (talk) 18:49, 23 January 2012 (UTC)

Right, someone zapped it, but I'm putting it back in. If there's a reason for its removal, please come and argue it out instead of being a silent scalpel-wielding dick. Who the hell is going to buy a graphics board with only 128k of RAM, and 640x480 capability, that doesn't have a mode that grants access to its highest resolution? Mono is a standard VGA mode, just as it was on EGA for the lower-RAM boards.
For that matter, so is 640x400. For a start, if you had said 128k board, it'd be the highest rez you could set which would provide 16 colours, and with the 256k you could have two graphics pages for quicker updates (and indeed it'd also be quicker on the 64k for mono, for the same reason). It gets used for BIOS displays and the like, and there's no reason they wouldn't allow it to be settable on grounds of monitor compatibility or the like because the text mode is also 400 lines. I have video card checking programs that run it, in the middle of 350 and 480 line modes, which means it's part of the standard complement rather than an add-on.
Both mono and 640x400 might have faded almost instantly into obscurity thanks to everyone plumping for the 256k boards off the bat but it doesn't mean they weren't there in the first place.
Or maybe I'm wrong? Bring the evidence. Right now it's uncited vs uncited (though I'm not beyond looking up the modelist). Prove 20+ years of experience is mistaken. 193.63.174.211 (talk) 12:03, 7 November 2012 (UTC)
-drinks own kool aid- ... ok, I DID go look it up, and I'll throw my hands up here: It's not a deadset standard mode, so I will now immediately go delete it.
Monitors support it, software supports it, card and monitor testers use it, but it's not on the default list. Which is totally baffling - it means if you have less than full memory you're basically limited to EGA resolution for colour, meaning they really missed a trick in the specification as there was the chance for even the low-end cards to superceed EGA's abilities - but, there you go. Can't change the facts, even if they're stupid.
However, 640x480 in 2 colours IS on the list, so THAT'S staying. 193.63.174.211 (talk) 12:08, 7 November 2012 (UTC)
I think what you've been saying is quite interesting - why not add it anyway? Instead of listing non-standard monochrome modes, you could say something about how the standard missed out on these modes, so cards with insufficient memory had to add non-standard methods of accessing them. Then you could add some examples and references and it would fit nicely in the article. -- Malvineous (talk) 23:39, 7 November 2012 (UTC)
Sadly, unless someone else has already covered this in an off-wiki article somewhere, that would definitely count as Original Research ... and quite a lot of it too, with the associated workload. Not worth it, and would probably get deleted anyway.
I am now trying to work out exactly what I was trying to say up there, some 18+ months ago... monochrome mode removed, but 640x480 in 2 colours stays? Eh? 2 colours is for all intents and purposes monochrome anyway, just that you can change what each of the colours is (and as that's just tweaking chip registers, it won't use any memory). Did I mean 2 *bitplanes* (so, 4 colours)? A different resolution (640x400 or more likely 640x350 in 2 colours)? Something else entirely? 193.63.174.211 (talk) 12:14, 20 June 2014 (UTC)
VGA supports 640x480 in 16 colors and 2 colors.
640x480 in 2 colors requires 37.5 KB memory (just over 32 KB, so 64 KB is needed)
800x600 in 2 colors requires 58.6 KB memory (that's better fit for 64 KB)
So any color depth that can be saved in 640x480 can also be saved in 800x600 without increasing size of card needed.
I will show overview of different card memory sizes:
KEY MODE: 640x480 monochrome (also 16 colors if possible) (resolution race for larger cards)
256 KB supports 640x480 (also 800x600) 16 colors. Honorable mentions include 1280x800 or 1360x768 (WXGA) in 4 colors and 1920x1080 (full HD) in 2 colors.
128 KB supports 640x480 in 4 colors or 640x400 in 16 colors. Also WXGA in 2 colors.
64 KB supports 640x480 (max 832x624) in 2 colors, but 640x400 in 4 colors.
32 KB supports 640x400 in 2 colors. That's almost enough. 720x350 in 2 colors is also supported.
KEY MODE: Most resolution in 256 colors
256 KB supports 640x400 in 256 colors!
128 KB supports 640x200 or 400x300 in 256 colors. Still good.
64 KB supports 320x200 in 256 colors.
32 KB supports 160x200 in 256 colors. That's similar to CGA problem with 16 color.
16 KB supports 160x100 in 256 colors. Wish CGA had this mode!
KEY MODE: Color depth race
256 KB supports 640x200 in 65536 colors and 320x200 in 4294967296 colors.
128 KB supports 320x200 in 65536 colors and 160x200 in 4294967296 colors.
64 KB supports 160x200 in 65536 colors and 160x100 in 4294967296 colors.
32 KB supports 160x100 in 65536 colors and 80x100 in 4294967296 colors.
16 KB supports 80x100 in 65536 colors. What if CGA had this mode?

217.99.252.102 (talk) 11:28, 27 September 2015 (UTC)

Comments

vga cable signaling spec?

  • Off the top of my head, the analog RGB cabling is RS-170 compliant, 75-ohm shielded video cable, and the video itself is referenced to 700 mV peak-to-peak. According to the 1992 IBM VGA/XGA Technical Reference, the H-sync and V-sync are both 5-volt TTL, though those lines are typically run through coaxial cable on better cables as well. The old video ID bits are TTL, too, but they've been replaced by DDC2, which is I2C. -lee (talk) 20:16, 25 November 2015 (UTC)

Exact VGA timings

The expression in the article does imply an exact frequency of 60/1.001, but how is this determined from the rounded numbers in the source or other timings provided through EDID, which I believe are rounded too and don't allow fractional math? Indeed, the microsecond durations seen here appear to have been calculated with these rounded numbers. 83.93.8.224 (talk) 20:10, 27 February 2016 (UTC)

Error in first paragraph

VGA is analog, but it is not "‎Amplitude Modulated". Modulation requires a carrier which is altered by the modulating signal. VGA is a baseband signal without modulation. — Preceding unsigned comment added by Whitcwa (talkcontribs) 17:36, 29 January 2016 (UTC)

Done (WP:BOLD). "Amplitude modulated" is nonsense for VGA, it would be sort of true for video standards like PAL, NTSC and SECAM, in which the color information is quadrature-amplitude modulated, while the brightness information is transferred in baseband. Greets from the electrical engineering department of ETH Zurich. 129.132.3.139 (talk) 10:04, 2 April 2016 (UTC)