Talk:Framebuffer/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Frame buffer

This and frame buffer should be merged. Should we keep the title with space or without? --Brion 07:23 Nov 23, 2002 (UTC)

This now seems to have been done. StuartBrady 22:42, 11 January 2006 (UTC)

Definition of Framebuffer

This page is just plain wrong. A framebuffer is a memory-driven display device. I'll see to adopting this article and updating it. Jbanes 18:11, 11 January 2006 (UTC)

I'm not so sure - I think a framebuffer is a memory buffer containing a frame. As for your adoption of the article, thanks very much! StuartBrady 22:14, 11 January 2006 (UTC)
Nope. Read the PDFs under "external links". A framebuffer is a device that drives a display from pixel memory. The memory itself isn't the framebuffer, though it is a critical (and defining) component. This distinction has sort of been blurred in many people's minds over time with the introduction of Virtual framebuffers such as Xvfb and the Linux Framebuffer device. However, you'll note that the documentation for both of these states that the software is emulating a device, not just providing a backbuffer. Jbanes 04:38, 12 January 2006 (UTC)
Quoting from Digital Paint Systems: An Anecdotal and Historical Overview : "A frame buffer is nothing more than a piece of computer memory with a means for viewing what it holds." However, I accept that virtual framebuffers don't count as true framebuffers (that's why they're "virtual"). StuartBrady 14:34, 12 January 2006 (UTC)
Precisely. Without the "means for viewing what it holds", it's not a framebuffer. The "virtual" framebuffers emulate a real device in much the same way that virtual memory emulates a real memory device. Jbanes 14:52, 12 January 2006 (UTC)
Right. I agree with that. As far as I can tell, though, a CRT controller is not a component of a framebuffer. Neither is a texture-mapper, or texture memory. The framebuffer is just the memory which is used to drive the display device. StuartBrady 15:24, 12 January 2006 (UTC)
Exactly. In 3D cards the texture memory feeds into the texture-mapper which writes the results into the framebuffer, which is then what we see on the screen. Sort of.
This is complicated by two factors that I can think of. The first is that a video overlay device might modify part of the video signal. This is commonly how the mouse cursor is implemented. The second is that some 3D cards run the video signal through filters to perform various effects. IIRC, 3DFX added hardware to one of their Voodoo cards that blurred the signal a bit to make it appear as if the screen were anti-aliased. It was a cute trick, but it meant that the "real-time anti-aliasing" couldn't be shown on screen captures. Jbanes 15:58, 12 January 2006 (UTC)

Can I just confirm, before I edit the article: you'd agree that a framebuffer is just memory with a special purpose? (I.e. it's mapped to the video output.) BTW, interesting point about the real-time anti-aliasing! StuartBrady 17:16, 12 January 2006 (UTC)

If by special purpose you mean that it's memory coupled with a display output, then yes. The two together form a complete framebuffer. Jbanes 18:11, 12 January 2006 (UTC)
I told you what I meant by special purpose, in brackets. I'm still not convinced. I'm sorry to complain, but I'm not getting anywhere here, so I give in. StuartBrady 19:48, 12 January 2006 (UTC)
I really shouldn't have said that. I'm sorry. I think you're in agreement with me, but I'm not sure. I'm saying that a framebuffer is memory coupled with a display output, but does not include that display output. This is what all the definitions I've been able to find say. If we don't agree, we should seek some means for resolving the dispute. It seems as though you thought that I was arguing that any memory containing pixel data counts as a framebuffer. I was not, as that would be preposterous. StuartBrady 20:28, 12 January 2006 (UTC)
S'Ok. Sorry if I was being obtuse. I wasn't quite sure what you were getting at. I think I might understand your question now. Is your question something along the lines of: Does a framebuffer consist of the memory and signal generator (NTSC/VGA/PAL/etc.), but not the display device (TV/Monitor/LCD Panel/etc)? If that is your question (am I on track here?) then you are perfectly correct. The signal generator tells the display display device what to do. (In the case of CRTs, it's telling the electron beam where to move, when, and how.) What differentiates a framebuffer from, say, the display on Tempest, is that the signal generator on Tempest doesn't have the data for all those lines. It has the vertexes, then relies on the natural, analog path of the beam to trace the correct line. Another example is an ocilloscope which drives the electron beam as if it were a physical plotter. With a framebuffer, the theoretical resolution of the analog device is digitized into discrete packets of information. That information is what the signal generator uses to decide how to drive the display device.
I mean that the signal generator itself isn't part of the framebuffer. Of course, I acknowledge that you can't have a framebuffer without a signal generator. Otherwise, it'd just be a buffer. Yes, I've only ever heard the term used with raster graphics. StuartBrady 23:56, 12 January 2006 (UTC)
The term is fairly loose in that it doesn't refer to any specific implementation of digital samples (e.g. RGB vs. color palettes) or the type of signal that is outputed. To the best of my knowledge, it is assumed that a framebuffer is intended to produce visible output. i.e. You can have a framebuffer that outputs to a television, but calling a buffer used by something like VNC a "framebuffer" is stretching it. This definition gets confusing, however, when we're talking about Virtual Framebuffers. Virtual Framebuffers are emulations that present software interfaces to programs as if a real framebuffer device existed somewhere. In the case of the fbdev device, the device maps memory access to the virtual "dumb" framebuffer to the bankswitched memory of a "real" framebuffer. In the case of Xvfb, however, the purpose is only to allow X-Windows to run, not to abstract away devices. Xvfb is intended for things like taking screenshots on a headless machine, testing that your X code runs, or running an X-Server because your version of Java requires a display in order to dynamically generate images on the server.
I agree with everything in that paragraph. Except for "X-Windows". *cough* Yes, I'm a pedant, and I'm sorry. StuartBrady 23:56, 12 January 2006 (UTC)
*chuckle* I'll keep in mind to always refer to it as "The X Windowing System". Promise. Cross my heart. (At least until the next time I slip up and revert to colloquial speech.) ;-) Jbanes 15:34, 17 January 2006 (UTC)
Umm. Does that answer your question, or is it a case of too much info for not the right question? If not, at least I can cut and paste half of that right into the article. :-P Jbanes 21:55, 12 January 2006 (UTC)
It helps — go for it! StuartBrady 23:56, 12 January 2006 (UTC)

Video RAM

Perhaps this should be merged into Video RAM? Or perhaps not? What do you think? StuartBrady 22:42, 11 January 2006 (UTC)

I personally thing that Video RAM should be expanded to cover the details of Video RAM in the various architectures of video cards. A framebuffer is only one example of what Video RAM is used for in a modern video card, thus making the topic much more complex. In addition, a dicussion of memory types and configurations (such as DRAM and VRAM) is pertinent to a Video RAM article, but not to a discussion of framebuffers. Jbanes
Agreed, but that wasn't what I was suggesting. StuartBrady 17:24, 12 January 2006 (UTC)
I probably wasn't clear. No, I don't think they should be merged. I think that the two topics each contain a lot of independent points that should be explored separately. The original request to merge was added at a point when the page claimed that a framebuffer was memory. In all reality a framebuffer is a device that consists of memory coupled with an output device, and not just Video Memory. It's important to note, however, that a framebuffer only uses pixel data. Video Memory, OTOH, may hold much more than a framebuffer. Information such as textures, GPU programs, backbuffers, and vertex lists also tend to share Video RAM with the framebuffer in a modern graphics card. When I'm done with this article, I may consider adopting the Video RAM article to add this info. That being said, perhaps it makes sense to merge Video RAM with the "Graphics Card" article? Jbanes 18:24, 12 January 2006 (UTC)
I'm a bit concerned over using the term Video RAM at all for this topic. When I think of VRAM, I think of the memory technology with the same name. --Swaaye 01:54, 16 January 2006 (UTC)
I see your point. Webopedia solves this dillema by having separate articles on Video Memory and VRAM. Perhaps we should do the same? i.e. Have an article that focuses on VRAM, and an article that focuses on Video Memory in general? (Currently, the link to Video Memory points to the Video RAM article.) Alternatively, we could delete the Video RAM article altogether. The DRAM article has a section that covers the subject in about as much detail as the Video RAM article. In my mind, this makes the Video RAM article redundant. What do you think? Jbanes 15:56, 17 January 2006 (UTC)
I think that using Video Memory is the way to go for sure. Video RAM is just too ambiguous and will definitely confuse. I say we just move everything to Video Memory and delete Video RAM like you suggested. --Swaaye 16:58, 17 January 2006 (UTC)
I agree, FWIW. StuartBrady 23:32, 18 January 2006 (UTC)

Since we have a three person concensus here, I've taken the liberty of removing the merge tag on Framebuffer, and updating the tag on Video RAM to point to DRAM instead. When I get the chance I'll review the Video RAM article for any info that isn't in DRAM, then replace the Video RAM page with a redirect. Anyone who wants to start the Video Memory article is free to do so. Otherwise, placing some sort of stub there will probably be the last step I get to. Jbanes 15:27, 1 February 2006 (UTC)

Thanks. BTW, I certainly saw "Video RAM" as a more generic term, which is why I originally proposed the merge. AFAIK, the term was in use way before VRAM became popular. (Oh, you forgot your sig.) StuartBrady 21:40, 19 January 2006 (UTC)

Request for Info

This is a laundry list of info that would be helpful in producing a complete article.

  • Details on Joan Miller's device. The only thing I have to go on is that it was a 3-Bit framebuffer. I have no info on the signal it produced, the resolution of the pixels, its architecture, or even if it produced a television signal. (For all I know, it might have driven a custom display.) All the citations I have point to a personal communication from Miller. I haven't been able to track this communication down.
  • Information on other framebuffer experiments would be nice to have.

Image Question

A question on "Fair Use". I'd like to upload a few images of the historical framebuffer systems to help round out this article. These images are used in Shoup's article as well as by the Computer History Museum and many other sources. However, I haven't figured out if Wikipedia has yet accepted such historical images as "fair use" in the context of an article. While my personal feeling is that Wikipedia would be within its right to use such images, it could be argued either way. Any thoughts? Jbanes 15:28, 1 February 2006 (UTC)

Article Complete

The article is effectively complete now. Feel free to improve upon the prose or information at any time. Jbanes 15:28, 1 February 2006 (UTC)

P.S. Thanks to everyone who's made corrections to spelling, links, and prose. You guys rock! :-) Jbanes 15:10, 6 February 2006 (UTC)

"Modern operating systems ... do not usually bother with display modes"

Is this quite true? Linux itself does use the framebuffer even if the X server is configured to use the nvidia (for example) display driver. I don't know about Windows, but surely it supports using the framebuffer? -- Borb (talk) 13:57, 30 March 2008 (UTC)

There are alot of shades of gray here. Generally, the FB is abstracted so for example the data layout it not relevant. Some other components may still require to think about the FB, for example the layout manager to figure out button sizes. Whatever the statement is true or false probably requires more information on the exact context. For example, is X11 considered a part of the operating system? Typically not. However, some X11 builds do have kernel-reserved instructions in their code so technically those are part of the runtime OS. The statement reads:
Modern operating systems such as Linux and Windows do not usually bother with display modes and attempt to manipulate the hardware directly through device drivers.
This has a few issues - sure, device drivers could be vendor-supplied (as such, they effectively abstract the FB wrt the OS) but the OS, just as any other program still has to track some informations. Maybe "do not usually bother" is a bit stretching it but it is more or less ok. D3D9 for example will happily "destroy" framebuffers and other resources in various scenarios so it has notion of a FB and it could be considered part of the OS.
MaxDZ8 talk 07:12, 1 April 2008 (UTC)

"The term video card can also be synonymous with a GPU."

I removed this line, as the terms 'video card' and 'GPU' each have a specific technical meaning, and conflating the two is, at least in my opinion, always an error. This was reverted by 174.141.208.112 without adequate explanation.

A practical note: the existence of video cards featuring multiple GPUs underscores the problem with conflating the two terms.

Comments?

Wootery (talk) 14:15, 23 September 2014 (UTC)

Video cards are incorrectly called GPU's. A GPU or Graphics Processor Unit is a specialized chip. Where as a video card is a secondary 'daughter' board that connects to the main 'mother board' and whose primary purpose is to render video as opposed to physics cards and GPGPU cards like the tesla whose purpose is general calculations. Things to note is that many devices like cell phones and other embedded devices have the GPU built along side the CPU and Intel sometimes have a gpu built directly into the CPU itself.
Hicklc01 (talk) 09:41, 5 March 2015 (UTC)

Colloquial Usage

On the 21st of October, 83.255.36.199 added an additional definition to the page that stated that "framebuffer" could be used to refer to any memory used for graphical storage. I attempted to clarify this as an incorrect usage of the term on the 14th of November. Presumably the same user posting from 83.255.36.148 reverted the change on the 17th of November, posting the challenge, 'Who are you to say that's "incorrect"?'

To clarify my stance, I am the original author of this article. I created it back in 2006 when I realized that the previous article was propagating the confusing definition that the unnamed user attempted to add to the article. I feel it is important that Wikipedia information be as correct as possible, and therefore I attempted to source the proper definition of a framebuffer as well as I possibly could.

From my original work, a number of Wikipedia editors have added additional sources and references to back up the precise definition of what a framebuffer is. Thus I feel that the definition of "framebuffer" as hardware is unassailable. A statement I am sure the unnamed editor will agree with.

What has not been proven is if there is any merit to the concept that "framebuffer" can refer to something other than a device that generates graphical output. From the majority of people I've spoken with, much of the confusion over this issue appears to derive from the existence of "virtual framebuffers"; devices that pretend to be a framebuffer device, but do not really exist. The article goes into some detail on these "fake" devices in an attempt to clarify their relationship to the physical devices.

That being said, I am not unopposed to mentioning the colloquial usage of the term. In fact, my most recent update changed the text to "colloquial usage" and attempted to explain the difference between proper usage and such "common" usages. If that is acceptable, then great! We're done! :-)

If my update is unacceptable to the unnamed poster, then I request that we open a dialog here and hash out what (s)he wants said, why (s)he thinks it's valid to add it, and what references (s)he wishes to use to source the statement. Per Wikipedia's guidelines, I'm sure we can come to an amicable solution.

Jbanes (talk) 03:38, 23 November 2009 (UTC)

The whole article is vague, incorrect, and confusing. For example, is the article implying that a framebuffer is the same thing as video card, GPU, or PPU? — Preceding unsigned comment added by 169.139.19.96 (talk) 20:44, 19 October 2013 (UTC)
Of the statements, "vague, incorrect, and confusing" the only one I can give any weight to is "confusing". The article has certainly changed over the years as a number of editors have touched it. Many of the changes are not cohesive and some are simply irrelevant. As for the other two, disagreeing with an article does not make it incorrect or vague. The article contains a very complete history of framebuffers going all the way back to Richard Shoup's experiments at Xerox PARC. The article may be different than how you've understood "framebuffer" your entire life, but that's what makes Wikipedia great! Articles get to provide the reality, history, and truth rather than opinion. That being said, if you see something that needs improvement, please improve it! That's why Wikipedia is here. :) Jbanes (talk) 19:29, 14 April 2017 (UTC)