Talk:Computer/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1 Archive 2 Archive 3 Archive 5

I heard something about variable computers. If someone knows what they are and how they work, can they type it in?


Is my thermostat a computer?  :-)

No it is a simple feedback device unless it is a programable thermostat.

Suppose it's programmable.

Then yes, it is a computer. I think a more natural way of speaking would be to say that it has a computer in it.

Well, no; it has an embedded chip...that doesn't make it a computer, does it?

We have to start with definition. To most people 'computer' means personal computer and even if they think about a supercomputer they see a more powerful pc. If we however stick to the definition 'device that process data' then computer will have much broader meaning. ENIAC was a computer but it did not resemble present computers. A computer computes data therefore any device that does it can be called computer. A programmable thermostat has small computer inside and one of the more sophisticated thermostats might be more powerful than ENIAC.


I think a strong connotation of computer nowadays is that it is universal (ie. it can perform any computational task). A thermostat can be incredibly sophisticated but it will still only tell you when to turn on the heater. A pac-man machine will only play pac-man. But a computer can do either of those things, or much more, so long as you give it instructions on how to do so.


A computer used to mean a person that computed, eventually a person that computed using an adding machine. Many of these computers were women. The computations were often systems of differential equations (or other linear systems), for example, solving problems in ballistics.



I intend to give this page a serious working over as the result of some interesting discussions on talk:Konrad Zuse and on the other "history of computing" related pages. Robert Merkel


As the author of the page (though it has been improved somewhat since) i think a complete rewrite would be nice. I wrote it mostly in desperation that so important a topic had only a one like entry. The current page is better than that but not particularly good.

However, i suggest not deleting anything from the page until you have a complete article that covers all the important stuff already there (and hopefully more!). One way might be to rule a line at the top (or bottom) and start your rewrite in a seperate section. When you have enough there the old version could be removed.

I have seen a few other pages where mediocre articles were deleted by someone who then ran out of steam before completing their rewrite, leaving something worse than the original. Leaving both versions available during the transition protects somewhat against this disaster. Best of luck here!


I didn't see the above comment until I had committed my rewrite (it was actually a good idea you had, if somebody can restore the old article and hang it somewhere that'd be good).

(Done. It's at the end of the new one. New one is looking good!)

It is approximately half "feature-complete" at this point. Seeing we already have a great deal of other material on computing topics, I intend to concentrate merely on the "what is a computer" question, with very brief overviews of the other two subheadings.


Any suggestions (or just plain edits) on how to improve my explanation of why Turing-completeness is important would be appreciated. Robert Merkel


On the commercial computing side, data processing machines began to enter serious use circa 1895 in the USA and during the early 20th century many other places. These machines usually were based on punched cards and could do operations such as sorting, tabulating, counting, and in some cases calculating a total, but were probably not general enough to be considered computers. When computers became cheap enough to be competitive, computers took over because they can do all this, and have much more flexibility. Many of the technologies used in computers 1945-1970 were originally developed for earlier data processing machines and some of the same companies were involved (IBM and Burroughs, maybe Sperry, probably others in other countries). In the history section this seems somehow relevant, but you write so much better than me i leave it to you to decide if, or how, to add it.


Yes, the new one is really looking good! --LMS


I think the old version should be moved here. Also, even though the main article could be expanded almost without limit, it might be good to move or remove all the metacomments so that it will read like most of the other articles. David 19:49 Sep 21, 2002 (UTC)


"It's now commonplace for operating systems to include web browsers, text editors, e-mail programs, network interfaces, movie-players and other programs that were once quite exotic special-order programs." Is this realy true even in windows? In Linux vim or emacs, xine or mplayer, kmail, evolution mutt or pine ,konqueror galeon opera mozilla or firebird would never be considered part of the OS which most people would consider the kernel, likewise for OSX and its BSD derived kernel. Do people realy regard notepad and IE and outlook and windows media player as part of the OS - I know microsoft claims it for IE, but the rest? I'm sure Microsoft would like to claim almost any aplication where they have competition is "part of the OS" so they can happily bundle it with the OS, but bundeling with and being part of are two different things. This passage should be defended or removed, is there any example of an OS where eg the text editor is part of the OS? Generally where the functionality can be provided by an alternative program it cannot be considered part of the OS. The example of network interfaces is the one example of a function which has genuinely been taken over by the os as compared to a third party program (cf. trumpet winsock) Htaccess

There's not really a clear line between what is part of the OS and what isn't--usually the core programs are considered a part of it. Programs like QuickTime and Safari could be considered part of Mac OS X because they are first party and are integrated at the system level instead of just the application level. Even a text editor may be--vi is part of the Single Unix Specification. Network interfaces have moved from the userland to the kernel, but the kernel is not the same thing as the OS. --HunterX 22:51, 2005 May 11 (UTC)

I really like this article but feel it is missing an important piece about the user of computers to decode in WW2. Is that somewhere else and if so can we put in a link to it? --(talk)BozMo 13:37, 14 May 2004 (UTC)

Well, that's what most of the work in building an encyclopedia consists of, isn't it? :-) As a start, see maybe: Ultra . Kim Bruning 14:09, 14 May 2004 (UTC)



Oh dear. For some reason John von Neumann isn't mentioned in the definition section. Odd that. Other missing people are Alan Turing and the pair Alonzo Church and Stephen Kleene.

These folks provide 3 different detailed definitions of computer, all of which are currently in actual use in the field. :-)

The 3 POVs (with a very short summary (so YMMV) ) are:

These 3 definitions overlap:

  • A von Neumann architecture can simulate a turing machine, with the understanding that most implementations can't actually simulate an infinite length tape, just a very long one.
  • Originally people thought that von Neumann machines weren't particularly suited to performing lambda calculus. Over time people have gotten practice. Nowadays , a language like Ocaml might actually run faster than a language (like C) that was specifically designed for von Neumann architectures.
  • A Turing machine can be simulated in lambda calculus, and lambda calculus can be performed by a turing machine.

This means that all 3 POVs of are logically equivalent, but they do each bring a slightly different way of looking at computers to the table.

I guess I'd better look into this better sometime. Kim Bruning 23:06, 28 Jul 2004 (UTC)

Partially done, and extended this comment as a result. Kim Bruning 07:40, 29 Jul 2004 (UTC)

I just completely re-wrote the definition section. Turing and Von neumann now get mentions but church does not (maybe you could add him in a suitable place). Given the general title I tried to stay away from too much theory and leave the details for other pages..

I think the following would be useful:

  • A scematic diagram of a simple von neumann architecture to illustrate the "How a computer works section".
  • More information in the computer applications section
  • Also a sprinkling of more up to date references in various sections

I also moved a lot of etymology to here wiktionary:computer which seems a much beter place for it.

John Harris


What do I want from a computer? Well, love, but I've pretty much given up hope on that one. Ten years ago a guy named Steve made a computer called NeXT, and things still suck. This is intolerable. Why don't I have a minimal instruction set computing (MISC)-type box on my desk? Something like a Forth engine (cf. Chuck Moore's F21 CPU and similar), with Lisp as the app language (coded itself in machine Forth, of course). Forth can be implemented surprisingly efficiently with a stack machine. No register mess, which makes context switching basically painless and trivial. Easy, cheap, frequent procedure calling is encouraged, with implicit low coupling and modularity thanks to the ever-present stack. Programs are small and efficient because so few instructions are required on a complete stack machine. Oh, yes, while I'm at it, I'd like the whole thing implemented in asynchronous logic CMOS please, so I can implant it in my cerebral cortex and run it off body heat and the occasional shot of glucose. Sigh. Well, I'm a cs geek, but there are times I wish I was in comp. engg...


The picture below shows one of my MBs that fried mystriously, it was connected to a highedups which showed now power issures at all. But obliouvsly something went wrong. Any way thought the pic might be of use to someone. Belizian 20:26, 6 Oct 2004 (UTC)

|

Fried Motherboard, And it didnn't come from outside something with board itself caused this

Computar redirects here. Is that right? Neither Dictionary.com nor Google seem to have anything useful. Brianjd


I removed the information on CPU manufacturers because it was misleading and incomplete. Intel and AMD both make x86 chips and, it doesn't make sense to mention "Motorolla" (68k) and Cyrix (another x86) and not PPC which is still used for a minority of desktops (or for that matter other low-key x86 players like Transmeta or VIA). In any case, information on manufacturers isn't really necessary to a general discussion of ALUs. --HunterX 08:16, 2005 Jan 7 (UTC)


I think the newly added general principles section is a good idea but I'm not sure Claude Shannon deserves special mention. His work on information theory was very important but this is only incedental to computers. Also he was not the only one thinking about boolean algebra Konrad Zuse was doing the same thing in Germany. Alan Turing was certainly more important than Shannon but again he is only one of many. In general I think name droping in a genral principles section is not very helpful since it will evolve into a who's who of computer pioneers. -- John Harris 2005, Jan 8

Uh, the reason Shannon is there is NOT because of his work on information theory. Before you make a fool of yourself further, please read the Shannon article in full, and the attached discussion page, to understand why he is listed in this article. Shannon had two major accomplishments in life, (1) founding digital circuit or logic design theory, and (2) founding information theory. It is the first one which was his major fundamental contribution to modern computers. Otherwise we might still be doing electronic computers the old-fashioned way, totally ad hoc, like Zuse, Stibitz, Atanasoff, etc., instead of doing them the fancy way, with modern Boolean algebra.

--Coolcaesar 05:37, 12 Feb 2005 (UTC)


In the etymology section, it might be worth mentionning that "Computer" is of latin origin, it comes from the verb "computare" which means "to count". It also gave the French verb "compter" which has the same meaning.


Error on graph

Somebody needs to fix the image Image:PPTSuperComputersPRINT.jpg -- the vertical axis says that FLOPS = "floating point operations" where FLOPS is actually "floating point operations per second." Gary D Robson 17:33, 14 July 2005 (UTC)


In case anyone was looking at the history, the reversion that I made was against the vandalism by 219.95.97.41, not 219.95.97.4 like I accidently said. DarthVader 14:04, 21 July 2005 (UTC)

History?

Where's the history section? I know, I know...be bold, but is it just somewhere I haven't seen yet? --Wtshymanski 19:47, 22 July 2005 (UTC)

See History of computing hardware. But also see my comment below - history has been displaced by all the other random crud that this article has accumulated. --Robert Merkel 04:35, 29 July 2005 (UTC)

Stop adding random crap!

Don't take this personally, but this article has become a near-unreadable collection of people's personal hobby-horses.

One unfortunate problem of encyclopedia articles by committee is that while committees are very good at adding things, they are terrible at removing them. This particularly shows itself in "overview"-type articles as this one is, because everybody wonders why their little bit of expertise isn't mentioned in the main article (it's in fact discussed at length in more narrowly focussed ones). The "classification" section, for instance, is incredibly unwieldy, and in my view adds little of value for the reader. This article therefore needs a thorough renovation, which I intend to start work on tonight.

Please, just don't add whatever random fact comes into your head to an overview article! Think about whether it actually adds value to the likely reader! --Robert Merkel 04:34, 29 July 2005 (UTC)

New version started at Computer/Temp. --Robert Merkel 13:59, 29 July 2005 (UTC)
Yep, have at it. An article here will always be mediocre until someone takes it upon themselves to make it better. Lots of people can do that, and do so for lots of articles all the time, but not every article has someone get to it. Keep us updated here on major changes in the temp page version. My general comment is to try to thing generally about the whole topic and prioritize what are truly the most important topics, and what needs to be covered and what doesn't. This article will be particularly dicey to avoid POV, because everyone has a different way of interacting with the computer and a different focus. Try to be as general as possible. - Taxman Talk 15:19, July 29, 2005 (UTC)
Nice re-write. Since I wrote the original "unwieldy" classification section I thought I might add a few comments. Don't worry I'm not offended I'm just amazed it lasted so long! The new introduction seems a bit heavy to me. Introducing the Church-Turing thesis in the second sentence seems a bit agressive. I think the first pragraph of this type of general article should be aimed at an educated 12 year old. I'd suggest leaving the theory of computers till at least the second paragraph. The History section is a nice summary. But it does illustrate a problem I had. Which names should be listed in such a section. You have Babbage, Zuse and Shannon. Obviously important, but why not Turing and Eckart etc. I could never find a solution to this problem someone is always going to want to add someone else. Finally I think there is a bit of revisionism going on here. The computer scientists and theorists seem to be taking all the credit when in reality it was a bunch of annonymous engineers who really made computers work (I understand what I am saying here and will stand by the claim). Where are the mentions for Tommy Flowers, Mauchly and Stibitz. All these men made contributions of significance. They used to say that "science owes more to the steam engine that the steam engine owes to science" I think a similar statement can be made about computers. Computers were made to work by men who did not always understand the theretical implications of what they were doing. But so what! They still made them work! The theorists often were left to explain what had been created not to specify it. Virtual Traveler
Thanks for your understanding about the need for a rewrite; while I happened to pick on the classification section, my whinge was about the profusion of the page had collected on many topics which had completely obscured the forest for the trees.
As far as the Church-Turing thesis goes, I was trying to get across the vital idea that computers are universal information processors; I tried to mention it in the most painless way possible. Feel free to change it (of course); I'll have a think about a less hardcore version.
The history section was a nightmare to get to the state it is; in the end, I avoided mentioning too many names because computers were largely team efforts, and singling people out is simply too diffcult (and controversial with the case of Eckart, Mauchly, and von Neumann). As far as the three names that got mentioned, maybe Shannon's name could be removed without interrupting the flow of the article, but there's no other way to describe Zuse's projects without mentioning Zuse himself (and if you didn't mention his work any random German who came across this page would go ballistic at the slight). And, as far as Babbage is concerned, he was the first person to dream up (in anything other than a fanciful sense) the idea of a computer. I think it's worth a sentence in a potted history, and if you mention his work you need to mention him.
As to the anonymous engineers, there's a certain truth to that, but I did try in the history section to mention the importance of the many precursor technologies. --Robert Merkel 08:36, 21 August 2005 (UTC)

New version drafted at Computer/Temp

I have drafted a heavily revised version of this article at Computer/Temp. I have deliberately narrowed the focus to the modern, digital computer, and removed a lot of the definitional stuff, and tried to give more comprehensive treatment of what a stored-program computer is and what it does. It still needs a lot of work, but I think it's to a point where it can replace the present article as the basis for future work. If anybody's got a violent objection to me replacing the present version with the new version, could they say so here? --Robert Merkel 13:31, 12 August 2005 (UTC)

Halting problem

You know, I'm on crack today, it seems. Let me re-state: The halting problem is not NP, as I had stated. It is not solvable, as stated in the edit, which I will restore, but make a bit clearer. -Harmil 15:01, 15 August 2005 (UTC)

Replaced with new version

OK, In the light of some suggestions and mainly positive comments, I have bitten the bullet and replaced the old version with the version from Computer/Temp. Have at it. --Robert Merkel 05:54, 20 August 2005 (UTC)

Illustrations...

One thing the rewrite hasn't really touched is the question of illustration. While I'm sure we can find suitable photographs, is there anything in this article that really needs a diagram for further elucidation? Maybe a conceptual diagram of the stored program architecture? --Robert Merkel 04:47, 23 August 2005 (UTC)

It would be nice if we had some pictures for the "History" section. --Keramida 05:04, 23 August 2005 (UTC)

"The network is the computer"

The recent edit by Jjshapiro is wrong: The quote from Sun was as above, not "the computer is the network". Also, most of what's been added is better covered in the Internet article. Also, there's a big difference between networking and internetworking, which was left alone before (not particularly relevant to Computer article anyway) but is nicely muddied here now... I just don't have time at the moment to try to sort all this out for him/her. Maybe it should just be reverted, but 210.211.241.195's new para about the WWW wasn't much good either - "shopping or marketing a product"? Sorry, guys. --Nigelj 19:18, 11 September 2005 (UTC)

Whoops! Thanks for correcting my reversal of that quote. But don't you agree that that principle involves a redefinition of the scope of the computer and should be in the Computer article? I do agree that in principle internetworking doesn't necessarily belong in the Computer article except by extension. On the other hand, under today's conditions, the internetwork is the network. Jeremy J. Shapiro 15:18, 22 September 2005 (UTC)

VANDALISM ALERT!

Some one inserted words like "gay" and "bumming" into the article and make some links unworking because of this! Please restore this article to a honest state. Thanks.

Atanasoff-Berry Computer

I have reverted the history section to an earlier version. This is an *overview* article, and giving so much prominence to a special-purpose machine like the AtanasoffBerry Computer is inappropriate. While its contributions of using binary numbers and all-electronic computations were important, it wasn't a general-purpose machine. It was another step along the path to the general-purpose, stored-program machines, perhaps comparable in importance to the ENIAC and Zuse's various machines and thus worthy of approximately the same amount of space. Frankly, the ABC page needs some NPOV and cleanup work too (the article confuses the concepts of Turing completeness and the stored program architecture), but that's another issue. --Robert Merkel 13:49, 5 October 2005 (UTC)

I am familiar with the history of the ABC and agree with user Robert Merkel's edit. Well argued. --Coolcaesar 05:12, 6 October 2005 (UTC)

Revert.

I recently reverted some edits that seemed to confuse computer with personal computer as explained here. I thought I'd put this here since it could look like vandalism on my part to an outside party. Indium 10:01, 16 November 2005 (UTC)


revert of first paragraph

I have removed the bit about networking from the lead paragraph; as networking is *not* an essential component of all computers, and it's a hell of a lot older than 10 years. See embedded computer for an example of computers that often don't have any networking capabilities, and ARPANET for networking that long predates 1995. Computers are more than just home PC's, people. --Robert Merkel 12:32, 20 November 2005 (UTC)

That is indeed very conservative

It seems that Robert Merkel is extremely conservative and intolerant. When one talk about computers, whether server computers, embedded computers or personal computers, there ability to communicate is unquestionable. It appears that he belongs to an ancient tribe which believes that computers can only compute. In fact communication in modern world uses computers extensively. He is behaving like an osterich. The definition of the word computer has changed through decade, from a person who calculated, to the analytical engine of Charles babbage, to electronic calculator (Karl Zuse) to microproceesor controlled mainframes to the modern computers that are seen in several avtaars. So nothing is served by citing the definiion in Mariam Websters dictionary. Definitions of terms change with time and one has to learn to accept the changes.Charlie 08:46, 21 November 2005 (UTC)

Charlie, are you familiar with the Wikipedia:no personal attacks policy? Please follow it. And, if you'll excuse me a little credentialism for a moment, I have a PhD in software engineering, and am currently a postdoc; my specific field happens to be software testing. I started using computers in 1980; I actually started writing real code in 1989. At one time, I did some research into network fault diagnosis for a multinational telecommunications hardware provider. After that, I worked for a company developing open source software; I was the only Australian-based employee of the company, and my primary communication with them was by email and private IRC. While I missed the BBS era, I was certainly aware of it, and was a regular reader and occasional contributor to Usenet back when it was the primary group communications medium of the internet. At the moment, I'm working on building a simple embedded system in my spare time; And, while it's not my field, I have friends who work on wireless networking stuff. As to my understanding of the history of computers, as well as reading many books on the topic (one particularly relevant one to your claims is The Dream Machine by M. Mitchel Waldorp), one of my undergraduate lecturers, Peter Thorne, worked on CSIRAC (The Last of the First by McCann and Thorne is an excellent account of the machine); he was there at the very beginnings of the modern computer. So I think I have a pretty comprehensive perspective on what a computer is and isn't.
Firstly, the claim that computers only became communications devices in the last decade is an incredibly PC-centric view of the world. Perhaps the first wide-area computer network was SAGE; by the release of the IBM 360 series remote terminals were a comercially-available feature, see this press release from 1964. The ARPANET, the direct ancestor of the modern internet, was switched on in 1969. Even in the microcomputer world, Ethernet was common in offices in the 1980's, and BBS's were popular in the hobbyist underground (if you could afford the phone bills). There is nothing fundamentally new or changed about networking in the last decade; it's just that what was largely restricted to large businesses, the military, academia, and the hacker underground went mainstream. It's like claiming that cars were invented in the 1950's because that's when every family got one.
The second claim is that networking ability is somehow a fundamental characteristic of a computer. While the ability to receive input and output from the outside world is indeed a fundamental component of computation, the exact method by which this is achieved is not particularly fundamental to making a computer a computer. If I unplug the network cable from my PC it does not become any less of a computer! Are the small pile of embedded computers in a BMW 7 series somehow not computers because they're not connected to the Internet? My telephone is a incredibly useful networked communications device, but it is not a computer (though mobile phones are essentially embedded computers with radios attached). The key thing that distinguishes computers from other devices is the ability to compute, not to communicate over a global network.
So, conservative or not, I think I'm right and you're wrong, and if you want to try to convince me you'll have to do better than gratuitous insults. --Robert Merkel 12:40, 21 November 2005 (UTC)

Sorry, if I sounded like insulting you. I had indeed no such intention. The point I was trying to make that at present the most common perception of a computer is a device that is (and can be) used for communication. You must be aware of the old saying " exceptions prove the rule". True there can be computers that are bereft of any communication ability, but they are not the only computer machines. A vast majority of computers are now being deployed for communication purposes. So if a line is added to the definition that computers can also facilitate communication and are increasing used to do so, I see no harm. But, you seem to be extremely possesive of your lines and cannot tolerate any one changing them, so let it be. But remember we are communicating with each other only through computers, and that includes computers other than our PCs. Charlie 07:04, 22 November 2005 (UTC)

While I disagree with several aspects of his rewrite of the Computer article, I concur with Robert Merkel's point that computers are primarily devices for computing, and their modern use as communication devices is merely a secondary point---that is, merely only one of many possible applications of computing technology. If you do not understand Merkel's analysis, it is probably because you are unfamiliar with basic computer science---in which one learns that a computer can be built on the basis of any well-understood physical principle which can be mapped to a Turing-complete set of basic operations. Working computers have been built out of Tinker-Toys and toilet paper rolls.--Coolcaesar 07:14, 22 November 2005 (UTC)
By the way, what are those? This is the Wikipedia; the article ain't holy writ. I only get narky when edits happen to make the article worse rather than better. --Robert Merkel 11:53, 22 November 2005 (UTC)
If you go back to the earlier version of the Computer article, I had inserted an explanation of two of the most important concepts underlying computing---computers simply carry out mechanical operations which result in the drawing of shapes which humans assign meaning to. Computers do not really think (at least yet). Furthermore, computing is not theoretically linked to any specific implementing technology, and computers could be built out of anything, although we use electronic computers for now due to their reliability and because other technologies suffer from the "Turing tar pit" problem.
In my experience, these concepts are probably the most difficult for laypersons and non-mathematicians (myself included) to grasp. It is because these ideas are so difficult that (1) computers have often been stereotyped by the media as "thinking machines" (and indeed there was a supercomputer maker of that name) and (2) electrical engineering is often confused with computer science. --Coolcaesar 03:57, 23 November 2005 (UTC)
With regards to the implementation technology, that's a reasonable point; it's sort of implied but not perhaps made explicit enough. However, you open a can of worms with your contention that "computers do not really think". How do you know that I'm just not carrying out mechanical operations that result in keys being pressed on my keyboard that *you* are assigning meaning to? :)
The Artificial intelligence article has also gone down the toilet over the years and I feel the urge to rewrite coming on (compare this old version with the current article), but it (or the associated articles *should* explain this issue. The two famous thought experiments in the area are the Turing test and the Chinese room. Personally, I tend to dismiss the Chinese room; but then, I don't assign any "intelligence" to a chess-playing computer, and plenty of people at least used to think chess-playing ability was the sign of intelligence.
I agree that some brief mention of the question "can computers think" should be made on this page, though, but it needs to be done well.

It may be very true that "computer scientists" learn through their education that "computers are devices that can process information only" , but the readers of wikipedia are not computer scientists alone. To press this point is rather puritanical not practical. Charlie 08:25, 22 November 2005 (UTC)

What's the problem, Charlie? The communication aspects of modern computing are well covered in the section Computer#Networking and the Internet. It's not fundamental, but is important, so it has it's own subsection, that is well linked to other specialist articles. No problem, as far as I can see. --Nigelj 10:38, 22 November 2005 (UTC)
Computers are devices that process information. One of their major applications is as communication devices to the extent that the information is sent to and comes from a remote location. How hard is that? I don't know why Charlie is having such a hard time with this.
Furthermore, the tendency on Wikipedia is to respect the general usage of a term by professionals in a field (for example, Internet v. internet). This is due to Wikipedia's general "No original research" policy, which in turn gives rise to corollaries like a preference for objectivity over subjectivity. The objective definition that most editors of this article support is already established in computer science textbooks all over the world, and is taught in computer science courses every day.
Look, Charlie, if you want to modify the definition, go do some research and get articles from major newspapers or peer-reviewed journals to establish a shift in the formal or informal meaning of the term. Then come back here and post your evidence and maybe a consensus will develop in favor of a change. For help on research, see Wikipedia:How to write a great article.--Coolcaesar 03:57, 23 November 2005 (UTC)

That's it. Why can't this article be introduced with your definition. " Computers are devices that process information. One of their major applications is as communication devices to the extent that the information is sent to and comes from a remote location." Charlie 07:14, 23 November 2005 (UTC)