Talk:Moore's law/Archive 2

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia


YHz?????

I'm just starting out programming C++, and I wrote a console application about Moore's Law that produced this text:

  • Year: 2006 GHz:2
  • Year: 2008 GHz:4
  • Year: 2010 GHz:8
  • Year: 2012 GHz:16
  • .....
  • Year: 2032 THz:16.384
  • Year: 2034 THz:32.768
  • Year: 2036 THz:65.536
  • Year: 2038 THz:131.072
  • .....
  • Year: 2114 YHz:36028.8
  • Year: 2116 YHz:72057.6
  • Year: 2118 YHz:144115
  • Year: 2120 YHz:288230
  • Year: 2122 YHz:576461

Now, according to what I've read on the Talk Page, this is 1000%(sic) not possible. How could this be considered a law?
(this is based on transistor counts per processor doubling every 2 years).
(I'm not sure that 2 GHz for an average computer is true in 2006, but...) —The preceding unsigned comment was added by 24.49.75.47 (talk) 14:46, 27 December 2006 (UTC).

  • Again, Moore was speaking of the number of switches per processor, not the speed of the processor itself (although the two can be related). Assuming 50 transistors per square inch in 1965 http://cactus.eas.asu.edu/Partha/Columns/09-03-Moore.htm), here's a more debatable chart plot going out to 2122 (badly assuming, of course, that binary switching will still be a relevant concept in 100+ years):
  • 1965 50
  • 1967 100
  • 1969 200
  • 1971 400
  • 1973 800
  • 1975 1,600
  • 1977 3,200
  • 1979 6,400
  • 1981 12,800
  • 1983 25,600
  • 1985 51,200
  • 1987 102,400
  • 1989 204,800
  • 1991 409,600
  • 1993 819,200
  • 1995 1,638,400
  • 1997 3,276,800
  • 1999 6,553,600
  • 2001 13,107,200
  • 2003 26,214,400
  • 2005 52,428,800
  • 2007 104,857,600
  • 2009 209,715,200
  • 2011 419,430,400
  • 2013 838,860,800
  • 2015 1,677,721,600
  • 2017 3,355,443,200
  • 2019 6,710,886,400
  • 2021 13,421,772,800
  • 2023 26,843,545,600
  • 2025 53,687,091,200
  • 2027 107,374,182,400
  • 2029 214,748,364,800
  • 2031 429,496,729,600
  • 2033 858,993,459,200
  • 2035 1,717,986,918,400
  • 2037 3,435,973,836,800
  • 2039 6,871,947,673,600
  • 2041 13,743,895,347,200
  • 2043 27,487,790,694,400
  • 2045 54,975,581,388,800
  • 2047 109,951,162,777,600
  • 2049 219,902,325,555,200


Somebody might want to update this with information on what actually happened between 1947 and 2007. The year of "1 switch" has got to be off, at the very least. I'm also semi-sure that the "2007" number has to be off by a factor of 10. 147.145.40.43 19:31, 26 April 2007 (UTC)

Believe it or not, we are actually at 820 million right now (Intel Yorkfield), which is 8 times more than the prediction above. --68.147.51.204 (talk) 09:13, 12 December 2007 (UTC)

The prediction is transistors per square inch, not transistors, so what is the area of the die of Yourkfield, and how many layers - then you can work out the transistors/square inch --152.78.202.157 (talk) 11:08, 13 December 2007 (BST)

FPNI statement

I just removed the following statement that was just added: In January 2006 Hewlett-Packard announced a new technique using an architecture called “field programmable nanowire interconnect (FPNI)” that promises to jump three generations forward, in violation of Moore's Law.

I removed it because it was in the wrong section (Early forms, rather than Future trends) and that it was unsourced. The source I can find (here) doesn't support the assertion that Moore's Law has been violated. By the time this experimental technique is used to produce chips, it should fall right into line. -Amatulic 23:54, 16 January 2007 (UTC)

Telecommunication cost

Might Moore's law/Archive 2#Formulations of Moore's Law say something about telecommunication cost trends? Futurists#Future thinking mentions "the fall of telecom costs towards zero," but provides no link. I saw nothing here, nor in telecommunication about the long-term historical cost trends for telecommunication. Might a graph similar to Kurzweil's show the cost of telecommunication dropping similarly over multiple paradigm shifts, from relay couriers to signal towers to telegraphs, up to fiber optics today? Does anyone know of suitable references? Thank you for your time. --Teratornis 02:18, 10 February 2007 (UTC)

Exponential power increase with increasing clock speed?

"This occurs because higher clock speeds correspond to exponential increases in temperature, making it possible to have a CPU that is capable of running at 4.1 GHz for only a couple hundred dollars (using practical, yet uncommon methods of cooling), but it is almost impossible to produce a CPU that runs reliably at speeds higher than 4.3 GHz or so."

The above entire paragraph seems to be bullshit.


Contrast with peak oil?

Peak Oil (or Hubbert's Curve) is similar to Moore's Law in that both are empirical observations about long-term resource or technology trends with far-reaching implications for society. However, while Moore's Law seems to violate Murphy's Law, Peak Oil obeys it with a vengeance. The two "laws" are related in that energy and information are to some extent interchangeable factors of production; the cost of information steadily falls due to Moore's Law, while the cost of energy appears likely to steadily increase, at some point, due to Hubbert's Curve, barring some seemingly unlikely breakthrough in energy technology (for the most part, a mature industry, where the pace of innovation is slow). Therefore, the challenge before technologists is to find ways to substitute information for energy, wherever possible. Sooner or later, the solution that substitutes information for (some) energy must become more profitable. For example, executives can meet via videoconferencing instead of flying around in business jets. At some point the overall economics must begin to favor videoconferencing/telecommuting/telepresence over physical travel by information workers, although the exact time when this occurs will obviously differ with the application, and there will be the usual cultural lag.

I'd like to mention the above observation, which seems plainly self-evident to little old POV me, but to avoid violating WP:OR I need to find some citations that discuss the information/energy tradeoff. I read about this sort of thing decades ago in some futurism-type books whose titles I don't recall (even before the Internet the economic shift from energy to information was obvious). There was also a Scientific American article, I think, some years back which discussed the steadily-falling energy input per unit of GDP in industrial economies. I'll do some looking, but if anyone else is aware of such thinking and can mention some references, I'd appreciate it. --Teratornis 02:18, 10 February 2007 (UTC)

Regarding "Misconception"

Someone edited the page to include a short opinion about labeling Moore's Law as a theory rather than a "Law". Understandably, their opinion is valid but Moore's Law is universally known by that title, and further, follows man other maxims with similar common names (e.g. Murphy's Law)

Also it seems like a bit of original research to me.

Anyone care if it's removed? Ar-wiki 12:49, 20 March 2007 (UTC)

Quote from Kurzweil (about there being 5 generations of exponential growth in computers)

I corrected an error in attributing Bletchley Park's "Heath Robinson" codebreaking machine to Turing (it was Max Newman's work). However - I didn't realise that I had corrected a verbatim quote from Kurzweil's article listed as [11] in the references.

I assume that my "correction" will have to come out of the verbatim quote, but that would leave us with a factual error in the resulting encyclopedia article. What's to do - point out the error within the quote with a "(sic)" note followed by a line or two pointing out the correction? What's the correct procedure here?

How about just using brackets? [Heath Robinson]

By the way, Moore's law breaks down long before 600 years. I give it less than a hundred years.

Removed unsourced digression

I am trying to improve the clarity of this article (especially for less technical readers). I removed this paragraph because it was a needless digression. It is also unsourced and (I'm guessing) somewhat controversial, so it should be removed for WP:VER. ---- CharlesGillingham 12:23, 3 September 2007 (UTC)

Under the assumption that chip "complexity" is proportional to the number of transistors, regardless of what they do, the law has largely held the test of time to date. However, one could argue that the per-transistor complexity is less in large RAM cache arrays than in execution units. From this perspective, the validity of one formulation of Moore's Law may be more questionable.

Forest and Trees

I've made a number of changes that are designed to make this article clearer and more accessible to the general reader. I have rearranged re-titled some of the sections, removed a digression and written a new intro. The main idea here is avoid getting into detailed technical discussions and disputes in the early sections of the article, and to put Moore's Law into it's historical context. The article should make sense at first glance to any educated reader.

There is more to do: there is still some repetition and some paragraphs are almost undecipherable to someone who is not familiar with chip manufacture. (See the guideline Wikipedia:Explain jargon). Also, some sections need an editor to attempt to make sure the entire section is complete and focussed. Does every one of these observations have to be here? Is there anything important being left out? ---- CharlesGillingham 19:01, 3 September 2007 (UTC)

Force or Result? Cause or Effect?

Am I the only one who has a problem with the last sentence of the introduction?

Moore's Law is a driving force of technological and social change ...

Moore's Law is an empirical observation and as such describes and effect, not a cause. I would like this sentence removed and will do so within a week unless someone has a better suggestion. --Tjconsult 22:48, 20 September 2007 (UTC)

Fixed. Is this okay with you? (The "driving force of social change" here is actually the "increasing usefulness of digital electronics" and moore's law "describes it" by roughly quantifying the rate of change) ---- CharlesGillingham 01:34, 25 September 2007 (UTC)
Thanks. --Tjconsult 04:04, 1 October 2007 (UTC)

Oversupplied or Overpriced?

WP:NPOV: Dark fiber overcapacity and the optical network bandwidth oversupply greatly exceeding demand of even the most optimistic forecasts by a factor of up to 30 in many areas.

I never hear the word "overcapacity" applied to roads, or water. In both those cases, most infrastructure is publicly owned or governed, even if built by private firms. Bandwidth isn't "oversupplied", it's overpriced. Several US municipal agencies have demonstrated that -- by building infrastructure and offering bandwidth priced below private suppliers. Network design & deployment is a notable investment hurdle; and sadly, many of our governance mechanisms are not structured to express "the will of the people" or to strongly act "for the people" in that way. Data pipes have become as fundamental and universally needed as water & sewer pipes became in the past. Similar management and control issues apply, and we should follow the same provision models.

However, this Moore's Law article is not the place to engage that complex issue ... and definitely not the place for language that proselytizes for the theft of the commons. Lonestarnot 06:35, 21 September 2007 (UTC)

18 or 24 months?

Moore's law is not ambiguous at all. It was stated that the number of components on microprocessors doubles every two years, not 18 months. There is confusion with the doubling of performance, which is indeed supposed to double every 18 months (but really does every 20 months or so).

See this transcript of a conversation with Gordon Moore, more specifically paragraph 3:

ftp://download.intel.com/museum/Moores_Law/Video-Transcripts/Excepts_A_Conversation_with_Gordon_Moore.pdf

QcRef87 19:12, 2 May 2006 (UTC)

Yep that makes sense. Everyone who usually speaks of 18month also says performance. Be bold include the source and edit it yourself.Slicky 06:18, 17 September 2006 (UTC)
There's a source already there, but people that don't bother to read change the number to 18 on occasion. Fixed again. — Aluvus t/c 06:28, 17 September 2006 (UTC)

Why, if this is so clear, does this article use the 18 month number so many times?? 205.157.110.11 23:00, 25 May 2007 (UTC)

I made some fixes, including a comment in the lede warning people not to change it back to 18 ... richi 21:12, 24 September 2007 (UTC)
I moved this into a footnote. (see next section) ---- CharlesGillingham 01:29, 25 September 2007 (UTC)

Forest & Trees

I think it's important to keep the lead as clear as possible. Someone who arrives here trying to find out what Moore's Law is doesn't need to know messy details unless it's absolutely necessary for accuracy. There is plenty of room for refine the definition and hash out details throughout the rest of the article. The lead should be written primarily for readers who are completely unfamiliar with Moore's law.

(The importance of Moore's Law is much wider than the chip industry. It has the same kind of significance as trends like the green revolution, globalization or urbanization, and so it gets mentioned occasionally in history, sociology and other humanities. The lead should be written primarily to help these readers, who may have no familiarity at all with the law, it's history or importance, and may know next to nothing about digital electronics. The rest of the article is for readers who are interested in the details.) ---- CharlesGillingham 01:29, 25 September 2007 (UTC)

Updated diagram?

Anyone know the transistor counts for, say, the Core 2 line of chips so we can update that diagram on the top of the page and see if the law has been holding for the past several years? Sloverlord 17:38, 18 October 2007 (UTC)

Removed spurious reference document

Removed the link: The Impact of Pervasive Symmetries on Hardware and Architecture from the Data reference section. The document is a fake machine-generated paper. —Preceding unsigned comment added by 130.235.34.47 (talk) 03:46, 1 November 2007 (UTC)