Talk:Broadway (processor)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

CANADA[edit]

How could you say the chip is make in New York when it clearly states on the chip itself when you open up your nintendo wii that the chip is made in canada, not in japan, not in the usa, not in china, but in CANADA. I think we should stop being bias towards america being the best country made by god and give credit where its do for this chips orgin. IT SAYS IT RIGHT ON THE CHIP!!! Otherwise its like saying the first NES was fully designed, produced and implicated in the USA, except Japan just recieved delivery years before.

Curious though, that it contradicts the press release linked in the article, which says they are being fabricated in East Fishkill, New York. I'm not saying you are wrong, and I have to admit I haven't opened up my Wii to check.--203.6.205.22 03:46, 6 March 2007 (UTC)[reply]
here are some other links to support East Fishkill as the source: [1] [2] I just did a google for "broadway wii processor fabrication", I didn't see Canada mentioned anywhere. --203.6.205.22 03:57, 6 March 2007 (UTC)[reply]
The chip is fabricated in East Fishkill, New York, but IBM has a packaging plant in Bromont, Quebec, Canada where it's packaged. Ie, put in a substrate with conectors to the motherboard, closed and labled. I feel for the Canadians that want to feel important, but really.. this isn't that big a feat, packaging a microprocessor. Not if you compare it to designing and manufacturing the actual silicon. The text on the packaging is there for ID and tracing individual chips, not celebrating nationalities. -- Henriok 10:39, 6 March 2007 (UTC)[reply]
I've reverted the text, because it was plain wrong the way it was written. IBM's only 300mm fab is in Fishkill. The chip appears to be packaged in Canada. I see Bromont mentioned in a couple of places, but I can't confirm that. --plicease 20:05, 6 March 2007 (UTC)[reply]

I just thought I'd say something, since I actually work at the IBM plant in Bromont, Québec, where the chip is assembled. Sure, we don't make the part, but we do the bond, the assembly and the test. We make sure your Wii runs right. --Ezzeloharr 02:36, 20 July 2007 (UTC)

729MHz?[edit]

That's like G3...The Captain Returns 23:37, 15 May 2006 (UTC)[reply]

Does anyone have a solid reference to the Broadway operating at 729 MHz? I've never seen this from a trustworthy source. —The preceding unsigned comment was added by 203.217.44.35 (talkcontribs) 13:10, 2 August 2006 (UTC)

The source of the 729MHz figure would appear to have been from a posting on 4chan where it was seemingly invented as a method to make fun of Nintendo's new console for not even being as fast as the original xbox, which was 733MHz. The validity of this figure is therefore highly questionable.
This number seems to be related to the frequent 'Gamecube 1.5' postings seen on many forums. As 1.5*485~729.64.6.0.220 06:02, 20 September 2006 (UTC)[reply]
This number comes from IGN, who says it's the figure in the most recent official Nintendo documentation sent to developers, who they claim to have sourced the specs from. It's right there in the references.
Lets see the nintendo developer documentation on the numbers then instead of the information from an Anonymous Developer that IGN sited. Till Nintendo says something official on this its too easy to hoax or simply invent numbers. 64.6.0.220 19:21, 20 September 2006 (UTC)[reply]
It most certainly is. However, as I said, the figure originated from IGN, not as "4chan", or other suspect forums. That was the point I was correcting. IGN is a reputable source, if they said they received the information from an actual developer, then they most likely did. 208.152.231.254 19:09, 20 September 2006 (UTC)[reply]
However it came from IGN today, that number has been kicking around now for months, and the only sources for it have been very questionable, and it has been most often repeated by internet trolls. If IGN has a new source of information I'd like to hear it. However till Nintendo or IBM say anything I don't think we can list an actual value. Even if the number comes from a developer there is no guarantee that they have access to the final production hardware or system, and the system development kits often differ greatly from the machines that actually are released to the public especially prior to its release. 64.6.0.220 05:25, 21 September 2006 (UTC)[reply]
Oh I don't know... those numbers look pretty accurate to me... look at the October 2006 EGM pg 26, which they are interviewing the dev's of Brothers in Arms. They ask him if the Wii is just a souped up GC? The answer : Yes. He also says it barely manages to out do current gen Xbox specs. 70.143.70.24 07:10, 22 September 2006 (UTC)[reply]
It didn't come from IGN "today", it came from them at the beginning of the year. Of course the numbers have been kicking around for months! Did you even READ the article? Geez. (BTW previous comment about Brothers in Arms is someone else, not me.) 208.152.231.254* 00:11, 25 September 2006 (UTC) (* from a different IP)[reply]
Please explain the EGM article then. 70.143.70.24 08:24, 25 September 2006 (UTC)[reply]
I'm not sure what your question is. My response was to 64.6.0.220 if you've misread my comment as addressed to you (the indent and fact I was saying I wasn't you should have given that away.) 208.152.231.254 19:38, 26 September 2006 (UTC)[reply]
Okay... heres what im saying. If the IGN article is wrong because its old, then how come in the article from EGM (which is an Oct 2006 issue), it states almost the same thing.
Neither Nintendo or IBM have said anything, and there is no guarantee that the devkit and the final hardware are the same (they often aren't), so until we hear something from them these numbers are all just rumors. 64.6.0.220 05:05, 27 September 2006 (UTC)[reply]
Yes, your right. The dev kits are usually more powerful then the final hardware.
Again, why are you asking me? I was correcting 64.6.0.220's assertion that the first references to the 729MHz figure came from unfounded, dubious, X-Box fanbois. As I said, it didn't, it came from IGN, which I've described as a reputable source, something I stand by. I'm really unsure how you can read into that that I'm saying IGN is wrong because it's old. How do you get that out of what I wrote? I'm saying the exact opposite. 64.6.0.220 is the one saying it's wrong because it's not old enough, but he thinks it was just published, rather than published in March.
He thinks it's wrong because he thinks it copied various predictions from forums. He's made no references to such forum posts to back this up, and it strikes me as likely his conclusion is based upon reading those forums before seeing the information reported in the media, not realising the media's reportage predates those forum posts.
Realistically, the probability, as I see it, is that IGN is [i]probably[/i] correct. IBM and Nintendo aren't saying anything, but specific developers known to named writers at named technology groups are saying that's what the official documentation from Nintendo says. Either those developers are hoaxing the journalists (unlikely, what would be their incentive? To ensure their games never get good reviews again?), Nintendo is hoaxing the developers (unlikely, what would be their incentive? To ensure the first games for the Wii suck because they underutilize the CPU and have mysterious compatibility problems that a G5 replacing a G3 would necessarily result in?), or the facts are more or less correct.
Further to that, there's just plain no reason to disbelieve it. The Wii is designed to be sold cheap and make a profit. The other consoles may have significantly more powerful CPUs, but they're being sold for up to 2.3x the price, and even at that price are loss making. The Wii is also designed to use very little power while staying on 24/7. It's not on standby when it's doing that either, it's connected to the network. The Wii is intended for use with an SDTV, and it uses SD resolutions. The requirement is low power consumption. There's no call for significantly higher CPU power.
Until IBM or Nintendo say something, I'm not in favour of biasing the main article one way or another. But the argument the 729MHz figure has been debunked because it appeared in forums apparently manufactured by some anti-Nintendo faction is clearly false. It has an origin. One reputable online magazine, quoting sources that have no reason to lie, appears to have staked its reputation upon this being true. The main reason to wait until Nintendo and IBM says otherwise is simply that Nintendo might have changed the design since then. 208.152.231.254 14:06, 27 September 2006 (UTC)[reply]
I remember hearing the Wii used an extension of the Gecko and the Flipper. Is this true? If so, wouldn't that make the Wii technically a G3?
Links : In speaking to officials at Nintendo, they mentioned that the software developer kits for the GameCube and the Wii share many commonalities.

http://www.officialmariobros.com/nintendowii.htm

Possibly, however there are also a number of rumors that indicate that it includes VMX (altivec) which would seemingly by apple's definition make it a G4, and at least one rumor I've heard says that its the IBM equivalent to a sempron processor, basically a G5 without the 64bit, we really know nothing about this chip at all. Even if it is 729Mhz that doesn't tell us much without knowing what the chip is. Two quick questions, does Nintendo have this information under a NDA? If so, we probably shouldn't trust anything at all till they do say something, or the NDA expires. If not it shoudln't be too hard for someone with the proper credentials to obtain the information simply by calling nintendo up and asking. 64.6.0.220 04:08, 28 September 2006 (UTC)[reply]
Gamespot's article on Wii being 2-3X the power of the GC By Nintendo's own admission, according to a report from USA Today, the system is two to three times as powerful as its current-generation console, the GameCube. Sony's PlayStation 3, announced yesterday, is reportedly dozens of times more powerful than its predecessor, the PlayStation 2.said to be admitted by nintendo. http://www.gamespot.com/news/6125078.html 729mhz Seeming more and more likely.
Since when does 1.5x == 2-3x?
It could be that they decided to lower it, or it could just be that 729mhz doesn't mean it won't be 2-3x more powerful. Maybe they added some fancy instructions, a wider bus, more on chip memory(cache), or whatever. Oh and if you are referring to the quote in EGM about the Wii being a GC 1.5, well thats just an expression or a figure of speech, im sure he didn't mean it literally. Anyways.... anybody see the Nov2006 EGM? There was an interview with the Factor5 president. They said their jaw dropped when they heard of the path the Wii was taking, and decided to go with the PS3 instead because it had enough power for what they wanted to do. He also claimed the Wii as a "GC 1.5".

This picture pretty much confirmes the 729 figure. -- Henriok 09:07, 20 November 2006 (UTC)[reply]

Enough is enough! Several undisputed reports states that its 729 MHz and it says so on the packaging. -- Henriok 21:26, 5 December 2006 (UTC)[reply]
729.14 is written right on the processor. Look at the picture to the right of the article. That corresponds to 729.14MHz. It's very common for processors to have writing of that kind to denote the speed. Although the speed is known, that does not mean much nowadays when processors greatly vary in capabilities, co-processors, word width, and many other aspects. For all we know, this PowerPC-based processor could be the equivalent of the Pentium III 1.5GHz. Geekrecon 18:02, 6 January 2007 (UTC)[reply]
No There Geekrecon,The Broadway processor wont be equlavant to the Pentium III 1.5Ghz,but to a Pentium III Coppermine 733Mhz.I daresay even lower to a Celeron Covington 266Mhz.HAHAHAHA,why would people buy a Pentium III gaming system when there's Core 2 and PowerPC X-3(Xbox 360) —The preceding unsigned comment was added by Ceecookie (talkcontribs) 06:22, 4 May 2007 (UTC).[reply]
Actually, an old G4 800 beats any P3/Celeron, and any P4 under ~2.5GHz, for most tasks. Since Brodway is IBM and not Motorola, and it has a lot more time of development, this chip should beat any G4 easily. This is more than enough power for a game console (I wouldn't buy it over current Cells, Intels and AMDs for scientific usage, though). Your question is akin to "why do people buy 100hp cars when a Lamborghini can have 640hp?" Different solutions for different problems. —Preceding unsigned comment added by 143.106.18.169 (talk) 12:25, 11 April 2008 (UTC)[reply]
A 800 MHz G4 will most certailnly not beat a 2.5 GHz P3 in almost any tasks (perhaps on some heavly AltiVec-optimized code). The Broadway is essentially a PowerPC 750CL (G3) with some SIMD-enhancements. It will not, under any circumstance, beat a G4 at the same clock, and G4s are nowdays dual core and available at over 2 GHz. Broadway is only 729 MHz. -- Henriok (talk) 13:53, 11 April 2008 (UTC)[reply]

Not a G5[edit]

Jon "Hannibal" Stokes has retracted his belief it'll be a 64-bit processor. Given this was the only referenced claim that pointed towards 64-bit in the article, I've removed that speculation. If someone wants to put it back, I think something authorative should be cited at the same time to back that claim up as, at this point, it's an extraordinary claim requiring extraordinary proof. Kind of a shame, I'd have liked it to be a 64 bit system, but, well, if it does the job... Squiggleslash 21:08, 31 October 2006 (UTC)[reply]

Broadway was next Apple processor?[edit]

According to an article on Arstechnica, Broadway was destined to be the next processor for Apple notebooks.

"At the time of The Switch, Apple was entertaining a plethora of very attractive but unreleased and completely unproven PPC options. IBM had offered them Cell on the desktop and Broadway (the Wii processor) in their laptop line. Similarly, P.A. Semi was trying to lure the company with their forthcoming PWRficient line. Neither company apparently saw the switch coming. I've been told that the final decision to make the switch was done at the very last last minute before WWDC." -Ars Technica (5/20/06)

Maybe we need to incorporate some of the information onto the article.

--Carcur 19:33, 21 May 2006 (UTC)[reply]

Speeds[edit]

Maxconsole.net is hardly a reliable source. They got it from a guy who calls himself "theguy." Nothing has come from Nintendo yet, I think this is safe to label as speculative. --71.82.94.124 20:19, 2 August 2006 (UTC)[reply]

Speculation[edit]

I've removed the speculation in this article. Remember: Wikipedia is not a crystal ball. - ZakuSage 05:14, 8 September 2006 (UTC)[reply]

Merge with PowerPC[edit]

I'm thinking that this article should be merged with the PowerPC_G3 article. The Gekko is already there along with a brief mention of the Broadway. fintler 12:05, 8 September 2006 (UTC)[reply]

I think that was a very rash thing to do that should have been given way more thought and discussion before it was done. There's no official sources saying that the Broadway belongs to the G3 or G5 families, Nintendo are being very vague and the only official information from IBM is that it's a PPC core specifically designed for the Wii.
The article should be unmerged from the G3 page and reverted to original form here as quick as possible. --Cryovat 17:22, 8 September 2006 (UTC)[reply]
I did a hasty undo of the merge. None of the sources actually confirmed that Broadway is G3-based. Dancter 18:27, 8 September 2006 (UTC)[reply]
Thanks. I think it's for the best to keep it on a seperate page until we get some specs that Nintendo or IBM have confirmed --Cryovat 18:33, 8 September 2006 (UTC)[reply]
A console processor will always be custom. Quite different requirements compared to desktop. In addition to this it usually is produced in larger quantities over a longer time. Having one unnecessary feature like support for a memorytype the console do not have would be a complete waste of money! (silikon space is money)

Revert Nov 9th, 2006[edit]

I'm going to revert The Captain Return's changes, for the following reasons:

1. This is the article that discusses the current known information about the Broadway CPU. Information that is inappropriate for the main Wii article is appropriate here, it is not legitimate to use an argument about what should appear there as relevent to what should appear here.

2. The sub-paragraph removed was entirely factual and sourced. No less than three sources appeared to back up the statement that most reports are reporting a CPU of spec described. TCR has removed factual information that would have been of interest to anyone researching the Broadway CPU.

3. At this stage, it is dubious to even tip-toe around the notion that the CPU description is, indeed, fact. The only reason for keeping the words "Very few confirmed details have surfaced on this processor" center around the fact that the confirmed details we do have are third party (programmers known to IGN quoted Nintendo documentation) and were, at one point, subject to change. The Wii will be released in less than two weeks. The probability that the specs have changed is virtually non-existant.

As such, I will be reverting the change. I also think it's time to remove the words "Very few confirmed details have surfaced on this processor", but I will leave them in for a few days so that that issue can be discussed further. Squiggleslash 13:52, 9 November 2006 (UTC)[reply]


1. This is the article that discusses the current known information about the Broadway CPU. Information that is inappropriate for the main Wii article is appropriate here, it is not legitimate to use an argument about what should appear there as relevent to what should appear here.
Well, putting it rather bluntly, you are wrong. Wikipedia is not a place to put speculation, be it in the main article or a sub-article related to the main article. Now, as I mentioned when editing the article, this has been discussed before and put aside as speculation- for now. As soon as the Wii's hardware details are released be bold and add as much detail as you want to.
2. The sub-paragraph removed was entirely factual and sourced. No less than three sources appeared to back up the statement that most reports are reporting a CPU of spec described. TCR has removed factual information that would have been of interest to anyone researching the Broadway CPU.
Sure it was sourced, but no one knows quite for sure whether it's true or not. Pretty much every site is putting up "true", "leaked" specs from an "employee from Nintendo", or "anonymous developer". I personally see as just another way to get more people to go to your site.
3. At this stage, it is dubious to even tip-toe around the notion that the CPU description is, indeed, fact. The only reason for keeping the words "Very few confirmed details have surfaced on this processor" center around the fact that the confirmed details we do have are third party (programmers known to IGN quoted Nintendo documentation) and were, at one point, subject to change. The Wii will be released in less than two weeks. The probability that the specs have changed is virtually non-existant.
Ah, I see we're pulling out t3h b1g gramerz0rz, I mean, you've got uneeded commas, MS Word-generated synonyms and everything. Well, as benevalent and munificent as I may find myself to be at times, I cannot seem to accept your attempt at 'tip-toeing around' the fact that the substance on your side of our quarrel, which is 'virtually non-existant', to be at all pleasurable. Yes, those specs were subject to change at that point in time, and they could have. Unfortunately, I don't know, you don't know, and they do. Lets leave it to them to add it to the article before it's formally announced, saavy? It's a very nice quality that have, being so urgent, and having such a desire to get information to the common people. Have you considered journalism as a possible career choice?

The Captain Returns 01:18, 10 November 2006 (UTC)[reply]

You were asked to not remove factual information concerning reports about the Broadway's specification. Instead of responding to that request, you, before launching a bizarre personal attack, have ignored what was asked of you, and mischaracterized what you've removed.
Please, stop removing information that will be of interest and use to anyone who wants to find out more information about the Broadway CPU from the Broadway article. Squiggleslash 02:44, 10 November 2006 (UTC)[reply]
Okay, lets try this again, a little simpler for you. (I included some links for your personal benefit)
1. It's not factual, it's speculation.
No, it is factual, and it is not speculation. If you are going to remove text from an article, you should at least try to understand the text first. The article states that current reports point at an enhanced Gekko. That is a fact. The reports do state that. Nor can you proxy your argument by claiming that the published reports are speculation. The IGN certainly isn't, unless you're accusing them of lying about their sources, in which case you need to back that up.
3. That wasn't a "bizzare personal attack", it was what is known as commentary.
Really? Because I seriously doubt anyone read it as literary criticism. Even if it had been, it's insulting, it's irrelevent, and it shows bad faith on your part.
4. I did not mischaracterize what I intended to remove. I defined it.
No, you mischaracterized it. You claimed it was speculation, despite being a fair characterisation of published reports, despite not claiming the claims were anything other than reports, and despite the fact that no less than three reports were linked to it.
If it had been "speculation", you would have either found four or more reports of equal merit that contradict the claim (Ars doesn't count, as they've specifically withdrawn their counter claims), or you would have proven that the description of the claims as pointing to an enhanced Gekko were false by showing that none of the three articles actually claim any such thing.
5. Please, try reading and understanding my view of the issue, before reverting my edits.
I've read what you had to say and it's wrong! Why the hell do you think I'm reverting the edits? It's you who's persisted in ignoring the evidence that shows your characterisation is wrong.
It's hardly unfair of us to ask you to hold off making such drastic changes and at least discuss in what way the sentence can be described as "speculation" and try to reach some sort of consensus before making the changes. You know, I know, that not only is it not speculation to say that the majority of (credible) reports are pointing at an enhanced Gekko, but they're also - unless IGN is actively lying, unless John "Hannibal" Stokes is actively lying - absolutely true. You know that this information is useful to those looking for information on the Broadway. You know that the Wikipedia page is, if anything, being overly careful and mistrustful about how it treats those sources, refusing to say outright that it is an enhanced Gekko, instead pointing people at the sources for that claim, under a large banner that, in my view wrongly, claims the entire article is speculation.
Please stop removing useful information from Broadway, or at least make an attempt to understand it before you do. You are not removing any speculation. You are removing a FACTUAL claim - that PUBLISHED REPORTS show an ENHANCED GEKKO. Published reports that are credible sources.
Quit it, or discuss your objections here. And wishy-washy "Well, I don't care, it's speculation" doesn't count as an objection unless you're prepared to explain how the statement is speculation. I can't see how you can honestly claim that. The REPORTS show an enhanced Gekko. They do. That's what they do. That's what they say. And hey, as an added bonus, of little relevence here, they're right. Squiggleslash 13:25, 10 November 2006 (UTC)[reply]
Now, now, there's absolutely no reason to get frustated over the whole ordeal. :> Lets try this once again.
No, it is factual, and it is not speculation. If you are going to remove text from an article, you should at least try to understand the text first. The article states that current reports point at an enhanced Gekko. That is a fact. The reports do state that. Nor can you proxy your argument by claiming that the published reports are speculation. The IGN certainly isn't, unless you're accusing them of lying about their sources, in which case you need to back that up.
No one knows who IGN got this from, and Nintendo hasn't said a word themselves. So what if it's a published article? There were published articles about aliens existing. Published means published, it doesn't mean it's truth. I'm not accusing IGN of lying, I'm just saying that it isn't stable evidence of it being true, and shouldn't be put into the article.
No, you mischaracterized it. You claimed it was speculation, despite being a fair characterisation of published reports, despite not claiming the claims were anything other than reports, and despite the fact that no less than three reports were linked to it.
Since the information isn't reputable, it can pretty much be summed up as speculation. Judging by your logic, any site that says they got information from some developer is true. Why don't we put up the specs saying that theres going to be a 8-core 3.2GHz G5 in there? The websites said so. Why don't we say that the console's is going to stay as "Revolution". Some site said so.
It's hardly unfair of us to ask you to hold off making such drastic changes and at least discuss in what way the sentence can be described as "speculation" and try to reach some sort of consensus before making the changes. You know, I know, that not only is it not speculation to say that the majority of (credible) reports are pointing at an enhanced Gekko, but they're also - unless IGN is actively lying, unless John "Hannibal" Stokes is actively lying - absolutely true. You know that this information is useful to those looking for information on the Broadway. You know that the Wikipedia page is, if anything, being overly careful and mistrustful about how it treats those sources, refusing to say outright that it is an enhanced Gekko, instead pointing people at the sources for that claim, under a large banner that, in my view wrongly, claims the entire article is speculation.
Woah, woah. I'm not saying the whole article is speculation. I'm just saying that tiny bit just isn't backed well enough. Now you're mischaracterizing what I said, Squig. All the rest of the article has been backed by the words of real companies, not these unnamed "close sources to Nintendo". And now you're critising us all because we're all being overly careful for not going out and screaming at the top of mountains that everything IGN says is true?
Thanks, The Captain Returns 01:31, 11 November 2006 (UTC). :>[reply]

P.S.: Chill.

Oh, I missed something there, bud. You say that there was information based off of internal Nintendo documentation? This (http://img20.photobucket.com/albums/v61/zach2387/Nintendo_Nexus.jpg) is apparently internal Nintendo documentation too (it even has the seal in the bottom left corner). Should we make an article about this?
:>The Captain Returns 01:41, 11 November 2006 (UTC)[reply]
Maxconsole.net is hardly a reliable source. They got it from a guy who calls himself "theguy." Nothing has come from Nintendo yet, I think this is safe to label as speculative. --71.82.94.124 20:19, 2 August 2006 (UTC)
I've removed the speculation in this article. Remember: Wikipedia is not a crystal ball. - ZakuSage 05:14, 8 September 2006 (UTC)
None of the sources actually confirmed that Broadway is G3-based. Dancter 18:27, 8 September 2006 (UTC)
I think it's for the best to keep it on a seperate page until we get some specs that Nintendo or IBM have confirmed --Cryovat 18:33, 8 September 2006 (UTC)
I think that was a very rash thing to do that should have been given way more thought and discussion before it was done. There's no official sources saying that the Broadway belongs to the G3 or G5 families, Nintendo are being very vague and the only official information from IBM is that it's a PPC core specifically designed for the Wii. - (unsigned)
Those specs aren't official. They were supposedly leaked, but they may have been inaccurate and it's possible the actual production specs are different. Nintendo hasn't really released technical specs for Wii. Scepia 00:59, 10 November 2006 (UTC)
I think no - quote from article "IGN Revolution contacted Nintendo of America for comment, but the company did not return our query in time for publish." and also from Slashdot http://games.slashdot.org/article.pl?sid=06/03/30/044254&threshold=-1 "Matt Casamassina hates Nintendo and takes every opportunity to talk about how weak and worthless their hardware is. Every three months for awhile now he's posted "leaked" specs about the Revolution. Every one of these "leak" stories takes care to talk about how much more powerful the XBox 1 is than the Revolution. In all cases the source is "sources". - (unsigned)
Nintendo has told us they are not interested in releasing the console specs. This would imply they would not like ANYONE doing this. Thus, any developer in possession of the specs and/or development kit would have to sign a non-disclosure agreement. We can pretty much be certain there is such an agreement, or the 'secret feature' of the Rev would've been revealed. Also... April Fools? Though I must agree a better theory is found in the unsigned post above, where Slashdot is quoted as saying Casamassina has been 'leaking' Rev specs for several months now, all unconfirmable, and all very negative. I don't think Casamassina counts as a trustable source for information anymore. --King Nintendoid 18:29, 30 March 2006 (UTC)
I added comments to the trouble sections (CPU/GPU and Technical Specs) in attempts to disuade people from puting the 729MHz/243MHz IGN information in without consulting this discussion. Hopefully this will tune down the number of reverts (being that there's been tons in the past 24 hours on this very topic). Zebov 02:35, 31 March 2006 (UTC)
I thought we had decided that this information should stay in the rumors section? (not the second paragraph, just the actual IGN rumor... see Some more specs above. Zebov 03:40, 14 April 2006 (UTC)
Read around. 70.113.74.80 02:10, 15 November 2006 (UTC) (The Captain Returns)[reply]

Miyamoto confirms IGN #'s?[edit]

I know this is not an actual confirmation, but he says he wanted to use existing hardware. http://wii.ign.com/articles/746/746380p1.html

Recently released IBM specs sheet[edit]

Possible Wii specs?
http://www-306.ibm.com/chips/techlib/techlib.nsf/techdocs/2F33B5691BBB8769872571D10065F7D5/$file/ppc750cl_ds_dd20_5oct06.pdf
Its the same as the IGN specs.

The IBM Spec sheet actually says it is anywhere from a 400MHz(IBMPPC750CLGEQ4024) to a 900MHz(IBMPPC750CLGEQ9024) processor look at page 12 and there is no 729 at all in the document but there is a 733(Catprog 1:31 + 1000, 20 November 2006)

Nov 21, 2006 PC Magazine[edit]

They have the specs of the Wii there. It is the same as the IGN specs. Can I post the specs yet?

The fact is, PC Magazine got their specs by reading IGN. AFAIK, any and all repetitions of the "729MHz" figure are not any extra sources; it's all news media jumping on the only available source, being IGN's ancient, unverified claim. Nottheking (talk) 06:02, 15 February 2010 (UTC)[reply]

about processor dcumentation[edit]

according to the document broadway could be anything from 400mhz to 900mhz and only way to be sure would be checking the processor part number. anyone ready to sacrifice a Wii for knowledge? Not me.. I'm betting on speed of 900mhz, because that's what the processor can pump out. I see no reason (other than heat) to leave processor overclocked.

You mean underclocked? Anyways I think 733 seems like a good number. I don't expect Nintendo to use the full CPU power, seeing as how Wii is so small. I don't think it would have adequate cooling. Anyways.... anyone got a Wii that they want to crack open? Maybe someone could ask PlanetBoredom what's written on the chips of the Wii they smashed.
This picture confirmes the 729 figure. You can draw your on conclusions about what it means, but I don't need any more confirmation than this really. Maybe the Photo is doctored, but I doubt it. -- Henriok 12:23, 20 November 2006 (UTC)[reply]
I could see the numerals 72914 - not 729MHz? Is this really proof?
It could be 729.14 MHz. There's no "MHz" och any processor; Image:IBM_PPC604e_200.jpg, Image:XPC7450.jpg, Image:XPC855TZP66D4_3K20A.jpg, Image:XPC750.jpg. Looking at this picture of the Gekko, [3], it clearly states 486, and 486 * 1.5 = 729. Exactly. And.. Even better than that, this figure appear on _exactly_ the same place as the "729" on the Broadway. No it's not proof, but aren't we grasping after straws here? What are the odds that the supposedly false or incorrect report states 729 MHz, and then that exact figure end up on the packaging of the product, but that those two figures have no connection? And that that figure is located at exactly the same place as the corresponding figure on the preceeding product does. But that this, again, have nothing to do with one another? What are the odds?
And.. for what i's worth. The 729 MHz figure might be a finely tuned 750CL to ensure optimal operating in the Wii, considering bus bandwidth, GPU specs, distances between components and so forth. And guess what? I think that IBM and Nintendo have done this kind of tweaking. The ratio between 485 and 729 is too close to 2:3 to be a coincidence too. Convinced now? -- Henriok
It's not 729.14 MHz. That's not how IBM's part numbering conventions work. But the 729 does indicate the clock speed. Dancter 20:37, 20 November 2006 (UTC)[reply]

This article is complete useless crap and dreck.

Citation Problem[edit]

I really have a problem with the following line:

Broadway's purported clock speed of 729 MHz is also exactly 50% faster than Gekko's, at 485 MHz. [original research?][citation needed]

We're saying that we need a citation to support the theory that the processor might be operating at 729 Mhz. This is astonishingly ludicrous. You don't need "citations" for "maybe". If someone had written:

It has been confirmed that Broadway operates at 1100 Mhz

then we would need a citation. Saying "It's speculation" automatically means that it's speculation. Since we're saying that the figure of 729Mhz is speculation, we've defined it as speculation, and thus identified it as speculation. Do I need to continue with the redundancy? I think not. I'll give some time before I remove this citation in case anyone has a good argument for why it should be so tagged. CameronB 14:52, 22 May 2007 (UTC)[reply]

I just see that "citation needed" and "original research" flags as passive aggressive remarks from those who refuse to have on Wikipedia what is commonly accepted but unconfirmed truth. The 729 MHz figure is uncontested, and I rather have uncontested but what might not be entirely correct data here, than have none at all. It should not be advertised as facts though but Wikipedia have ways to display when the factuality of an article is under dispute. I vote for a nice article with data of what is not under contest, common knowledge and an explanations to why it might not be completely accurate data. There's a stale mate now where one can barely touch this article before the puritans cuts it to pieces or pollute it with {or}}{{Fact}}. LOTS of articles on Wikipedia is under dispute, probably all regarding history, religion, sexuality, evolution, and cutting edge science, but in those cases all sides get their share of the lime light. Why can't it be like this on this article? It's just silly. -- Henriok 15:27, 22 May 2007 (UTC)[reply]
I think the problem lies in the fact that it's ok if it's not confirmed, we just need to have something from somewhere outside of Wikipedia that references that figure, confirmed or not. Though having both tags at the same places in the article 3 times over seems rather redundant. --Chiklit 01:56, 2 July 2007 (UTC)[reply]

256-bit?[edit]

Isn't this technically a 256-bit processor? MessedRocker (talk) 20:59, 31 August 2007 (UTC)[reply]

No it's 32-bit. A memory address is represented by a 32 bit value which makes an addressable memory space of about 4 GB. That's the way this works. Real all about it in this article. –– Henriok 21:09, 31 August 2007 (UTC)[reply]
Hmm... why would they downgrade from a 64-bit processor in 1995 (that's when the N64 units were built I believe) to a 32-bit processor? MessedRocker (talk) 21:36, 31 August 2007 (UTC)[reply]
Because the Gekko is vastly superior in performance. The bitness of the CPU is of almost no consequence for the performance of the console as a whole and I can't honestly say why they chose a 64-bit processor for N64. Marketing perhaps. The 64-bit vs 32-bit debate is pretty lengthy and I can't explain it all for you here. Please do some reading on your own. For now, just accept fact. -- Henriok 11:06, 1 September 2007 (UTC)[reply]
It's possible that the N64 used a 64-bit CPU in order to take advantage of an existing, established 3D rendering-machine architecture, namely that pioneered by SGI, which used MIPS CPUs on their own workstations. In that case, the r4300i and its derivatives (including the VR4300 the N64 actually used) were the cheapest-available example of MIPS current generation. In that case, being 64-bit would;ve been a downside that Nintendo apparently decided was worth dealing with (possibly due to perceived marketability) in order to use the existing architecture. Of course, this is mostly speculation/OR, hence why I'm placing it here rather than in an article.
Additionally, to further clear things up, almost all consoles are/have been "32-bit;" they use 32-bit address widths and 32-bit general registers; the widths used elsewhere are inconsequential. An easy way to compare: look at Pentium/2/3/4 CPUs, and note that on almost all OTHER numbers measured in "bits," it's a width FAR greater than 32. (for instance, 64/128-bit external interfaces, as well as 128-bit SIMD units, very wide FP units, etc.) Specifically, for consoles, every 5th, 6th, and 7th-generation console save for the Nintendo64 has been 32-bit. Additionally, it can be argued that the Sega Genesis/Mega Drive, based on an (apparently) unmodified 68000, is ALSO 32-bit, suggesting Sega sold themselves short. (though a strong point can be made that the CPU isn't "true" 32-bit as the high-order address byte was ignored, hence it's usually called a "16/32-bit" CPU)
The misconception of high-level bitness of consoles was a natural consequence of how agressively that aspect was marketted from 1989 onward; Sega is largely credited with starting the trend with the Gensis/Mega Drive, and (As mentioned above) Nintendo built on this with the outright naming of the "Nintendo64." Sony continued this into the 6th generation, albeit with a false claim; while the Playstation 2 has a 128-bit wide SIMD unit, the CPU remains a 32-bit one. Similarly, Nintendo caused a fair amount of confusion with their "Mario 128" tech demo, as the "128" referred to the number of Marios on the screen, not the bitness of the CPU. As for a "return" to real 64/128-bit CPUs, the former might come in either the 8th or 9th generation should the maker decide they need more than 4GB of address space, since 64-bit CPUs can have up to 16,384PB of address space, which, following a naive extension of Moore's Law, should last consoles for half a century; longer than CPUs themselves have been around. Nottheking (talk) 18:59, 6 April 2010 (UTC)[reply]
Nice anecdotes! Thanks! Xbox 360 and PS3 is 64-bit today but neither use more than 4 GB of RAM. Wii is 32-bit. Next generation will most probably be 64-bit and have more than 4 GB RAM. Iguess that the next generation will be more about the GPU than the CPU.-- Henriok
First off, please be civil. As I meant, while the Cell technically could be called a 64-bit CPU as it has 64-bit general-purpose registers, the CPU's designated word size is only 32 bits, as is the instruction word length. Similarly, the SPUs ignore the higher-order four octets of the address, meaning that they can't even access see beyond the first 32 bits of address space. If the Cell is 64-bit, it is it more by "accident" rather than intent; one could argue it as a 32/64-bit CPU in the sense that the 68000 is often referred to a 16/32-bit one.
As for GPUs... You're missing a point there; RAM is still needed for GPUs. Similarly, central CPU power is still needed for the core game code; so don't expect to see a stop in potency increase. As for memory size... Remember that address space isn't just for main memory. It's needed to assign to and access othjer I/O ranges. (such as for input devices, as well as some output devices like audio) Hence why in a 32-bit OS, you can have 4GB of physical memory yet won't be able to use it all. I don't know any more than anyone else what specific design choices Sony, Microsoft, and Nintendo make for their next generation; if they want 4GB+ of RAM, they'll take a 64-bit CPU, and if not, they won't push for it. Nottheking (talk) 23:27, 6 April 2010 (UTC)[reply]
I'm not arguing with you. I'm in complete agreement, even with regard to Power Architecture's 32/64-bitness. I sincerely enjoyed your anecdotes, I wasn't sarcastic at all. -- Henriok (talk) 17:53, 7 April 2010 (UTC)[reply]

Good source for the article[edit]

http://blog.newsweek.com/blogs/levelup/archive/2007/09/24/is-wii-really-gamecube-one-point-five-yes-says-beyond3d.aspx JACOPLANE • 2007-09-26 20:21

Mathematical Performance?[edit]

Has anyone found how much peak performance in Gigaflops can Broadway perform?From what i heard its 1.94GF, but the soruce is not realiable.213.198.220.233 (talk) —Preceding comment was added at 13:29, 1 February 2008 (UTC)[reply]

In the case of peak theoretical FP performance, it's simply calculated by multiplying the number of floating-point operations the chip can perform per clock cycle, by the number of clock cycles per second. Note that the results are usually DOUBLE what they may appear to properly be, since virtually every floating-point unit today has use of a commonly-used "multiply-accumulate" instructions, which allow virtually any FP unit to handle two operations in one clock cycle. In the case of the Gekko/Broadway, since the CPU features a FP unit that can process 2 floating-point units in a vector/SIMD fashion, that'd mean its floating-point peak performance would be 4 times its clock speed. (2 numners x 2 operations on each number per clock) Since as far as is known, there actually is no verifiable clock rate for the CPU, the floating-point performance is unknown. Nottheking (talk) 05:58, 15 February 2010 (UTC)[reply]

Gecko-based unconfirmed?[edit]

Why is it unconfirmed, the fact that Gamecube games run natively on Wii hardware, (not emulated like Xbox) should make it clear that the CPU is based on the Gecko —Preceding unsigned comment added by 173.2.224.226 (talk) 17:50, 16 December 2009 (UTC)[reply]

Because it's unconfirmed. Nintendo hasn't specified Broadway officially, nor have IBM. Strictly, confirming it on Wikipedia would be Original Research and not allowed. Since both Gekko and Broadway are PowerPC processors and PowerPC code is portable across several processor families, Broadway could be based on PowerPC e500, PowerPC 400 or PowerPC G4 too and the reasoning would still hold. Software would still be compatible without modification. In the real world (ignoring Wikipedia's policies), Broadway is not based strictly on Gekko but on PowerPC 750CL, but they are very closely related. There's just no official documentation available for us to rely on, even if it's common knowledge. -- Henriok (talk) 19:26, 16 December 2009 (UTC)[reply]
Is it not enough that N64 Op-codes and constants exist in the RVL_SDK??
N64? Are you referring too Nintendo 64? That console predates the GameCube and is most definitely not related hardware wise to either GameCube, Wii nor Wii U. N64 is a MIPS based console where the others are confirmed PowerPC. I think Nintendo's SDK's pretty much will support development for many of their platforms and they doesn't necessarily remove old cruft, such as development for Nintendo 64. This proves nothing regarding the kinship between Gekko and Broadway, essentially. -- Henriok (talk) 12:20, 26 May 2014 (UTC)[reply]
(a search in Notepad++)
...\RVL_SDK 2.1\include\revolution\ax.h (2 hits)
Line 540: #define AX_PB_SRCSEL_POLYPHASE 0x0000 // N64 type polyphase filter (4-tap)
Line 546: #define AX_PB_COEFSEL_12KHZ 0x0001 // 12.8KHz N64 type response
...\RVL_SDK 2.1\include\revolution\os\OSSerial.h (10 hits)
Line 79: #define SI_TYPE_N64 0x00000000u
Line 102: #define SI_N64_CONTROLLER (SI_TYPE_N64 | 0x05000000)
Line 102: #define SI_N64_CONTROLLER (SI_TYPE_N64 | 0x05000000)
Line 103: #define SI_N64_MIC (SI_TYPE_N64 | 0x00010000)
Line 103: #define SI_N64_MIC (SI_TYPE_N64 | 0x00010000)
Line 104: #define SI_N64_KEYBOARD (SI_TYPE_N64 | 0x00020000)
Line 104: #define SI_N64_KEYBOARD (SI_TYPE_N64 | 0x00020000)
Line 105: #define SI_N64_MOUSE (SI_TYPE_N64 | 0x02000000)
Line 105: #define SI_N64_MOUSE (SI_TYPE_N64 | 0x02000000)
Line 106: #define SI_GBA (SI_TYPE_N64 | 0x00040000)
Tcll5850 (talk) 17:34, 25 May 2014 (UTC)[reply]
EDIT: btw, I purposely selected 2.1 (SSBBrawl's SDK) instead of 3.2 to show early implementations, and not recently implemented. — Preceding unsigned comment added by Tcll5850 (talkcontribs) 18:28, 25 May 2014 (UTC)[reply]

External links modified[edit]

Hello fellow Wikipedians,

I have just modified 3 external links on Broadway (microprocessor). Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

checkY An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 03:39, 9 November 2016 (UTC)[reply]

They work; I changed them to use {{cite web}}. Guy Harris (talk) 02:20, 10 November 2016 (UTC)[reply]