Wikipedia:Reference desk/Archives/Computing/2007 December 28

From Wikipedia, the free encyclopedia
Computing desk
< December 27 << Nov | December | Jan >> December 29 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


December 28[edit]

encrypting a single file or stream with a password[edit]

I know compressed archive formats like zip, 7z, rar, can encrypt their contents using a short user-typed password (not some special binary key). I want to know if there is an equivalent but more lightweight way of encrypting a single file (without the need for the archive and compression functionalities), like a filter program that takes a stream on standard in and encrypts it to standard out, using a user-typed password. Thanks. --71.141.111.63 (talk) 00:38, 28 December 2007 (UTC)[reply]

Unix provides crypt (Unix). Note that most implementations are ..."using an algorithm based on the Enigma machine. It is considered to be far too cryptographically weak to provide any security against brute force attacks by modern, commodity personal computers." Nimur (talk) 01:20, 28 December 2007 (UTC)[reply]
PGP, better known for its public-key uses, can do symmetric encryption of files. Just run gpg -c file --tcsetattr (talk / contribs) 01:54, 28 December 2007 (UTC)[reply]
What's wrong with some special binary key? The contents are encrypted; it's not just the program refusing to open it if the password doesn't match. --ffroth 03:25, 28 December 2007 (UTC)[reply]

free dvd to ipod video[edit]

hi guys does anyone know of any free software available that can rip DVD's to iPod video format (.mp4)? i've tried Handbrake, but it didnt work. thanks —Preceding unsigned comment added by 91.109.44.206 (talk) 00:49, 28 December 2007 (UTC)[reply]

Not directly - I'd rip it to AVI, then use SUPER to convert it to the iPod's video format. -Wooty [Woot?] [Spam! Spam! Wonderful spam!] 04:13, 28 December 2007 (UTC)[reply]

Digital camera as a mass storage device[edit]

In Windows 98 SE, I could access my Kodak DX6490 as a drive and use it as a memory card reader. However, in XP, WIA takes over and only allows me transfer pictures. Is there some way to disable this feature (USBStor.ini?), or any software that communicates to the camera as a USB Mass Storage Device? The camera firmware doesn't have any setting to choose between MSD/PTP. WikiY Talk 01:09, 28 December 2007 (UTC)[reply]

You can try disabling WIA from the Services window, but I'm not sure if that would then cause the device to be picked up as a storage device. It couldn't hurt to try though. Click your start button, then Run. In the Run window, type services.msc and press Enter. A long list will then be displayed. Find Windows Image Acquisition in that list, right-click it, and choose "Properties". In the resulting window, click the "Stop" button and wait for the service to stop. Then in the same window, next to "Startup type", choose "Disabled". This will ensure that when you restart your computer the WIA service won't start again. Now try connecting your camera again and see what happens. You can always do those steps again if you want to re-enable the WIA service -- just choose "Automatic" instead of "Disabled" from that drop-down list, and click the button labeled "Start" in that window (not the regular Windows start button at the bottom of the screen). Equazcion /C 02:28, 28 December 2007 (UTC)[reply]
The WIA dialog doesn't appear when I plug in the camera, but nothing shows up in My Computer either. Device Manager still lists the camera under "Imaging Devices". WikiY Talk 03:05, 28 December 2007 (UTC)[reply]
Do you have the Kodak EasyShare software installed? Equazcion /C 03:16, 28 December 2007 (UTC)[reply]
No, I don't. It used to be installed on another computer, but never on this one. WikiY Talk 04:00, 28 December 2007 (UTC)[reply]
That could be the problem. If you had the software installed in Windows 98 when you could access the camera as a mass-storage device, it may have been the software that was allowing you to do that. I would try installing the software in XP and see if that works. Equazcion /C 04:06, 28 December 2007 (UTC)[reply]
I have a digital camera and sometimes use it to transfer files to and fro. However I despise these USB mass storage transfer colourful helpful resource-consuming apps - I just installed the USB driver and I access it as just another drive from My computer. I did the same with my Zen Nano Plus. --Ouro (blah blah) 07:19, 28 December 2007 (UTC)[reply]
Not all devices are capable of doing that, though, and some included software doesn't actually require you to use the software -- you just have to have it installed in order to access the device from Windows Explorer. In fact I did some googling on this Kodak model and I can't find anything that says it can act as a mass-storage device. If it ever did for this user, and he had the software installed at the time, I'd say that's the best bet. Again you might not actually need to use the software -- just have it installed. It may solve the problem. Equazcion /C 10:25, 28 December 2007 (UTC)[reply]
In many cases a USB mass storage driver is available for download from the manufacturer's website and will suffice, though. --Ouro (blah blah) 16:09, 28 December 2007 (UTC)[reply]
Not this one. Equazcion /C 22:08, 28 December 2007 (UTC)[reply]

Difference between CF cards[edit]

What are the difference between a Lexar 2GB Compact Flash Pro 300X Memory Card [1] and a SanDisk 2GB Ultra II Compact Flash Card [2] that warrants a $80 price difference ($100 compared to $20, respectively)? If I use the more expensive Lexar CF card when shooting 10 mp picture with a Canon Rebel XTi, will I notice a difference in performance compared to a cheaper card? Thanks. Acceptable (talk) 02:46, 28 December 2007 (UTC)[reply]

The difference is UDMA capability, which means the more expensive card will give you much faster transfers (45mb/s vs. 10mb/s -- this is what the "300X" means for the Lexar). However, you'll only get the speed advantage if your device is UDMA-enabled, which I don't think your Canon Rebel XTi is (but I don't know for sure so you may want to check your camera specs yourself). Also, I don't know this for sure, but I would guess the speed increase would only be apparent when transferring photos to and from your computer, and not while taking pictures. Equazcion /C 02:59, 28 December 2007 (UTC)[reply]
Will there still be just a little speed advantage if I'm using a UDMA card with a non-UDMA enabled camera? Acceptable (talk) 03:10, 28 December 2007 (UTC)[reply]
I'm not 100% on this but I'm fairly certain there would be no difference whatsoever. Equazcion /C 03:15, 28 December 2007 (UTC)[reply]

Oh, thanks a lot. What would be the fastest CF card out there for a Canon Rebel then? Acceptable (talk) 03:27, 28 December 2007 (UTC)[reply]

No problem. The Sandisk Ultra II is pretty much as fast as it gets for your camera, to my knowledge. The Sandisk Extreme III may give you slightly more speed, but those are significantly more expensive for not much gain (a 4-5% boost at the most). You could always order both to try, and return one of them :) Equazcion /C 03:43, 28 December 2007 (UTC)[reply]
I just read about a Sandisk Extreme IV. Again it's more money, but this one seems to be a lot faster. Equazcion /C 03:56, 28 December 2007 (UTC)[reply]

SETI@HOME Screen saver-- It makes me seasick[edit]

Anyone know how to permanently stop the the screensaver image from rolling all over thte place? It makes me seasick.--TreeSmiler (talk) 03:54, 28 December 2007 (UTC)[reply]

It's been a few years since I used SETI@HOME, but I seem to remember that you can turn the screensaver image off. I am firmly of the belief that you should in fact do so, because it wastes a significant proportion of your cycles drawing pretty pictures instead of actually looking for aliens. Marnanel (talk) 20:55, 28 December 2007 (UTC)[reply]
Well I like the screen saver. Its just that in the latest version of the software, the image rolls about all over the place. i want to keep it static as in the earlier version.--TreeSmiler (talk) 00:02, 29 December 2007 (UTC)[reply]
Log in to your SETI account at http://setiathome.berkeley.edu/home.php. Click SETI@home preferences. Click the edit link at the bottom of that page. Then set the horizontal and vertical oscillation ranges to 0 degrees. That should do it! :) • Anakin (contribscomplaints) 16:02, 30 December 2007 (UTC)[reply]
Hey thanks! Ill try that.--TreeSmiler (talk) 22:18, 30 December 2007 (UTC)[reply]

Source File --> Executable File[edit]

Hello. I understand that the Turing (programming language) is not well-known; but, I will accept any clues. Other than clicking the Run button in the compiler, how can I convert a Turing source file with a file extension .t into an executable file (or any running file that will not show my source code)? Thanks in advance. Have a happy new year! --Mayfare (talk) 04:46, 28 December 2007 (UTC)[reply]

If you're on Windows with WinOOT >3.0, go to the Run menu and choose "Generate stand-alone program". That should do it. Turing is really not optimized for this though so be prepared for a larger-than-expected EXE file. Otherwise, enjoy :) Equazcion /C 10:44, 28 December 2007 (UTC)[reply]

Need help with using MEncoder[edit]

I want to convert an OGM file to whatever format by MEncoder. The OGM has two audio multiplexes. First for Italian, second for English. So, I want to keep two audio tracks in the output file or keeping second audio track only. What should I do? -- JSH-alive (talk)(cntrbtns)(mail me) 08:55, 28 December 2007 (UTC)[reply]

To select the second audio track from the input, use -aid 2. I don't think mencoder supports multiple audio tracks on output. --tcsetattr (talk / contribs) 09:54, 28 December 2007 (UTC)[reply]
Not working. (Here's the address where I got vids: http://www.usotsuki.info:6969/stats.html?info_hash=55b0bfb1d99eac4c3a66120b1162bc5a1f95906d ) -- JSH-alive (talk)(cntrbtns)(mail me) 11:45, 28 December 2007 (UTC)[reply]
Use a OGG splitter tool to extract the 2nd audio track and the video track first. Mux them with VirtualDub into an uncompressed AVI. Then convert that with mencoder. --ffroth 18:50, 28 December 2007 (UTC)[reply]
Well. Is there OGG splitting freeware?--JSH-alive (talk)(cntrbtns)(mail me) 10:06, 30 December 2007 (UTC)[reply]

Linux Dependency Hell[edit]

Can somebody please advise me how to get through the endless dependency issues that come up when installing new apps on Linux ? I originally gave up on Linux some years back for exactly this reason - every time you try to install something there is a new dependency that is missing, and so you download the dependency, only to find that the dependency itself needs another dependency and so on ad nauseum. I thought I'd give it another bash with Ubuntu 7.10, tried to install Vodafone Mobile Data Card Software for Linux, which needed dependency python-twisted, which I downloaded, which needed python-zope interface, which I downloaded only to find it needs python 2.4 which it also complained about because my current version (2.5) is too new ! Moral of the story ... gave up again. (Note, I am trying to install my internet software, so for now I do not have internet access in Linux - on the off-chance that it is possible to download missing dependencies automatically. (Have to keep switching back to Windows to download the missing dependencies) --196.207.47.60 (talk) 09:53, 28 December 2007 (UTC)[reply]

If you don't have an Internet connection that's supported in the installer, you should forget about downloading the pieces (which, as you've noticed, is tedious and annoying) and just get a full set of CDs or DVDs. Then you will be able to apt-get install and it will do the dependencies for you, and all you'll have to do is put in whichever disk it asks for. --tcsetattr (talk / contribs) 09:59, 28 December 2007 (UTC)[reply]
As Tcsetattr indicated, you are not supposed to download the packages one at a time and try to install them. You are supposed to use a package manager such as APT or YUM. Then, you "apt-get install the-package-you-want" or "yum install the-package-you-want" and it will automagically install all the dependencies for you. -- kainaw 16:28, 28 December 2007 (UTC)[reply]
It is not always possible to use apt-get this way. I had an old PC at home, which I decided to install Linux on. All was well until I found out that whatever was available on the installation CD was not enough to get my wireless card to work. So, I had to resort to manual installation package-by-package as well (as, obviously, I could not get to the Internet without working wireless card), and ran into the same kind of problems as the anonymous editor described above. Any pointers what to do in such situation?—Ëzhiki (Igels Hérissonovich Ïzhakoff-Amursky) • (yo?); 16:37, 28 December 2007 (UTC)[reply]
Exactly the reason why I abandoned linux. Spent in the neighborhood of 10 hours trying to manually resolve dependencies for MadWifi.. dual booting back into vista to download a package each time. I finally found the exact versions that the package manager wanted. When it wanted me to manually downgrade and recompile my kernel, I had had enough. No way I'm trying to figure out how to get all the C library dependencies sorted out.. isn't that like 50 packages?! Grr --ffroth 18:55, 28 December 2007 (UTC)[reply]
I do not use Ubuntu, but my Fedora install CDs have had ndiswrapper for a long time. That allows you to use the Windows driver which is likely on the install disk for your wireless card or laptop. No need to get anything that isn't on the install disks. Once you are up and running, you can try to get linux-specific drivers, but I just continued using the ndiswrapper/Win32 driver. It is my opinion that the benefit of ndiswrapper has caused the production of linux-only drivers to slow to a crawl. Developers would rather work on something that is missing as opposed to just trying to speed up a wireless driver a little with a native driver. -- kainaw 19:01, 28 December 2007 (UTC)[reply]
The single livecd of Fedora 8 doesn't include ndiswrapper, but after downloading the binary rpm of it it still didn't work. Neither my OEM drivers nor the Intel drivers worked. --ffroth 22:01, 28 December 2007 (UTC)[reply]
Don't you people have an emergency ethernet cable just in case? Also, Ubuntu 7.10 (AFAIK) comes with ndiswrapper (plus a GUI!). --antilivedT | C | G 10:01, 29 December 2007 (UTC)[reply]
Apparently it's not cost-effective to wire a dorm when you can just cover it in WAPs --ffroth 01:22, 30 December 2007 (UTC)[reply]
The point of getting the full disk set is that you can do the complete installation without a network. That way you avoid fighting 2 battles at once: a half-installed operating system and a lousy network card whose driver isn't in the official kernel. --tcsetattr (talk / contribs) 19:53, 28 December 2007 (UTC)[reply]
Nobody is keeping you (anonymous 3rd person, not tcsetattr) from downloading each and every package file to a local disk. To limit external network usage, that is what we do here. Every night we have updates to our local package mirrors and all our computers are set up to update from the local mirror, not the main repository. It is ridiculous to expect it all to fit on one CD or to expect someone else to package up a big box of CDs and send them to you - for free. If you are willing to pay for it, I'm sure someone will burn all the packages for the OS you like to a monster box of CDs - but then you'll have to mount all of them at the same time to get programs like YUM or APT to automatically handle all dependencies because they cannot know what disk what package is on. -- kainaw 20:02, 28 December 2007 (UTC)[reply]

(User:NoClutter took exception to the word "automagically" above in User:Kainaw's response and removed it; I have reverted this since it appears to be contrary to talk page guidelines. Marnanel (talk) 21:49, 28 December 2007 (UTC))[reply]

No problem with the revert, but just for the record, "automagically" is not a word. I didn't remove a word, I replaced gibberish with something that actually makes sense and can be found in a standard English dictionary. Since RD and talk pages seem to occupy some sort of mystifying "independent realm" within Wikipedia, I will be more than happy to accept your undoing of my entirely appropriate correction, for the sake of the "rules". NoClutter (talk) 22:01, 28 December 2007 (UTC)[reply]
Automagically is a jargon word that has wide acceptance in the computing community - it means something between 'magically' and 'automatically'. It is impossible to talk about computers and computing without using the jargon that goes with the field. If you see a jargon word and don't understand it - then just ask. Anyone with the knowledge to mess with Linux installations will likely have no problem understanding what "automagically" means. At any rate, it's extremely rude to edit other people's posts on Wikipedia - please drop that habit. SteveBaker (talk) 23:10, 28 December 2007 (UTC)[reply]
Spitting on the sidewalk and jaywalking has "wide acceptance" in the pedestrian community -- that doesn't legitimate the practice. Moreover, the "computing community" (whatever the flip *that* little gem of jargon means ... I just love it when people glibly talk about the "foobar community" as if there is some monolithic battalion of drones walking in lock-step uniformity ... ) has *loads* of meaningless buzzwords that do little more than reflect the ability to parrot what someone else has said because that happened to "sound kewl d00d". To imply it is impossible to converse about computing topics without slinging such puerile, clinically-meaningless, 1337-speak gibberish (that I defy you to find sanctioned in a single style guide for technical writing) is a slap in the face to the dignity and legitimacy of both the English language and computer science in one fell swoop. I'd rather you just spat on the sidewalk.
As a compromise, I'll drop my supposed "habit" if you drop your "habit" of implying someone doesn't understand what they are talking about; if and when they happen to hold a viewpoint inconsistent with your own. NoClutter (talk) 00:01, 29 December 2007 (UTC)[reply]
It appears that this would have all been cleared up had I linked it and said automagically instead. Then, it wouldn't be so difficult to hunt down the definition by itself. As it is, "automagically" means more than "automatic" and "magically" by themselves - as the definition explains. -- kainaw 00:25, 29 December 2007 (UTC)[reply]
Linking to the definition would have been nice, but it's really not necessary. "Automagically" is commonly enough a word that anyone knowledgeable enough to be talking about "dependency hell" is very likely to understand it. Likewise, anyone in a position to offer help is also very likely to understand it. To me, the word is no more difficult to understand than "unseasonably warm" in weather forecasts. --71.162.242.28 (talk) 04:25, 29 December 2007 (UTC)[reply]
OK - this sub-thread has gone too far - please take further posts on the appropriateness of using jargon terms to the Ref Desk discussion page. SteveBaker (talk) 05:04, 29 December 2007 (UTC)[reply]
My advise is to stick to the software suggested by Ubuntu. Use the package manager (system, administration, synaptic) and dependencies are not an issue. On Ubuntu you can enable extra software sources (system, update, software sources). Software for Linux on the internet may be quite experimental, so its best avoided. --h2g2bob (talk) 23:40, 28 December 2007 (UTC)[reply]
Undo what you have done, and do this in terminal: sudo apt-get install python-twisted python-zopeinterface (frankly python-twisted doesn't depend on python-zopeinterface in Ubuntu as far as I can see, installing it just in case), and voilà, you have the packages automagically/automatically installed! --antilivedT | C | G 10:01, 29 December 2007 (UTC)[reply]


Your best bet trying to install a package without network access under Ubuntu is to first go to http://packages.ubuntu.com, look up the package in question, and then track it's dependencies though the links on each package's page. There's a link to the .deb file on each package page. Download them all, and then start installing from the bottom of the chain up. Ptelder (talk) 07:35, 30 December 2007 (UTC)[reply]

Firefox[edit]

Has anyone encountered problems with acess to the interent using firefox even though firewalls and proxy ones set to allow. Is it compatible with all ISPs? —Preceding unsigned comment added by 82.152.9.44 (talk) 10:20, 28 December 2007 (UTC)[reply]

Yeah it's compatible with all ISP's, as far as I'm aware... Equazcion /C 10:58, 28 December 2007 (UTC)[reply]
This question is based on a misconception of what Firefox and ISP define. An ISP is an Internet Service Provider. It is nothing more than a connection to the internet and all it has to offer. Firefox is a web browser (among other things). It connects to web servers, fetches web pages, and displays them to the user. Your ISP allows you to connect to web servers, regardless of the program you use. Compatibility does not have to do with the ISP. It has to do with the web pages themselves. Some web pages are specifically designed to not work properly on Firefox. Some are specifically designed to not work on Internet Explorer. This is done on purpose, by accident, and through pure laziness (ignorance in most cases). So, it should be obvious that there is no such thing as an incompatibility between Firefox and an ISP, but there is an incompatibility between Firefox and some web pages. -- kainaw 13:12, 28 December 2007 (UTC)[reply]
Not exactly correct. Free ISPs require a secure key to be sent from their proprietary browser before they'll serve you any content. It's to make sure you're using their ad-laden software --ffroth 18:57, 28 December 2007 (UTC)[reply]
I've used many of those. In my experience, you open the proprietary browser and connect. Then, you open Firefox and use it. The trick is that you must keep the proprietary browser open to keep the Internet connection open. -- kainaw 19:57, 28 December 2007 (UTC)[reply]

Human-Computer[edit]

Is there any technology (or any technology under-Devolupment) ,to enable human to directly transfer signals/instructions to Computer by the waves generated by body part(say brain)-Sumit(pardon my English)-59.94.153.78 (talk) 10:30, 28 December 2007 (UTC)[reply]

Yes, they do it with paralysis victims and amputees. A chip can be implanted within the head of a paralyzed individual and interpret very basic signals, enough so that with practice the person is able to move a cursor around a screen. Similar interpreters are implanted under the skin closer to the extremities so that amputees can operate robotic limbs. These are all very experimental at this point, though, and I'm speaking off the top of my head from news reports I've seen/read before, so I could be getting some of this wrong :) Equazcion /C 10:47, 28 December 2007 (UTC)[reply]

GT4[edit]

hi,

basically im looking for a list of the prices at which you can sell cars for on Gran Turismo 4..............

thanks, --62.136.174.87 (talk) 13:30, 28 December 2007 (UTC)[reply]

Here is one. Recury (talk) 18:30, 28 December 2007 (UTC)[reply]

network speed vs. firewire speed[edit]

How fast does a network have to be so that we don't have to connect our hard drives directly to the computer?

I have several projects on multiple external hard drives. While not working on a project, I place each hard drive on a shelf. When I need the project, I bring the hard drive over to one of my computers and connect it with a 6-pin firewire cable. This makes the project load (or update or save) just as fast as using the internal C: drive.

My question is, how fast would a computer network (LAN) have to be, to allow me to use one of my computers as a server? I'd like to plug half a dozen external hard drives into the server, and just access my project files "across the network"?

Can you help me figure this out? Or direct me to a web page? --Uncle Ed (talk) 16:40, 28 December 2007 (UTC)[reply]

Using standard network cables on a 100BT network, you max out at 100Mbit/s. That does not include overhead for the data transfer (be it NFS or Samba or SFTP or whatever). Firewire maxes out at 3200Mbit/s. Again, that does not include overhead. You obviously won't get firewire speeds on a standard network. -- kainaw 16:49, 28 December 2007 (UTC)[reply]
According to our article on Firewire, Firewire can be used to set up a network. The question of how fast the network needs to be depends on how intensive you are using the drive and how much you are willing to wait for files to load. Taemyr (talk) 17:34, 28 December 2007 (UTC)[reply]
These days, there is no reason not to use 1GHz ethernet within your home or workplace since a large number of PC's support it right there on the motherboard. You typically don't get 1Mbit/sec as you'd hope - maybe you get 300Mbit/s with a typical computer that's 10x slower than Firewire on paper - but with a typical computer, you aren't going to be able to handle firewire data at 3200Mbit/s either. There is also a question of whether your hard drive can keep Firewire busy at 3200Mbit/sec. I strongly doubt it. Most hard drives range in speed (depending on where you are on the disk) from 350 to 800 Mbits/s - but that doesn't include seek times and such which for typical access patterns can drop the average speed of the drive down to 200Mbit/s or less. Using a client/server arrangement tends to smooth out seek times - and the extra RAM cache you have because the server is caching disk data in it's local RAM means that in some cases a client/server setup can actually be faster than a local hard drive! Realistically, going from a local/firewire drive to a server setup probably halves your disk throughput - but a lot depends on your access patterns. I use a file server at home - all five of our day-to-day computers (2 desksides and 3 laptops) and our web/email server use the same file server and I really can't tell whether files are on my local machine or on the server most of the time. The benefit of being able to sit at any of my computers and see the exact same user settings and the exact same set of files is ENORMOUS - something I'm more than happy to give up 50% of my disk speed for. I also like that I only have to backup my files from one central location.
Having said that, I have a Linux server and mostly Linux client machines - I'm less certain this would work out as well using a Microsoft OS on the server. There have been some dire concerns over this recently: [3] for example. I wouldn't touch the "Windows Home Server" product with a 10 foot pole! One problem appears to be that they do delayed writes over the network - so you can save a file out of your program, exit the program and only later get a message telling you that the file could not be written onto the server and that you should try saving it again!! Since you already exited your app - you're basically doomed! So, if you plan to do this - put Linux on your server.
SteveBaker (talk) 17:47, 28 December 2007 (UTC)[reply]
You mean gigabit ethernet, and 1000mbps? --ffroth 21:58, 28 December 2007 (UTC)[reply]

Facebook safari problem, mac os leopard,[edit]

I am reposting this because my question still hasn't been answered:

I use safari on my macbook and am having trouble accessing facebook. I can access in the sign in page, but sometimes when I sign in, I am directed to a page that says; Safari can’t find the server. Safari can’t open the page “https://login.facebook.com/login.php” because it can’t find the server “login.facebook.com”. Othertimes, I get this message while I am logged in and surfing the site. Does anyone have any advice? —Preceding unsigned comment added by 71.56.231.40 (talk) 17:17, 22 December 2007 (UTC) Does this happen to you only with Facebook or with other sites, too? What Internet connection are you using? --Ouro (blah blah) 18:42, 22 December 2007 (UTC)

It just happens when I'm using facebook. I am using wireless (wifi I think) internet or whatever wireless a Macbook uses. —Preceding unsigned comment added by 71.56.231.40 (talk) 01:10, 23 December 2007 (UTC) I suppose that sometimes when you try to log on, the network is clogged up so much with traffic that you can't get through and subsequently lose the connection. This coupled with a wireles internet connection could be it. Sometimes traffic-demanding servers have narrow connections to cut costs or because of laziness, it shouldn't be so with an oh-so-popular-and-great website like Facebook. --Ouro (blah blah) 11:01, 23 December 2007 (UTC)

Its weird though, because if I go to standby then boot up the computer again, everything works just fine, any ideas? —Preceding unsigned comment added by 71.56.231.40 (talk) 04:12, 25 December 2007 (UTC)

I just ran out of ideas, that's all. One more thing - reposting questions like this is sometimes frowned upon. --Ouro (blah blah) 20:20, 28 December 2007 (UTC)[reply]
No idea. I have trouble with Facebook in Safari as well (the images don't always load) and trouble with a handful of other sites too (Google Maps barely works; only about half of the map tiles load up on any given screen, the rest claim they can't be found). For sites that performance is important (Google Maps yes, Facebook not so much) I use Firefox instead even though it is a lot slower. --24.147.86.187 (talk) 22:30, 28 December 2007 (UTC)[reply]

Gaming Problem[edit]

I just got a game, and am finding it hard to run on a Vista machine. I spent an hour (almost) installing it, and now the disk won't fire up and play the game. The box says it is compatible with Vista, but I have checked the processor, and it says I need 'PIV 2.8GHz', whereas I have only 1.73GHz. Could this be the problem? It also talks about a graphics card I need, and that "integrated/onboard graphics chipsets and laptops are not supported." I am using a laptop. Does this mean I am wasting my time trying to get it to work? any help, much appreciated.--ChokinBako (talk) 20:01, 28 December 2007 (UTC)[reply]

It depends on how important the game is to you. Basically, your computer is too slow and has poor graphics capability. So, you will need to patch the game to remove the need for a faster computer and the need for a dedicated graphics card. If you want to learn to patch games and it is of extreme importance that you play the game, then it may be worth a few years investment in learning assembly language so you can decompile the game, hack it a bit, slow it down, remove the extra graphics, and run it on your laptop. Since I seriously doubt it is of that much importance, I feel that the answer is most likely "Remove the game to save some disk space and try playing it when you get a new computer - if you are still interested in it." -- kainaw 20:09, 28 December 2007 (UTC)[reply]
Yes - laptops are made to conserve battery life, so most (especially under $1000) lack a discrete graphics card, instead possessing a motherboard-specific "controller" with extremely reduced capabilities. You can buy external graphics cards now, IIRC, but they're not going to play CoD4 or other new games because the bandwidth of the PCMCIA slot is not even close to something like PCI express, and they're prohibitively expensive. Invest in a good desktop! - Wooty [Nyan?] 20:37, 28 December 2007 (UTC)[reply]
I have used programs that simply refused to operate (without telling me) if the computer in question wasn't at least its minimum CPU requirement (one program I used refused to even try to run something on a 2.0 GHz machine when the minimum requirement was something like 2.1 GHz!). Anyway, yeah, it sounds like you are trying to run a game with very intense minimum CPU requirements; trying to get a laptop to do it is really probably not going to work. Your best bet would be to Google around a bit though and see if others have had any success. --24.147.86.187 (talk) 22:27, 28 December 2007 (UTC)[reply]
It is in the interests of game designers to sell you the game - so they don't go around demanding higher requirements than the game absolutely needs. If your PC doesn't meet the requirements - then PLEASE don't buy the game. Either it won't run at all or it will run so poorly as to be unplayable. You might get lucky and find that it just about runs acceptably - but the odds are not good. Contrary to what Kainaw says - the game design team will already have worked hard to get the game to run on the slower CPU's - and even with a lot of effort - there is simply no chance. Realistically - you need a better computer. SteveBaker (talk) 23:03, 28 December 2007 (UTC)[reply]
But there are plenty of good-lucking games with low system requirements, like Guild Wars and San Andreas. You'd think the old Half-Life and Counter-Strike 1.6, but no. Other old games run fine on old hardware, just get used to substandard graphics and you'll find AMAZING gameplay gems. Anyone who calls himself a gamer will love the old Unreal Tournament 99, despite very outdated graphics. Download it and try it- the newer games aren't any good anyway --ffroth 04:37, 29 December 2007 (UTC)[reply]
O RLY? I think Half-Life 2 and its derivatives will run at the lowest setting, or go Open Source and play like FretsOnFire (not sure how you supposed to do it with laptop though) or Tremulous.--antilivedT | C | G 09:51, 29 December 2007 (UTC)[reply]
Agree on the UT99 statement, they just don't make games like that anymore. I find it very hard to believe the above statement "the game design team will already have worked hard to get the game to run on the slower CPU's" - in my experience I find very few CPU/GPU-intensive games well optimized to work on "older" machines. By "older" I mean even two years. I think it's a conspiracy between CPU makes, GPU makers, Microsoft and games development houses so that we all have to fork out for the latest and greatest, when they could have taken the extra effort to optimize on older hardware. I think developers should be given older spec machines to develop on, otherwise they get lazy and churn out code that will run brilliantly on the latest super-duper-arctic-cooled machine. Sigh, gone are the days of optimized OS's and playable games that don't take a tenth of your hard drive. The Amiga had a GUI OS that fitted in 1 meg! Developers for the Amiga had to innovate for the hardware, so if they wanted to go to the next level of gaming brilliance, they had to write tight and optimized gaming engines, making use of every last bit of computing power the machine had to offer. Nowaways we are lucky if we get anything optimized after the third patch, released a year later. Sandman30s (talk) 10:03, 29 December 2007 (UTC)[reply]
I can assure you there is no conspiracy. I'm a game developer, I know! We work very hard to make our software run on older CPU's and GPU's - but there are limits. If you want to deliver a superb state-of-the-art game experience that will be fun, look and sound good, get good reviews, sell well and therefore make money (yes, making money IS the goal here!), you need to push the limits of the technology. The degree to which you can efficiently 'dial-down' the content to make it work on older machines yet not ruin the game-play is limited - but you can be assured that we don't waste CPU cycles if we can possibly avoid it. On my desk at work, I have a high end multi-core monster machine with 8 gigs of RAM, dual nVidia 8800 graphics feeding two big-screen monitors and a high end sound card...but I also have a machine with a low end CPU, not much RAM and a scruffy old nVidia 6800 card. I regularly test my code on both machines and we track bugs and performance on a whole range of different platforms. It simply wouldn't make sense for us to arbitarily abandon a section of our market unnecessarily. But there have to be limits. A game that'll just barely run on a 600MHz PC with a Voodoo-2 card would definitely look crap on a modern high end gamer machine - so we have to trade off between using all of the power of the latest boxes versus failing to run on low end machines from 5 years ago. As with everything else, it's a matter of knowing your customers and tuning the game to be desirable to the largest numbers of people. Once we've identified the performance span that we're trying to meet - we optimise the crap out of our code to make it run on those lower end machines while we're also pushing the limits of the higher end machines. So when the box says "You need a X GHz CPU and at least a YYYY GPU and at least ZZZ Mbytes of RAM"...we really mean it! SteveBaker (talk) 16:22, 29 December 2007 (UTC)[reply]
I hear you, perhaps you are one of the exceptions. I'm just so frustrated with the current rate of change and high prices and companies expecting us to keep up. I have a 2-year old PC with the maximum possible AGP card (upgraded four times just to keep up) and already there are games that are almost impossible to play. Bioshock is sluggish and has ridiculously high level load times, yet the quality is not that much better than some other FPS's that are playable on my machine. So what does that lead me to conclude? The developers in an effort to release this game within a deadline could very well have neglected to optimize it. Also, look at Crysis - I hear this game makes high-end graphics cards sweat! What is the use of releasing this kind of game? How will it make money if only a small percentage of gamers can play it? Even UT3 is a disappointment to me - something that requires tweaks and hacks just to run, with a large community waiting for a patch just to be able to make it run at all. Surely the developers picked this up during their 'extensive testing' cycle? I think the 'making money' part is gone one step too far in this multi-billion dollar industry. Be thankful you're on the end that can supply you with super-duper machines. Sandman30s (talk) 21:47, 29 December 2007 (UTC)[reply]
I understand where you're coming from - it can be frustrating. The Crysis engine is certainly notorious in that regard. There is actually no hardware you can buy that will allow the game to run at decent frame rates on it's highest settings! Crysis is (I suspect) a somewhat special case. They are in the business of making a kick-ass game engine - which they hope other companies will buy and build games on top of (much as the Quake engine or the Unreal engine are used). They built their engine such that a company who buys it today can spend two or three years writing a game - and find that the hardware that'll be available THEN will be able to run the game fully. The first game using that engine is their own - and it's as much a technology demonstration for the likes of my management as it is a game to be played.
But certainly there is a general problem here. Much of it comes about because developers are cross-platform developing for game consoles as well as PC's. In the game console market, everyone is using the exact same hardware - if your game doesn't push that hardware to the utter limits then the reviewers will 'ding' you for poor graphics or whatever and your console sales will stink...right now, that's death for a major game title. In those cases, the PC target tends to have to be at about the same level of general capability as the game consoles. That's tough because we can get in and tinker at the lowest levels with the hardware in a game console - because we know that hardware won't ever change. In a PC, we can't use this sneaky clever trick that only (say) an nVidia 7700 card can do - because we also have to run on the 7700GT (which doesn't have that trick) or the ATI 8000-and-whatever which needs a different trick. On a PC, you have to use the higher level API's - and those are simply not as efficient as tinkering down at the nuts-and-bolts level.
In addition to the two PC's on my desk - I have an Xbox360 developer kit and two Playstation 3 developer kits piled up there too (yeah - it's pretty cluttered what with all the keyboards, mice, joysticks and whatnot - thank god I don't have to run on the Wii as well!) - and I have to make sure that everything works on all of those systems. The question is - what level of PC sophistication should I use as my low-end PC target? It can't be too far behind the two game consoles because then I'll spend too much of my time trying to get it to run on that system and not enough time polishing it to look spectacular on the consoles and high end PC.
The determination of what the low end PC requirements are is pretty much a marketting decision - they look at the demographics and decide how much of the market we can afford to leave behind as 'obsolete' and which level of system is still so commonplace that it's a "must have" market. Such things are down to cold hard economic realities - not software engineering. But if at the end of the day, we can squeeze out enough CPU cycles that we can run on a 1.6GHz CPU rather than demanding 2GHz - we'll certainly make the effort. What we cannot do is things like reduce the number of enemies that leap out to attack you if your PC can't handle drawing that many - to do that would require major game play rebalancing - and perhaps even mean that the game is essentially different on the low end machines. Worse still, with networked game play - we have to somehow compromise in such a way that people with shiney new boxes don't dominate the scoring just because they are seeing more detail than everyone else. It's a tough call.
Sadly, at the end of the day, it is all about money. Game companies are not charities. We have to please our shareholders or we won't be writing ANY games! For that reason, we're pretty hard-nosed about where the effort has to go. If it costs us an extra third of a million dollars (that's about one mid-grade software engineer for the life of a typical project) to support a PC with an older graphics card - then that has to result in maybe a hundred thousand more game sales or it's a net loss. (Yes, I know how much games cost - but not a great percentage of that goes into paying engineers salaries!) So the question becomes: If we support that lower end graphics card - will 100,000 more people buy the game? If the answer is no, then we can't afford to do it - period. But - if I can spot a sneaky trick that's not going to take too long to implement that'll let the game run on a lower performance card than we'd planned for - then I'll certainly take the time to sneak it in - and the requirements list on the box will reflect that.
SteveBaker (talk) 07:04, 30 December 2007 (UTC)[reply]
You could also view the whole process the other way around. Instead of different groups of programmers competing to get your money, think of it as different groups of game players competing for the attention of programmers. One group is the attention-deficit teenagers who are constantly buying new graphics cards because their self-esteem is tied to their frame rate. It's senseless to complain "they wouldn't have to upgrade so often if the games were better designed". They want to be on that upgrade treadmill, belittling those who are 6 months behind. Because this group throws money around freely, they succeed in attracting the attention of programmers, who give them what they want: games that suck on anything but the latest hardware. --tcsetattr (talk / contribs) 07:44, 30 December 2007 (UTC)[reply]

Microsoft Live Search Books problem[edit]

This URL - http://search.live.com/results.aspx?q=&scope=books - on my desktop machine gets me into Live Search Books. On my laptop, the same URL gets me only "web results". (Screen says "Book results" in a green font in the top left corner if it is showing Live Book Search, or else "Web results" if it is not).

Anyone have a clue why, or how I might coax my laptop to follow the desktop's lead? Clearing cookies & caches has no effect.

It has some relevance to wikipedia - here's a reference I've just added to a page. Works fine on the desktop, comes up as mostly blank screen on the laptop. (Let me know if it works for you). --Tagishsimon (talk) 20:30, 28 December 2007 (UTC)[reply]

Try going here: http://www.microsoft.com/windows/products/winfamily/ie/default. Click on the link at the top-left with the magnifying glass icon next to it, called "Add live search to your browser". That might do it. Equazcion /C 00:14, 29 December 2007 (UTC)[reply]
Thanks. No, it doesn't. Both machines are XP, problem occurs in IE & Firefox on the laptop. I'm thinking that it's a beta product from MS, perhaps some choice is being made at their end to ration the service? --Tagishsimon (talk) 00:42, 29 December 2007 (UTC)[reply]
You could try http://books.live.com (although that seems to be a redirect to the URL you posted). This gives me the "Book results" text but otherwise actually doesn't work for me at all. Searches seem to trigger a never-eding loop of reloads (at least in Firefox). Your suggestion about a gradual Beta rollout could be correct. See this article. Equazcion /C 00:58, 29 December 2007 (UTC)[reply]
Thanks. That takes me to ... web results. Interesting article. I wish they'd hurry up. I used to have the "never-ending reload" problem on the desktop, but it's gone away recently, and the user interface & content is excellent. When you can get it :( --Tagishsimon (talk) 01:02, 29 December 2007 (UTC)[reply]
It's happening in IE7 also. This is very strange. I've done some searching and some people report the same problem while others say it works for them in both browsers. I don't see how Microsoft could be making two computers in the same location and with the same software load differently. This must have something to do with a missing addon, update, certificate, or something. I'll be looking into this more. Equazcion /C 02:29, 29 December 2007 (UTC)[reply]
I'd be v.grateful for continued input. Yes, to confirm: IE7 & Firefox on the laptop get "web results". IE7 & Firefox on the desktop get "Book results". Both are WinXP, both on the same LAN going through the same gateway. I'll add any additional soundings from other machines as I have the opportunity. Like you, I'm unsure how it is discriminating between the two machines... --Tagishsimon (talk) 14:49, 29 December 2007 (UTC) I'm on a different LAN now ... cookies deleted ... laptop gets web results still, mac portable gets Books. Barking. --Tagishsimon (talk) 17:59, 29 December 2007 (UTC)[reply]
Browser language settings? The books search isn't available for me because I have en-gb as my primary language, however if I promote en-us above it then I can search through books (Options, General Tab, Language button) --Blowdart | talk 09:53, 30 December 2007 (UTC)[reply]
That's excellent! Very many thanks ... works (as you know) like a dream) --Tagishsimon (talk) 11:53, 30 December 2007 (UTC)[reply]
You're welcome. --Blowdart | talk 11:56, 30 December 2007 (UTC)[reply]
They seem to have blocked that loophole. Anyone else found a way to get into http://books.live.com from UK? --PeterR (talk) 12:15, 8 March 2008 (UTC)[reply]

Capturing USB audio stream[edit]

I have a gramophone player with built-in analogue-to-digital conversion and a USB interface so that you can plug it directly into your computer. The audio signal is obviously digital as it enters the computer and I'd like to capture it exactly as it is. How can I do that? How can I, to begin with, find out what format the audio is in (sampling rate and bit depth)? I am limited to Windows XP at the moment, but I could possibly boot with a Linux live CD if that's what it takes. Thanks. —Bromskloss (talk) 20:36, 28 December 2007 (UTC)[reply]

You should be able to trace the USB traffic using SnoopyPro (I've never used it). Beyond that you're probably into reverse engineering land. -- Finlay McWalter | Talk 20:41, 28 December 2007 (UTC)[reply]
For Linux, usbsnoop (I have tried that) works fine. -- Finlay McWalter | Talk 21:21, 28 December 2007 (UTC)[reply]
Thanks for the answers so far. Just to clarify, I don't want to capture the raw USB communication, only bit-correct audio. —Bromskloss (talk) 23:46, 28 December 2007 (UTC)[reply]
I realise that. But unless the device-driver supplied with the gramaphone thing creates a Windows audio device (that can be opened and recorded by a program like Audacity) then reading the USB may be your only option. If it doesn't, if it only talks a special magic protocol to a special magic program (that it seems you find produces unsatisfactory results) then sniffing may be your only option. If you get the sniffer to work, and you can program a bit, then this shouldn't be too hard - it's very likely the data stream in the USB packets will be almost entirely PCM encoded audio. Once you've obtained that you won't take too many options to figure out what the format is - it'll surely either be 8 or 16 bit, either signed or unsigned, and you don't need to get the samling rate correct to make progress - if you guess at 16k you'll either hear sensible data or not, and you can fix the sampling rate later. -- Finlay McWalter | Talk 00:00, 29 December 2007 (UTC)[reply]
I'm sorry, I should have told you. It does create an audio device (Audacity actually comes with the player!). In Audacity, I can choose to record in 16-bit fixed, 24-bit fixed or 32-bit float. I want to choose the format that gives me all audio data, but no more. Hopefully, one of the three possible is the correct one. Can the audio device it creates be inspected to see the format it uses? Windows knows it, obviously. —Bromskloss (talk) 00:52, 29 December 2007 (UTC)[reply]
I think the business about 24-bit or floats is for the internal format Audacity uses to store your recorded data, not the actual format of the audio device (that setting is in Audacity's "quality" tab, not the "audio i/o tab"). Looking at the Windows audio API only shows 8 or 16 bit fixed formats - nerdy details follow.
A bit of poking around in MSDN shows (I think; no windows box on which to test this) that you can ask an audio capture device its format(s). The process seems to be 1) enumerate the audio devices (see here) 2) call waveInOpen to open a handle to the correct audio device 3) call waveInGetDevCaps which fills in a WAVEINCAPS structure, which lists the formats the device supports. I note from the WAVEINCAPS list that all the formats are either 8 or 16 bit ints; no fixed24 or floats. -- Finlay McWalter | Talk 01:30, 29 December 2007 (UTC)[reply]
Ooo, nice find! I have never done any Windows programming so I don't even know how to call the functions, but it seems I can assume that 44100 Hz, 16 bit fixed, stereo is the format. Thanks for your help! —Bromskloss (talk) 11:34, 29 December 2007 (UTC)[reply]

capturing preview images and multiple print of web pages[edit]

BACKGROUND: Sally has a huge website that is really disorganized and she wants to print out all the pages and review them so she can rebuild her site with all the useless stuff weeded out. She wants hard copy printouts, but I don't think she realizes just how much "stuff" she really has, and I don't want to kill a bunch of old-growth forest just for the paper it will take to prove it to her.

PROBLEM: How can I get multi-page printouts of webpages that are on the internet without manually going to each page, choosing "print" or opening each page in the word processor? I know there are tools that go out and automatically download entire websites onto your local machine, but I want to print, not download.

Also, to save paper, id like to be able to print multiple "screens" per piece of paper. For example, something like on a single sheet of paper. NoClutter (talk) 21:54, 28 December 2007 (UTC)[reply]

The first step is to get all the pages on the website. I'd use wget. With the -m option, you can recursively suck down all the pages in one go with something like "wget -w 2 -m http://whatever.site.com". Then, you'll probably want to convert them all to PDF for printing. I'd actually run it to doc then PDF to get a better conversion than most utilities. First (if you want to), openjade is handy for HTML to SGML to doc converting. Then htmldoc is good at converting the cleaned up HTML to PDF with "htmldoc -f output.pdf input.html". That will give you a PDF for every HTML document. Ghostscript is great at that. Use "gs -dBATCH -dNOPAUSE -q -sDEVICE=pdfwrite -sOutputFile=finished.pdf file1.pdf file2.pdf file3.pdf..." Of course, that is a lot of work to do if you plan to do it more than once. Of course, it wouldn't be a major ordeal to wrap it all up in a single shell script. Perhaps someone has done that already. As for the red links - those are common programs that should have articles by now. -- kainaw 01:45, 29 December 2007 (UTC)[reply]
I stopped a tad early - sorry. Once you have it all in one PDF, you should be able to select multiple pages per sheet when you print without any problems. -- kainaw 02:01, 29 December 2007 (UTC)[reply]

Just a side note: don't worry too much about the old-growth forests; as far as I know paper is generally made from fast-growing, relatively sustainable pulpwood trees. —Steve Summit (talk) 17:41, 29 December 2007 (UTC)[reply]

picasa to iphoto[edit]

Hello,

I'm trying to migrate someone from windows XP into OS X leopard. Is there any way to transfer photos from picasa (including the album data) into iphoto?

Thank you for any help,

--Grey1618 (talk) 22:11, 28 December 2007 (UTC)[reply]

I think it should just automatically send all data of the photos to wherever you saved it. You might want to save it in the photos section when you transfer those pictures. Save you a lot of time. --67.84.12.248 (talk) 23:54, 2 January 2008 (UTC)[reply]

keyboard question[edit]

I just bought recently an HP Pavilion laptop with an extended keyboard (meaning I have the seperate number pad). However I've been having a problem in that I don't know how I do this, but I hit something that changes my question marks into É or é. I've done this several times and I can't seem to figure out how to get it back so I can type in question marks. Anybody able to help? Croat Canuck Say hello or just talk 22:39, 28 December 2007 (UTC)[reply]

Mayhaps you're hitting the key combination to change the keyboard map? Check out the keyboard settings in the Control Panel and deactivate this feature so as to avoid confusion in the future. Cheers, Ouro (blah blah) 07:41, 29 December 2007 (UTC)[reply]

Mac on a PC?[edit]

I've heard numerous rumours that a Mac OS, with some computer knowledge can be installed onto a PC. Is this true? How is it Done? Thanks, Perfect Proposal Happy Holidays! 22:51, 28 December 2007 (UTC)[reply]

Just so you know, it violates the Mac OS X license to do so. Generally, Wikipedia doesn't help people break the law. Atlant (talk) 23:14, 28 December 2007 (UTC)[reply]
However, we do provide all manner of information, such as reports of instances where the law has been broken, if you're interested in reading about that. Here's a news article at Wired [4], who I sincerely hope the authorities track down already cause the article's been up for nearly two years with no Wired personnel being arrested yet, what with all this illegal activity, helping the public break the law and such. The article states that a hacked version of OSX exists called OSx86, which people are acquiring via BitTorrent, and a project exists at http://www.osx86project.org/. FYI only. Cheers. Equazcion /C 23:34, 28 December 2007 (UTC)[reply]
There is a WIkipedia article on OSx86. —Bromskloss (talk) 23:44, 28 December 2007 (UTC)[reply]
Depending on what you're looking for Darwin (operating system) may be of interest, as well. -- 128.104.112.236 (talk) 20:05, 29 December 2007 (UTC)[reply]
For more info, be aware that if you have an Intel Core 2 based system particularly with an Intel chipset based motherboard your chances of success are probably greatest since these are most similar to available Apple systems. A64 based systems are trickier particularly since they lack SSSE3 which is present on all Core 2 (and therefore all Intel x86-64) based systems. If you have an older pre Venice A64 without SSE3 you'll probably have even more problems. However people have had success with all these systems. Nil Einne (talk) 09:45, 29 December 2007 (UTC)[reply]
Note that violating an EULA may or may not be violating the law. It may just mean you void any warranty with the company. There are boundaries to how much reach EULAs have in enforcing various arbitrary rules and practices (see the EULA article for more details). On top of that, we have no idea about the jurisdiction of the question answer. So it is probably better to just answer the question than to try and self-censor based on a half-understanding of the legal issues. If someone asked me on here how to assemble a nuclear weapon I'd point them to the relevant article (nuclear weapon design) even though it would be against a variety of laws for them to actually do it. --24.147.86.187 (talk) 23:46, 28 December 2007 (UTC)[reply]
Yes. Exactly. Equazcion /C 23:53, 28 December 2007 (UTC)[reply]
Yes exactly. Note that there is a big difference between downloading OS X from BitTorrent when you don't have any legitimate copy and buying a copy of OS X which you either then modify yourself or download a modified version which you substitute for you legitimate media during an installation Nil Einne (talk) 09:39, 29 December 2007 (UTC)[reply]

http://lifehacker.com/software/hack-attack/build-a-hackintosh-mac-for-under-800-321913.php this is a good lifehacker artical on this subject that —Preceding unsigned comment added by 67.68.25.49 (talk) 20:02, 29 December 2007 (UTC)[reply]

Maximum file size: FAT32 vs. NTFS[edit]

I thought that the biggest file I can put, on a drive formatted as FAT32, was 4 GB. Without getting into technical details, I assumed this meant you'd need to use NTFS format for larger files, such as a 12 GB AVI file used in digital video capturing.

What is the maximum file size under FAT32? And do I need NTFS for bigger files? --Uncle Ed (talk) 22:53, 28 December 2007 (UTC)[reply]

That's right - FAT can only store files less than 4 GiB. So you could not store a 12 GB file on any FAT drive (including FAT32), unless you break it into chunks of less that 4 GiB. NTFS and ext3 can store files up to 2 TiB. --h2g2bob (talk) 23:23, 28 December 2007 (UTC)[reply]
Tried checking the articles yet? Check FAT32 and NTFS. They answer your questions. — Kieff | Talk 23:19, 28 December 2007 (UTC)[reply]
Note also that although FAT32 supports 4 GB files theoretically, there may be OS limits. I have a 3.5 GB file here that I can read and write on FAT32 under Windows 2003, but Windows 98 for example wouldn't be able to access it, being limited to 2 GB. • Anakin (contribscomplaints) 16:16, 30 December 2007 (UTC)[reply]

time machine[edit]

Dear wikipdia

I tried to use time machine (in leopard) on the network drive at the back of my wifi router, but it only works on ym 2 internal drives. I was thinking there might be a program which makes the mac think the network drive is actually an internal, so time machine would work normally? thankyou --82.152.250.117 (talk) 23:00, 28 December 2007 (UTC)[reply]

I think it's a feature of the filesystem, nothing to do with "time machine". If your network server is making "shadow copies" then time machine may work over the network, but your computer definitely isn't going to be doing for the server --ffroth 04:34, 29 December 2007 (UTC)[reply]