Wikipedia:Reference desk/Archives/Computing/2016 March 31

From Wikipedia, the free encyclopedia
Computing desk
< March 30 << Feb | March | Apr >> April 1 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


March 31[edit]

application that maintains perspective on subject when film is moving[edit]

I have seen this of late, and have no idea what it is called-old video that is shaky or taken from television news is somehow straightened to focus on the subject, therefore the captions at the bottom and even the shape of the screen often come out warped. It's a neat application, but I have no idea what one even calls such a thing. Help?--Kintetsubuffalo (talk) 10:49, 31 March 2016 (UTC)[reply]

You may be thinking of "motion tracking" software. We have articles at Match moving and Image stabilization that may be helpful. --Thomprod (talk) 12:33, 31 March 2016 (UTC)[reply]
Distorting the image seems like an odd way to do it. I realize that leaving it undistorted would allow the image to go partially out of frame, but that seems preferable, to me. If the shaking isn't too bad, you could crop the frame a bit so as to not see black bars intruding on the frame. Or maybe you could just extend the last column of pixels out to look a bit better than black bars. StuRat (talk) 15:12, 31 March 2016 (UTC)[reply]
I'm nearly positive that OP is talking about some form of image stabilization. The images don't get distorted, the frame does. So instead of a guy jumping around the frame, the frame jumps around a (nearly) stationary guy [1]. This has become hugely popular in the past 5 years or so (or longer?) Here's an entire subreddit devoted to stabilized .gifs, here [2] is another. Here [3] is a tutorial on how to do it. SemanticMantis (talk) 15:48, 31 March 2016 (UTC)[reply]
Right, but then you have to decide what to do about the part of the image that goes off the frame. There are several options:
1) Distort the image so it doesn't go off the frame.
1a) Compression at the edge(s) that goes off frame.
1b) Stretching at the edge(s) that falls short (or just extending the last pixel, as I suggested).
2) Shrink the image down so that no parts goes off the frame. This would still have black bars, on one or more sides, changing in size.
3) Crop the frame so the parts of the image falling short of the original frame no longer do.
4) Some combo of the above. StuRat (talk) 15:56, 31 March 2016 (UTC)[reply]
HERE and HERE are some good examples. In the original StarTrek footage, the camera was bounced around as the cast of the show jiggled around in their seats...the result is fairly convincing. When you stabilize the image to remove the camera bounce - it reveals just how lame the actors look when they are doing this! Aside from curiosity value, there is value in stabilizing an image - for example in hand-held camera footage where some camera jiggle is inevitable, and undesirable. SteveBaker (talk) 19:54, 31 March 2016 (UTC)[reply]
I have done this exact thing in Adobe After Effects, but it is NOT very easy, it took several hours of studying tutorials before I got it to work. Vespine (talk) 02:04, 1 April 2016 (UTC)[reply]
They should just film with a steadicam. The most impressive steadicam operator I saw was in Akeelah and the Bee, and took a steady shot while jumping rope. If they give Oscars for steadicam operators, he deserved one. StuRat (talk) 02:08, 1 April 2016 (UTC)[reply]

Thank you all, that's it! BTW, what is OP? Offending Party? Me? Sometimes...--Kintetsubuffalo (talk) 16:51, 1 April 2016 (UTC)[reply]

Original Post or Original Poster. StuRat (talk) 17:40, 1 April 2016 (UTC)[reply]

Windows 10[edit]

My desktop runs Win7 Home Premium, SP1. It has an Intel Celeron CPU G540 @ 2.50 GHz, 6.00 GB installed RAM, 5.90 GB usable. I receive the offers to install Win10, which say my system is suitable. Is it worth installing, or would it be stretching my system's resources a bit far? Thanks, @@@@

Others have had more experience with this than I have, but this says that it meets the minimum requirements. However, that is not enough to give a good experience. My daughter is running it on a computer that had only 4GB of RAM, and it was quite sluggish. I bumped it up to 6GB and she saw a noticeable difference. A little later I raised it to 8GB, but she didn't comment about the difference. So I think 6GB will be OK. (But memory is about $5 per GB, so you can add two 2GB sticks for about $20.) My guess is that the processor will give you performance about like what you are used to. Bubba73 You talkin' to me? 20:21, 31 March 2016 (UTC)[reply]

Would a 4K Galaxy Note 6 have shorter battery life?[edit]

Sure it's a phablet and rumored to have a slightly bigger 6 inch screen so the battery will be big but the tech is so new that the only 4K phone (only 8 months older than the Galaxy Note 6) just interpolates what the 4K pixels should be from a 1080p image. Otherwise the CPU and GPU load would cause poor battery life. As much as I believe we need ubiquitous 4K OLED phablets to enter the future if it doesn't browse for 10 hours like the Note 4 or 5 or is fake 4K for endurance then anything my eyes might see would make so little difference I'd just get a 1440p Note instead. Sagittarian Milky Way (talk) 18:34, 31 March 2016 (UTC)[reply]

Just to note, 4K and 2160p are the same thing. --Wirbelwind(ヴィルヴェルヴィント) 20:07, 31 March 2016 (UTC)[reply]
Brain fart. I of course meant 1440p, the other thing that starts with a 2 (2560x1440). Sagittarian Milky Way (talk) 20:29, 31 March 2016 (UTC)[reply]
Nobody is going to give you an answer for battery life of an unreleased product with an unknown battery size. If you want speculation, see [ http://www.androidcentral.com/samsung-galaxy-note-6 ]. --Guy Macon (talk) 23:52, 31 March 2016 (UTC)[reply]
4k on a phone/tablet ? Can you possibly tell the difference on a screen that small ? Seems like worrying about how many angels can be carved on the head of a pin, to me. StuRat (talk) 03:08, 1 April 2016 (UTC)[reply]
  • "Can you possibly tell the difference on a screen that small"
Yes, the tiny colon in the corner of my s key looks cloth-like because of the black lines between the pixels and it's rendered at 441 pixels per inch. Anything that's a uniform lighter color has pixel texture, at least if I squint from 4 inches. I also see color haloing from the right side of the pixels being redder than the left (mild). A 1440p screen that is 5.7 inches would only shrink the pixels to 76% as big. I think I could still see them then, but maybe only on primary color backgrounds since the subpixels look like this (I can see the 1080p texture up to about the distance I normally use the phone (half foot) if I squint). The inevitable 6 inch screens (from smaller bezels) would make them 80% as big. Also, there are some slight benefits to a more than a minimal retina screen. It'd be cool to browse Google Maps satellite with a magnifying glass. Scratches so small that you wouldn't even care if the white pixels were really white won't show the subpixels as much or at all (I don't use a protector because it makes the thickness feel less futuristic and even with care it'll likely get microscratched eventually). If it's so humid it fogs where you touch the parts under the fog will look better. And water on the screen won't make the subpixels as obvious (hey you can use a non-waterproof phone with wet hands or in the shower if you're really, really careful) That said 4K is such a minor, minor plus that there's a lot of things I'd rather have instead. Like 0.3 inches less bezel or mobile Java or a handlaser that can cut wood to play with (whether it's part of a phone or not). Sagittarian Milky Way (talk) 01:37, 2 April 2016 (UTC)[reply]
OK, thanks. Sounds like we are nearing the limit of visual improvement with smaller pixels, even if we aren't quite at the limit yet. StuRat (talk) 16:36, 5 April 2016 (UTC)[reply]
I wholeheartedly agree with you on that. Sagittarian Milky Way (talk) 17:07, 5 April 2016 (UTC)[reply]

You ask, "Who is my Internet provider?" WHY?[edit]

I just bought a Dell Inspiron computer, and in setting it up they wanted to know who is my Internet provider. Why on earth would they ask that??? --Halcatalyst (talk) 22:42, 31 March 2016 (UTC)[reply]

I am confused. Who is "they"? 175.45.116.66 (talk) 00:59, 1 April 2016 (UTC)[reply]

The setup software. --Halcatalyst (talk) 03:39, 1 April 2016 (UTC)[reply]

Probably to install updates. Depending on where you live and what kind of internet set up you and your neighbors have, that might narrow down which connections to bother checking. Ian.thomson (talk) 03:45, 1 April 2016 (UTC)[reply]
(EC)This used to be fairly common in the days you still had to "dial up" to your internet service provider, the computer would ask you who your ISP was and had the settings for the major providers saved, purely so you didn't have to put them in manually. These days, you generally just connect to your wifi network and your internet works so I can't really see any reason for your computer to ask who your ISP is. Vespine (talk) 03:56, 1 April 2016 (UTC)[reply]
It just occurred to me UNLESS your laptop has a build in 3G modem or something like that? Vespine (talk) 03:57, 1 April 2016 (UTC)[reply]
"The Setup Software" --- setup for what? My gut tells me that it is the setup for Outlook. It asks who your service provider is because most people get their email from their service provider. So, it uses that information for a default account setup. However, it could be the setup for Windows or the setup for Office or the setup for Norton Antivirus or the setup for Leather Goddesses of Phobos 2020... 209.149.114.77 (talk) 12:31, 1 April 2016 (UTC)[reply]
  • It might have been Outlook, which I don't want to use. In any case, I'm not one to bite on something I'm not sure about, so I got outta there.
  • My wife had a Surface 2 (RT!!!!) and Outlook totally hosed her email. All she wanted to do was connect with Yahoo.
  • It's strange that MS, which in a way pioneered open systems, has also always wanted to keep you in its grasp. --Halcatalyst (talk) 00:57, 6 April 2016 (UTC)[reply]