Jump to content

Wikipedia:Reference desk/Archives/Computing/2014 September 18

From Wikipedia, the free encyclopedia
Computing desk
< September 17 << Aug | September | Oct >> September 19 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


September 18

[edit]

make query

[edit]

Hello. kubuntu linux. I am used to the ritual "./configure ; make ; make install". Why is it that about 80% of software packages simply configure and make straight out of the box, while about 20% end up in a nightmare dependency hell? I don't have a bizarre set-up, just a bog-standard plain vanilla kubuntu box. I'm not a power user. I'm don't muck about about with obscure configuration files or recompile my kernel on the bleeding edge. Right now I'm trying to compile inkscape and have just about given up. Any insights appreciated, Robinh (talk) 08:28, 18 September 2014 (UTC)[reply]

Is there any reason why you're building from the source, rather than using "apt-get install inkscape" to install the pre-built package? If ./configure says it needs libfoo, then you need to install both libfoo and libfoo-dev. The latter tells the compiler building the program how to use libfoo. However, installing libfoo-dev should autoinstall libfoo as well. CS Miller (talk) 10:17, 18 September 2014 (UTC)[reply]
Agreed, a package manager is the easiest way to avoid dependency hell. IMO, the ease and ubiquity of these tools is one of the major improvements to the *nix world over the past decade or so. I don't think there really is a way to make "./configure ; make ; make install" work with 100% reliability on any *nix. Some of the reasons are outlined in the linked articles. SemanticMantis (talk) 14:43, 18 September 2014 (UTC)[reply]

(OP) thanks guys. I want to do this because there is no precompiled version of inkscape 91, AFAICS. But I was asking about why (whether?) there is such a severe split between the majority works-out-of-the box software and the install-a-billion-weird-libraries-and-eventually-give-up software. Best wishes, Robinh (talk) 20:16, 18 September 2014 (UTC)[reply]

It depends on what development packages are installed by default on your machine, and which (un)common packages inkscape needs. As I noted above, you can install the dev packages that inkscape needs. You can also try "apt-get build-dep inkscape" to install the build dependencies for the version that kubuntu ships with; they might be the same as 91.CS Miller (talk) 20:28, 18 September 2014 (UTC)[reply]
[edit]

For example I would like to google search about Mickey. But everytime results of Mickey Mouse comes out. Is there ^a way to specify to exclude results for Mickey Mouse? e.g. ONLY Mickey NOT Mickey Mouse? --Jondel (talk) 10:11, 18 September 2014 (UTC)[reply]

Search for 'Mickey -"Mickey Mouse" '. This will (mainly) exclude the phrase "Mickey Mouse" from the results. If you search for 'Mickey -Mouse' instead, then it will exclude all pages that have "Mouse" in them, even if its no where near "Mickey". CS Miller (talk) 10:20, 18 September 2014 (UTC)[reply]

Awesome! Thanks CsMiller--Jondel (talk) 10:40, 18 September 2014 (UTC)[reply]

Java two-way conversion

[edit]

In Java, what's the typical method of providing a polymorphic two-way conversion between sibling classes? An example would be where text was sometimes HTML-escaped and sometimes unescaped, with only the runtime type indicating which was which (e.g. in an e-mail or forum app that accepted multiple formats), but where functions required either one or the other. NeonMerlin 11:53, 18 September 2014 (UTC)[reply]

The standard way to provide such polymorphism is to define your requirement - e.g., "provide text in a specific format" - as a Java interface; and then ensure that both classes implement this interface. Design an interface that meets your needs - and any instance of any class implementing the interface will automatically be able to satisfy your requirement. This can be enforced at compile time.
If you actually want to perform a conversion, the language doesn't force you into any specific design. You could convert the text, convert the object, or both. You could re-convert on demand, or you could cache the result anywhere you like. You could provide a copy constructor for each class that takes an instance of the other class as its argument; each time you needed an instance of the other class, you could regenerate it. If this operation occurs so frequently that it incurs a high CPU cost, you could generate the text in all necessary formats ahead of time, and store it for later use. If that implementation uses memory in such quantity that it scales poorly, you could re-think your utility functions so that they could handle text data in a more intelligent way.
Nimur (talk) 14:31, 18 September 2014 (UTC)[reply]

Storing images

[edit]

I want to store 1000's of images on my computer and I want them to stay the same quality as when I downloaded them. What should I convert all the files to so that I ensure they will stay in perfect condition for later viewing? I was thinking about converting all the images to a PNG format but I wanted to know what the best idea would be before I do that. — Preceding unsigned comment added by 204.42.31.250 (talk) 21:47, 18 September 2014 (UTC)[reply]

You can just store them in the format they were when you downloaded them, that's the way most digital storage works. Lossy compression may mean some data was lost from the originals (although with modern cameras the closest thing to the originals often are lossy), but that's already happened by the time you download the lossy compressed images. (If you are editing the images, you do have to consider generation loss.) Of course this doesn't protect against data loss on the storage medium (defective medium, malware, filesystem problems, accidental deletion etc), malicious or dumb software tampering with the images or anything of that sort, but choosing a specific format won't generally help with that (except for dumb software which modifies and rewrites lossy images). Use multiple backups in different geographical locations to cover that. There is a minor advantage to a format which stores its error detecting code for the image data since in some cases you could get corruption not easily detected, but it's far better you ensure your backup system has decent error detection. Of course, even if your backup system is robust enough, you have no guarantees of software capable of opening the images in 50 years, if that's a concern, it may be worth considering the storage format. Nil Einne (talk) 22:05, 18 September 2014 (UTC)[reply]
I agree with Nil Einne. Converting to say, lossless, from the original files you have downloaded doesn't have any benefit, as it isn't going to improve the quality; it could even degrade it if done improperly. -- Tahlorz
I made some modifications to my comment after you posted. I don't think these affect your comment but since there are no links to your user, talk or contributions page, I'll just notify you here. Nil Einne (talk) 22:20, 18 September 2014 (UTC)[reply]
As long as you're talking about common, open standard image formats (JPEG, PNG, GIF) then I agree with Nil Einne - you'll surely be able to find programs to read or at least convert them in 10, 20, probably 50 years from now (getting the media to work even in 10 years is more of a challenge, ditto the filesystem the files are stored on). Previous common formats like TIFF, BMP, and TrueVision Targa which have fallen mostly out of use are still widely supported. But file formats that are used by only one program, like Adobe Photoshop or Coreldraw files, would probably mean you'd need some future incarnation of that program (or hope that some open-source project has successfully reverse engineered these complex, proprietary file types). It's not even a completely safe assumption that Photoshop 2025 will read a CS5 document properly. -- Finlay McWalterTalk 16:28, 21 September 2014 (UTC)[reply]
To be clear, my point was not that I believe the images could be opened in 50 or even 10 years by future system but rather that the condition of the images should generally remain the same, and if they didn't, that changing format wouldn't help. I do agree that if you want to ensure the images are viewable, you need to consider the storage format, hence my last comment. While it's useful to talk more about being able to open the format in the future, I concentrated on the other issues (images remaining the same and the need for good storage practices) because it's not clear to me the OP understands such basics but they seem essential and also because it's not that common to download stuff besides PNG, TIFF, JPEG. Admitedly as you hinted at, I also glossed over the backup issue somewhat. I thought of mentioning the need for regular refreshes to ensure both that the backups still work & are correct & also that you can still access them on a modern system, but decided not to further complicate my reply, partially because I'm not sure the OP is really concerned about keeping the images that long. But if their hard disk containing their only copy of the images dies tomorrow, whatever they store them in isn't likely to make much difference. Well unless it's only partially dead, in which case a format more easily detected is more likely to be recoverable. Of course the OPs reply is fairly unclear so I could easily be wrong. Nil Einne (talk) 14:37, 22 September 2014 (UTC)[reply]
The whole point about using DIGITAL technology is that things like images can't gradually degrade. Either the entire image is as perfect as the first time you viewed it - or it's gone. In that sense, the format you choose is irrelevant. There are, however, two concerns that might affect your choice of file format:
  1. Longevity: Will there still be software that understands your file format in the distant future. We can't predict what'll happen in the future - but your best chance is to stick with the most common formats - and those that are not proprietary (belonging to one specific company) or tied to specific hardware. So avoid photoshop's own file format - and don't use the ".raw" format that your camera delivers.
  2. Inherently Lossy formats: Some file formats (GIF and JPEG, particularly) use "lossy" image compression - that is to say that they toss out seemingly unimportant quality in favor of getting a smaller file size. Other file formats (PNG, for example) preserve every detail - even if the human eye can't even perceive it - which results in a vastly larger file size. This doesn't affect the longevity of you pictures - but it does affect quality. The very first time you convert your image to JPEG (".jpg"), some details are discarded and will be lost forever if you archive the JPEG file. If you store your images in PNG, they'll retain all of that detail.
A bigger concern is the media you use. Early photos of my son, soon after birth, were taken by my father on a fancy new camera he had that recorded pictures on a 2" floppy disk. Even before we could ask to get copies, the company went bust and my father was left with no way to get the pictures off of the camera - except by recording them to video tape! So we have PNG images grabbed from a crappy VHS recording as our only record of those days. The lesson here is to avoid anything super-new and one-off. Pick the most common thing - and these days, uploading it to the web seems like the best option.
The Internet provides hope that we'll have more longevity. The images you can view in your browser are SO ubiquitous that it seems impossible that humans will ever fail to support those file formats without first converting the entire 2014 Internet archive into whatever new formats they'll have. Similarly, the idea that some disk format will go out of style and your data will be stuck on some impossible-to-read physical disk is unlikely to be an issue. The Internet has transitioned through several generations of storage technology - and nobody has even particularly noticed. Sure, if you'd kept data on 8" floppy disks, you'd have a hard time extracting them now - but if they were on an early 20Mbyte "Winchester" hard drive, and that hard drive was progressively copied onto newer and newer technology, then your files would still be readable.
Another trick to keeping files accessible, is to put them in lots of places. Don't trust just one Internet provider. If you put all of your photos on FaceBook and they change their policies in 20 years time to delete all files more than 10 years old...then you're screwed. But if you keep them on memory sticks AND FaceBook AND Google AND WikiCommons AND your phone, your computer...then the odds of all of those disappearing at once is essentially zero. But you'll need to be vigilant. If some storage mechanism goes black - then find another up-start format or service, and make another copy there. Ultimately, your very best bet is keeping your pictures in as many places (both physically and logically) - and on as many media types as possible - and in the most common/standard file format you can find. SteveBaker (talk) 19:08, 22 September 2014 (UTC)[reply]