Jump to content

Wikipedia:Reference desk/Archives/Science/2016 January 27

From Wikipedia, the free encyclopedia
Science desk
< January 26 << Dec | January | Feb >> January 28 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


January 27

[edit]

Can a man's epididymis grow back if *all* of it is removed?

[edit]

For reference: Epididymis. Futurist110 (talk) 00:05, 27 January 2016 (UTC)[reply]

I don't think so. Organs generally don't regenerate. Semi-exceptions: liver and brain. Why are you asking? Are you thinking of something like a vasectomy spontaneously reversing, which can happen? If so, that involves the vas deferens, not the epididymis. --71.119.131.184 (talk) 00:26, 27 January 2016 (UTC)[reply]
Why exactly can the vas deferens grow back but not the epididymis, though? Futurist110 (talk) 02:34, 27 January 2016 (UTC)[reply]
Indeed, if one tube/duct can grow back, then why exactly can't another tube/duct likewise grow back? Futurist110 (talk) 02:36, 27 January 2016 (UTC)[reply]
Good question. It doesn't really "grow back" in the sense of sprouting a new one from scratch. Exact vasectomy methods can vary a little (see the article), but in general the vas deferens is severed. Sometimes a portion is removed, but sometimes it's just cut, and the cut segments closed off with surgical clips or something along those lines. So, you can get minor tissue growth that winds up reconnecting the segments. Some additional procedures, like forming a tissue barrier between the vas deferens segments, have been tried to reduce the likelihood of spontaneous reversal. --71.119.131.184 (talk) 02:45, 27 January 2016 (UTC)[reply]
OK. Also, though, out of curiosity--can the vas deferens grow back if *all* of it is surgically removed? Futurist110 (talk) 02:58, 27 January 2016 (UTC)[reply]
In general, the less differentiated a tissue is, the easier it is for it to regenerate. The vas deferens are fairly simple muscular tubes, in contrast to the epididymis and testes, which are specialized organs, so it's not surprising that you can get some regrowth of the vas deferens. --71.119.131.184 (talk) 02:45, 27 January 2016 (UTC)[reply]
Is regrowth of the epididymis and testicles (after *complete* removal of the epididymis and testicles, that is) completely impossible or merely unlikely, though?
Also, please pardon my ignorance, but isn't the epididymis a tube just like the vas deferens is? Futurist110 (talk) 02:58, 27 January 2016 (UTC)[reply]
The epididymis is a 'tube' but a longer and more complex one and if it's removed completely it won't grow back (parts of it may, but not the enitre connection). If you snip the tube, then you've got a similar situation to a vasectomy, which can in rare circumstances reverse (repair) itself, but that's a very small step as opposed to regenerating an entire epididymis. The body is good at 'protecting' itself by repairing damage, whether that's by growing new tissue to reverse a vasectomy or repairing a damaged organ. What we lack is the regenerative capability to re-grow any parts that have been removed/destroyed completely, including the epididymis. I think a quick google search will make it clear that testicles don't grow back Mike Dhu (talk) 03:21, 27 January 2016 (UTC)[reply]
An aside on skin growth and stretch marks
Another exception is skin, which grows just fine, if given enough time (like when you gain weight). But, if you try to grow it too quickly, you get stretch marks and scars.StuRat (talk) 00:29, 27 January 2016 (UTC) [reply]
That's not an exception, just wrong. Skin contains elastic fibers that allow it to stretch during weight gain or recoil or shrink during weight loss. So someone who gains a large amount of weight does not grow new skin. Their skin stretches to accommodate the accumulation of fat tissue. Scicurious (talk) 14:13, 27 January 2016 (UTC)[reply]
The definition of "growth" here may be tricky. According to these two papers ( [1][2] ) the skin consists of epithelial proliferative units (though there may be some equivocation on the details) and each unit has its own stem cell. Given the chance, they clonally expand, but a unit without a stem cell can also be colonized. If you simply look at a section of skin, you're not going to see a lot of gaps where cells no longer contact -- something is taking up the slack. Yet at the same time, the hair follicles don't increase in number, and they have their own stem cells that can provide regeneration in case of injury. So when a baby's scalp becomes a woman's, you can say her skin grew, in the sense that it is probably thicker and stronger and has more cells in it than when she was a baby. But yet, the hair follicles are no more numerous, so the regenerative potential from that source is presumably reduced. I'm not sure exactly what happens to the average EPU size. Wnt (talk) 14:58, 27 January 2016 (UTC)[reply]
Just compare the surface area of a man at 150lbs to the surface area of the same man at 350 lbs a few years later. The skin "grew" under most any definition: more of it, more cells, new cells, more area, more mass, etc. Here's a nice paper that breaks down relative skin proportion in mice [3]. Unfortunately it's about two different strains of mice rather than weigh gain within strains but the point is that they bigger mice get more skin as they grow, just as humans grow more skin as they grow. SemanticMantis (talk) 15:11, 27 January 2016 (UTC)[reply]
Ahem. Stretch_marks "are often the result of the rapid stretching of the skin associated with rapid growth or rapid weight changes." That sentence is not sourced, but see table 2 here [4], and perhaps add it to the article for your penance :) Now, this may have been avoided had Stu given a reference, and you also are correct that the skin can stretch a lot. Some specific types of stretch marks are less influenced by weight gain, but that's a small detail in an issue unrelated to OP. Please let's endeavor to include references when posting claims to the ref desks. Thanks, SemanticMantis (talk) 15:04, 27 January 2016 (UTC)[reply]

U.S. currency subjected to microwaves

[edit]

See HERE Is there any validity to this? Hard to filter out the nonsense/conspiracies. If so what is the mechanism of action? 199.19.248.82 (talk) 02:07, 27 January 2016 (UTC)[reply]

I wouldn't say it's that hard. Youtube videos and anything associated with Alex Jones or spreading conspiracy theories like godlikeproductions.com and prisonplanet.com are obvious stuff to filter out. Of the top results, that will probably leave you with [5], [6] & [7]. A quick read should suggest both the first and second link are useful. In fact despite the somewhat uncertain URL, the snippet probably visible in Google for the first link does suggest it may be useful, which is confirmed by reading.

A critical eye* on the less top results will probably find others like [8] which are useful.

I don't think it's possible to be certain how big a factor the metallic ink (or perhaps just ink) was responsible for the cases when the bills did burn, and how much of it is simply that stacking a bunch of pieces of paper and microwaving them for long enough will generally mean they catch fire. Suffice it to say they are probably both factors. It's notable that these stories claiming evils lack controls, they didn't try a stack of similar sized ordinary paper (obviously it would be very difficult to use the paper used for bank notes.

BTW, [9] isn't actually too bad in this instance if you ignore the crazies. It does look like one of the more critical posters there made a mistake. While the idea that some minimum wage employee is going to be microwaving $1000 worth of $20 bills or heck that they would just so happen to have that much cash in their wallet isn't particularly believable, if you read carefully the original story carefully [[deprecated source?] the min wage employee was someone else not the person who had the money. Still, as some of the other sources above point out, there are obvious red flags in the original story like the fact that they claimed to microwave over $1000 in $20 bills but only show 30 bills (i.e. $600) there. And that their claim the burning is uniform isn't really true. The amount of burning shown varies significantly and while it's normally in the head sometimes it seems much more in the left eye than the right.

Critical eye = at a basic level, anyone who seriously believes there are RFID tags in bank notes would be best ignored. And while forum results can have useful info, it often pays to avoid them due to the number of crazies unless you can't find anything better. Instead concentrate on pages that sound like whoever is behind them is trustworthy and check out by reading. To some extent anything which sounds like it's claiming to be scientific is potentially useful in this instance since while there are a lot of sites and people who claim science when they are actually into pseudoscience this is much more common with stuff like alternative medicine, climate change deniers or ani-evolutionists than it is with conspiracy theories about RFID tags in banknotes.

A lot of this can be assessed without having to even open the sites/links in the Goodle search result. Some others do depend on existing knowledge, e.g. knowing the URLs for Alex Jones or conspiracy theorist sites. Still you only need a really brief look to realise goodlikeproductions or prisonplane are not places you should count on for useful info.

Nil Einne (talk) 08:55, 27 January 2016 (UTC)[reply]

Given how extensive known privacy invasions have become, and how obvious the government's motive for spotting large amounts of money is, I don't think condescension is deserved. The people I saw made various hypotheses and tested them. However, some of the assumptions may be questionable. For example, I doubt that RFID is the only way to track a bill by penetrating EM radiation, and I doubt that RFID chips inevitably catch on fire in a microwave. I am very suspicious of the uses of terahertz radiation and lower frequency radio waves - obviously, the higher the frequency/shorter the wavelength, the smaller the receiver can be and the more readily it can dissipate heat to its surroundings. Alternatively, terahertz can simply penetrate the human body, as with airport scanners, and so if someone designed a set of terahertz dyes, probably some conjugated double bond system that goes on a really long but tightly controlled distance, then they can have their damned identifying codes marked out in a way you will see only if you can scan through the terahertz spectrum with a more or less monochromatic emitter and view what is transmitted and reflected. If I see someone do that experiment on a bill, I'll believe it's not being tracked... maybe... otherwise, I should assume it is (it's just a question of whether those interested in you are important enough to have access) Wnt (talk) 15:13, 27 January 2016 (UTC)[reply]
Their tests were very poorly planned (if you genuinely believe that something has an RFID either feel for the tag or look at it under a microscope as someone else who didn't believe their nonsense did, don't microwave). And as I already said lacked even the most basic control for even the stupid test they were doing. And they were either incapable of even counting, or didn't even show all their results. Results which didn't even show what they claimed to show. So condescension is well and truly deserved.

BTW most terahertz radiation can barely penetrate the human body (our article says "penetrate several millimeters of tissue with low water content"). The main purpose of most airport scanners using terahertz radiation is to penetrate clothes not the human body (they may be able to see the body which is quite different from penetrating the body).

Note that in any case, the issue of whether bills are being tracked is unrelated to the question (unless your claiming the cause of the notes catching fire in microwaves really was because of RFID chips which it doesn't seem you are) and wasn't discussed by anyone here before you. I only mentioned that the specific claim mentioned in some sources discussing microwaving money (the presence of RFID chips in money as a cause for them catching fire) was incredibly stupid.

Nil Einne (talk) 19:23, 27 January 2016 (UTC)[reply]

This is a multi-layered question:
  1. Does paper money catch fire in a microwave? (We don't trust YouTube videos!)
  2. Does all paper catch fire in a microwave when stacked in the same manner as the money? (All experiments need a 'control')
  3. If (1) is true and (2) is false - then is the fact that paper is made of cotton rather than wood-pulp the cause of this?
  4. If (3) is false - then are metal particles in the anti-counterfeiting ink the cause of this difference?
  5. If (4) is false - then...and so on...
The idea that there are sneaky RFID tags embedded in the money seems really unlikely given the size of antenna you'd need - and the fact that you'd see them if you held the bill up to the light. So that comes in as question #20 or so after we've speculated about the fact that maybe 30% of all paper money has traces of cocaine in it and maybe that's what catches fire.
If you're determined to find a conspiracy, what seems more plausible is that the pattern made by the metal in the ink could cause some kind of unique signature in reflected or transmitted radio waves - and this would somehow make the money detectable...but the behavior of the money in a microwave oven doesn't really prove that either way. However, that seems more plausible than RFID tags.
This image clearly shows that paper money is clearly detectable in x-rays - so for sure the metal inks make it detectable in some manner.
Microwave ovens are a continual source of surprising effects. Cut a grape *almost* in half, leaving a thin shred of skin between the halves - lay the two halves spread apart inside a microwave oven and zap them - and you get an impressive light show of sparks. Does this prove that government can 'track' grapes? No! That's ridiculous! So why would you assume that money catching fire in a microwave would imply that?
We know that putting some kinds of metal into a microwave causes unusual effects - so why not the metal inks or the cotton fibers or some other effect inherent in the structure of paper money?
SteveBaker (talk) 13:58, 28 January 2016 (UTC)[reply]

Are dogs racist?

[edit]

Do dogs prefer their own breed for mating or at least, are more aggressive towards breeds far away from them? --Scicurious (talk) 12:46, 27 January 2016 (UTC)[reply]

Intriguing question. I have absolutely no idea how to answer it though... just picture the kind of laboratory you'd have to set up to try to socialize dogs under highly consistent conditions, then see whether they act differently. I'm tempted just to read anecdotes here, like [10]. Individual people describe dogs with out-of-breed associations, even as others say that you can just tell at a dog show, etc. The existence of the mutt is proof that any breed loyalty is not absolute ... it's also a reminder that the dogs people buy are often not the result of freely assortative (or non-assortative) mating. Wnt (talk) 15:25, 27 January 2016 (UTC)[reply]
The studies will be done more like sociology/ethology, not through controlled exposure experiments. They'll use things like surveys and observations and medical records and lots of relatively fancy statistics. E.g. these [11] people have survey data on dog-dog aggression by breed, but I can't see that they reported it! Even if breed was not a significant factor, they should say so...This paper [12] does have relevant data, (tables 2, 3) but the data are sparse and breed of the other dog is not reported. Here are a few more scholarly papers that look promising [13] [14]. OP can find many more by searching google scholar for things like /dog breed intraspecies agression/. If OP is interested in anecdotes and OR (which is potentially valuable here), I'd suggest asking at a dog forum. SemanticMantis (talk) 15:56, 27 January 2016 (UTC)[reply]
If you go to any large dog park, you'll see dogs of all breeds playing together - even when there are enough of one common breed for them to potentially group together. So it seems rather unlikely that they care very much. The only preferences I think I see are that there seems to be some kind of broad preference for other dogs of similar size. Our lab gets visibly frustrated with very small dogs...but whether that is due to their behavioral tendencies is hard to tell. SteveBaker (talk) 16:04, 27 January 2016 (UTC)[reply]
Steve, I just misread your message and began wondering why your laboratory was getting frustrated with small dogs! Presumably, you mean your labrador! This breed, rather surprisingly, has topped several lists for aggression - particularly when their home territory is "invaded" by people such as postmen. Regarding the OP, I have no references to support this but I very much doubt there would be a psychological racism about mating amongst dogs. There may be preferences according to size, but just the other day I saw a rather humerous photo of a male Chihuahua perched on the back of a female Great Dane so he could mate with her. Very probably staged though.DrChrissy (talk) 16:24, 27 January 2016 (UTC)[reply]

Excessive Inbreeding as practiced by humans on pedigree dogs (controversy) has caused genetic defects that would not survive under natural selection while there is likely evolutionary survival value to hybrid vigor. Dogs sensibly rely more on their Vomeronasal organ to evaluate the pheremones of a potential mate than on any version of a Kennel club breed checklist. AllBestFaith (talk) 17:17, 27 January 2016 (UTC)[reply]

It's not about mating and it's not about dogs, but rats can be racist, see Through the Wormhole S06E01 Are We All Bigots. Of course, they can be educated not to be racist. Tgeorgescu (talk) 20:28, 27 January 2016 (UTC)[reply]

I interpreted the headline differently: I once knew a dog who was racist regarding humans. He had been mistreated by Mexican men, and was consequently suspicious of all men, but he got crazy in the presence of Mexicans. — Sebastian 18:10, 29 January 2016 (UTC)[reply]

I think you need to read the question itself - the headline rarely carries sufficient information with which to formulate a decent answer. SteveBaker (talk) 16:59, 31 January 2016 (UTC)[reply]

Why don't some species have a common name?

[edit]

Many species have a common name. Human. Squirrel. Rat. Dog. Whale. Dolphin. Fern. Some species don't seem to have common names. Entamoeba histolytica. Staphylococcus aureus. Candida albicans. Why don't scientists invent common names for specific parasites, bacteria, and fungi? Instead of Staphylococcus aureus, which can be a mouthful to say, the common name may be Staphaur bacteria. 140.254.70.165 (talk) 12:49, 27 January 2016 (UTC)[reply]

Also, of the above names, only two are single species in common use, H. sapiens and C. familiaris. Robert McClenon (talk) 21:59, 27 January 2016 (UTC)[reply]
My answer as to why scientists don't invent common names is that they don't need to, because scientists refer to the species by its taxonomic name. It is up to non-scientists to invent common names, since the scientists are satisfied with the scientific name. Why journalists and others don't invent common names for every species is described below. Robert McClenon (talk) 21:56, 27 January 2016 (UTC)[reply]
It is "an attempt to make it possible for members of the general public (including such interested parties as fishermen, farmers, etc.) to be able to refer to" them, according to common name. I find it difficult to find an exception, but if common people relate somehow to a species, then a common name exists. Otherwise not. --Scicurious (talk) 12:57, 27 January 2016 (UTC)[reply]
As to fishermen, I will note that often the same common word, such as "trout" or "bass", may be used differently in different English-speaking regions. Fishermen who are aware of regional inconsistencies in naming will often use the unambiguous scientific name to disambiguate. Robert McClenon (talk) 21:56, 27 January 2016 (UTC)[reply]
It could also be that scientists just aren't that creative with names... FrameDrag (talk) 14:46, 27 January 2016 (UTC)[reply]
It's the other way round. Scientists give each known species a name. Common people are not prolific enough to keep up with them.Scicurious (talk) 14:54, 27 January 2016 (UTC) [reply]
Staphylococcus aureus is known just as 'Staph', similarly Streptococcal pharyngitis is known as 'Strep'[15], so those are the common names. Mikenorton (talk) 13:10, 27 January 2016 (UTC)[reply]
Golden staph.
Sleigh (talk) 13:49, 27 January 2016 (UTC)[reply]
Bacteria are a special case, where the usual test for a species, whether it breeds with itself and not with related species, does not apply. This results among other things in so-called species, such as E. coli, that consist of a multitude of so-called varieties that are really so different in their behavior that they are probably multiple species. But the question originally had to do primarily with plants and animals. Robert McClenon (talk) 21:53, 27 January 2016 (UTC)[reply]
Does that refer to the color of the snot ? :-) StuRat (talk) 16:40, 27 January 2016 (UTC) [reply]
The thing about common names is that they need to be popular and commonly used - and it's hard to dictate that. People name things if they need to - and not if they don't. People don't need common names for organisms they'll never encounter or care about. Also, there are far too many organisms out there to have short, simple, memorable names for all of them. We tend to lump together large groups of organisms into one common name. "Rat" (to pick but one from your list) formally contains 64 species...but our Rat article lists over 30 other animals that are commonly called "Rat" - but which aren't even a part of the genus Rattus. So allocating these informal names would soon become kinda chaotic and nightmareish. Which is why we have the latin binomial system in the first place. That said, scientists very often do invent common names for things - so, for example Homo floresiensis goes by the common name "hobbit" because the team that discovered this extinct species liked the name and it seemed appropriate enough that it's caught on. Whether that kind of common name 'catches on' is a matter of culture. All efforts to get members of the public to understand that there is no difference between a "mushroom" and a "toadstool" and to adopt a single common name fail because the public believe that there are two distinct groups of fungi even though there is no taxonomic difference between fungi tagged with one or other of those two terms. Another problem is that common names are (potentially) different in every language...so would you have these scientists invent, document, propagate around 5,000 common names - one in each modern human language? It's tempting to suggest that the same name would be employed in every language - but pronunciation difficulties and overlaps with names for existing organisms or culturally sensitive terms would make that all but impossible. SteveBaker (talk) 16:00, 27 January 2016 (UTC)[reply]
The OP is overlooking the more obvious Hippopotamus and Rhinoceros, which have local names but in English are known by these Latin-based names - or by "hippo" and "rhino", which mean "horse" and "nose" respectively. ←Baseball Bugs What's up, Doc? carrots17:07, 27 January 2016 (UTC)[reply]
Of course, the full names in Greek being "River Horse" and "Horned Nose". (the names derive originally from Greek rather than Latin, though arrive at English through Latin transcription. The native Latin word meaning horse is "equus", c.f. equine. The native Latin word meaning nose is "nasus", hense "nasal".) Of course, both names are wrong. Hippos are not particularly closely related to horses, and the growths on the faces of rhinos are not true horns. So even in the original language of Greek, neither name is related to actual Biology in any way. Such is language. --Jayron32 20:45, 27 January 2016 (UTC)[reply]
  • The other issue is that the vast majority of species don't have common English names at all, because English speakers don't commonly encounter them. Consider the 400,000 different species of Beetle. Of course, we have some names for beetles English speaking people run into every day, like ladybugs/ladybirds or junebugs, or japanese beetles (even these names have multiple species they cover though, and often mean different unrelated species in different geographies). We have 400,000 different latin binomial names for these species, because each needs a unique identifier, but seriously, we don't also need 400,000 unique different English names for them, especially where they aren't beetles anyone runs into in their everyday lives. --Jayron32 20:37, 27 January 2016 (UTC)[reply]
In many cases, the differences between the species may not be significant enough that a non-zoologist recognizes them as different species. The common name may refer to a genus, a family, or an order. Most beetles are just called beetles, unless someone has a reason to identify them more specifically, such as "Japanese beetle" as a garden pest. Even with mammals, and even with large mammals, people don't always see the need for distinctive common names. "Zebra" and "elephant" are not species but groups of species. There usually really isn't a need for a common name for every species. Robert McClenon (talk) 21:50, 27 January 2016 (UTC)[reply]
Yes, but every species and subspecies of zebra also has a common name, like Chapman's zebra and Hartmann's mountain zebra. In the UK, there are common names for hundreds of different types of beetles, for example, those without common names are ones that are really uncommon - beetles nobody refers to in everyday speech. Alansplodge (talk) 22:19, 28 January 2016 (UTC)[reply]

Disadvantages of iontophoresis for administering drugs ?

[edit]

Iontophoresis#Therapeutic_uses doesn't list the disadvantages, but they must be substantial, or I would have expected this method to have replaced needle injections entirely. So, what are the disadvantages ? I'd like to add them to the article. StuRat (talk) 16:43, 27 January 2016 (UTC)[reply]

This [16] is a very specific study about a specific thing, but it says in that one case (of problem, treatments, drugs, etc) "In contrast, electromotive administration of lidocaine resulted in transient pain relief only" compared to other treatments, which were concluded to be better. Here is a nice literature review [17] that has lots of other good refs. I'm no doctor, but I don't get the idea that it was ever intended to replace needles entirely. For one, it seems much slower. Another is that the permeability of skin is different with regard to different size compounds, so some things may be too big to pass through easily. Another potential factor is the stability and reactiveness of the compounds to the electrical field. It's also clearly more expensive and rather new, compared to injections via syringe and hypodermic needle, which are cheap and have been thoroughly studied for efficacy. If you search google scholar, you'll see lots of stuff about bladders and chemotherapy, and nothing about using it as a method to deliver morphine of flu vaccine. I'll leave you to speculate why that might be... Also I think you are vastly underestimating the time scale at which the medical field changes. The key ref from the article [18] is preaching that we should do more research on this, and cites small trials. And it is only from 2012! SemanticMantis (talk) 17:17, 27 January 2016 (UTC)[reply]
I just saw it in a 1982 episode of Quincy, M.E., so it's been around for at least 34 years. If it really could replace all injections, then I would think it would have, by now. StuRat (talk) 17:28, 27 January 2016 (UTC)[reply]
Well, sure, the idea has been around for a while. I'd also suggest a TV show isn't a great record of medical fact. My second ref says "The idea of using electric current to allow transcutaneous drug penetration can probably be attributed to the work done by Veratti in 1745." I agree it can't replace all injections. I agree there must be things it won't work for, and cases where syringes are just better. I'm trying to help you find out what those cases and things are. The references I gave above, and especially the refs within the second, discuss some difficulties and problems, but you'll have to read them to see what they're really talking about and to understand the details. To clarify what I said above, EDMA only works well with an ionized drug. That alone would probably be useful to clarify win the article. As for timing, when I see research articles on EDMA written in the past few years talking about "potential", and "new frontiers," I conclude it is not yet widely used for many things, but it may become more widespread in the future. Maybe someone else wants to find additional references or summarize them for you in more detail, but that's all I've got for now. SemanticMantis (talk) 17:58, 27 January 2016 (UTC)[reply]
I think it's unlikely a TV show would completely make up a medical procedure that didn't exist. StuRat (talk) 21:08, 27 January 2016 (UTC)[reply]
As for flu vaccine, picture running an electrophoresis gel with a mixture of large protein complexes and a small molecule like lidocaine. I'm thinking you wouldn't get one out of the well before the other runs off the bottom. Flu antigen is just not going to go far under the influence of a few plus or minus charges; it's like putting a square sail on a supertanker. Wnt (talk) 18:31, 27 January 2016 (UTC)[reply]
Isn't there a nasal spray flu vaccine ? That implies that it can pass through the skin on the inside of the nose. Is the diff between that skin and regular skin so much that electricity can't overcome it ? StuRat (talk) 21:10, 27 January 2016 (UTC)[reply]
Live attenuated influenza vaccine goes through cells by the usual receptor-mediated process. Even the most delicate mucosa, like the rectum, shouldn't let viruses or other large proteins slip past - HIV actually finds its CD4 receptors on the epithelial cells, as far as I recall. Wnt (talk) 23:15, 27 January 2016 (UTC)[reply]
An obvious limitation is spelled out in the article but is easily missed: the substance needs to be charged. But for chemicals to penetrate where they need to be in cells, often you want them to be neutral. To give kind of a bad example, crack cocaine is a neutral alkaloid, while the cocaine powder is a salt, and clearly the user notices the difference. There are many substances of course which can be charged if the pH is weird enough... but I think that means exposing not only the outside of your skin but the interstices of the cells to the weird pH; otherwise the stuff could end up stuck somewhere in the outer layers of skin with no prospect of moving further. That said, testing my idea, I found [19] which says that lidocaine and fentanyl have been delivered by this route. Fentanyl has a strongest basic pKa of 8.77 [20] so apparently this is not insurmountable. That reference also says it has been used on pilocarpine and ... tap water??? Reffed to this, which says the mechanism is not completely understood (!) but I don't expect to have access. Well, this is biology, a field that is under no obligation to make sense, since the cells can react however they want to an applied current. I should look further... Wnt (talk) 18:25, 27 January 2016 (UTC)[reply]
Yes, that tap water mention in our article shocked me, too. Is it really safe to inject that, or get it under your skin by any other mechanism ? (I realize some is absorbed through the skin when you take a bath, but even that can cause cell damage given enough time.) StuRat (talk) 21:00, 27 January 2016 (UTC)[reply]
I would call attention again to the question of cost (which SemanticMantis did bring up). I'm pretty sure an iontophoresis machine costs more than a needle and syringe. And even though the main machine is probably reusable, I imagine the part applied to the skin needs to be single-use for hygiene reasons. If the issue is simply the patient disliking injections, there are probably cheaper measures, like applying topical anesthetic before the injection. There's also been increasing attention given to intradermal injections, which require a much smaller needle and thus reduce discomfort. --71.119.131.184 (talk) 05:01, 28 January 2016 (UTC)[reply]
There are lots of issues with injections, beyond discomfort. It is an injury after all, and repeated injuries to the same area cause cumulative damage. They somewhat reduce this problem by changing injection sites, but for people who need constant injections, it's still an issue. StuRat (talk) 05:44, 28 January 2016 (UTC)[reply]

As for it being slower than an injection, that could actually be an advantage. My Dad had iron injections, and there was apparently a problem with too much iron in too small of an area, causing severe cramps. If that could be done more slowly, hopefully the iron would have time to distribute more evenly. They could also do this with a slow IV drip, but then you have the issue of that excess fluid. StuRat (talk) 05:48, 28 January 2016 (UTC)[reply]

Why do humans around the world cover the genitals?

[edit]

Depending on the culture, humans may or may not cover the breasts or the nipples. However, across most cultures, it seems that humans cover the genitals. Is this a universal human trait? Are there human societies that don't cover the genitals? I remember watching a film adaptation of Romeo and Juliet, and the setting looked as if it took place during the Italian renaissance. The men in the motion picture dressed themselves in long pants that really highlighted their genitals. But they still wore clothing that covered them. 140.254.229.129 (talk) 18:29, 27 January 2016 (UTC)[reply]

We have many good articles that relate to this issue. See modesty, nudity, nudism, taboo, as well ass public morality and mores for starters. No, the trait of hiding one's genitals is not completely universal among humans. If you look through the articles above, you'll see there are exceptions in various places/times/cultures. Nature vs. nurture and enculturation may also be worth looking in to. SemanticMantis (talk) 18:58, 27 January 2016 (UTC)[reply]
"ass public morality" isn't covered in the linked article. Unless you count the links to regulation of sexual matters, prostitution and homosexuality and other articles which may cover ass public morality. Nil Einne (talk) 19:01, 27 January 2016 (UTC)[reply]
Codpiece. Sagittarian Milky Way (talk) 20:10, 27 January 2016 (UTC)[reply]
Merkin too if we're listing such things. SemanticMantis (talk) 20:56, 27 January 2016 (UTC)[reply]
If merkins are outerwear in a Romeo and Juliet film it isn't historically accurate. Sagittarian Milky Way (talk) 22:44, 27 January 2016 (UTC)[reply]
Simplifiable: If merkins are outerwear in a Romeo and Juliet film it isn't historically accurate.  ;-). --Stephan Schulz (talk) 14:05, 28 January 2016 (UTC)[reply]
Aside from moral/sexual issues, there are also practical reasons:
1) Hygiene. Do you really want to sit in a chair after a woman menstruated on it or a man's penile discharge dripped on it ? Of course, exposed anal areas are even more of a hygiene problem, but it's hard to cover one without the other (especially in the case of women).
2) Safety. An exposed penis is a good target for dogs or angry people, as are testicles.
3) Cold. Unless you happen to live in a tropical area, it's likely too cold for exposed genitals a good portion of the year. StuRat (talk) 21:06, 27 January 2016 (UTC)[reply]
On StuRat's #3, note that the Yaghan people of Tierra del Fuego were one of the better-known <insert politically-correct word for "tribes" here> who didn't wear clothes, despite the maximum temperature in the summer in that part of the world being only about 10 C. Tevildo (talk) 22:17, 27 January 2016 (UTC)[reply]
Why did they do that? I've heard that's why it's called Tierra del Fuego (they just stood around fires their whole lives). Sagittarian Milky Way (talk) 22:48, 27 January 2016 (UTC)[reply]
Nay, the article says that they didn't spend *all* their time around the fires ... to the contrary, the women went diving in very cold ocean waters for shellfish. I have no idea, but I wonder if their lifestyle helped with cold adaptation, so these dives wouldn't be fatal?? Wnt (talk) 01:25, 28 January 2016 (UTC)[reply]
The assumption in the question is false. In Ecuador, in some tribes the men tie the penis to a string around the waist. Supposedly this is to keep fish from swimming up it when they bath in the river. National Geographics in past decades always had photos of naked natives. Edison (talk) 04:46, 28 January 2016 (UTC)[reply]
The candiru is a real thing... though it seems a rare accident to us, I imagine that people out doing survivalism daily have more exposure ... and it sure makes a big impression. Wnt (talk) --- hmmm, reading our article I just linked I'm not so sure it's a real thing. Wnt (talk) 16:18, 28 January 2016 (UTC)[reply]
One idea I've heard: When other primates stand face to face, their naughty bits typically are not readily visible; but we walk around in what amounts to a display posture, which our relic instincts can see as a challenge or invitation (depending on the parties' sexes), creating unnecessary social awkwardness. —Tamfang (talk) 09:27, 28 January 2016 (UTC)[reply]
I hadn't thought of that. Are there any animals with both eyes and genitals or pubes visible from in front instead of behind or on the ground? How close to humans do they get (evolutionarily)? Sagittarian Milky Way (talk) 12:07, 28 January 2016 (UTC)[reply]
Not really. Humans are relatively unique among mammals in that our mostly hairless nature and bipedalism make our genitals fairly visible from the front. But many primates have far more visible genital areas, related to estrus signalling just google /[primate of choice] estrus signal/, baboons are particularly notable. Baboons are particularly notable. But humans don't telegraph ovulation, see links below. SemanticMantis (talk) 15:13, 28 January 2016 (UTC)[reply]
Maybe because humanoids are sexy all the time it became impractical for most (but not all) societies to continue the nakedness. Other primates might be able to get away with very obvious estrus because losing all attraction for females without the rare red buttocks was rewarded by Darwin. (the promiscuous bonobo seems to have gotten around this by becoming so jaded by sex that they go back to eating etc. in seconds (if a bonobo loses patience and smacks an annoying child the mother will retaliate and then they will have sex for like 3 seconds and all is forgiven) Sagittarian Milky Way (talk) 13:35, 29 January 2016 (UTC)[reply]
Yes, this is the sort of thing Desmond Morris gets in to. Also related to how, even naked, female human genitals are reduced and less visible, comparted many primate analogs. Fair warning though, many current scientists see such views of sociobiology as largely Just-so stories, though there is some slightly more rigorous contemporary research along these lines. Some slightly related info at Concealed_ovulation#Concealed_ovulation_as_a_side_effect_of_bipedalism. Here's a popular article about visibility of human genitals compared to other primates, [21], and the the related scholarly article is here [22]. SemanticMantis (talk) 15:13, 28 January 2016 (UTC)[reply]
One of my personal favorites is the traditional penis gourd (koteka) of Papua New Guinea. Natives traditionally wore a dried gourd over the penis (and tied to the scrotum and waist) and nothing else, which leaves even the testicles visible. Natives who lost their gourd for whatever reason would nonetheless hide themselves out of an apparent sent of modesty, even though the difference in what was visible with and without a penis gourd would be very small. Dragons flight (talk) 12:29, 28 January 2016 (UTC)[reply]
I don't know if you intended to generalize about PNG, but if you did, don't, and if you didn't, this is for other readers: New Guinea is so mountainous, and its tribes so often isolated from each other, that it has a surprisingly big fraction of the world's living languages; any assertion that's true about one NG tribe is likely to be false about another. —Tamfang (talk) 12:16, 29 January 2016 (UTC)[reply]

Just my own observation but humans are over-fascinated with sex far beyond any other species. Culturally, we pay far more attention to sexuality than nearly any species. Any nature TV show on animals always focuses on mating and mating habits. Humans also seem to be unique (or at least a rarity) in that the female, even though is rate limited in reproduction, puts a lot of effort in being visually appealing (i.e. see cosmetics industry, supermodels for clothing, shampoo, etc, etc) even though males aren't particularly choosy in who they will copulate with (birds are rate limited too but it's the male that generally works appearance). Humans, it seems to me, are very close to bonobos primates with a non-stop emphasis on sexual activity (heck, just read section 4.3 to see the over-sexed species called humans writing about an over-sexed chimpanzee - WP even throws in a Great Ape face-to-face sexual encounter gratis). Clothing seems like it is used to control that emphasis especially concealing arousal but also causing it. I just finished watching a show on dogs/wolves. Since it was made by humans, a large segment was devoted to mating, dominance, hierarchy and even showing sneaky mating by non-alpha wolves. Then I looked at my dog and it's obvious they don't give a shiat about human mating but they are very interested in what we eat. I imagine documentaries made by dogs would be 40% food, 40% butts and 10% on mating behavior regardless of what animal. People, seem to make documentaries that are 80% on mating behavior and 20% on how mating behavior affects them. The emphasis on sex and sexuality seems wired in to how humans view the world. If dogs wrote wikipedia, section 4.3 of the bonobos article would be what they taste like. --DHeyward (talk) 07:23, 29 January 2016 (UTC)[reply]

Scientific description of Anelasmocephalus hadzii (Martens, 1978)

[edit]

Hello! Anelasmocephalus hadzii is a species of Harvestman, described by someone called Martens (don't ask) in 1978, but in what paper did Martens describe this Harvestman? I cannot find the answer on any obvious sources, but maybe you will have more luck? Megaraptor12345 (talk) 21:50, 27 January 2016 (UTC)[reply]

Google scholar finds 7 relevant records from Martens in 1978 [23], but I think there are only two publications, and the rest are spurious bibliographical records of the one rather famous work Spinnentiere, Arachnida: Weberknechte, Opiliones. People have been citing it as recently as the last few months (presumably some as a species authority), but I don't read any of the languages of the most recent citing works listed here [24].
This [25] Opiliones wiki says the book is great and describes many European species, and has a photo of the title page, but says the book is hard to find (surprise). Anyway, it seems very very likely that the species is described in the book published by Fischer Verlag, Jena, 1978. Either that, or Martens published a paper describing a harvestman species in 1978 without using the word "Opiliones" (unlikely) or Google doesn't know about it (possible, but still unlikely IMO). This is all just subjective evidence of course, if you need to be sure, I think you'll need to get a hard copy and someone who reads German. I'd imagine most research libraries could get you a copy through interlibrary loan. SemanticMantis (talk) 22:35, 27 January 2016 (UTC)[reply]