Jump to content

Wikipedia:Reference desk/Archives/Science/2016 February 8

From Wikipedia, the free encyclopedia
Science desk
< February 7 << Jan | February | Mar >> February 9 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


February 8

[edit]

marshy gas from mines

[edit]

as during mining ,the marshy gas are evolve ,why this happen? please give the scientific reason.https://en.wikipedia.org/w/index.php?action=edit&preload=&editintro=&preloadtitle=&section=new&title=Wikipedia%3AReference+desk%2FScience&create=Ready%3F+Ask+a+new+question%21# — Preceding unsigned comment added by Shahjad ansari (talkcontribs) 02:23, 8 February 2016 (UTC)[reply]

See Methane#Occurrence. AllBestFaith (talk) 10:55, 8 February 2016 (UTC)[reply]
See also firedamp. The methane is produced as coal is heated (due to progressive burial) and some of it is retained in the rock when the coal becomes uplifted sufficiently to mine, where it can be a problem. Mikenorton (talk) 21:42, 8 February 2016 (UTC)[reply]

Formula for lens

[edit]

give the formula equation for lens ,in which one longitudinal part areat n1 refractive index , second part at n3 refractive index and lens of n2 refractive index.https://en.wikipedia.org/w/index.php?action=edit&preload=&editintro=&preloadtitle=&section=new&title=Wikipedia%3AReference+desk%2FScience&create=Ready%3F+Ask+a+new+question%21# — Preceding unsigned comment added by Shahjad ansari (talkcontribs) 02:32, 8 February 2016 (UTC)[reply]

Sorry, we don't do your homework for you. Check the articles Refraction and Lens (optics) for the info you need. 2601:646:8E01:9089:14B5:216D:30B1:F92 (talk) 10:35, 8 February 2016 (UTC)[reply]

Possible to change taste buds in adulthood?

[edit]

I'm 20 and hate the taste of vegetables unless it's been thoroughly cooked and/or mixed with other flavours. Could I change that and if so is there a known method? 2.103.13.244 (talk) 02:54, 8 February 2016 (UTC)[reply]

Apparently it's in your genes. Googling "why some people vegetables" throws up some interesting links, including this one which suggests you need "bitter blockers".--Shantavira|feed me 11:08, 8 February 2016 (UTC)[reply]
Technically that's a medical diagnosis, and we aren't supposed to do that. It's certainly possible that there would be some other mechanism in this case besides genetics, which is almost never 100%. Wnt (talk) 12:41, 8 February 2016 (UTC)[reply]
Technically, that isn't a medical diagnosis, it's a biology reference. See User:Kainaw/Kainaw's criterion. Unless we're telling someone that a) they have a disease or b) what the disease is likely to do to them personally or c) how to treat their diseases, there is no problem with providing answers about human biology. --Jayron32 15:09, 8 February 2016 (UTC)[reply]

"Apparently it's in your genes" diagnosis "this one which suggests you need "bitter blockers" treatment. μηδείς (talk) 18:55, 8 February 2016 (UTC)[reply]

I think you're a bit too keen to be jumping on the 'medical advice' bandwagon. This isn't a question about a medical complaint, pointing out that it's genetic is not a diagnosis and offering links for the OP to follow up is not prescribing treatment Mike Dhu (talk) 10:12, 9 February 2016 (UTC)[reply]
Have a look at our long, detailed, and well-referenced article taste. It's complicated, and involved taste buds, but also psychology, nutritional needs, evolutionary past, culture, childhood development, exposure, etc. etc. Most people I know enjoy some foods at age 40 that they did not at age 20. Here's a selection of articles that discuss aspects of how taste perception can change with age [1] [2] [3]. Here's a freely accessible article that discusses a bit about how children's diet preferences are shaped by the adults around them, and you might find it interesting background reading [4]. We have some references for treatment of [[5]] and also Avoidant/restrictive_food_intake_disorder#For_adults, so I would look at the refs there if I wanted to learn more details about methods for expanding my taste preferences. SemanticMantis (talk) 15:40, 8 February 2016 (UTC)[reply]
My experience is that a lot depends on how the food is cooked. Generally (as our OP mentions), brief cooking retains flavor and long cooking destroys it. Generally, short cooking is what people want because they crave the maximum amount of flavor - but I suppose that if you don't like those flavors then the reverse might be the case. Unfortunately, cooking for too long destroys much of the nutritional benefits of eating vegetables - and also destroys any crunchy, textured sensations and reduces them to an unpleasant mush. Honestly, I'd recommend re-visiting the taste of lightly cooked (or even raw) veggies...and if that's still unpleasant, dump them into some kind of sauce that you like. A chili or curry-based sauce will annihilate the taste of almost anything! Also, it's a horrible generalization to say that you don't like "vegetables" - there are hundreds of different kinds out there - and they don't all taste the same. Gone are the days when you had a choice between carrots/broccoli/cabbage/peas/french-beans/corn. Now you can get 'baby' versions of lots of things - there are 50 kinds of beans out there - there are leafy greens of 20 different kinds to choose from - there are things like asparagus (which used to be ruinously expensive - and now isn't), avocado and artechokes to play around with. It would be really surprising if you hated all of them, and even more surprising if you hated all of them no matter how they were prepared. Modern cuisine encourages us to mix weird, contrasting things together - so go ahead and mix jalapeno peppers, a little melted chocolate and peas (yes, really!) - or cook your cabbage in orange juice instead of water (one of my personal favorites!) - or mix nuts and fruit into a green salad. There is no "wrong" answer here.
I grew up in an environment where veggies were low in variety, and invariably over-cooked. When I married my first wife (who is an excellent French cook) - my eyes were opened to the incredible array of better options out there. SteveBaker (talk) 17:24, 8 February 2016 (UTC)[reply]
My experience changing what I drink may be helpful. In my 20's I drank Mountain Dew (high sugar soft drink). Then I switched to herbal tea, but needed lots of sugar in it to make it palatable. I then gradually reduced the amount of sugar, and now I don't need any. So, I suggest you initially mix just a bit of veggies with something you like, then gradually change the ratio until it's mostly veggies. StuRat (talk) 17:30, 8 February 2016 (UTC)[reply]
Incidentally, I notice that our OP recently asked a question about eating fruit that suggests that (s)he doesn't eat that either. That's a more worrying thing. SteveBaker (talk) 17:41, 8 February 2016 (UTC)[reply]
I think Mouthfeel is something you may want to look at, along with food neophobia and there's also RFID, an escalated version of picky eating. It's interesting that SteveBaker mentions the texture of food. I wouldn't touch vegetables until my early 30s, even though I had a girlfriend who worked as a chef at The Savoy in London (I'm sure your wife is much better Steve!). I disliked the "flavor" of foods from my childhood until my early 20s and retrospectively I think it was more the texture I didn't like. Mike Dhu (talk) 17:09, 9 February 2016 (UTC)[reply]
The thing with texture is that you can play around with it to an amazing degree. Consider just the potato. You can have creamy mashed potato, mashed potato with deliberate chunks of potato and/or skin in it, you can have french fries, boiled potatoes (with and without skin) and also roasted and baked potato. You can do hash-browns or fry crispy potato skins - or you can make potato chips. That's a MASSIVE variation in texture and crunch with just one vegetable being involved. With creativity, you can do similar transformations with other veggies too. If you don't like (say) peas - rather than just having warm round things - you can cook them, mash them, form them into patties, then fry them ("Peaburgers"!) - or you can blend them into a smoothie or a soup - there are lots of options if you're prepared to be creative and are open to trying new techniques. SteveBaker (talk) 17:27, 9 February 2016 (UTC)[reply]
I totally agree with your points re the texture of food, but my point to the the OP was that the texture and the flavor of food may be interlinked. I like the taste of creamy mashed potato (not a vegetable of course), but lumpy mashed potato is something I can't eat, I find the lumps in it unpalatable, not because of the taste per se, but because I don't like the texture of it. Mike Dhu (talk) 19:19, 9 February 2016 (UTC)[reply]
Yeah - you probably don't want to go there. What is a "vegetable" and what isn't is a topic of frequent and prolonged debate around here. Bottom line is that there is a strict scientific definition, a strict culinary definition and a whole messy heap of what-people-think-a-vegetable-is. From the lede of Vegetable:
"In everyday usage, a vegetable is any part of a plant that is consumed by humans as food as part of a savory meal. The term "vegetable" is somewhat arbitrary, and largely defined through culinary and cultural tradition. It normally excludes other food derived from plants such as fruits, nuts and cereal grains, but includes seeds such as pulses. The original meaning of the word vegetable, still used in biology, was to describe all types of plant, as in the terms "vegetable kingdom" and "vegetable matter"."
So...um...I claim victory. A potato is a vegetable. <ducks and runs> SteveBaker (talk) 20:57, 9 February 2016 (UTC)[reply]
I can see how that could lead to a very lengthy discussion, and in my mind I always thought of potatoes as a vegetable, in the same way that I think of poultry and fish as meat (although I've just looked at the meat article and see the same situation applies). Anyway, good job you ducked (bad pun, I know!) Mike Dhu (talk) 11:08, 10 February 2016 (UTC)[reply]

Falling from a building

[edit]

If someone fell from the fifth floor of a building, would they die or just be badly hurt? 2607:FB90:1225:2047:A4E6:5421:24F2:7B82 (talk) 03:49, 8 February 2016 (UTC)[reply]

It depends how they land and what they land on. ←Baseball Bugs What's up, Doc? carrots03:59, 8 February 2016 (UTC)[reply]
If they land on concrete? 2607:FB90:1225:2047:A4E6:5421:24F2:7B82 (talk) 04:12, 8 February 2016 (UTC)[reply]
Then it depends on how they land. But their odds are not good. Here is someone's idea for a strategy. ←Baseball Bugs What's up, Doc? carrots04:16, 8 February 2016 (UTC)[reply]
It would be far better to land on a Life net. That's a little article I wrote a few years ago. Cullen328 Let's discuss it 04:20, 8 February 2016 (UTC)[reply]
Obviously. But the OP specified concrete. ←Baseball Bugs What's up, Doc? carrots05:02, 8 February 2016 (UTC)[reply]
On page 17 of this OSHA document [6], figure 6 shows the distribution of workplace fatalities as a function of number of feet fallen. From that, you can see that a small number of people died after falls of less than six feet - and most people in the workplace who die after falling, fell less than 40 feet...which is less than 5 floors. So for sure, lots of people die every year from falling fell from considerably less height than the 5th floor.
A few other sources I checked with suggest the the risk of death starts to go up sharply at falls of around 8 to 10 meters - with about a 50/50 chance of dying if you fall from 15 meters and a near certainty of dying at around 25 meters. A typical building floor height is about 3.5 meters - so 5 floors would be 17.5 meters - and that's about a 75% chance of death. But there really is no 'safe' fall height. People trip and fall and whack their heads against something as they reach ground level and die as a result - so even a fall from zero height can be fatal.
CONCLUSION: If you fall from the 5th floor - you have roughly a 3 in 4 chance of dying - there is no 'safe' distance.
SteveBaker (talk) 04:59, 8 February 2016 (UTC)[reply]
Would it be a quick death or a long and agonizing one? 2607:FB90:1225:2047:A4E6:5421:24F2:7B82 (talk) 15:13, 8 February 2016 (UTC)[reply]
I don't see any data on that. One would presume that a head-first impact would be quick - and feet-first much less so - but it's very hard to say, and as skydivers soon discover, bodies rotate during free-fall in ways that can be hard to control. I wouldn't want to make any bets on that one. SteveBaker (talk) 17:07, 8 February 2016 (UTC)[reply]
Quick, call the Mythbusters before they're cancelled! FrameDrag (talk) 20:48, 8 February 2016 (UTC)[reply]

Is it best for a man/woman to see a male/female psychiatrist respectively?

[edit]

Just curious if it's generally best for a man to see a male or female psychiatrist and for a woman to see a male or female psychiatrist, or if there's no recommendation in the psychology community. 2.103.13.244 (talk) 05:22, 8 February 2016 (UTC)[reply]

Most psychiatrists base their treatment on pills. I hardly see how it could matter the gender of those who prescribes you pills. Psychiatrists are also not necessarily part of the psychology community, they could be psychotherapists too, but primarily they are physicians. I suppose you want to know whether gender of psychologists, psychotherapists, counsels and the like matter.
On the practice it's clear that psychiatrists are mostly male, and the psychology community is mostly female. That reduces your chances of picking a specific gender. Anyway, the role of gender in the quality of psychotherapy seems to be negligible, in the same way that you don't need a therapist with the same age, religion, race, as you. I see that it could even be an advantage to have a certain distance from your therapist, since you both are not supposed to enter a private relationship. --Llaanngg (talk) 11:35, 8 February 2016 (UTC)[reply]
[citation needed] for a lot of this, perhaps most importantly on the first sentences of each paragraph. SemanticMantis (talk) 15:30, 8 February 2016 (UTC)[reply]
SemanticMantis, here they are:
[7] "Like many of the nation’s 48,000 psychiatrists, Dr. Levin, in large part because of changes in how much insurance will pay, no longer provides talk therapy, the form of psychiatry popularized by Sigmund Freud that dominated the profession for decades. Instead, he prescribes medication, usually after a brief consultation with each patient"
[8] "Psychiatry, the one male-dominated area of the mental health profession, has increasingly turned to drug treatments."
[9]: The changing gender composition of psychology.
And [10] Need Therapy? A Good Man Is Hard to Find. "He decided to seek out a male therapist instead, and found that there were few of them."
I do admit though that the effect of gender matching with your therapist (or not) is debatable. The debate is still open. I suppose it comes down to the patient's world-view. If it's important for the patient, then probably it can influence outcome. The same probably applies to ethnicity. --Llaanngg (talk) 09:56, 9 February 2016 (UTC)[reply]
[11]"As Carey's timely article notes, there is nothing in the rather limited mainstream scientific literature on gender and treatment outcome suggesting unequivocally that either males or females make better, more effective psychotherapists."
[12] "a female therapist genuinely is able to help a male client as well as a female client, and a male therapist is truly able to help a male client as well as a female client, the fact is that if a client comes in with a pre-conceived notion about the therapist based on gender, it has the potential to affect treatment if not addressed."
--Llaanngg (talk) 09:56, 9 February 2016 (UTC)[reply]
User:Llaanngg, thank you. Your claims sounded reasonable, but this is, after all, a reference desk :) SemanticMantis (talk) 14:51, 9 February 2016 (UTC)[reply]
For some people, maybe. A psychiatrist is indeed different than a psychologist, but gender match in medical and therapeutic professions can indeed be a factor in outcomes. Here is a study that specifically looks at effects of gender matching in adolescents [13]. That one is freely accessible, these two studies [14] [15] are not, but they also discuss gender matching in therapeutic contexts. Note that all three also discuss matching of ethnicities as a potential important factor too. SemanticMantis (talk) 15:30, 8 February 2016 (UTC)[reply]

Having been treated by half a dozen psychiatrists and therapists, I will say that the race/culture, age and gender of your treatment providers definitely matters in some cases, even for "pill prescribers" because your story may sound different to different doctors. For example, I've been routinely noted to have "poor eye contact" and be diagnosed with borderline personality disorder and bipolar disorder by old white men, but younger psychiatrists are more up to date on neuroscience research and my female psychiatrists (including a South Asian) tend to agree with post-traumatic stress disorder or complex PTSD. Also Asian treatment providers definitely get cross-cultural struggles and Asian cultural values like conflict aversion, whereas white providers often don't, frequently chalking it up to some personality defect or saying that you're "non-assertive". Yanping Nora Soong (talk) 16:06, 8 February 2016 (UTC)[reply]

I'd say that if it's important for you as a patient, then, it is important for the outcome. However, I don't believe it is a general factor per se.Llaanngg (talk) 09:56, 9 February 2016 (UTC)[reply]
That's easy for one to say if one isn't queer (LGBT), a person of color, etc. Yanping Nora Soong (talk) 15:19, 15 February 2016 (UTC)[reply]

cramps or a "charley horse" after orgasm

[edit]

My girlfriend often has serious cramps (or a charley horse)after she has an orgasm. The cramp is usually in her lower left calf. This is not a medical question. I am just curious how an orgasm and a cramp in the lower leg can be connected (given the very different muscles involved). 147.194.17.249 (talk) 05:41, 8 February 2016 (UTC)[reply]

For bemused readers.... Charley horse. Ghmyrtle (talk) 08:49, 8 February 2016 (UTC)[reply]
Orgasm often involves muscular contractions not just in the groin area, but throughout the body -- so in some cases, different muscles can cramp after orgasm. (I know first-hand, I've pulled a leg muscle once or twice during sex.) FWIW 2601:646:8E01:9089:14B5:216D:30B1:F92 (talk) 08:42, 8 February 2016 (UTC)[reply]
Differ love and porn! Porn can be violent. In some cultures sex is a secret and porn is the only “manual” and not a good advice at all. We have wikipedia and it sould give some more reliable information. The next step is You to care what You are doing. But some human are very fragile. When the charley horse is always on the same place You can find the reason. --Hans Haase (有问题吗) 11:37, 8 February 2016

(UTC)

Does Hans Haase 有问题吗's post above make sense to someone? In this case and in previous cases too I am unable to even guess what he's trying to say. --Llaanngg (talk) 11:45, 8 February 2016 (UTC)[reply]
Yes, I get the basic gist of it, and I usually can with Hans' posts. Then again, I have lots of experience reading listening to ESL. Respectfully, this is not the best place for such comments and discussion. SemanticMantis (talk) 15:19, 8 February 2016 (UTC)[reply]
Our articles on this are really, really bad. Charley horse confounds multiple conditions and multiple colloquial terms until there's no telling what is what. Cramp does virtually the same - it is hard for me to accept that the usual sort of "charley horse" has anything to do with failure of ATP to loosen muscles, since generally it is a sudden onset of a muscle contraction. We'll have to look this one up from scratch... after which, we might want to rewrite those articles quite nearly from scratch. Wnt (talk) 12:06, 8 February 2016 (UTC)[reply]
I should share the first good reference I found at [16] (I just did a PubMed search for leg cramp and this was one of the first things) Apparently there is a treatment for leg cramps ...... it involves injecting 5 ml of 1% lidocaine into the "bifurcation of the branches that is located in the distal two-thirds of the interspace between the first and second metatarsals" - this is a nerve block of "the medial branch, which is the distal sensory nerve of the deep peroneal nerve". The site is on the inside of the base of the big toe. The effect was to reduce cramps by 75% over a two-week study period. As part of their discussion they say

The mechanism(s) of leg cramps are yet to be clarified, but disturbances in the central and peripheral nervous system and skeletal muscle could be involved (Jansen et al. 1990; Jansen et al. 1999; Miller and Layzer 2005). Electrophysiologically, cramps are characterized by repetitive firing of motor unit action potentials at rates of up to 150 per sec. This is more than four times the usual rate in maximum voluntary contraction (Bellemare et al. 1983; Jansen et al. 1990). In a human study, Ross and Thomas indicated a positive-feedback loop between peripheral afferents and alpha motor neurons, and that this loop is mediated by changes in presynaptic input. This loop is considered a possible mechanism underlying the generation of muscle cramps (Ross and Thomas 1995). The frequency of nocturnal leg cramps has also been suggested to result from changes in hydrostatic pressure and ionic shift across the cell membrane in the calf muscles in the recumbent position, inducing hyperexcitability of the motor neurons. Consequently, the pain of the cramps may be caused by an accumulation of metabolites and focal ischemia (Miller and Layzer 2005). The difference in these conditions in each patient may explain the diverse symptomatology of the cramps.

So the thing I'm thinking of is possibly, not certainly, related to some kind of feedback, possibly via the spine only, between sensation of what the body part is doing and a motor response. It seems easy to picture how infrequent activities might somehow jiggle such a sensitive mechanism. Honestly, because this is a regulated phenomenon with different characteristics than usual contraction, I'm not even entirely sure it is pathological - for all I know, the body might be administering it as some sort of health intervention on itself. Note that I definitely cannot and will not diagnose the woman involved here - there are a thousand things she could be experiencing that aren't what I have in mind. Wnt (talk) 12:25, 8 February 2016 (UTC)[reply]

Have the OP and his girlfriend tried different positions? Seriously: I myself often used to (and still occasionally do) get leg cramps when sitting on a hard chair for extended periods – this first arose during long services in a cramped (heh!) school chapel – but avoiding such a position makes them much rarer. It may be that different postures during the act might change the forces on the relevant muscles sufficiently to lessen the problem. {The poster formerly known as 87.81.230.195} 185.74.232.130 (talk) 15:19, 8 February 2016 (UTC)[reply]

Jump cushion

[edit]

Are jump cushions ever used in firefighting in lieu of life nets? If so, how effective are they? Do they even actually exist, given that they're not on Wikipedia? 2601:646:8E01:9089:14B5:216D:30B1:F92 (talk) 10:31, 8 February 2016 (UTC)[reply]

See [17]. Quoted maximum jump height is 40m. AllBestFaith (talk) 10:49, 8 February 2016 (UTC)[reply]
Thanks! 2601:646:8E01:9089:14B5:216D:30B1:F92 (talk) 05:57, 9 February 2016 (UTC)[reply]

How many defecators?

[edit]

Is it possible to come up with a reasonable estimate of how many humans are defecating at any given moment? -- Jack of Oz [pleasantries] 11:56, 8 February 2016 (UTC)[reply]

If I were to pull a number out of my ass...50 million. Make a ballpark assumption the average human spends 10 minutes a day pooping, seven billion humans, and there you go. Should be within an order of magnitude of reality. Someguy1221 (talk) 11:59, 8 February 2016 (UTC)[reply]
Thanks for the key, Someguy (I've been away). (I quibble with the assumption of 10 minutes per day per person, but I can adjust the calculation.) -- Jack of Oz [pleasantries] 09:37, 14 February 2016 (UTC)[reply]
Given that there are certain times when defecation is more likely (when you get up in the morning, and perhaps also before bed in the evening), the number doing it at any given time may depend on the population density of the time zones matching those times of day. First thing in the morning in China is likely to see a lot more poopers than the similar time in the mid-Pacific. — Preceding unsigned comment added by 81.131.178.47 (talk) 14:37, 8 February 2016 (UTC)[reply]
Today's SMBC comic [18] is highly relevant to this question [19] . SemanticMantis (talk) 18:29, 8 February 2016 (UTC)[reply]
Which of those two links should I follow? —Tamfang (talk) 08:10, 10 February 2016 (UTC)[reply]

Perspective machines

[edit]

What's a perspective machine, or in particular, a railroad perspective machine? The main source for Nester House (Troy, Indiana) says "The building's 1863 design is attributed to J. J. Bengle, the inventor of the railroad perspective machine." Google returns no relevant results for <perspective machine>, and the sole result for <"railroad perspective machine"> is this main source. Nyttend (talk) 15:46, 8 February 2016 (UTC)[reply]

I haven't the foggiest but my guess would be that he invented a machine that helped with making accurate perspective drawings. Architectural drawings showing a building from an angle are normally axonometric projections where parallel lines stay parallel rather than using perspective. A nice perspective drawing helps with selling a design to a client. Dmcq (talk) 16:20, 8 February 2016 (UTC)[reply]
Just had a look around and machine like what I was thinking of, the 'perspectograph plotter', was made in 1752 by Johann Heinrich Lambert, see [20], which is before that man's time. So it was either something else or a refinement on that. Dmcq (talk) 16:39, 8 February 2016 (UTC)[reply]
There are several kinds of quasi-realistic perspective - "single point" and "two point" being the most commonly mentioned. I wonder whether the term "railroad perspective" might refer to single-point perspective - implying that the way that two parallel railroad rails seem to meet at the horizon. This is just a guess though...take it with a pinch of salt! SteveBaker (talk) 17:04, 8 February 2016 (UTC)[reply]
Yes, long parallel straight lines are relatively rare in nature, and in that time frame railroad rails would have been an ideal application for a perspective drawing. StuRat (talk) 17:22, 8 February 2016 (UTC)[reply]
My thoughts exactly. Thinking about a railroad "perspective-machine" didn't get me very far - but thinking in terms of a "railroad-perspective" machine definitely makes me suspect that we're thinking in terms of a single-point projection. Our article on Perspective mentions the word "railroad" three times when discussing this - so I'm starting to believe that this must be what's meant here. SteveBaker (talk) 17:31, 8 February 2016 (UTC)[reply]
Typeset content describing the building in the cited PDF says "railroad perspective machine" and "Bengle", but the hand-written inscription on the drawing of the building says "railway perspective machine" and spells the name "Begle" (no "n" in it). Googling for "railway pespective" finds tons of hits for the same one-point perspective that SteveBaker suspected. I'm not finding anything in Google's patent database for Begle though ("perspective" is a poor search term, since damn near every object patent includes a perspective drawing of it). DMacks (talk) 20:29, 8 February 2016 (UTC)[reply]
This newspaper article confirms that a "J. J. Bengle" lived in Denison, TX in 1907. I don't know how that ties in with any other known dates and places of residence of the architect. The newspaper article does not give any helpful details - "J. J. Bengle has returned from a trip to Galveston and other points." That's it in its entirety, I'm afraid. Tevildo (talk) 21:01, 8 February 2016 (UTC)[reply]
I often wonder how people would feel, knowing that their only mark on modern history is the fact that they once returned from Galveston. :-( SteveBaker (talk) 15:17, 11 February 2016 (UTC)[reply]
When my father was employed by the State Railways many years ago, as an Inspector of Permanent Way, he showed me a device he used which I recall was called a "perspective sight". It was essentially a modified pair of binocculars. It is critical that ralway lines be accurately parallel and straight, but get out of true over time for various reasons. Bad weather (washouts from exceptionally heavy rain) and extremely hot days can cause the lines to buckle. If you look with the naked eye, you cannot see buckling that will derail a speeding train. Binocculars foreshorten perspective, so if you stand between the two railway lines and look along the track with binocculars, you see the distance reduced, and because of the binoccular's magnification, any buckling becomes easily visible. The binocculars the Railway supplied (the "perspective sight") had an adjustable pair of lines that converge on a point (the vanishing point). You adjusted the lines so that they aligned with the railway lines - giving a minor advantage in seeing any buckling. There were horizontal calibation marks (which have non-linear spacing due to viewing height & perspective) so that the inspector could say to the maintenance crew things like "go forward 320 metres and straighten there." They had a special instrumented carriage for detecting rail missalignment, but the binocculars facilitated a quick response to any problem due to extreme weather, regardless of where the instrument carriage was. 1.122.229.42 (talk) 00:53, 9 February 2016 (UTC)[reply]
That might explain why there was little concern about curved tracks...L-O-N-G stretches of dead straight train tracks there. SteveBaker (talk) 20:52, 9 February 2016 (UTC)[reply]
The South Australian Railways actually. And I'm not within 1000 km of Perth. The poster previously at 1.122.229.42.58.167.227.199 (talk) 03:11, 11 February 2016 (UTC)[reply]
Nothing quite as long and straight as the Trans-Australian Railway I'd guess though, the curvature of the earth probably matters more there! Dmcq (talk) 16:26, 11 February 2016 (UTC)[reply]
Excellent info ! StuRat (talk) 00:58, 9 February 2016 (UTC)[reply]
Wow! That's a typically ingenious invention for the era. Sadly, these days a couple of visible light laser beams would make a much simpler and more efficient solution. I wonder how they coped with warping around curves and across varying slope though. SteveBaker (talk) 03:38, 9 February 2016 (UTC)[reply]
"Sadly"? What an odd perspective to find a simpler and more efficient solution to be sad. (No insult intended, just an observation.) Deli nk (talk) 14:20, 9 February 2016 (UTC)[reply]
Sadly - because I love the ingenuity of the binocular approach...while recognizing that using a couple of lasers is probably a more efficient way to do it these days. SteveBaker (talk) 20:52, 9 February 2016 (UTC)[reply]
Evenings and mornings was just what I was going to suggest, when you still have enough light to see the tracks, but not so much as to drown out the laser. That would make the inspector crepuscular. StuRat (talk) 03:31, 11 February 2016 (UTC)[reply]

Technology for the disabled

[edit]

What is the current status for:

  1. Body part less people.
  2. Blind sighted people. exclude surgery.

Are there any satisfactory mechanisms out there to grant capability?

Apostle (talk) 18:31, 8 February 2016 (UTC)[reply]

Fixed title to be proper English. StuRat (talk) 18:33, 8 February 2016 (UTC) [reply]
-- Apostle (talk) 22:36, 8 February 2016 (UTC)[reply]
1) I assume you mean people missing body parts. See prosthetics.
2) I don't think most causes of blindness can be addressed without surgery, assuming implanting electrodes into the brain is considered to be surgery. I think there was some research on attaching a grid of electrodes (with just tape) on the back, and using those to convey visual images, so that might qualify. StuRat (talk) 18:35, 8 February 2016 (UTC)[reply]
There is an enormous amount of technology for the blind - from talking clocks to software able to scan a printed document and turn it into artificial speech. — Preceding unsigned comment added by 81.131.178.47 (talk) 18:56, 8 February 2016 (UTC)[reply]
Some blind people use a device that helps them to "see" using their tongues [21] [22]. SemanticMantis (talk) 21:16, 8 February 2016 (UTC)[reply]
I'll go through the links... Thank you -- Apostle (talk) 22:36, 8 February 2016 (UTC)[reply]
And a About number 2): BBC was showing a program where this blind woman was viewing throw her eyes (black & white) fuzzily. The mechanisms they implanted inside her eyes are apparently compulsory to repair every 6 months. There was also a electrical box, her brain was probably connected... - can't recall properly.
The technology was very depressing; knowing that its the 21st century (or something). -- Apostle (talk) 22:36, 8 February 2016 (UTC)[reply]
See visual prosthesis for this particular type of device. Tevildo (talk) 23:10, 8 February 2016 (UTC)[reply]
The technology to interface nerve fibers to electronics is extraordinarily difficult. It's not like there is a red wire labelled "Video In" in the interface between eyes and brain - instead there is a large bundle of unlabelled nerves - all different from one person to another. It's not like each nerve is a "pixel" or anything useful like that. Maybe one of them says "There is a high contrast, vertical line, about a quarter the height of the retina that's moving left to right" - figuring out what to say to each nerve from a camera is beyond what we can currently do...we can try to rely on brain plasticity to cope with whatever incorrect data we're sending - but that's how you end up with fuzzy, low-resolution monochrome - and experimental devices that don't survive long-term implantation. Also there are at least a dozen reasons why someone might be blind - and each one needs a separate, and equally difficult solution. This is an exceedingly difficult problem and it may be decades before we have something that truly works and is actually useful to people. SteveBaker (talk) 03:34, 9 February 2016 (UTC)[reply]
The neural plasticity is exactly what they rely on. The brain has an amazing ability to learn, and this includes learning which nerve corresponds to which pixel. And, for people who have been blind all their life, the mapping would never have been defined in the first place, since that happens as a baby, based on visual feedback. As for how to teach the brain quickly, I would suggest hooking up only the corner pixels in the image frame first, then, once they have been learnt, add more pixels, maybe one at a time, until the full grid has been learned. StuRat (talk) 18:44, 9 February 2016 (UTC)[reply]
My mistake. I recall now that its was gray-black background instead of black, and white/light colour objects that she had to differenciate; was the only colour that she could see. The image via her eyes looked like as if you were turning a TV on and off about every 3-5 millisecond or something. She did/might have/had a box (unless I'm confusing with another program).
Thank you all once again. I'll definitely look into it... Regards. -- Apostle (talk) 22:09, 9 February 2016 (UTC)[reply]

StuRat, SemanticMantis, Tevildo, SteveBaker: Just for clarification - Will it ever be possible to create glasses (or any other thing) for the blind people so that they can see, without an operation; given all the above facts still? -- Apostle (talk) 21:09, 11 February 2016 (UTC)[reply]

This isn't a field in which I can confidently speculate, but it might be possible to stimulate the visual cortex with some form of RF or magnetic system fitted to the glasses - see Deep transcranial magnetic stimulation. Whether that will ever be safer than a surgical implant is another matter. Tevildo (talk) 21:40, 11 February 2016 (UTC)[reply]
Thanks Tev, as long as there is a way... I'll definately look at it thoroughly now, in the near future. Regards -- Apostle (talk) 06:52, 12 February 2016 (UTC)[reply]
In a sense, this already exists - the BrainPort device uses a pair of glasses with a camera - a small electronics box to process the data - and a small paddle that you place on your tongue. The paddle stimulates the tongue (which is really sensitive and has dense nerve-endings). The difficulty is that it takes a lot of time and practice to recognise pictures this way - and it relies on brain plasticity for the user's brain to "see" the image presented by the tongue. But people who stick with it are able to recognize objects and navigate while walking - so they can, in a sense, "see". Similar tricks have been done with arrays of electrodes attached to the skin - or even a grid of pins that mechanically push against the skin.
However, what you're presumably asking for is full color images, fast updates for motion - a wide field of view and so forth - and that seems much harder. The idea of having a camera that stimulates the optic nerve remotely isn't going to be easy. But even if it were possible, we're expecting to re-use parts of the normal visual system and just to replace the retina. Whether that can be done or not depends critically on the REASON the person is blind. Sure, if their retina has stopped functioning, then 'plugging' the video into the cells behind the eye might work - but if the reason for blindness is brain damage or some problem in the optic nerve - then you'd need to find a different route. People who are recently blinded may still have a fully functional visual cortex - but people who were blind at birth may not develop all of that brain structure, so even if the original cause of blindness can be corrected, it may still be hard for them to recover fully. So expecting to find a single device that works for everyone is almost certainly impossible and we'll need a wide range of solutions in order to fix it.
There is also a matter of cost and practicality. A surgical approach may well be cheaper and more effective than non-surgical methods. SteveBaker (talk) 16:30, 14 February 2016 (UTC)[reply]
Yes, thanks for the clarification. Plan cancelled due to some other reason... -- Apostle (talk) 20:35, 14 February 2016 (UTC)[reply]

Accelerating a particle with light

[edit]

If I accelerate a tiny speck of dust using light, what max speed could be it reach? Let's suppose that hypothetically we can know exactly where this speck of dust is, and that we know how to point a laser at it. --Scicurious (talk) 19:22, 8 February 2016 (UTC)[reply]

Theoretically you could accelerate it to almost the speed of light. StuRat (talk) 19:24, 8 February 2016 (UTC)[reply]
Assuming you find a void in space that (with much luck) presents no molecule of gas to hinder the speck's progress, there is still microwave background radiation defining an approximate cosmic rest frame, which would become blue-shifted as the particle approaches it as the light source you use becomes red-shifted - also starlight of course, which is similarly in a fairly consistent rest frame all around. As a result, if you assume a constant light intensity in a perfectly focused beam, I think there would be a maximum level that you can use at the beginning to avoid vaporizing the particle, which eventually becomes weaker than the oncoming radiation. On the other hand, if you continue to turn up your light source (or increase its frequency) then I suppose the particle might accelerate without limit, coming arbitrarily close to light speed. Unless, of course, I forgot something else... Wnt (talk) 19:52, 8 February 2016 (UTC)[reply]
Isn't this how solar sails work? Nyttend (talk) 21:10, 8 February 2016 (UTC)[reply]
So, you can approach the speed of light as much as you want, but not reach it ever? --Scicurious (talk) 16:15, 10 February 2016 (UTC)[reply]
Yes, for two reasons.
1) Just with conventional Newtonian physics, you could never accelerate one object with mass to match the speed of another, by having them hit each other. Even if a star runs into a proton, the mass of the proton + star is now slightly more, meaning the total speed is slightly less, for it to have the same inertia.
2) Relativity prevents objects with mass from being accelerated to the speed of light, although this is tricky as it depends on precisely how "mass" is defined. See rest mass. StuRat (talk) 21:17, 10 February 2016 (UTC)[reply]
The faster something moves, the heavier it becomes (relativistic mass). The kinetic energy of its motion, as viewed from your rest frame, is a kind of energy, and has mass per E=mc2. The more kinetic energy you add to the particle, the more massive it becomes, and the more energy it takes to speed it up. In the extreme case, all of the (relativistic) mass of a photon is energy - you might add more energy to it, but the mass increases in direct proportion, so the speed never changes. I should note that relativistic mass has become unpopular in recent years, but I feel like that's a fad - since ultimately, many kinds of "rest" mass are predicated on the kinetic and potential energy of the substituent particles. Wnt (talk) 16:02, 11 February 2016 (UTC)[reply]

Is it possible that it is just a respect for observation? (I am not waiting for answer) ~~~~Like sushi — Preceding unsigned comment added by 49.135.2.215 (talk) 01:19, 15 February 2016 (UTC)[reply]

Immunity vs resistance

[edit]

Is there a difference, and if so, what is it? Are they the same but used for different species, or is there a clear but subtle difference? In other words, does "She is immune to the flu" mean the same as "She is resistant to the flu"? What about "This strain is resistant to drug X" and "This strain is immune to drug X"? 140.254.77.216 (talk) 19:51, 8 February 2016 (UTC)[reply]

"Immune" means 100%, unless some qualifier is added like "partially immune". "Resistance" is less than 100%. StuRat (talk) 19:54, 8 February 2016 (UTC)[reply]
The problem here is that you are using a literary definition of immune, StuRat, and that while I agree with you in that way, SemanticMantis and the heretical Wnt much more closely approach the received biological notion. In the school where I got my undergrad biology major (focusing in botany), you had to have four years of chemistry and four years of bio-major "bio" before you could even apply to take Immunology 396. So I would take their comments as read. μηδείς (talk) 02:47, 9 February 2016 (UTC)[reply]
You know, I can see how you'd think that. The problem is that your explanation is completely incorrect in terms of medical and physiological terminology. Immunity_(medical) discusses how the term is used. An easy example sentence "All vaccines confer immunity, but not all vaccines are 100% effective, and so some people who have acquired immunity from a vaccine may still get infected." My dictionary says "Immune: resistant to a particular infection or toxin..." Wiktionary says "Protected by inoculation", Miriam Webster says "having a high degree of resistance to a disease <immune to diphtheria>". The only time immune means 100% resistance is in fiction, games, or legal matters. SemanticMantis (talk) 21:28, 8 February 2016 (UTC)[reply]
Active immunity represents a process of natural selection within immune cells of the body (cell mediated immunity or antibody mediated immunity) by which molecules become common that (in some context) interact with a pathogen and allow it to be destroyed. In drug resistance, bacteria produce molecules that neutralize a drug, frequently by enzymatic means, often using plasmids to allow trading of useful resistances within a broader genetic background. So the selection for immunity takes place within an organism, but the selection for resistance occurs between organisms - most bacteria die, a few live and become resistant. So to be "resistant" to something is more of an inborn trait, generally speaking, while "immunity" usually implies past exposure to the agent or a vaccine etc. Exception, sort of: multidrug resistance in cancer occurs within an organism. But if you look at it another way, every cancer cell is out for itself, and (apart from the one that mutates) is either born resistant or not. Another exception, sort of: innate immunity may not require a selective response; the thing is, we rarely hear that someone is innately immune to a pathogen because they never know they might have gotten sick. This reminds me, say, of toxoplasmosis which preferentially affects those of the B cell type. (There was actually a huge outbreak in postwar Japan, and Japanese became known for "blood type personality theory", to this day never having been aware of the role of the protozoan in affecting their minds...) Wnt (talk) 20:05, 8 February 2016 (UTC)[reply]
Wnt I work at a research institution where several groups study Toxoplasma gondii and I don't think I've ever heard of a connection between ABO blood type and susceptibility to infection. For the sake of satisfying my curiosity, could you link me to where you read that, (or maybe I misunderstood what you said up above). Thanks, PiousCorn (talk) 06:03, 9 February 2016 (UTC)[reply]
@PiousCorn: I don't remember which source I originally went by, but [23][24] mention it. On the other hand [25] reports a lack of association with B blood type ... but rather, with Rh negative status! Also [26] says that. I had found the B blood type association in an older source ( [27] ) in a question I asked back in 2010 about it. [28] I think even back then I had lost track of some earlier source specifically about the Japan postwar outbreak... Wnt (talk) 09:22, 9 February 2016 (UTC)[reply]