Jump to content

Wikipedia:Reference desk/Archives/Science/2013 December 11

From Wikipedia, the free encyclopedia
Science desk
< December 10 << Nov | December | Jan >> December 12 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


December 11

[edit]

Programming in non-computer science/software dev degrees

[edit]

Programming skills seem to be creeping in into a lot of degrees that didn't use to require them or are connected to computer sci, like business, economics or biology. Is programming turning into a basic requirement in the academic landscape? Something like learning English if you are into IT, business or natural science. OsmanRF34 (talk) 01:58, 11 December 2013 (UTC)[reply]

I would say that learning to write macros is very common; learning to use scripting languages like Python or Matlab is pretty common; learning to use full-fledged programming languages like C or Java is not so common. I wish it was more common, actually -- there are lots of philosophers and biologists who would benefit from an understanding of programming. Looie496 (talk) 02:29, 11 December 2013 (UTC)[reply]
Maybe even less. Biologists don't need to learn Tex or Latex anymore to publish. --DHeyward (talk) 06:06, 11 December 2013 (UTC)[reply]
Has it ever been common for biologists? My impression is that Tex/LaTex has never been particularly common there, except perhaps for those that are maths heavy. Although I admit my knowledge is limited before the mid 90s or in the book publishing area (compared to thesis/dissertation and journal article ones). Nil Einne (talk) 13:47, 12 December 2013 (UTC)[reply]
There is a strong movement to suggest that learning at least the basics of programming should be taught in all degree-level courses - and perhaps even in high school. There are several reasons for this - one is that being able to program a computer (even in the simplest possible way) unlocks the other 95% of what a computer can do for you - because without that, you're using maybe 5% of the capability you have in your computer, tablet or cellphone. Another reason is that for children and teenagers, learning math seems "pointless" because it's so abstract and (seemingly) without real-world application. But when you get them interested in programming, they'll soon find that they want to learn math as a way to get more out of their growing programming skills. Even if people never actually use their programming skills later in life - knowing what's going on under the hood of these computers that are absolutely everywhere in our lives these days seems like a worthy activity.
The world has changed drastically over the last 20 or so years - and in one way in particular, human minds need to adapt to. When I was in school, it was essential to cram facts into kid's heads because if you needed a fact in later life and it wasn't in your head, you'd have to resort to a trek to the local library and a protracted search to find it. These days, there are three million pages of facts here on Wikipedia alone - and you can often find a fact faster here than digging it out of your own memory! What brains are now called upon to do is learn procedures - ways of doing things. Meta-knowledge, the understanding of how to find facts. We need to be able to do 'synthesis' - to take a bunch of facts and combine them into the results we need. A classic example is the question (above) about what size of plant consumes the same amount of energy as a human being. There is no "knowledge" that will tell you the answer to that question. It can only come about from a combination of knowledge search skills and the ability to take facts that you can easily find and combine them (using skills like critical thinking, math and programming) to produce an answer.
These days, teaching facts (beyond the most basic) is quite pointless. Making kids memorize the dates of battles and the death of kings is tantamount to child abuse! What they need are mental and physical skills - math, programming, critical thinking, logic, search-engine skills, music composition, literacy - both input (reading) and output (writing), communication skills, graphic design....and physical skills - things like welding, 3D printing, CAD that allow you to use modern tools to make things.
As to precisely what programming language is taught, I don't think it matters. C++ is a hard thing to learn, brutal, vicious, hard to write and hard to debug...but the ultimate in speed and power. So I don't think I'd want to teach it to non-Computer Science specialists. But Javascript, certainly.
SteveBaker (talk) 13:27, 11 December 2013 (UTC)[reply]
I agree with a lot of that, but I don't agree that learning facts has become pointless. It's impossible to understand the significance of information without facts to put it in context. How is a person who doesn't know any non-basic facts going to understand the significance of the fact that Barack Obama shook hands with Raul Castro at Nelson Mandela's funeral? Looie496 (talk) 16:11, 11 December 2013 (UTC)[reply]
That's an excellent example of what I'm saying. I was never taught a thing about Raul Castro in school or college - I'm fairly sure that nobody else who graduated more than 6 years ago has either. With what I was taught in school and (more importantly) what I'd learned from being a voracious reader and news-hound, I was puzzled as to why the news reports seemed upset at how Obama shook hands with a man who "has blood on his hands". That puzzled me because all I'd learned about him was that he took power from his brother fairly recently and has been a fairly benevolent dictator ever since. My assumption (from what I'd learned or been taught) was that there shouldn't be a reason for Obama not to shake his hand. In the past, I'd have been 100% dependent on news reporters to tell me about him - and quite honestly, I'd have been none-the-wiser. But now we have: articles on Raul Castro, Nelson Mendela and Barack Obama that tell you all of the background you need - and (specifically) that Raul fought alongside his brother through the revolution and is responsible for a lot of the same 'issues' that Fidel Castro has been labelled with. Far from being a "clean slate" and a new direction for Cuba, it now starts to look like "more of the same". But honestly, I learned nothing whatever about this subject in any formal learning situation - or even from news sources. I'm happier that I found out for myself why this handshake was or was not appropriate rather than just accepting a media sound-bite.
History classes taught me the names and dates of the kings and queens of England and numerous battles and so forth. Geography probably taught me the primary exports of Venezuala and where Bauxite is mined...but I really don't need to know any of those things - because I can trivially look them up if I ever need to know!
So what does this say about education? To me it says that it's pointless to give more than a broad-brush outline of history and geography to students who aren't planning on working in those fields - and the time that would save would be better spent in engendering the kind of curiosity that made me read the Raul Castro article when I was puzzled, the knowledge-search skills (kinda minimal in this case!) needed to find the relevant information - and also in teaching the "critical thinking" skills needed to make me wonder whether the media had just made a "knee jerk reaction" to the event - and then to interpret what I read in the context in which it matters.
Teach more skills, tools, methods and algorithms that are difficult to gain later - and fewer facts which can be recalled in half a second by asking your phone. It goes deeper than that - should you trust a fact that comes from a TV advert - or one that comes from a research paper in "Nature"? How do you know which things you're told are likely to be true? When does skepticism replace blind faith and when does trust in reliable sources shorten your search for information?
SteveBaker (talk) 16:46, 11 December 2013 (UTC)[reply]
To me that actually supports my point. Having read a whole bunch of history books, it was immediately clear to me that (a) the handshake was not an accident, and (b) that it was a signal, readable by any diplomat, that the two countries want better relations but don't want to say so explicitly. There's just no way that a superficial knowledge base and a few Wikipedia articles would enable anybody to figure that out. The deeper point is that the more facts you know, the more connections you can make for any new fact you encounter. Without a rich base to build on, searching only yields confusion. Looie496 (talk) 00:17, 12 December 2013 (UTC)[reply]
Yes, but the trouble is that there are too many important facts for anyone to know. Indeed, I could interpret this particular situation to a degree based on my school history lessons, but for other (equally or more important) news stories, my school knowledge provides little or no help - I was not taught economics, cryptography or information security at school, and would therefore be clueless about bitcoin, data collection by the NSA or even the entire 2008 financial crisis and associated events, without spending the time to inform myself about the above topics. Thus my contention that it is far more important to teach research skills, assessment of evidence etc. than just facts - If you have the skills you can acquire more facts pretty easily, If you only get given the facts you're stuck. Of course, teach a foundation of facts too, but that is always of secondary importance to teaching the skills. My view is also that, apart from the basics, schools don't really do a good job of deciding which facts are important and which are not so important - why is basic Latin (which I was taught in school) judged as more important than the basics of how the internet protocol works (which I most certainly wasn't taught)? I cannot come up with a single reason, except that the people deciding on the syllabus were probably humanities rather than computer science graduates. N.B, I am young enough that the internet had been around for quite a while when I was in school. Equisetum (talk | contributions) 11:47, 12 December 2013 (UTC)[reply]
Almost precisely my own view on the matter, and stated far more eloquently than I ever could. I would also add that, with the appropriate skills training (in this case research, quick reading/scanning and memorisation) it is really quite easy to gain a working collection of facts on any given subject. Therefore, if there was the requirement at some point to, say, become passingly conversant with a particular period in history, it is a far better idea to teach someone how to take a few hours to research it than to hope (usually in vain) that they will remember what they have been taught on the subject many years ago at school. Equisetum (talk | contributions) 21:25, 11 December 2013 (UTC)[reply]
Turn the clock back 40 or more years ago and we all had to be conversant with the use of a [slide rule]. Being able to program today is on par with being able to compute with the current technology available back then. Oh, I have my 10" Thornton in my hands now (as it seldom leaves my side) and I have not had to change the batteries, even once, even though I bought is over forty years ago. Before Windoz has had time to boot up, my Thornton have given my the answer ! Study, what ever software language the course advises you to learn. Move on from there... as the need or inclination leads you. --Aspro (talk) 18:43, 11 December 2013 (UTC)[reply]
[Banned user]
I imagine that you must be getting on in years, which might explain why your memory isn't as reliable as it used to be—the 1950s certainly enjoyed their share of fad diets and quack nonsense. The grapefruit diet (also known as the 'Hollywood diet', and based on the notion that grapefruit contains magical weight loss enzymes) existed since at least the 1930s, and enjoyed a resurgence in the '50s. The same goes for the cabbage soup diet. The Gerson 'therapy' diet (cures cancer, migraines, and tuberculosis using organic juice, assorted supplements, and coffee enemas) originated in 1928; Gerson himself kept offering it to trusting cancer patients until he died in 1959. Johanna Budwig's Budwig protocol was published in 1952, and claimed to cure cancer with flaxseed oil and cottage cheese. Various forms of radioactive quackery flourished in the first half of the twentieth century—did you know that you could cure arthritis by lying in a box full of mildly radioactive sand? I could go on.
You're on very shaky ground if you want to pretend that your generation was better immunized against pseudoscientific nonsense than the generations who followed; bad ideas had plenty of followers then, too. In the 1950s, it was just plain harder for similarly-minded but geographically-separated individuals to connect with one another, which meant that low-popularity ideas (for good or ill) tended to have more difficulty spreading from one place to another, or across social or class boundaries. TenOfAllTrades(talk) 06:00, 12 December 2013 (UTC)[reply]