Wikipedia:Reference desk/Archives/Science/2013 November 6

From Wikipedia, the free encyclopedia
Science desk
< November 5 << Oct | November | Dec >> November 7 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


November 6[edit]

Why does ice do more damage than liquid water?[edit]

Like if you throw a block of ice at someone's head, it's going to hurt or even kill them. But if you dump an equal amount of water on someone's head, they just become wet even though water is more dense than ice. Why does ice do more damage? Shouldn't liquid water have greater kinetic energy if it's traveling at the same speed? ScienceApe (talk) 00:44, 6 November 2013 (UTC)[reply]

A block of ice is going to be hit the poor unfortunate in a much smaller area and the impact will occur in a short time period, while the water will sheet all over the somewhat annoyed person and over a longer interval. (Why would water, assuming the same mass, have more kinetic energy?) Clarityfiend (talk) 00:58, 6 November 2013 (UTC)[reply]
As you indicate, density is not the whole story. Flexibility also figures into it significantly. ←Baseball Bugs What's up, Doc? carrots→ 01:10, 6 November 2013 (UTC)[reply]
If, however, she's a wicked witch.... μηδείς (talk) 01:43, 6 November 2013 (UTC)[reply]
So if Dorothy had thrown a freshly-filled bucket of ice, instead of a bucket of water, history might have turned out differently? "You curséd brat! Just look at this red bump you put on my green head!" ←Baseball Bugs What's up, Doc? carrots→ 01:48, 6 November 2013 (UTC)[reply]
"I'll get you my pretty...and your little cubes, too !" StuRat (talk) 07:45, 6 November 2013 (UTC) [reply]
This discussion has been closed. Please do not modify it.
The following discussion has been closed. Please do not modify it.
Take it to your talk pages. Your 'clever' discussion has nothing to do with this question or page. --Onorem (talk) 01:51, 6 November 2013 (UTC)[reply]
Take it to where the moon don't shine. ←Baseball Bugs What's up, Doc? carrots→ 01:53, 6 November 2013 (UTC)[reply]
I'm not sure where that is. I'd be happy to accommodate you if possible. I hope you're willing to do the same. --Onorem (talk) 01:57, 6 November 2013 (UTC)[reply]
I'm guessing you haven't heard the real quote before. Regardless, you came here strictly to criticize other editors, not to answer the OP's question. Go find someone else to nanny. ←Baseball Bugs What's up, Doc? carrots→ 02:00, 6 November 2013 (UTC)[reply]
  • Ice is solid (not flexible). Getting hit by solid things hurts. A bucket of ice would have had little effect on the wicked witch, unless some of it melted quickly enough. Drmies (talk) 02:09, 6 November 2013 (UTC)[reply]
This discussion has been closed. Please do not modify it.
The following discussion has been closed. Please do not modify it.
    • Does yokel Ono's saying F.U. to me in an edit summary[1] qualify as a blockable offense? Or would it be best to just delete everything below Clarity's answer? ←Baseball Bugs What's up, Doc? carrots→ 02:20, 6 November 2013 (UTC)[reply]
      • If my invective is blockable, your calling me a yokel is stupidly on the same level. Deleting (or hatting which I tried) is fine with me. --Onorem (talk) 02:24, 6 November 2013 (UTC)[reply]

A "question" like this brings to mind an interesting quote by Aristotle:

What nature is, then, and the meaning of the terms 'by nature' and 'according to nature', has been stated. That nature exists, it would be absurd to try to prove; for it is obvious that there are many things of this kind, and to prove what is obvious by what is not is the mark of a man who is unable to distinguish what is self-evident from what is not. (This state of mind is clearly possible. A man blind from birth might reason about colours.) Presumably therefore such persons must be talking about words without any thought to correspond. - Aristotle's Physics Book 2, chapter 1

What he means is, that it is foolish to ask for a technical explanation of a simple fact, when no possible technical explanation could be given, unless you already understood the simple fact you want explained. If ScienceApe were an artificial intelligence that had never experienced ice or water, but had only heard of them theoretically, asking this question would make sense. But to any human old enough to speak, the answer is because ice is solid and water is liquid, and if you don't already understand that intuitively, the technical answer will be mere squiggles on the page. μηδείς (talk) 02:47, 6 November 2013 (UTC)[reply]

Oddly enough, if you try to walk across a river of ice, you might well suffer less damage than if you try to walk across a river of water. ←Baseball Bugs What's up, Doc? carrots→ 02:59, 6 November 2013 (UTC)[reply]

As with Medeis, I do sometimes raise my eyebrows at a question or two. But I'll assume good faith and answer this one instead of being snarky. Ice is rigid, water is not. When water strikes you, only the molecules in contact with your face impart energy and momentum as they stop and/or move out of the way. The water molecules behind them continue to move forward unimpeded. There is a little bit of energy being transferred from the molecules in the rear of the ball of water due to the cohesive forces holding the water roughly together, but it's not much. When a block of ice hits you, it stays rigid. As the molecules in the front of the block strike you, the ones behind continue to push forward, as do the ones behind those. Thus, your face has to absorb the energy/momentum of the entire block of ice at once. So although the amount of momentum your head might absorb from the blow could be the same, it will be absorbed far faster. This is similar to why gently slowing your car from 80 mph to 0 is perfectly harmless, but crashing it into a brick wall is not. Someguy1221 (talk) 03:44, 6 November 2013 (UTC)[reply]

Thank you for assuming good faith, believe me, this question leads to what I really want to ask about: So solids will generally do more damage than liquids? So if I project mercury in a laminar stream (perhaps using an electromagnetic pump or a railgun that can fire fluids) at the same muzzle velocity as a rotary cannon like say... the vulcan cannon firing lead bullets. Assuming the volume of fire is more or less the same, the lead bullets will do more damage even though mercury is more dense? ScienceApe (talk) 07:22, 6 November 2013 (UTC)[reply]
Above a certain speed (someone will be able to tell you what sort of range of muzzle velocity, perhaps around the speed of sound in the liquid), liquids start behaving more like solids, and then the mercury might do more damage than lead, and water more damage than ice because of the higher momentum and energy. I don't know whether the effect is sufficient, or whether it might be counteracted by solids melting at these energies. Dbfirs 08:08, 6 November 2013 (UTC)[reply]
Condensed matter physics can be complicated, and a third possibility (superplastic metals) is used in some weapons. Perhaps you might look at High-explosive anti-tank warhead, Munroe effect and Superplasticity. Cardamon (talk) 08:18, 6 November 2013 (UTC)[reply]
Yes, I'm out of my depth here. I was searching for some references, but found only [this] if anyone has access to it. Things get much more complicated at hypersonic speeds. Dbfirs 08:29, 6 November 2013 (UTC)[reply]
See Water jet cutter. Oda Mari (talk) 09:54, 6 November 2013 (UTC)[reply]
Yes, but is not strictly just water. The water is iusually load with an abrasive material that is doing the actual cutting. The water is used primarily as a medium to deliver the abrasive. Plasmic Physics (talk) 01:21, 7 November 2013 (UTC)[reply]
The OP specifically said "throw a block of ice" and then "dump an equal amount of water on someone's head". In the normal use of those terms, you've already got a greater velocity behind the ice than the water. Certainly if you fire some quantity of water at a very high speed, it's going to be damaging on a par with firing the block of ice at the same speed. (Consider how a tornado can drive a straw into a tree, etc.) ←Baseball Bugs What's up, Doc? carrots→ 00:53, 7 November 2013 (UTC)[reply]
The rhetorical device is bait and switch. Had the OP not been being cute, he'd have gotten the answer that kinetic energy is proportional to velocity squared a lot quicker. μηδείς (talk) 02:09, 7 November 2013 (UTC)[reply]
This discussion has been closed. Please do not modify it.
The following discussion has been closed. Please do not modify it.
Being as how the OP admits to jerking us around, boxing up the entire section might be the best thing. ←Baseball Bugs What's up, Doc? carrots→ 14:58, 7 November 2013 (UTC)[reply]
No. But boxing up and/or refactoring these insults wouldn't be a bad idea. Blocks don't have to be traveling fast to do extensive damage. [2] and straws are not water drops. -Modocc (talk) 16:21, 7 November 2013 (UTC)[reply]
Yours included. ←Baseball Bugs What's up, Doc? carrots→ 16:25, 7 November 2013 (UTC)[reply]
The OP never "admitted" to "jerking us around". Please stop slandering other people. --Bowlhover (talk) 17:19, 7 November 2013 (UTC)[reply]
"...this question leads to what I really want to ask about..." qualifies. And you, who have come here not to answer any questions from the OP but strictly to attack other users, qualify as a nanny. Stop it. ←Baseball Bugs What's up, Doc? carrots→ 17:26, 7 November 2013 (UTC)[reply]
That is simply a legitimate generalization of what was a legitimate question. And if insisting that you follow Wikipedia's policies, such as WP:AGF, qualifies as "nannying", I refuse to stop "nannying". Your attitude seems to be that you're above the law and should be allowed to do whatever you want with impunity. --Bowlhover (talk) 18:11, 7 November 2013 (UTC)[reply]
When you come here specifically to issue personal attacks, you're also in violation of wikipedia "law". Yet you cop the attitude that that "law" does not apply to you??? ←Baseball Bugs What's up, Doc? carrots→ 19:28, 7 November 2013 (UTC)[reply]

Mainstream theories of intelligence[edit]

Hello, I am not aware of it and I wonder what are the current mainstream widely accepted sources on intelligence and how much of this is heritable(genetic). Thank you!74.14.29.128 (talk) 05:08, 6 November 2013 (UTC)[reply]

We have an article on heritability of IQ that you can read for discussion of the concept from a variety of viewpoints. In general, it is extremely difficult to prove that apparently inherited behaviors are genetic because twins or other siblings will also typically share similar fetal environments, similar childhoods, identical parents, and identical socioeconomic conditions. That is to say, without the ability to clone you and stick the clone with a random other family (and repeat this 1000 times), its hard to say to what extent your genes are responsible for your IQ. This and other issues are discussed at the article I linked. That said, it is certainly true that there are many many recognized genetic disorders that subtly or dramatically reduce a person's intelligence, but these would normally be excluded from studies of IQ and intelligence as trivial cases. Someguy1221 (talk) 07:16, 6 November 2013 (UTC)[reply]

I dare say SomeGuy obfuscates the issue. Some twins do not share similar childhoods and identical socioeconomic conditions. There is a body of research on identical twins separated at birth. What it shows is that those twins (separated) demonstrate differences indistinguishable (statistically) from twins who haven't been separated, which leads us to believe that the contribution of the environment is nil. — Preceding unsigned comment added by 168.178.74.153 (talk) 17:14, 7 November 2013 (UTC)[reply]

How do you account for situations where one twin is a cigarette smoker and the other isn't? ←Baseball Bugs What's up, Doc? carrots→ 17:17, 7 November 2013 (UTC)[reply]
Presuming the figures really can't be statistically distinguished (our article gives different figures and doesn't comment on which figures are not statistically different from each other), all you can actually say is you can't say from those figures that there's an environmental influence. Looking at the other figures, I strongly suspect that some of them suggest there is an environmental influence which shows the complexity and problems with trying to draw too much of an inference from limited data. Nil Einne (talk) 17:17, 8 November 2013 (UTC)[reply]

It is sooo abstract. Do you know a case like this or it is your fantasy? Are you sure they are identical? Were they reared apart? What are you talking about? — Preceding unsigned comment added by 168.178.75.70 (talk) 22:09, 7 November 2013 (UTC)[reply]

IGNORE them, they are leftist PC brainwashed

Conversion of finances to binary[edit]

I work for a finance company who are currently looking for an "outside the box" idea for time savings in the department. Since any ideas are considered, and I am unlikely to come up with anything actually implementable as I've only worked here for a year or so, I came up with the very "outside the box" idea of converting all of the numbers we process to binary. Huge amounts of computer based numerical processing is done in the company which can take many days - presumably where the computer converts them to binary behind the scenes, calculates and then converts back to base 10? Since maybe 0.1% of the numbers are ever seen by human eyes, those few could be converted when needed, and the rest left as binary (or just teach everyone to read numbers in binary). I realise this will probably never amount to anything, but it's a good thought experiment. My question is - would this decrease calcualtion time by avoiding binary to text conversion at each stage? How much time would be saved (per calculation/million calculations)? Can anyone find any references for the efficiency of working with binary files? Thanks! 80.254.147.164 (talk) 12:52, 6 November 2013 (UTC)[reply]

You might want to review Arithmetic logic unit and see where it leads, and thus get some thoughts as to whether your premise is valid. ←Baseball Bugs What's up, Doc? carrots→ 13:02, 6 November 2013 (UTC)[reply]
Additionally, ignoring that your premise isn't actually valid, I don't think you would be able to teach everyone to read numbers in binary. The base is too small and people won't be able to easily read numbers at a glance (can you tell the difference between 1000000 and 10000000? That's a whole factor of two!) Hexadecimal would be better for human comprehension and compatibility with binary. Double sharp (talk) 13:39, 6 November 2013 (UTC)[reply]
Suggesting some IT solution if you are not in the IT field can turn to be just a crazy speculation or even worst, turn you into the laughing stock in the IT department, specially if you are a beautiful woman. IT specialists can deal with these. If you want to suggest anything relating to time savings, think about something that is kind of useless for all. For outsiders, it would be difficult to suggest anything. OsmanRF34 (talk) 13:49, 6 November 2013 (UTC)[reply]
I should have made it clearer - this wasn't supposed to be a serious suggestion, more picking fun at thinking so far outside the box your brain falls out. I do realise that this would be a terrible idea in practice, but was more looking for a spherical cows type approach for time saving 80.254.147.164 (talk) 14:40, 6 November 2013 (UTC)[reply]
Here's a very rough order-of-magnitude estimate. I'm working with a data file right now that contains 6.7 million floating-point numbers, all in plain text (and base 10). It takes about 4 seconds to read the entire file, or 0.6 microseconds per number. That's the absolute maximum amount of time you can save. If you mean that 99.9% of numbers at your company are intermediate results stored in text files in base 10, it's definitely not a stupid idea to suggest storing them in binary files instead--but only to save space, since saving 2.412412 in binary takes up much less space than saving the string 2.412412. --Bowlhover (talk) 15:11, 6 November 2013 (UTC)[reply]


In the abstract, it's not a terrible idea. In fact, it's a great idea: why wouldn't we optimize the machines to perform great at a specific application? Ancient computers used EBCDIC and binary coded decimal and fixed point math - because at some point in the 1950s, somebody wanted to use computers for finance, and the engineers in the 1950s worked out that EBCDIC and BCD actually stored and processed financially-relevant data better. (By the late 1960s, their efforts paid off and they shipped products! That gives a good benchmark for the time-scales involved in planning and implementing).
But that was half a century ago; those machines were very different from most computers you find today. You'd be hard-pressed to make a case that a change in fundamental representations of primitive data types on today's technology would yield an improvement in total system performance. Half of century of research and development had yielded a computer architecture that is highly tuned.
If you work in finance, you might still come across some very exotic computers in the data center: IBM System Z, for example, whose central processing unit and system software directly descend from this simple idea: IBM built a computer that was more efficient for financial work. Yet, by the mid-2000s, even IBM started to recommend Linux on System Z, and most software for it was written in Java, because in the aggregate, the micro-optimizations that allow for finance calculation were less important than the total cost of maintenance and the overhead of ensuring compatibility with other types of computer.
Today's computers are very complicated - billions and billions of times more complicated than the first IBM 360 from half a century ago. It is impractical for a non-specialist to really enhance performance at the fundamental level, because it can take years of training to learn the details of operation. When computer theorists want to improve performance, they usually attack the problem at several levels:
  • software engineering: using profiling and instrumentation to determine inefficiencies in application software
    • Sometimes, there are immense inefficiencies that can be fixed just by improving the software
  • compiler architecture: expressing the software in a way that is mathematically-provable to be the fastest possible set of machine instructions
    • Compiler design is very complicated, and there are lots of trade-offs to choose.
  • Processor architecture - this is the most challenging item to modify.
    • You will spend years learning the operational theory of current technology, and then try to find some entity that can be improved.
    • Examples are instruction level parallelism and vector instruction sets. But those things already exist! So if you want to improve, you'll have to spend some time researching what does and does not already exist, and then prototype it (in software).
    • If you have a good idea - and those are a lot more rare than most people think - then you'll have no problem finding employment with a microprocessor company. You could try to build your own technology, but microprocessors are very difficult and expensive to build: they are made of microscopic parts and manufactured using expensive equipment, hazardous chemicals, and a lot of engineering talent.
The thought experiment is actually a very good one. We are always trying to squeeze a little performance out of these machines. But the realist and the engineer in me can't emphasize enough: millions of man-hours of research and engineering has already been done at thousands of corporations, universities, and think-tanks. You'll have to be very clever, and very very well informed about the state of the art, to make a measurable improvement. Nimur (talk) 15:25, 6 November 2013 (UTC)[reply]
Well that doesn't really surprise me now I think about it - if it did work then I guess a lot more companies would be doing it... I'll probably suggest it and hope I get the booby prize for "most impractical suggestion" 80.254.147.164 (talk) 15:47, 6 November 2013 (UTC)[reply]
There can be cases where something seemingly obvious was missed by the "experts", though. They tend not to think about the simple things, so miss such improvements. StuRat (talk) 22:02, 6 November 2013 (UTC)[reply]
Storing binary financial data to hard disk instead of base 10 makes some sense, in that it saves disk space and makes it quicker to read and write a file. However, it does have a big disadvantage, that a human can't read the files. This makes debugging a program a lot more work. So, it typically works out that small files should just be saved in a human-readable form, while huge files should be stored in binary form. StuRat (talk) 21:56, 6 November 2013 (UTC)[reply]
Tell them to hire us instead. :) Really, converting finances to binary only has one real advantage, namely, if you get to keep the thousandths of a penny round-off error! The computing time difference is going to be trivial compared to the risk of any bug. Of course, binary finances have been used -- see pieces of eight. :) And for certain data like stock quotes I guess they still use that, even after all this time. Wnt (talk) 22:54, 6 November 2013 (UTC)[reply]


OK - this thread is a mess - almost every previous answer is wrong for one reason or another - and I don't have time to correct them all!

Firstly, once the numbers are read into the computer program, they are either going to be in binary already - or perhaps in a format called "BCD" (Binary-coded decimal). The latter is a rather ancient idea that was popular 20 or 30 years ago - but is pretty much obsolete now. If your software currently uses BCD throughout it's innards, then there might be a case for switching to simple binary notation for speed...maybe. But if your systems are still using BCD then the reason is likely to be embedded deeply in old, old software or database records that NOBODY dares to tinker with - so this change won't happen.

But if the data inside the computer is in true binary - in order to be displayed on the screen or sent to a printer, or presented in human-readable form of any kind, it has to be converted into ASCII characters. I can only presume that you seek to save time in binary->decimal and decimal->binary conversions - but those really don't happen in computers. What we do is to convert the binary value (such as 11110001001000000) into a string of digits "123456". To humans, this is "A decimal number" but to the computer, it's a "string" - a set of letters, digits and punctuation characters that happens to have only decimal digits in it. A number like 123456 has six decimal digits and is represented as 11110001001000000 in binary...17 binary digits or in ASCII as six bytes '1', '2', '3', '4', '5', '6'. So to send 123456 to the end user as a human-readable decimal number requires six binary-to-ASCII conversions and six trips through all of the graphical software or whatever to display them. The conversion time is utterly negligible compared to the time it takes to get the information onto a disk drive, out to a printer or onto a screen...probably ten thousand times less. Given all of that, to send the same number to a human in binary takes 17 trips through that complicated software instead of just six.

So you've done three things here:

  1. Slowed down the computer by making it generate 17 ASCII characters instead of 6.
  2. Made the number all-but incomprehensible to real humans (I can just about remember a 10 decimal digit phone number - but I'd stand no chance of remembering the 30 binary-digit representation of it!).
  3. Increased the amount of storage for ASCII numbers by a factor of three.

No net win whatever - actually, a massive loss.

The one thing you *COULD* suggest (although any sane manager will still shoot it down in flames) is to go to a higher base than decimal. You could save storage, time and have shorter numbers for people to remember in Hexadecimal notation. 123456 becomes 1E240 - which saves a digit. Heck, go hog-wild and suggest "radix 50" notation - which reduces 123456 to a three digit representation.

Bottom line here is that this is (to be very honest) an incredibly BAD suggestion...don't tell anyone at work about it or you'll be laughed out of the office!

SteveBaker (talk) 23:00, 6 November 2013 (UTC)[reply]

This thread reminds me of a Dilbert cartoon. Gandalf61 (talk) 09:34, 7 November 2013 (UTC)[reply]
Higher bases may be more desirable for conciseness, but they have serious problems when it comes to doing computation: most significantly, nobody is ever going to remember the base-50 multiplication table. I'd submit that the only practical bases for everyday use are {8, 10, 12, 14, 16}. I'm also not too keen on octal and hexadecimal because they are prime powers and therefore fail and produce repeating decimals whenever you divide by anything that isn't a power of 2. This also makes them have irregularities in the multiplication table, much as we see in the decimal 7 times table. So I would say the average person could only use {10, 12, 14}. But the problem is that you can't change it just at work and nowhere else because it would create endless confusion (does "29" mean twenty-nine, two dozen and nine, or two fourteens and nine?) So in the one thing you could correctly suggest, you have two options: (1) force the whole world to convert away from base 10 or (2) leave well enough alone. Double sharp (talk) 10:25, 7 November 2013 (UTC)[reply]

I don't think this is would necessarily be a bad suggestion. Binary/decimal conversions are inefficient at best, so for large volumes of data the effect on processing may indeed be significant. For example, once while working with arbitrary-precision-integer library (dealing with numbers in the thousands of bits) I found that I could turn a five-minute calculation into a five-second one simply by converting the screen output to hexadecimal rather than decimal! Reason being that the former requires little more than lookup-tables whereas the latter division. Sebastian Garth (talk) 16:44, 7 November 2013 (UTC)[reply]

I believe that by default, COBOL stores numbers as BCD: my COBOL reference even devotes a page to the different storage options for "numbers that people will see" vs. "numbers that people won't see." There's also a LOT more COBOL code out there than many people realize: for example, the payroll system for the entire Executive branch of the federal government (including the military) is written in COBOL. So I don't think it is as preposterous a suggestion as it first appears, although I agree with SteveBaker that the gains to be had from modifying legacy code are not worth it. OldTimeNESter (talk) 21:36, 7 November 2013 (UTC)[reply]

(I'm surprised why this is here rather than RD:C)
I would look into base 100 for financial data; you can convert two digits in one byte and save about half the space, and it's very easy and fast to convert between base 10 and 100. The only "special" case that arises is if the number of decimals happens to be odd, which cannot even happen in some financial applications.
For other purposes where there's much input and output but not many arithmetic operations (so, base conversion time is an issue), I used a base-109 representation, which saves slightly more bandwidth and is still convenient on virtually all 32-bit CPUs.
217.255.176.48 (talk) 07:38, 8 November 2013 (UTC) (One.Ouch.Zero via wifi)[reply]

Confusing mental with physical illness[edit]

What illnesses (mental - physical) could have similar symptoms? Think about something like thyroid diseases affecting the mood like depression. OsmanRF34 (talk) 14:04, 6 November 2013 (UTC)[reply]

Due to Münchausen syndrome, many diseases can be caused by that mental disorder. Of course, those diseases with no visible signs are easiest to fake. However, the mentally ill might be willing to harm themselves to get sympathy, so break their own leg, etc. Still, it would be difficult for them to give themselves something like cancer, although they could increase the chances by exposing themselves to carcinogens.
Then there are cases where diseases have both a physical and mental component, and it may not be clear which is causing the other. For example, say a patient can't sleep and also has mental issues. The mental issues might cause the lack of sleep, or vice versa, or perhaps the two have a common cause, or they might be unrelated.
Also, "mental" issues might be considered a subtype of physical diseases, since all mental problems may stem from a physical problem with the brain. Sometimes these physical problems are obvious, like a brain tumor, and sometimes less so, like a chemical imbalance or microscopic structural flaw. StuRat (talk) 16:12, 6 November 2013 (UTC)[reply]
There is no by no means a clear separation between physical and mental illness; see Causes of mental disorders#Biological Factors. Red Act (talk) 16:30, 6 November 2013 (UTC)[reply]
To clarify, while all mental illness may ultimately have a physical cause, the reverse is not true. Physical problems exist which are not caused by the mind. I suppose some people believe that all diseases are caused by the mind (or soul), but science doesn't support that. StuRat (talk) 16:37, 6 November 2013 (UTC)[reply]
So the emaciation, and on occasion death, of anorexia nervosa sufferers or cirrhosis of the liver caused by the addiction to alcohol don't count then? With the greatest respect to you StuRat I find the inclusion of weasel words in your answers attenuates their credibility. Richard Avery (talk) 08:03, 7 November 2013 (UTC)[reply]
I don't think you read my statement correctly. I said that science doesn't support that all diseases are caused by the mind. Obviously, some are, as I've stated. StuRat (talk) 08:21, 7 November 2013 (UTC)[reply]
As an example of StuRat's point, sickle-cell disease clearly isn't caused by mental illness, or by some kind of "soul illness". Red Act (talk) 17:44, 7 November 2013 (UTC)[reply]
I think that might be a long list. Take schizophrenia for example, the differential diagnosis mentions drug intoxication and drug-induced psychosis, metabolic disturbance, systemic infection, syphilis, HIV infection, epilepsy, and brain lesions. Stroke, multiple sclerosis, hyperthyroidism, hypothyroidism and dementias such as Alzheimer's disease, Huntington's disease, frontotemporal dementia and Lewy Body dementia.
Depression (mood) can be caused by non-psychiatric illnesses like hypoandrogenism (in men), Addison's disease, Lyme disease, multiple sclerosis, chronic pain, stroke, diabetes, cancer, sleep apnea, disturbed circadian rhythm, hypothyroidism or by psychiatric syndromes.
Anorexia nervosa (differential diagnoses) is a full article.
See the disorders listed in the mental and behavioural disorders navigation box for more... Ssscienccce (talk) 22:16, 8 November 2013 (UTC)[reply]

David Suzuki's recent warning about the Fukushima situation[edit]

Mr. Suzuki recently stated that "I have seen a paper which says that if in fact the fourth plant goes under in an earthquake and those rods are exposed, it's bye bye Japan and everybody on the west coast of North America should evacuate".[3] Knowing what I do about these things I'm a little skeptical, and after reviewing our article on the matter I could find nothing saying as much. I was wondering if there are any informed parties here who could speak to this issue. I suspect this was hyperbole on Suzuki's part in order to coax the Japanese government into accepting US help. Thanks. Vranak (talk) 14:59, 6 November 2013 (UTC)[reply]

Yea, some severe exaggeration going on there, particularly in the case of North America. I doubt if anything that happens in Fukushima can cause even one death there. StuRat (talk) 16:24, 6 November 2013 (UTC)[reply]
I don't know how the currents would operate in such case, but consider that the distance US Japan is huge. Fukushima - SF is 8101 kilometers or 5034 miles or 4374 nautical miles. Enough space to dilute anything flowing on the sea. OsmanRF34 (talk) 16:43, 6 November 2013 (UTC)[reply]
Why would one more reactor be so much worse than the three already damaged? Rmhermen (talk) 18:24, 6 November 2013 (UTC)[reply]
The Fukushima disaster actually went relatively well - containment was just barely held just enough so that the fuel rod storage areas were never allowed to overheat and catch fire. There is a lot more waste in the storage areas than there is in the reactor itself. It could very well have turned out to be what (in retrospect) could be called a "level 8" nuclear accident. Wnt (talk) 23:15, 6 November 2013 (UTC)[reply]

cancer hair follicles and nails[edit]

Do cancer cells consume the food of hair follicle and nail (bed) cells? What is the relation among cancer cells, hair and nails? — Preceding unsigned comment added by Anandh chennai (talkcontribs) 15:19, 6 November 2013 (UTC)[reply]

I believe cancer cells get their "food" from the blood, like other cells do. There may be exceptions, though, like immune system cells which "eat" other cells. Thus, a cancer of the immune system might eat other cells, as well. Of course, I wouldn't expect such a cancer to show up at hair follicles and nail beds. StuRat (talk) 16:22, 6 November 2013 (UTC)[reply]
It's certain types of chemotherapy that cause cells to stop replicating, and hence interfere with the growth of nails and hair, as well as the lining of the gastrointestinal system. Most cancers themselves wouldn't normally have any direct effect on the hair or nails. μηδείς (talk) 16:43, 6 November 2013 (UTC)[reply]
Yes, the link between cancer cells, hair and nails is that they are all relatively rapidly dividing cells (hair and nails both grow by cell division in the hair follicle and nail bed respectively, while of course in the hair shaft and nail plate themselves the cells are no longer living). As Medeis says, some types of chemotherapy target cell division, which affects the growth of the cancer, but also is responsible for many of the side effects of such drugs. Equisetum (talk | contributions) 14:51, 7 November 2013 (UTC)[reply]
So baldness occurs when hair follicles, the rapidly dividing cells stop division, and club hairs fall. Somewhat related to cancer cells that grow as bundle and top layers lack blood supply from the substratum, but difference in this case is that they do not die and instead separate out to a new place to search for nutrition supply. — Preceding unsigned comment added by 122.164.200.67 (talk) 01:48, 8 November 2013 (UTC)[reply]

Ashkenazi intelligence[edit]

Is it accurate the theory that Ashkenazi Jews evolved to have a high IQ due to societal pressure? — Preceding unsigned comment added by 74.14.29.128 (talk) 17:41, 6 November 2013 (UTC)[reply]

What's your basis for that claim? ←Baseball Bugs What's up, Doc? carrots→ 17:45, 6 November 2013 (UTC)[reply]
It's not my theory, but it goes that throughout history apparently the Ashkenazi had societal pressures for high IQ. — Preceding unsigned comment added by 74.14.29.128 (talk) 17:56, 6 November 2013 (UTC)[reply]
What's the basis for that claim? ←Baseball Bugs What's up, Doc? carrots→ 17:58, 6 November 2013 (UTC)[reply]
There is an articleAshkenazi Jewish intelligence about this. Much of the stuff there is disputed but you're unlikely to get better information here than there. Dmcq (talk) 18:10, 6 November 2013 (UTC)[reply]
What are the criticism of the theory? — Preceding unsigned comment added by 74.14.29.128 (talk) 20:43, 6 November 2013 (UTC)[reply]
Have a look at the talk page for people discussing the article. Dmcq (talk) 21:26, 6 November 2013 (UTC)[reply]
  • The theory, as I remember having read it, is that since Jews in the Holy Roman Empire couldn't own land and farm (they lived in ghettos because of this) they had to seek employment in trained professions, putting a selection pressure on men to succeed in a profession, so they could have enough money to marry. μηδείς (talk) 21:21, 6 November 2013 (UTC)[reply]
Wouldn't this improve the Ashkenazi intelligence because of selection of intelligence genes?74.14.29.128 (talk) 21:27, 6 November 2013 (UTC)[reply]
The problem with that notion is there aren't really cost-free intelligence genes, or they would spread to the entire population. (Indeed, where such genes do exist, not having a working copy is described as a form of mental disability or retardation.) What you can get in such a high selection pressure environment is genes like that for cystic fibrosis, which confers resistance to cholera, or various blood "diseases" that confer resistance to malaria. It is possible, but nowhere beyond the speculation stage, that the prevalence of such things as bipolar disease[4] and schizophrenia may be related to genes that also confer gifts to those who don't suffer full-blown disease. But most genetic diseases associated with Ashkenazim,[5] such as Tay-Sachs are believed to be due to a genetic bottleneck, basically, inbreeding. μηδείς (talk) 01:59, 7 November 2013 (UTC)[reply]
Medeis, you seem to be assuming that intelligence per se enhances reproductive success, or at least does not harm it. I think that's a questionable assumption. If intelligence above some threshold actually reduces reproductive success, then there might exist genes that increase intelligence and have no other harmful effects, and yet do not "spread to the entire population". --Trovatore (talk) 04:10, 7 November 2013 (UTC)[reply]
Yes, I am. Yet I do not define intelligence as being a green vegan with one's tubes tied, but rather as the ability to correctly identify relevant patterns quickly. That, combined with a love for one's close kin, is probably more evolutionarily successful than its alternatives. μηδείς (talk) 04:26, 7 November 2013 (UTC)[reply]
I think this is a big assumption; it could well be the reverse. It could well be, say, that the people who are best able to "correctly identify relevant patterns quickly" also tend to delay parenthood until they're out of grad school. That's just one possible mechanism — there could be lots of others. --Trovatore (talk) 05:53, 7 November 2013 (UTC)[reply]
Most racial theories are bunk. If any are valid, they are probably successors to the work of Trofim Lysenko, an early epigenetics researcher (whether he knew it or not...). The other possibility, I guess, is that Ashkenazi Jews could have such a narrow gene pool that by random chance they are all smarter, dumber, or both relative to humanity as a whole, as members of a single family might be, but I'd take that with a grain of salt. (I'd take it all with a couple of teaspoons, really) Wnt (talk) 22:59, 6 November 2013 (UTC)[reply]
Lysenko? These modern theories are coming out of Western institutions that have nothing to do with crop science or a lack of understanding of the modern synthesis. μηδείς (talk) 02:03, 7 November 2013 (UTC)[reply]
The OP, or for that matter anyone considering this kind of theory, might be confusing intelligence with industriousness - something that humans in general are very good at. ←Baseball Bugs What's up, Doc? carrots→ 00:46, 7 November 2013 (UTC)[reply]
What is the modern synthesis when it come to racial theories, medeis? 74.14.29.128 (talk) 02:33, 7 November 2013 (UTC)[reply]
I don't understand your question. The term "racial theory" is yours, not mine. The modern synthesis was mentioned in contrast to Lysenkoism. I linked the article if you want to read it. Read the papers I linked to see how they defined their Ashkenazim populations. μηδείς (talk) 02:43, 7 November 2013 (UTC)[reply]
Thanks, btw is the Ashkenazim intelligence due to selection theory mainstream at this moment?74.14.29.128 (talk) 02:50, 7 November 2013 (UTC)[reply]
What is the evidence for the allegedly superior Ashkenazim intelligence? Define "intelligence". Define what makes Ashkenazim allegedly superior. ←Baseball Bugs What's up, Doc? carrots→ 03:08, 7 November 2013 (UTC)[reply]
The higher IQ than other populations, and according to this theory is because the Ashkenazim have over time accrued genes for intelligence because of evolutionary pressure for it.74.14.29.128 (talk) 03:10, 7 November 2013 (UTC)[reply]
Who says they have a higher IQ, and how are they defining "intelligence"? ←Baseball Bugs What's up, Doc? carrots→ 03:15, 7 November 2013 (UTC)[reply]
Roth et al. says so in The Bell Curve (cited here in Race and intelligence): "The Bell Curve (1994) stated that the average IQ of African Americans was 85, Latinos 89, whites 103, East Asians 106, and Ashkenazi Jews 113. Asians score relatively higher on visuospatial than on verbal subtests. The few Amerindian populations who have been systematically tested, including Arctic Natives, tend to score worse on average than white populations but better on average than black populations.[44]" There's your "citation needed", right in front of you! 24.23.196.85 (talk) 02:34, 8 November 2013 (UTC)[reply]
First, there's no the theory. There are speculative studies based on correlation that have been published in peer-reviewed journals as mentioned in Ashkenazi Jewish intelligence which Dmcq linked to above. Such papers wouldn't be considered crankery, but they certainly would be open to a lot of criticism and skepticism, and no such theory "mainstream" in the way that the dinosaur origin of birds theory is mainstream. We're nowhere near the stage where we could prove this based on direct evidence and a full knowledge of the mechanisms of intelligence. I wouldn't listen to anyone who says the theory is true or false a priori. μηδείς (talk) 03:19, 7 November 2013 (UTC)[reply]
I meant what is the acceptance vs. criticism rate, if you get what I mean.74.14.29.128 (talk) 03:23, 7 November 2013 (UTC)[reply]
That would require a meta-study. It's also not the way scientific consensus works. I suspect there will be all sorts of genetic differences in intelligence, just as there are genetic differences in metabolism. That doesn't mean that with a heart-healthy diet and exercise, your breakfast cereal may not help prevent weight-gain. One plays the cards one's dealt the best one can. μηδείς (talk) 04:02, 7 November 2013 (UTC)[reply]
Yes. The article you're looking for might be Origin of birds. And as regards IQ, regardless of how well someone does on an IQ test, all it proves is their aptitude for taking IQ tests. ←Baseball Bugs What's up, Doc? carrots→ 03:25, 7 November 2013 (UTC)[reply]
Well, no the article I was looking for was indeed entitled "dinosaur origin of birds", but I'll accept your link to an article that actually exists, over the one I wanted that doesn't, gis. As for IQ, the question is definable, and a question of fact, whatever the usefulness of the results. One can stipulate that one defines an Ashkenazim population by whatever criteria, say males carrying the "Jewish" Y-chromoome and at least 6 great-grandparents of provable central or eastern European Jewish descent. And one can choose some criterion as a proxy for intelligence, say a standard IQ test administered to natively English-speaking students of various ethnicities of the same age, education, family circumstance, and other controlled factors. And one can see whether this leads to a statistical difference in populations, regardless of one's ideological stance on IQ tests or whether races really exist. That being said, a correlative study wouldn't give us causes or mechanisms (it might give us clues toward them) and it wouldn't tell Mrs. Cohen whether her 3 year old son would win a Nobel prize due to his Y-chromosome. It certainly wouldn't tell us that anyone should not be given the most rigorous eductaion possibe. But the issue can still be framed in questions of verifiable fact, and such studies can be replicated to see if they reach the same results. μηδείς (talk) 03:49, 7 November 2013 (UTC)[reply]
Thanks, by the way, I think race is a biological construct, and the social construct thing is an overreaction by the left against it, so would it make sense for the difference in intelligence to be genetically different between groups? And this intelligence determines what the person of a certain race would become, like Ashkenazim, or Whites vs. Blacks, because tests reaffirm the IQ gaps74.14.29.128 (talk) 03:56, 7 November 2013 (UTC)[reply]
I think I have said about all I can. Just be aware that "group" is poorly defined. Scientific studies are done (one hopes) on well-defined populations, not on self-identified ethnic groups. μηδείς (talk) 04:06, 7 November 2013 (UTC)[reply]
This whole argument is like a broken record. It turns up here from time to time under various guises. But it's always based on some bogus premise. Intelligence is not in groups - it's in individuals. And I think it was TR who said (or was attributed) something to the effect that persistence trumps so-called "intelligence". I think back to the elections of 2000 and 2004, and how the critics kept making fun of George W. Bush and his frequent malapropisms, and hence how "un-intelligent" he was/is compared to his opponents. So the punch line is, if his opponents were so much freakin' smarter, why didn't they win? I have to agree with Teddy that IQ is vastly overrated. ←Baseball Bugs What's up, Doc? carrots→ 04:49, 7 November 2013 (UTC)[reply]
I think there are inheritable properties in any group and what we are looking at, usually, are the extreme ends or tails of the distribution. For example, the racial makeup of the NBA compared to society may be highly skewed but it's really just a very small tail compared to a large population. It's useless as a predictive metric for any single individual, it's simply an observable. I read the article and spotted a few false comparisons like when looking at populations it's full heritage but when looking at prizes or awards, anyone with partial relationships were includes. Even still, it's a tail end of a distribution. There is no predictive value for an individual associated with any group. Additionally, the achievements are individual in and of themselves, not a group achievement. Being Einstein's cousin means very little other than being his cousin. Winning the lottery is a good analogy. The winners might have a correlation but would that correlation extrapolate to the larger population to predict the next lottery winner? I don't think so. --DHeyward (talk) 07:10, 7 November 2013 (UTC)[reply]
Yes, Lysenko. For eighty years we heard about how Lysenko was the demagogue of Communist pseudo-science, with the crazy idea picked up from illiterate Russian peasants that when you plant a seed in different conditions, the seed somehow adapts to withstand them, in a way that can be inherited. And then, finally, after the fall of Communism, some people tried the experiment ... and guess what: It works! It was really the capitalists holding an ideological line on science, saying that you can't possibly hope to make the low IQ races at the bottom of The Bell Curve ever get any smarter just by putting them in conditions where they can use their intelligence without getting beaten down for it.
The consequences of living in a world where epigenetics matters are far-reaching. A time of starvation for grandparents can mean heart disease for the current generation. Maybe good conditions pay off as I suggest - I hope so, and I think so, but there's no guarantee. For all I know the exposure of a couple of generations to computer monitors and a torrent of information is going to be kids with a 30% chance of growing up autistic. And then there's the dramatic and mysterious increase of height and weight over the centuries. The world of genetics is a whole lot less predictable than it ever seemed to the sedate operations of classical Darwinism, but it certainly is getting even more interesting. Wnt (talk) 04:36, 7 November 2013 (UTC)[reply]
But maybe the races evolved in entirely different environments and lead to increase in brain size, intelligence and other abilities. For example, evolving in cold environment vs. a tropical environment for human races.74.14.29.128 (talk) 04:46, 7 November 2013 (UTC)[reply]
Also, speaking of persistence, it could be that certain behaviours evolved in certain races, so they developed behaviours like trying hard at certain tasks while others didn't because their environment wasn't challenging enough?74.14.29.128 (talk) 04:54, 7 November 2013 (UTC)[reply]
I'd like to see some Hottentots develop an IQ test and see how well the average European would do. ←Baseball Bugs What's up, Doc? carrots→ 04:56, 7 November 2013 (UTC)[reply]
And how, may I ask, would such a test have the slightest relevance to the kinds of intelligence that are actually useful for civilized people to have? 24.23.196.85 (talk) 02:34, 8 November 2013 (UTC)[reply]
Think of all the environments there are, even in Africa. Deserts, rainforests, tropical, subtropical, even well below freezing in the Climate of South Africa and other deserts on winter nights. And the genes of humans and their predecessors, while seeming sedentary for a few generations, endlessly flow from population to population, from one end of the continent to the other given any significant amount of evolutionary time. So while organisms need to evolve a developmental program to deal with such environments, there is quite an opportunity for short-term (i.e. over a few generations) epigenetic adaptation to take on this function. Wnt (talk) 04:59, 7 November 2013 (UTC)[reply]
Okay, but does that negate the idea that certain races evolved in environments favouring intelligence and innovation, so you have Europeans who became advanced and bushmen who still use bows and arrows? 74.14.29.128 (talk) 05:03, 7 November 2013 (UTC)[reply]
The differences are absurdly small. Places like Germany and Scotland were dark, mysterious woods or desolate moors full of backward tribesmen until the past millennium or two. And Africans have a long tradition of iron smelting. The Americas lagged behind, due to small population sizes and possibly North America being blasted from space, leaving some useful species like the horse unavailable - nonetheless, allegedly without any contact with the Old World, they came up with agriculture at about the same time. Wnt (talk) 05:17, 7 November 2013 (UTC)[reply]
The Etruscans (predecessors of the Romans) and the Hittites beat the Africans by at least a thousand years in terms of metallurgical knowledge! And so did the Germans and the Scots, though by a smaller margin. 24.23.196.85 (talk) 02:34, 8 November 2013 (UTC)[reply]
See Iron Age. The Africans had nothing to be ashamed of. I realize that the past few centuries have created a widely propagated narrative of civilization spreading out from Europe, but it's strictly a myth. Even the Roman Empire was a pretty short-term aberration, using technology that was largely imported from Greece and the Middle East, and soon receding right back to Istanbul for its later years. I think the real moral of that historical story is that the crevices where three continents come together ended up at the technological forefront, probably because they were simply at the middle of everything. Wnt (talk) 05:27, 8 November 2013 (UTC)[reply]
I wonder how the OP figures that having better killing machines indicates "superior intelligence" or "advancement". ←Baseball Bugs What's up, Doc? carrots→ 06:33, 7 November 2013 (UTC)[reply]
Why wouldn't development of more advanced weapons be a sign of technological advancement? 24.23.196.85 (talk) 02:34, 8 November 2013 (UTC)[reply]
Let me think. The Mongols invaded Europe with gunpowder, holding sway for a short time until the Europeans picked up the trick and kicked them out. Then the Europeans invaded Africa, holding sway for a short time until the Africans picked up the trick and kicked them out. Which proves... that Europe is closer to China than Africa, but not much about intelligence. Wnt (talk) 06:15, 8 November 2013 (UTC)[reply]
The difference is, the Europeans did not just "pick up the trick" like you say -- they independently reinvented gunpowder, which was not the case with the Mongols and the Africans (the former stole the technology from the Chinese, and the latter from the Europeans)! And, in fact, Africans do not manufacture their own firearms or ammunition even today -- they relied on guns and ammo supplied by the Europeans (mainly Russia and other Marxist nations) to kick the colonists out, and they still rely on imported guns and ammo to fight their tribal wars today! 24.23.196.85 (talk) 02:29, 9 November 2013 (UTC)[reply]
Then how come Liber Ignium was translated from Arabic? And as for Africa not making their own guns, it's because they have so many outsiders dumping them in on the cheap to try to get control over mineral resources. You might as well say Americans are genetically inferior because they can't make computers. Wnt (talk) 05:05, 9 November 2013 (UTC)[reply]
First of all, your claim about the Liber Ignium is disputed, so quit trying to present it as an incontrovertible fact: "The work has been subjected to numerous academic analyses, and with contradictory conclusions with regards to origin and influence on its contemporaries.[1][2]" And secondly, Roger Bacon did not use it, but instead came up with his own recipe, just like I said: "Iqtidar Alam Khan writes that while the Liber Ignium contents can be traced back to Arabic and Chinese texts, the work of Bacon appears to represent a parallel tradition, especially because the decoded formulas of Bacon contain considerably less nitrate.[10]" As for us Yanks not making computers, this is absolutely FALSE and an INSULT to our great nation -- we DO make some of our own computers (ever heard of Texas Instruments, or the Cray supercomputers, ignoramus?), and it is ONLY the personal computer market that's dominated by foreign manufacturers because of cost issues! In any case, we HAVE made many computers in the past, so it's perfectly clear that we CAN -- whereas the Africans NEVER manufactured their own firearms or ammunition (let alone computers or other electronics!) 24.23.196.85 (talk) 20:32, 9 November 2013 (UTC)[reply]
Africa has a way to go, both on making and using computers, but manufacturers like Zinox and Mecer have a significant role. And just because Bacon had a "parallel tradition" doesn't mean it was one indigenous to Europe - to the contrary, he was clearly very close to the import of the idea from further east.
At this point I would like to invoke a highly potent magical incantation against racism: "Non Angli, sed Angeli". Pope Gregory the Great spoke these words in his efforts to purge slavery from the ruins of Rome, after their society of inequality had finally burned itself out to its bitter end. It is the direct counterpart of our culture's "Black is beautiful", but it referred to an acceptance and appreciation of the pale-skinned Anglos brought in the slave ships from the backward provinces of the far north. Wnt (talk) 03:50, 10 November 2013 (UTC)[reply]

It seems to me that the selection pressure from society is toward less intelligent people. People who don't complete their education who then don't get into a good position in society, tend to get (or make someone) pregnant at younger ages, they end up getting more children. These children start at a disadvanteged position in society from which being a bit more intelligent isn't going to be of much more benefit compared to children who grow up in families where e.g. both parents are professors (these children may become scientists who choose not to have children so that they can devote their lives to their work). Count Iblis (talk) 18:44, 7 November 2013 (UTC)[reply]

Maybe only the smart ones are "welcomed" as Ashkenazi Jews, and the rest are rejected, or encouraged to assimilate with non-jews... Ssscienccce (talk) 05:02, 8 November 2013 (UTC)[reply]

Somebody up there mentioned Lysenko apparently confusing him with Michurin. Lysenko was a fraud pure and simple. Michurin's hybrids all died out eventually. None is extant today. — Preceding unsigned comment added by 168.178.75.207 (talk) 17:30, 8 November 2013 (UTC)[reply]

Well, I was laying it on a bit thick, and you do have a point that Lysenko even called it Michurinism. Efforts to vilify him certainly have some good reasons, but I suspect they go too far in the sense that, in the Stalinist Soviet Union, somebody was bound to end up in a camp. The core political point, however, is what I find most interesting: that it may be possible to elevate individuals and social groups of people even from what appears to be an inborn lack of intelligence, by giving them a chance to use their intellect, exposing their gonads to circulating hormones that might reprogram their DNA to direct more embryonic and metabolic investment in brain function. Wnt (talk) 23:11, 8 November 2013 (UTC)[reply]
That is just tabula rasa nonsense, scholars like Steven Pinker and Razib Khan have absolutely destroyed that, face it, some people will just be born unequal and some races are just disadvantaged in intellect174.88.155.12 (talk) 02:51, 9 November 2013 (UTC)[reply]
That is just a fact of human biodiversity.

Fact or fiction: is it possible to be germophobic without having OCD?[edit]

There was this episode on Arthur - the popular children's TV series about the titular anthropomorphic aardvark and his friends - where Buster Baxter had germophobia or extreme fear of germs because his friends once told him that he should clean up. Then, he had a nightmare one night. Having dreams is a motif throughout the whole series, intending to teach the dreamer something before the dreamer wakes up. So anyway, once Buster Baxter woke up, he became a super neat freak and expected everyone to be clean, wearing gloves to eat his lunch. At the end of the program, Buster seemed to return back to normal instantaneously, giving an impression that it's not a psychiatric disorder. The characters also do not use the term "OCD", but rather "nervous wreck". The episode title is "Germophobia". Is it possible to be germophobic without having OCD, or is this entirely a work of fiction with no basis in reality? 140.254.227.44 (talk) 18:39, 6 November 2013 (UTC)[reply]

Well, there was a time before we knew of any benefit to "germs", and only that they were potentially deadly, where many people were probably what we would describe as "germophobic" today. Also, if you work in certain fields, like a biological weapons lab, it pays to be germophobic. StuRat (talk) 18:45, 6 November 2013 (UTC)[reply]
Mysophobia is the technical term, incidentally. Tevildo (talk) 21:05, 6 November 2013 (UTC)[reply]

What's this object falling?[edit]

What's the object falling in this music video starting at about 3:55?[6] I assume it's some sort of space program accident, but I don't recognize the footage. A Quest For Knowledge (talk) 20:31, 6 November 2013 (UTC)[reply]

Looks more like an airplane, to me. And the way it's falling straight down is very strange, like it stalled before it burst into flames. Might be something staged. StuRat (talk) 21:36, 6 November 2013 (UTC)[reply]
I've seen the footage before, as I remember it was after a collision between two aircraft, possibly at an air show. I can't find the original footage yet though. I will keep looking217.158.236.14 (talk) 09:41, 7 November 2013 (UTC)[reply]
Challenger May 1962 explosion of the first Atlas-Centaur? Seems to be a fragment of the longer sequence shown in Koyaanisqatsi. Beginning at 1:50 http://www.youtube.com/watch?v=i1kOW-luAoI Ssscienccce (talk) 05:07, 8 November 2013 (UTC)[reply]

Charge a battery in the microwave[edit]

This is not about charging a mainstream cell-phone in a microwave. However, could we develop a battery that transforms microwaves into electricity? The plastic encasing could be transparent for MWs. OsmanRF34 (talk) 23:08, 6 November 2013 (UTC)[reply]

Not really. It would be possible to design a device you could put into a microwave oven which converted the microwave energy to DC, but it wouldn't work as a battery charger. First, and most importantly, it would be appallingly inefficient compared to a charger that just plugs into the mains. Secondly, a microwave is designed to deliver a fairly large amount of power (about 1 kW) for a short period (about 10 mins), but a battery needs a much lower power for several hours. The charger would need something like a very large capacitor to store the energy from the microwave and feed it into the battery at a suitable rate, and such capacitors, although they exist, are very expensive and too big to fit into a microwave. If we had a battery that could be charged at 1 kW for 10 seconds, then it might work - but we don't have such batteries, and it would still be more efficient to use the mains to power the charger even if we did. Tevildo (talk) 23:37, 6 November 2013 (UTC)[reply]
There is no time limit I know of on microwave ovens. Haven't you ever cooked a turkey in one?[7] (Me either but My mother used to cook pheasants that way). Rmhermen (talk) 03:32, 7 November 2013 (UTC)[reply]
Aww, I thought you'd linked to this. Tevildo (talk) 22:38, 8 November 2013 (UTC)[reply]
(edit conflict)The technique of Inductive charging has already been developed, though microwave frequencies are probably not the best to use (and Tevildo explains above why a microwave oven is not an appropriate device). As you correctly mention, it would be dangerous to put a normal battery in a microwave. Dbfirs 23:40, 6 November 2013 (UTC)[reply]
Recently, a Japanese laboratory has invented a device that can salvage microwave energy naturally leaking from a microwave oven when in use, and convert that energy into electrical energy. This device is capable of charging low power consumption devices such as cellphones. Plasmic Physics (talk) 01:17, 7 November 2013 (UTC)[reply]
Plasmic, if your microwave oven leaks sufficient energy to charge a cellphone battery, it a) has faulty door seals, door interlock, or other fault, and b) is very dangerous. While the amount of microwave engergy inside the oven is of the order of 500 to 700 watts (for an oven rated at 800 to 1000 watts input), the amount of power required to activate oven safety testwers is milliwats. 1.122.165.223 (talk) 07:25, 7 November 2013 (UTC)[reply]
Tell that to the researchers. Plasmic Physics (talk) 09:57, 7 November 2013 (UTC)[reply]
What researchers? Where? Who? As far as cellphones go, it's pure fiction.
A Google search turns up a paper by a Yoshihiro Kawahara, Tokyo University, (http://dl.acm.org/citation.cfm?id=2493500) about harvesting the energy leaked from a microwave oven. It essentially shows its a waste of time - novelty value only. He was able to harvest all of 1 milliwatt. That's enough to power a CMOS timer, and about 0.002% of sufficient to charge a cellphone battery even if you run the oven continuously, rather than the usual 1 to 5 minutes or so. 1.122.165.223 (talk) 10:20, 7 November 2013 (UTC)[reply]
OK, so I was mistaken about the cell phone. However, in defense, you did make it sound as if it was my research project. Plasmic Physics (talk) 11:21, 7 November 2013 (UTC)[reply]
[8] 82.44.76.14 (talk) 20:49, 9 November 2013 (UTC)[reply]