Wikipedia:Reference desk/Archives/Science/2011 February 5

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Science desk
< February 4 << Jan | February | Mar >> February 6 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.

February 5[edit]

Homogenized Milks[edit]

Are all USA Homogenized Milks created equal/the amount of cream, water,etc.? Does the FDA have a standard? Is one brand richer than another? — Preceding unsigned comment added by Jza59 (talkcontribs) 01:16, 5 February 2011 (UTC)

Grading and verification of dairy is regulated and enforced in the United States by the United States Department of Agriculture, not the FDA. Here is the webpage for Dairy Standardization, which requires that certain marketing labels meet certain milk quality and content standards. The actual content and quality of milk may vary from brand-to-brand (and day to day, even from the same cow); but to be labeled with certain terms, milk and dairy products must meet USDA standards. The USDA is responsible for regulating the milk product all the way from the cow to the dairy to the wholesale distributor to the grocery store (if that is the path it takes). The FDA also has some regulatory power, specifically related to milk safety and pasteurization: see Pasteurized Milk Ordinance 2007 and the FDA's Milk Safety program. There is a little bit of overlap between FDA and USDA, particularly in dairy regulation. Nimur (talk) 01:31, 5 February 2011 (UTC)
I'm pretty sure that homogenized milk is simply milk from the cow that gets homogenized. They don't adjust the levels of cream, water etc. It's used as is. Maybe the diet of the cows varies from brand to brand, but I doubt it. Ariel. (talk) 03:28, 6 February 2011 (UTC)
They certainly do that in Germany, where "whole milk" has a fat content of 3.5-3.8% (3.5% is the legal minimum, 3.8% is what premium brands offer as "natural" fat content). The milk is centrifuged to separate out the fat, and then remixed to the desired fat content. Leftover milk fat becomes butter or cream. When I was in Scotland, they sold Channel Island milk with 5% fat content - it's delicious. Arteries? Who needs arteries? --Stephan Schulz (talk) 14:39, 6 February 2011 (UTC)
Note that's a lot to do with the breed of cattle Jersey cattle and Guernsey cattle (Channel Island cows) are renowned for their high butterfat content, at 5-6%. In contrast, the majority of (beverage) milk produced in the United States is from Holstein cattle, who produce large volumes of milk, but with one of the lowest butterfat content (not listed in the article, but listed variously as 2.5-3.8%). Holstein milk may go into the homogenizer as-is, but if your German milk is from another breed like Brown Swiss/Braunvieh (4% butterfat), they may partially skim it first to reduce the butterfat (and increase profits by selling the cream separately). -- (talk) 16:09, 7 February 2011 (UTC)

Hydrogen bonds[edit]

I know that hydrogen bonded to Nitrogen, Oxygen, and Fluorine are capable of forming hydrogen bonds with other, similar molecules. However I looked on an electronegativity table and it shows Oxygen has an electronegativity of 3.44, Fluorine of 3.98, snd Nitrogen of 3.04 (again I forget the units), but chlorine has an electronegativity of 3.16 (greater than N) and yet I am told it does not form hydrogen bonds. I think there might be something other than electronegativity going on here; why doesn't chlorine attached to hydrogen form hydrogen bonds even though chlorine attached to nitrogen does? Thanks. (talk) 01:47, 5 February 2011 (UTC)

Size. Chlorine has an occupied energy level that is empty in the row two elements nitrogen, oxygen, and fluorine. Being larger, the unbalanced charge from its bonding to hydrogen is more widely distributed than it is in the smaller molecules, resulting in a lower dipole moment. Indeed, the dipole moment of hydrogen chloride is just over half that of hydrogen fluoride. And I'm sure there are also quantum reasons it has a weaker tendency to donate either electrons or a proton, but my chemistry is too far behind me to remember why. Someguy1221 (talk) 05:21, 5 February 2011 (UTC)
You've pretty much got it. The main issue is that hydrogen bonding is determined by Bond dipole moment. In HCl, the longer bond means that the hydrogen atom is far enough away from the chlorine atom that it gets enough electron density to shield its nucleus. In the shorter H-N bond of ammonia, though nitrogen isn't as electronegative as chlorine it, the fact that the bond is significantly shorter means that the nitrogen can effectively remove enough electron density from the hydrogen to deshield its nucleus. The deshielded hydrogen nucleus in H-N, H-O, and H-F bonds is what gives rise to hydrogen bonding. --Jayron32 01:06, 6 February 2011 (UTC)
So factors beyond the simple electronegativity difference are in effect, though ionic bonding is usually stronger than the hydrogen bonds. ~AH1(TCU) 19:18, 6 February 2011 (UTC)


How long does it take for bone to completely disintegrate into 'dust?' Say this bone is a large leg bone of an animal, 2 meters long and 1 meter around. Thanks. schyler (talk) 02:48, 5 February 2011 (UTC)

It could be more than a billion years. See fossil. Looie496 (talk) 05:13, 5 February 2011 (UTC)
Except that there weren't any bones that large a billion years ago. HiLo48 (talk) 05:15, 5 February 2011 (UTC)
Well no one said it had happened already Nil Einne (talk) 06:20, 5 February 2011 (UTC)
Excellent point. HiLo48 (talk) 06:53, 5 February 2011 (UTC)
It depends entirely upon the various forces acting upon it, e.g whether it's exposed to weathering, or chewing, or the acidity of the soil in which it's buried, etc.--Shantavira|feed me 09:46, 5 February 2011 (UTC)
My understandings that most fossils are not really the actual bone material, but minerals and stuff that's leeched in and replaced the original material over the course of many years. See Fossil#Types_of_preservation . Vespine (talk) 10:01, 5 February 2011 (UTC)
Lucy's bones are not yet dust after 3.2 million years, how long are you prepared to wait? Richard Avery (talk) 11:15, 5 February 2011 (UTC)
Are the Lucy "bones" really human bone material or are they just rocks which are in a bone shape? That seems to be the main question. The article does not distinguish between the two, and indeed, most websites do not seem to either. --Mr.98 (talk) 01:47, 7 February 2011 (UTC)
Thanks to Vespine and Shantavira for the info. schyler (talk) 14:01, 5 February 2011 (UTC)
I think the Ship of Theseus is highly relevant to this discussion. Is it clear that the original bones still exist once fossilized? -- Scray (talk) 03:35, 6 February 2011 (UTC)
I don't think anyone, including me, has actually addressed the question properly yet. Ship of Theseus is one thing, but the fact is it is generally extremely unlikely that any one single bone will be fossilized in the first place. Fossilized bones represent an extreme minority of the bones that have ever existed. The vast majority of bones do decompose without leaving a trace. I imagine someone in the field of Forensic anthropology would be able to give a decent answer regarding how long bones last. If I had to guess, for bones just left out in average weather, I think bones would last maybe a few to several decades, however local conditions would play an extremely important part, especially rain and humidity. I imagine in a rainforest a bone probably lasts not even a few years where as in a desert a bone might last centuries. Vespine (talk) 21:58, 7 February 2011 (UTC)


whats the difference between Ceramic plates and Porcelain plates? — Preceding unsigned comment added by Tommy35750 (talkcontribs) 03:12, 5 February 2011 (UTC)

Well a porcelain plate is one type of ceramic plate. There are many other types of ceramics so I suggest you read the articles.--Shantavira|feed me 09:50, 5 February 2011 (UTC)
There is no difference, technically Porcelain is a type of ceramic but it's such a general type that just about any ceramic would count. Ariel. (talk) 03:10, 6 February 2011 (UTC)
Text in Section 1 of the previously linked article ceramic differs. (talk) 10:07, 6 February 2011 (UTC)

injection moulding[edit]

when making ABS plastic vacuum cleaners by injection moulding, is a mould realease agent used. for example PVA or oil. —Preceding unsigned comment added by (talk) 03:58, 5 February 2011 (UTC)

A number of release agents for ABS are listed in this book SpinningSpark 16:51, 5 February 2011 (UTC)

i understand they can be used, but are they usuallly used when making vacuum cleaners?

BTW, are these two questions also by you? Ariel. (talk) 03:07, 6 February 2011 (UTC)

no —Preceding unsigned comment added by (talk) 06:15, 6 February 2011 (UTC)

"Is it really so..."[edit]

...that if I make a theory based on 10,000 (of the same) experiments (by different scientists) that proved my hypothesis 51% of the time that it is accepted by the scientific world? I really have a problem with the suppositions surrounding Half-Life. The article almost puts it where the theory could be wildly false. Does this bring into question science as a whole? Are people today substituting Science for Truth? schyler (talk) 14:15, 5 February 2011 (UTC)

That's not what it's saying. The chances of any particular atom decaying after existing for the half-life is exactly 50% - so 50% of all atoms of that age will have decayed by that time, you just don't know which ones will decay in advance. Mikenorton (talk) 14:27, 5 February 2011 (UTC)
So if my hypothesis is that atoms have a definable half-life and the tests shows that "50% of all atoms of that age will have decayed" by that hypothesized half-life, I have a theory? And it is accepted as "probabilistic" truth? schyler (talk) 14:33, 5 February 2011 (UTC)
I don't know what you mean by "truth". The point is that your hypothesis will have passed the experimental text and that's all that is expected from a good hypothesis. Dauto (talk) 15:14, 5 February 2011 (UTC)
You seem to have a misunderstanding of how everything fits together. There's an assumption that atomic decay is a constant-rate process, meaning that for an atom of a given type (say a particular isotope of a particular element), in a given period t, the probability p that an undecayed atom will remain undecayed at the end of the period is constant, and independent of the past history of the atom. This is something that can be confirmed or refuted experimentally. From that assumption, we can mathematically derive that the probability p is a function of t. There is a "rate of decay" that's an intrinsic property of the type of atom. We can describe it in many equivalent ways, one of them is the length of time after which the probability of non-decay is 50%—that's what we call the "half-life". We can, if we so choose, to use an analogously-defined "quarter-life". There's also an assumption that the decay of one atom is independent of the decay of any other. I think experiments can be designed to confirm that, although I don't know how scientists did it. Once you have these assumptions, you can mathematically work out the expected fraction of undecayed atoms as a function of time. You can use it to design different kinds of experiments to measure the "decay rate" parameter of an atom. And you can characterize the rate using a half-life measure. For atoms that are synthesized and are available only one at a time, the experiment for measuring the decay rate may involve measuring the decay probability of a single atom in a given duration, one atom at a time. The measured decay rate is mathematically converted to a half-life measure because that is the conventional measure. Measuring the half-life of an atom should not be confused with proving a hypothesis 50% (or 51%) of the time, whatever that means. -- (talk) 15:45, 5 February 2011 (UTC)
It's not very polite to tell another editor that they "have a misunderstanding of how everything fits together" and then go on a barely followable rant that calls you own understanding into question. Did you come here to ask something, or to argue about some kind of vague notion half formed in your head? Beach drifter (talk) 16:02, 5 February 2011 (UTC)
I think's explanation was pretty good and understandable. Just my opinion. Dauto (talk) 18:00, 5 February 2011 (UTC)
I was responding to (the second comment of) the OP. My remark that he might "have a misunderstanding of how everything fits together" was not an insult, but rather an attempt to help by pointing out what I thought was causing his difficulty in understanding the article he referenced. If you think my explanation is incorrect, feel free to point out the errors, but please stick to the technical and be civil. -- (talk) 02:02, 6 February 2011 (UTC)
The direct answer to your question is no; 10,000 identical experiments of which only 51% have results consistent with a given hypothesis cannot be taken as proof of that hypothesis. Clearly something has been missed, there is some condition or conditions that prevent the other 49% from meeting the hypothesis. However, you are confused over half-life because the claimed hypothesis involved here is a stochastic process. Let's take another example, Boyle's law. We would expect 10,000 experiments measuring the pressure and volume of gas in a balloon to to be pretty nearly 100% consistent with Boyle's law. It would be acceptable to discard the odd one or two wild results which might easily be put down to errors such as missing a decimal point in recording a result or the lab assistant filling the balloon with the "wrong" gas, but if only 51% of results agreed with Boyle's law then the law would be pretty much considered disproved. As it happens, Boyle's law does describe the behaviour of gases to a high degree of accuracy. Now coming on to half-life, the claim is not that an unstable atom will decay after the half-life time. The claim is that it will decay at an unknown time with a probability given by some probability function. The half-life time is merely the time for which the probability is one half. The law of radioactive decay is not stating a deterministic time, it is stating the probability function which can be accurately measured and, like Boyle's law, gives results close to 100% accurate. Another way to look at radioactive decay is through the time constant, which is related to (but not equal to) the half-life. What this says is that if the rate of decay continues at the initial rate then the entire sample will have decayed in the time-constant. In fact, the rate of decay does not continue at the initial rate, it continually changes (gets slower) in such a way as to keep it true that decay would be complete in the time-constant if the current rate were maintained. Rate of decay is easily measured and shown to be in line with this time-constant law, which can be shown to be entirely equivalent to the half-life law. SpinningSpark 16:40, 5 February 2011 (UTC)
Thanks Spinningspark. Also the OP may also find it useful to consider the notion that an experiment to measure a rate constant of radioactive decay can have a very low p-value. SemanticMantis (talk) 16:50, 5 February 2011 (UTC)

Thanks for the direct answer, SpinningSpark. I appreciate the example using Boyle's Law. I think the chunk of your answer is found where you say "The half-life time is merely the time for which the probability is one half." Maybe it's a mental wall when it comes to science that makes this phrase fly right over my head. What does this mean? schyler (talk) 16:48, 5 February 2011 (UTC)

Well consider coin-tossing. The probability of heads is one-half on any one toss. This is not the same as claiming that the coin will land heads. The claim is that over a large number of tosses the proportion of heads will be very close to half. We could state this in terms of time; let's say the coin is tossed at regularly time intervals. There will be a definite time in which the probability of getting heads is one half. How long it takes to actually get heads however could be anything since any number of tosses, in principle, can come up tails. SpinningSpark 16:59, 5 February 2011 (UTC)
It means that after a half-life time is passed there is a 50% chance that any given atom will have decayed and a 50% chance that it won't have decayed. If there is only one atom in your hand than it is anyone's guess what may actually happen. If you have trillions of atoms than about half of them will have decayed and the other half won't have decayed. Dauto (talk) 17:58, 5 February 2011 (UTC)
The real interesting aspect about half lives is that they are truly probabilistic as far as we know: you can't predict, at all, when a single atom is going to undergo decay. But on the aggregate, with a large number of atoms, you can say, "well, half of them will decay in a year." It's one of the interesting facets of statistical knowledge. The same principle can be applied to a lot of other probabilistic functions: I can't tell you whether any individual cigarette smoker will or won't get lung cancer, but I can tell you that out of a certain population of smokers, a given percentage will get lung cancer. (This is what is generally meant by describing risk factors—I can't tell you that surfing will cause you to get bitten by a shark, but I can tell you that out of X surfers per year, Y% get bitten by sharks.) In the case of cancer, it's because the chances of individual cancers are probabilistic in some sense (whether a given cell become a tumor), and also because they are other, complicated factors involved (like genetics, and other environmental hazards). But statistics lets you get away from worrying about the direct causes, and instead come up with observations which seem in many cases to act like iron-clad laws. When people started keeping solid records on mortality statistics, they were shocked that they could predict, with reasonable accuracy, exactly how many suicides would occur in a given nation per year, for even though suicide seemed like an entirely individual phenomena whose causes differed from person to person, over the aggregate they become as predictable as, say, flipping coins.
Exactly what kind of knowledge this statistical worldview really is was a major 19th century debate amongst scientists and philosophers (in the context of thermodynamics, specifically). Some said, "this is the root of all true knowledge, because it doesn't require us to pretend to understand the direct causes of everything in the world, just the outcomes." The others said, "this is just a crude way of understanding the world, in the aggregate, and is not a substitute for the true understanding of cause and effect." I'm not sure who won out in the end, personally — the debate seems to have just drifted out of consciousness in the 20th century, perhaps (one might guess) because probabilistic thinking became so abundantly popular in epidemiology and physics in particular. But it does occasionally surface in regards to what "risk factors" really mean, and the benefits and perils of what is sometimes called "statistical medicine".
Your larger question, about scientific consensus, is somewhat different. On the one hand, one very clever experiment can occasionally disprove a thousand other results, if it is really unambiguous. Such experiments seem rather rare. Galileo's observation of the phases of Venus might be one of them — it basically made classical (e.g. Ptolemaic) geocentrism totally impossible to argue in favor of, no matter how many other clever arguments or experiments you had. (And indeed, the Church's astronomers recognized this almost immediately, and switched over to the Tychonic system.) In most cases, though, such amazing, paradigm shifting experiments are pretty rare in science. More often you get slow building of consensus, or a slow descent into uncertainty. In any case, though, I don't think that considering what is considered to be scientific consensus is a factor of raw numbers of articles saying one thing or another — more important is exactly who is making a given claim. The article of a scientist known to be reliable is usually given far more weight than a dozen articles from unknowns.
As for whether people confuse Science with Truth; indeed, it is a feature of our age that the two are considered quite synonymous in the minds of many. Whether they should be a heady question that is best addressed separately from discussions of half-lives, though. --Mr.98 (talk) 21:22, 5 February 2011 (UTC)

Thanks all for your time and patience. I have learned something new to argue about ;) schyler (talk) 23:19, 5 February 2011 (UTC)

I know you said this was resolved, but Mr. 98 raises some very interesting points to ponder, and his discussion over the statistical predictability of suicide reminded me a bit of Isaac Asimov's Foundation series, especially on Psychohistory (fictional), which was basically the same concept, but applied to all of humanity. Its an interesting thing to ponder; the ability to predict human behavior on large groups of people even if individually they behave rather randomly and unpredictably. Asimov uses the idea fictitiously, but its a real part of scientific philosophy, especially as it relates to the "soft" sciences like psychology and sociology and anthropology. --Jayron32 00:55, 6 February 2011 (UTC)
Actually, we're at ground zero for some interesting psychohistory, e.g. User:Wnt#Psychohistory_of_Wikipedia. But the idea behind half-life is really much simpler than that. The reason why it works as it does is that atoms are very small, simple things, which are incapable of memory. If you have an ingot of 1% U-235 or 0.0001% U-235 the atoms decay at the very same rate, because they don't know what ingot they're part of - provided, that is, they're not part of a large 100% U-235 ingot that blows up, because once the atoms start talking to each other via abundant neutrons they are no longer quite so simple. But when half-life applies, it also means mathematically that 1/4 or 3/4 or 1/1000 of atoms will all decay within certain intervals that can also be calculated, simply based on logarithms from the half-life; thus the chance that an atom breaks down at any given "instant" is also constant. Now, I can understand the visceral desire to object to probabilistic laws of physics - after all, Einstein himself made the "God doesn't play dice" statement - but a great deal of physics on this level works in just such a way. Wnt (talk) 15:48, 6 February 2011 (UTC)

I suspect the OP will continue to have arguments until it is understood that half-life is merely a statistical measure that is often useful to characterise a large number of continuous discrete events. It is not part of a nefarious conspiracy by scientists to sustitute science for truth, nor does the phase "verify by experiment" of the Scientific Method work like democratic voting. A hypothesis that gets disproven 49% of the time is unconvincing. Cuddlyable3 (talk) 18:28, 6 February 2011 (UTC)

The "51%" or "49%" is actually misleading. The hypothesis is not that each and every nucleus will decay. The hypothesis is that approximately 50% of the nuclei will decay, according to a certain statistical distribution. This hypothesis has not been disproved. Wnt (talk) 20:38, 6 February 2011 (UTC)
How about a similar simpler analogy: If I take 100 coins and flip them all, close to 50% will land heads up. I can't tell you which, but it's obvious this is a scientifically valid hypothesis that would be validated (or falsified) by the experiment. No appeal to truthiness required. 23:49, 7 February 2011 (UTC)

Help identifying a spider[edit]

This spider was found in an apartment in Sao Paulo, Brazil. Sorry for the lack of size context, not my image. Thanks! --Shaggorama (talk) 15:10, 5 February 2011 (UTC)

Multiple Chemical Sensitivity[edit]

I have Multiple Chemical Sensitivity. I was diagnosed in 1994 by Dr. Rea. But the issue is that my illness is very controversial. The vast majority of doctors I go to cannot treat me and just end up taking my money to do no real good. I went to the head of the evidence-based medical teachers board and asked him what he thought of MCS; his response was "I'm sorry, that is not an eb (evidence-based) question. Please refer to pico by scott richardson."

My three foot tall stack of medical files begs to differ. What is a non evidence-based question, anyways? How do I prove the existence of mcs? I have evidence coming out the wazoo. Where do I take it?

Where can I go to be the subject of medical research?

I can be contacted on windows live messenger - <email removed> —Preceding unsigned comment added by (talk) 19:13, 5 February 2011 (UTC)

I've removed your email address to protect you from spam, per reference desk conventions all answers are given here. Dragons flight (talk) 20:53, 5 February 2011 (UTC)
Have you looked at the article on multiple chemical sensitivity? It explains, in broad terms, why MCS can be a problematic diagnosis. As for getting help, you probably need to seek out a specialist in MCS (or perhaps allergies) who would be willing to discuss your concerns. The reference desk doesn't offer specific medical advice. Dragons flight (talk) 21:06, 5 February 2011 (UTC)
See evidence-based medicine. Basically what it means is that there should be good evidence from scientifically based clinical trials in order to engage in the treatment of a given condition. Some conditions are controversial because they are hard to define, or they lack an objective diagnostic test that can definitively demonstrate the presence or absence of that condition in a given person, or because there is no treatment that has been clearly shown to be efficacious. The NIH has a database of clinical trials that lists one study that seems to be about multiple chemical sensitivity. That particular study is based in Copenhagen, so it may not be possible for you to participate in, but perhaps the investigators would be able to help you out further. --- Medical geneticist (talk) 21:21, 5 February 2011 (UTC)

female an dmale squirrels are they different colors[edit]

i would like to know if female squirrels are different colors than males squirrels? its an american red squirrel. —Preceding unsigned comment added by (talk) 21:32, 5 February 2011 (UTC)

Not that I know of. The article American Red Squirrel doesn't mention anything about sexual dimorphism. You may be confusing different species of squirrel such as a Fox Squirrel or an Eastern Gray Squirrel; or you may have spotted a variety of American Red with different coloration; the Wikipedia article mentions a black and a white phase ofd the American Red Squirrel; though there is a [citation needed] tag on the statement. --Jayron32 00:47, 6 February 2011 (UTC)

What's up with Yellowstone right now?[edit]

I heard a few references to Yellowstone in mass media in the past few days. What is going on? Is there some quantitative knowledge about the risk, in light of some new data? I didn't find references to newer events in Yellowstone Caldera and Yellowstone National Park. --Mortense (talk) 23:19, 5 February 2011 (UTC)

This article was published in National Geographic. This has been picked up by media outlets around the world and given the usual 'it's all going to go boom, AAAAAAARGGGH, run run for your lives' spin. That's about it. 23:34, 5 February 2011 (UTC)
This is one of those confusions between "soon" on a geologic time scale and "soon" on a human time scale. "Any day now" has a range of +/- 1000 years on the geologic time scale. We've known most of this stuff about the Yellowstone Caldera for a long time now; I think the History Channel did a special on it 5-6 years ago at least. --Jayron32 00:42, 6 February 2011 (UTC)
Are you sure it wasn't Jellystone Park? There's a movie on in my neighbourhood that's making it famous again. HiLo48 (talk) 04:20, 6 February 2011 (UTC)
The Yellowstone system has now and then lava outbursts. We are in an earthquake cycle of the Pacific ring of fire. The probability of an outburst of lava is higher now. --Chris.urs-o (talk) 04:34, 6 February 2011 (UTC)
That's an intriguing statement. Has someone, presumably someone who doesn't subscribe to the mantle plume model, found a correlation between activity around the edge of the Pacific Plate and activity at the Yellowstone hotspot ? Sean.hoyland - talk 06:45, 6 February 2011 (UTC)
The correlation is between big earthquakes and leaking magma chambers. --Chris.urs-o (talk) 14:57, 6 February 2011 (UTC)
It is just media sensationalism at work. CNN had a story a few weeks back that said that Betelgeuse would go supernova soon and people were freaking out even though soon is defined as sometime in the next 10,000,000 years. Googlemeister (talk) 15:10, 7 February 2011 (UTC)