Jump to content

Wikipedia:Reference desk/Archives/Science/2018 February 17

From Wikipedia, the free encyclopedia
Science desk
< February 16 << Jan | February | Mar >> February 18 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


February 17[edit]

"Molecules of light" -- break this stuff down for us[edit]

This news talks about having "molecules" of three photons bound together because they "remember" interactions with an atom they passed. The original paper is presumably this at ArXiv; I didn't check to see if it came out in Science yet. I tried to get a clue here about photonic molecules before ([1]) but this seems like a good time for another shot at it. Besides, I have more questions...

  • There is much mention of "Rydberg polaritons". But what kind of polariton is that - is it synonymous with one of the half dozen kinds listed in our article?
  • I would assume a bound photonic molecule would have less energy than the free photons. Some math is on page 6-7, with an ultimate conclusion that the binding energy is around 1010 less than that in ordinary covalent bonds. However, the trimer energy is 4 times the dimer binding energy. Does this imply that with larger photonic "molecules" you could create much stronger binding energies as the number of combinations increases? Can you make it so that there is a net zero energy for a large photon assemblage?
  • The authors say (page 8): "A threefold increase in the atomic density would render the interaction potential sufficiently deep for a second bound state to appear near zero energy, which should result in resonant photon-photon scattering and a tunable scattering length ([ref 22])" What does that mean?
  • How does the "mass of a photon" increase in proximity to matter? Why can a dilute gas create "slow light" when dense assemblages of matter have low refractive indices that indicate only a minor slowing of light??

Wnt (talk) 15:27, 17 February 2018 (UTC)[reply]

I've confirmed the abstract is the same as that of the free ArXiv article I cited above. Wnt (talk) 17:31, 18 February 2018 (UTC)[reply]

Multiple shootings statistics[edit]

Does anyone know where I can find statistics on shootings by the number of victims? For example, how often do shootings involve 1 victim, 2 victims, 3 victims, 4 victims, etc.? Dragons flight (talk) 19:27, 17 February 2018 (UTC)[reply]

This looks promising, but the information isn't set as a table, but rather as graphs. It does, however, break things down by various criteria in interesting ways. Matt Deres (talk) 20:02, 17 February 2018 (UTC)[reply]
List of serial killers by number of victims may be helpful if you're willing to sort into those which involved firearms. Akld guy (talk) 21:14, 17 February 2018 (UTC)[reply]
Don't think that helps. I'm interested in the distribution of the number of victims in single shooting attacks. Dragons flight (talk) 21:17, 17 February 2018 (UTC)[reply]
Yes, those would more typically be classified as "spree" or "rampage" killings, of which we have a number of articles to start research: List of rampage killers, List of rampage killers (Americas), List of rampage killers (school massacres), List of school-related attacks, and School shootings in the United States - among many others, but they are mostly targeted to a particular sub-topic or only involve murders. The original question asked for "victims", which I would read as "casualties" - hence my initial reference. Matt Deres (talk) 21:57, 17 February 2018 (UTC)[reply]
The United States Federal Bureau of Investigation publishes the Uniform Crime Reporting ("UCR") website, including extraordinarily complicated and detailed statistics, summarized in its annual summary publication "Crime in the United States". This publication is widely considered to be the authoritative database. However, you practically need a degree in law and statistics to decipher it.
Although it might seem trivial to aggregate statistics from such incidents, it's actually very difficult: when do you consider a crime to have occurred factually? A common gut-feeling response might be to believe that an event has occurred after you've read about it on the news; but holding ourselves to the highest standard of aggregation of legal data, a fact is true if and only if the evidence of that crime has been proven in a court. Next, the troublesome problem is that crimes, laws, and courts differ across the entire United States; so the F.B.I.'s UCR methodology seeks to bring some coherency to the different reporting conventions.
Perhaps a few concrete, controversial cases can highlight the difficulty: consider this 2009 shooting in Oakland - will it be included as a gun crime in your data set? A conviction was obtained and the defendant served time in prison. Or, equally troublesome: how about this shooting in 2015 in San Francisco: no conviction was obtained; it was the opinion of a jury, and the outcome of the criminal justice system, that the 2015 shooting was not a homicide. It was not even manslaughter. In fact, no crime was committed pertaining to the death of any person, even though a crowd of people watched a person get shot.
No matter how you feel about those potentially controversial outcomes, if you simply treat them as statistics, one event was a shooting, and one was not. In a very large database, imagine the confounding factors! Every time a firearm is illegally present in a motor vehicle during a fatal car accident, a court may convict for a firearm aggravation on involuntary manslaughter. For example, consider this 2010 case in San Pablo, in which the defendant was convicted for murder. What if the hit-and-run had killed 4 and injured 6 - as this one did? It meets the bar for a "mass killing"! And if, hypothetically, an illegal firearm were also found in that vehicle - would that be a firearm aggravation, conviction for murder, with multiple fatalities? How would that show up in your database? The court only convicts for homicide, because although homicide and other gun crime is illegal, there's no way to convict for mass shooting.
As you begin to aggregate a "big data" database, it would be a fallacy to assume that you can ignore such incidents as outliers. The only trustworthy facts are the ones that are tried in court; and if you aggregate those facts instead of scrutinizing each individual case, the aggregate summaries are going to be pretty obscured by legal-ese detail.
To quote the FBI: "Figures used in this Report were submitted voluntarily by law enforcement agencies throughout the country. Individuals using these tabulations are cautioned against drawing conclusions by making direct comparisons between cities. Comparisons lead to simplistic and/or incomplete analyses that often create misleading perceptions adversely affecting communities and their residents. Valid assessments are possible only with careful study and analysis of the range of unique conditions affecting each local law enforcement jurisdiction. It is important to remember that crime is a social problem and, therefore, a concern of the entire community. In addition, the efforts of law enforcement are limited to factors within its control. The data user is, therefore, cautioned against comparing statistical data of individual agencies. Further information on this topic can be obtained in Uniform Crime Reporting Statistics: Their Proper Use."
If it seems unnecessarily pedantic, you can let a journalist or editorial publication do the difficult interpretation for you. For example, journalism publication Mother Jones publishes data and editorial commentary on that data. They have applied their own subjective standard to decide which criminal cases constitute a "mass shooting." Their standard is significantly different from the standards applied by many other journalistic publications, as they explain in their editorial on that topic. This is why New York Times counted 355 mass-shootings during 2017; Mother Jones counted only four.
If you really want your head to spin - have a read through this act of congress, one that is commonly cited by newspapers as an authoritative definition of "mass killing" ("3 or more killings in a single incident").
Strikingly: this law does not make mass-killing illegal. (Homicide is already illegal, most of the time; and this law does not alter those crimes). Rather, it makes it legal to use federal dollars to investigate those crimes, on a case-by-case basis.
Nimur (talk) 22:24, 17 February 2018 (UTC)[reply]
@Nimur: That is a good answer, but only where a physical count of all the victims is concerned. The original question was one of statistics, which I think might be obtained with less effort. If you can get a list of all the people convicted of homicide in a jurisdiction, you can look some up at random, one by one, decide if they meet your individual criterion for a shooting, and decide how many victims they had, thereby coming up with a bunch of stats. I think. You could do a cluster analysis i.e. to take small amounts of random data from states according to population or some other criterion, rather than relying on the completeness of voluntary reporting, if desired. Of course, probabilities of shootings with larger numbers of victims are rare and could not be estimated reliably from a data sample (perhaps not even from the entire data set) Would you agree this is a viable option? Wnt (talk) 00:53, 18 February 2018 (UTC)[reply]
Indeed, Wnt: that approach can be done, and has been done; but as you already said - it applies an individual's subjective criteria to filter the data; to say that some crimes "qualify" according to some standard or definition. If we start to enumerate the precise detailed criteria that define our subjective idea, we rapidly diverge - perhaps most clearly exemplified by the huge divergence between two journalistic sources I linked earlier - one that counted 4 "mass shootings" in 2017, and another that counted 355 "mass shootings" in 2017. Both statistics are backed up by a definition of the subjective term that sounds very reasonable and informed; yet yields entirely dissimilar statistical results. This arbitrariness of definition is one reason why parsing the numerical data isn't "objective fact;" it's why FBI reminds data analysts to proceed with care when looking at statistical summaries.
Nimur (talk) 04:30, 18 February 2018 (UTC)[reply]
Well, one source was trying to define a sort of "public mass shootings" that would exclude, for example, stuff that happens in gang neighborhoods -- which does indeed reek of a certain bias. But the existence of specialized definitions doesn't mean that a more generic definition is impossible. Wnt (talk) 04:41, 18 February 2018 (UTC)[reply]
As an aside, this WaPo story discusses the complexities surrounding defining 'school shootings' especially when you're not limiting yourself to those which involved injuries or death [2] Nil Einne (talk) 23:10, 18 February 2018 (UTC)[reply]
It would be surprising if the vast majority of shootings didn't involve a small number of victims. The mass shootings like the one in Florid would be outliers. ←Baseball Bugs What's up, Doc? carrots→ 02:28, 18 February 2018 (UTC)[reply]
There is no gun violence research in the united states for the last 20 years because a very influential lobbying organization (NRA) advocates against it.[3] --Kharon (talk) 04:10, 18 February 2018 (UTC)[reply]
As written, that's ludicrously inaccurate. The factoid that you are attempting to describe stems from the true fact that the Center for Disease Control may not spend federal dollars to conduct advocacy about gun violence. But even that summary explanation is a dramatic over-simplification of the CDC's ability to fund and collect research. Here is an article explaining some of the subtleties: A 1996 bill has had a chilling effect on the CDC’s ability to research firearms, from The Atlantic. Legislation limits what CDC can do - and in my opinion, some of that legislation is incredibly and unnecessarily restrictive; but it's not accurate to say that there is "no gun violence research" - at CDC, and certainly in the United States at large.
For example, here is CDC's Fast Stats webpage on firearm homicide.
Many other organizations do conduct research on gun violence. Nimur (talk) 04:34, 18 February 2018 (UTC)[reply]
The article you cited implies part of the chilling effect was not just from what that legislation forbade i.e. the actual restrictions imposed, but also from the fact CDC's funding was reduced by the amount that had been used for such research. The fear would likely have been either that further legislation would have been introduced. Or maybe more likely if they continued to fund much research from their reduced funding, they may have again had their funding cut by this amount. The WaPo article linked by Kharon suggests the chilling effect may have extended to other federal agencies, I presume due to a fear something similar would have happened to them. Nil Einne (talk) 23:16, 18 February 2018 (UTC) 08:41, 19 February 2018 (UTC)[reply]
To many it seems very out of place for an organization that at least used to fly around all over the world collecting viral samples so they'd have a small chance at an effective response to a natural or man-made pandemic would divert resources to collect statistics on who shoots who, let alone affecting policy. Some would like to see a doctor at the bottom of a ski slope patching up the occasional broken leg or severed testicle, but not at the top with a taser telling the tourists they can't get on the lift. The second kind of doctor seems more like a master or a farmer, I think. And while the abhorrence of a chilling effect is certainly very commendable from a free speech point of view, it is a near-universal but awful fallacy to offer government the same freedom as the people. People decide whether to get married or not, but the county clerk doesn't get to decide whether to issue the certificate. Likewise CDC officials don't get to say they were hired to do whatever they want - they can use their knowledge and evaluate competing scientific priorities, but only within the purview the taxpayers allowed them. Wnt (talk) 14:12, 19 February 2018 (UTC)[reply]
Not sure the relevance of nearly all of what you said. The CDC is AFAIK is still involved as they have been for a while in research and advocacy on smoking [4]; motor vehicle safety [5]; violence in many forms domestic violence (including sexual violence and teen dating violence) [6] [7], youth violence [8], child abuse and neglect [9], elder abuse [10] and suicide [11]; occupation safety and health [12]; drug overdose prevention [13] and more, e.g. look at our article on the Centers for Disease Control and Prevention. Americans may or may not agree with the CDC being involved in these areas of research and advocacy. It may very well have been reasonable for Congress to undertake a wide review of precisely what areas of research and advocacy the CDC was involved in. But they did not do so, instead they only restricted one specific area. (Well I'm sure they also rejected funding for other areas [14], but it seems clear they never did the same chilling effect thing 'don't you dare research this' that they did for guns.) Likewise I don't think anyone suggested random people in the CDC be allowed to research random things using government money completely of their own volition. But that is entirely different from preventing the high level bureaucrats from making decisions about what what the funding they have been allocated should be used for, especially if it directly intersects with existing areas of research like the aforementioned research on violence [15]. Or for that matter, even asking for funding because they feel it directly relates to their mission and what they are allowed to research since daring to ask is likely to cause major problems. (Politicians have of course asked [16] [17] for funding, but it seems clear also e.g. [18] that the CDC people themselves know whatever they personally feel, it would be a mistake to get involved.) For that matter, while individuals do have to focus on whatever it they are researching, preventing them completely from touching an area of research which is a key part of what they are looking at because it's a political hot potato and the agency may lose funding simply because they dared to look at it is also another thing. (I haven't looked into details on advocacy restrictions but I presume as it stands, if research suggested that limiting paracetamol or weed killer or even rope sales would reduce suicides they are free to advocate for that. But if they dare to advocate that parents should be required to have a gun safe because research suggests it would suicides, they're in major shit. In fact the PRI article suggests their researchers are even afraid if they simply mention that if they report that they found that having a gun in your home makes you more likely to be injured they could be accused of breaking the law.) Likewise if a count clerk refuses do to their job this is an issue. If they do things which are illegal for good reason again, this is an issue. If as part of doing their job, they realise it's beneficial to do something else as well, and this improves their ability to do their job and doesn't harm anyone, there's no reason why this should be an issue. Maybe this is some weird US thing, but in most of the developed progressive world people have found that while politicians may set the tone, and intervene in special cases etc, it's often best if they don't get too heavily involved in specific funding decisions which are best made by the bureaucrats and researchers. (Which is not to say it never happens.) In other words if Congress really feels that despite all the other areas of research the CDC is involved in, they don't want to specifically provide funding for gun research, questionable but okay whatever. But to make it such that they're extremely scared about any of the funding they do have for areas where guns intersect being used for gun research, that's a different matter. And where they could be penalised for daring to speak about one specific thing their research showed, well..... Nil Einne (talk) 05:57, 21 February 2018 (UTC)[reply]
FYI, I ended up parsing through multiple years of data from the Gun Violence Archive [19]. Not necessarily a perfect source, but good enough for my purposes. If anyone is interested, of the shooting incidents in their archive where at least one person died, 93.4% involved only 1 death, 5.1% involved 2 deaths, 1.1% involved 3 deaths, 0.22% involved 4 deaths, 0.067% involved 5 deaths, and 0.020% involved 6 deaths, and 0.024% involved more than 6 deaths. Dragons flight (talk) 16:37, 19 February 2018 (UTC)[reply]
I know I must sound like I'm spreading FUD, but if you look at the methodology that Gun Violence Archive describes, they include at least one category described this way: "Included incidents may or may not have had a firearm fired."
Evidently even the "Gun Violence Archive", using a team of 200 professional humans to carefully sort and aggregate their data, cannot assure that every statistic collected corresponds to a crime or other event that a reasonable person would call "gun violence"!
If anything, this isn't an indictment of their skill or effort; on the contrary, they appear to be methodical and earnest in their efforts. Yet they still cannot always be sure that a reported event actually involved firing a gun! This is only further strength for my case: collecting this type of data accurately, at scale, is really non-trivial.
...and after all this, you either preserve or sacrifice your integrity as a statistician... and for what purpose? Do you think that correct and valid statistics will sway opinion on gun policy in the United States? Bringing valid scientific data to illuminate this issue is about as useful as bringing the proverbial knife to the gunfight.
Nimur (talk) 16:11, 20 February 2018 (UTC)[reply]
I don't see any reason to assume DF has any interest in the data other than for personal reasons. After all, they don't even live in the US AFAIK. But anyway, while it's probably true quality data would make no difference, there is probably also a reason why there is such fierce opposition to their being any quality data. Nil Einne (talk) 05:49, 21 February 2018 (UTC)[reply]