Jump to content

Wikipedia:Reference desk/Humanities: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Line 682: Line 682:


Profit(price)/item = price - ( FC/SV(price) ) - MC
Profit(price)/item = price - ( FC/SV(price) ) - MC

Average Cost per item = ( FC/SV(price) ) + MC

So we have

Profit(price)/item = price - Average Cost per item


So you see it's very easy. All you have to do is find the mathematical function SV(price), the amount sold at each price
So you see it's very easy. All you have to do is find the mathematical function SV(price), the amount sold at each price

Revision as of 01:34, 24 March 2010

Welcome to the humanities section
of the Wikipedia reference desk.
Select a section:
Want a faster answer?

Main page: Help searching Wikipedia

   

How can I get my question answered?

  • Select the section of the desk that best fits the general topic of your question (see the navigation column to the right).
  • Post your question to only one section, providing a short header that gives the topic of your question.
  • Type '~~~~' (that is, four tilde characters) at the end – this signs and dates your contribution so we know who wrote what and when.
  • Don't post personal contact information – it will be removed. Any answers will be provided here.
  • Please be as specific as possible, and include all relevant context – the usefulness of answers may depend on the context.
  • Note:
    • We don't answer (and may remove) questions that require medical diagnosis or legal advice.
    • We don't answer requests for opinions, predictions or debate.
    • We don't do your homework for you, though we'll help you past the stuck point.
    • We don't conduct original research or provide a free source of ideas, but we'll help you find information you need.



How do I answer a question?

Main page: Wikipedia:Reference desk/Guidelines

  • The best answers address the question directly, and back up facts with wikilinks and links to sources. Do not edit others' comments and do not give any medical or legal advice.
See also:



March 18

Looking for a science book...

I read somewhere a while back about a book written by wither a journalist or a scientist that described what the earth would look like if humanity suddenly disappeared off the face of the planet. I believe it described how long buildings would take to fall, land to be reclaimed, wild animals to come back and breed, that sort of thing. I'm also fairly certain it was quite popular/well-known and got a lot of reviews. Could anyone help out? Skinny87 (talk) 06:57, 18 March 2010 (UTC)[reply]

The World Without Us? Clarityfiend (talk) 07:09, 18 March 2010 (UTC)[reply]
Cheers, thats the one! And a Featured Article to boot! Skinny87 (talk) 07:37, 18 March 2010 (UTC)[reply]
There's a similar TV series on the History Channel titled Life After People. --Jayron32 14:14, 18 March 2010 (UTC)[reply]

traditional Francophone elite

How is Egypt a traditional Francophone elite? —Preceding unsigned comment added by 76.64.53.214 (talk) 15:23, 18 March 2010 (UTC)[reply]

This site refers to "the deep anchoring in Egypt of French cultural presence", and states: "Relations between Egypt and France date back to Napoleon Bonaparte's Expedition in 1789 which left its imprints on all aspects of Egyptian life. As time went on and with the ascension to power of Mohammad Ali The Great in 1805, ties of friendship deepened between the two countries. Egyptian missions were sent to France to specialize in modern sciences and fine arts." But there must be more to it than that. Ghmyrtle (talk) 15:39, 18 March 2010 (UTC)[reply]
Doesn't mention the Suez canal which dominated Egypt's relations with France and Britain from the 1850s until the Suez Crisis of 1956. It's probably still a touchy subject in Egypt. Alansplodge (talk) 17:16, 18 March 2010 (UTC)[reply]
As for the original question, I don't understand it. An elite is usually a group, particularly a group of people. I don't see how a single country can be a group of any kind. Beyond that, it is unclear what "a traditional Francophone elite" might be (as opposed, for example, to other kinds of Francophone elites). As recently as the 19th century, one might, I suppose, have referred to the European aristocracy as "a traditional Francophone elite", but I can't think of another way to apply that phrase. If the question is really "Does Egypt have a tradition of a Francophone elite?", then the responses above might help to answer that question. However, Egypt was dominated by Britain, not France, from 1882 until the 1950s, and Egypt's current elite are much more likely to speak English than French. Marco polo (talk) 17:56, 18 March 2010 (UTC)[reply]
I believe the question comes from the Organisation internationale de la Francophonie article. See Egypt's entry in the membership table. I don't understand that particular statement either. Maybe the original author there meant "ally". --Kvasir (talk) 18:17, 18 March 2010 (UTC)[reply]
This thread may be of interest. Apparently, up until the mid-1900s, the Egyptian elite tended to learn French (perhaps in addition to English and of course Arabic). This seems no longer to be true. So one could say that Egypt had a traditional Francophone elite, but an elite that had learned French as a second or third language. Marco polo (talk) 19:30, 18 March 2010 (UTC)[reply]

fishing sector of Canada

Is there a website where I can find all the aspects of the fishing sector in Canada like the present situation: facts and figures, economic strengths or successes, problems faced, main issues and controversies, policies adopted and implemented by the government and the extent of sucess and failure of these policies? —Preceding unsigned comment added by 76.64.53.214 (talk) 15:31, 18 March 2010 (UTC)[reply]

The Fisheries and Oceans Canada site would be a good place to start: http://www.dfo-mpo.gc.ca/index-eng.htm --Kvasir (talk) 15:37, 18 March 2010 (UTC)[reply]
A couple Wikipedia pages include: Fishing industry in Canada (really poor article), Cod#Endangered-species controversies in Canada and Europe and No More Fish, No Fishermen Rmhermen (talk) 15:49, 18 March 2010 (UTC)[reply]

Jeffrey Dahmer's mugshots

Question has been moved to Wikipedia:Help Desk#Jeffrey Dahmer's mugshots. Comet Tuttle (talk) 23:44, 18 March 2010 (UTC)[reply]


March 19

King of Tavolara

In 1836, King Charles Albert of Sardinia made a shepherd, Giuseppe Celestino Bertoleoni Poli, King of Tavolara. Did he had the right to do that? Wasn't it only that the Pope could created someone a King, ie. Stephen I of Hungary? And also it wouldn't make sense a king making someone an title equal to their's. I thought King only created Dukes, Counts, Prince, and ect. --Queen Elizabeth II's Little Spy (talk) 00:02, 19 March 2010 (UTC)[reply]

Similarly were the Cocos (Keeling) Islands granted in perpetuity to the Clunies-Ross family by Queen Victoria. However the "king" title was self-styled by the head of the family. It's essentially a fiefdom. --Kvasir (talk) 00:08, 19 March 2010 (UTC)[reply]
Newly formed kingdoms used to have monarchs elected by a body, not the Church. The Great Powers at the London Conference of 1832 elected Otto of Greece, originally a Bavarian Prince, to become the king of the newly form Kingdom of Greece. He was later deposed and the Greek National Assembly elected a Danish prince to become king. The newly formed Belgian National Congress elected yet another Bavarian prince to be their first king when the country became independent from the Netherlands. Of course, one of the most notable examples is Napoleon crowning himself as Emperor. --Kvasir (talk) 07:19, 19 March 2010 (UTC)[reply]
The Pope created Charlemagne as emeperor.
Sleigh (talk) 11:57, 19 March 2010 (UTC)[reply]
Under what code of laws do you suppose Charles Albert might not have had the right to do that? If you're a king (or a government) you can do more or less anything that the other polities will let you get away with. International law hasn't been around that long, at least in Europe. --ColinFine (talk) 18:26, 19 March 2010 (UTC)[reply]
The OP may be a bit confused because the pope traditionally crowned Emperors; the pope is Pontifex Maximus, i.e. chief priest of the Roman Empire, and as such, is given the right to crown the Emperor, in medieval times this meant the Holy Roman Emperor. Kings, on the other hand, either exist since "time immemorial" (like England, France, Spain, etc) where the Kingship can be traced back to the military leaders of major confederations of Germanic tribes (Anglo-Saxons, Franks, Visigoths, etc.). Other kings are created by Emperors or even other kingdoms, for example Ottokar I had his Duchy, Bohemia, elevated to a Kingdom as a reward for giving military aid to the Holy Roman Emperor (see Golden Bull of Sicily). Other times kings just up and declare themselves to be kings, King Zog I apparently got tired of merely being the President of Albania, so he had the constitution rewritten and made himself king. And then there's the kings that are so assigned by the various Congresses of Europe. This, for example, when the Ottoman Empire was driven out of Europe, Otto, a Bavarian Prince, was placed in charge of the new Kingdom of Greece. --Jayron32 02:47, 20 March 2010 (UTC)[reply]

Crown Prince of Denmark

Who was the first Crown Prince of Denmark? When did Denmark started having crown princes instead of hereditary princes?

See List of Danish monarchs and Crown Prince. Crown princes are hereditary princes, and Denmark has always been an hereditary monarchy (and still is). To answer your question, the first (recorded) crown prince of Denmark was Sigfred, son of Ongendus, the first king of Denmark, who ruled in about 720 AD. Tevildo (talk) 09:28, 19 March 2010 (UTC)[reply]
Denmark has only been constitutionally a hereditary monarchy since 1660. Prior to that time, monarchs were elected and their children, including their eldest sons (even Hamlet, were not princes of Denmark since they had no hereditary right to the throne. Instead, from the accession of the House of Oldenburg to the Danish throne, the Danish kings' sons and their male-line were Dukes of Schleswig and Holstein, at first dividing those fiefs up among the numerous cadet branches, but later simply bearing the title Duke/Duchess (not Prince/Princess) of Schleswig-Holstein or of Holstein-Gottorp, while the head of their dynasty, the King of Denmark, actually ruled the twin duchies. A crown prince is a form of hereditary prince, although the former term is more commonly used for the heir apparent to an empire or kingdom than to a grandduchy, sovereign duchy or sovereign principality. In Scandinavia, the term Hereditary Prince came to be used to refer to the heir apparent's heir apparent (i.e. usually the eldest son of the king's eldest son) or, when there was no heir apparent, the heir presumptive was accorded the title of Hereditary Prince. When Frederick IX of Denmark's daughters were given succession rights in 1953, his younger brother Prince Knud of Denmark ceased being heir presumptive de jure but continued to retain the style of "Hereditary Prince" (Arveprins) for his lifetime as a courtesy title. FactStraight (talk) 11:50, 19 March 2010 (UTC)[reply]

Original & Translation version of Gilgamesh

Does anybody know of a print version of the Epic of Gilgamesh which includes the original as well as a translation in parallel, such as what the Loeb Classics series does? Cevlakohn (talk) 07:29, 19 March 2010 (UTC)[reply]

The original cuneiform? The epic we read in translation is a synthesized version patched together out of various tablets. So there's no single authoritative "received text" such as we have with the Greek and Latin canon in the Loeb editions--Wetman (talk) 12:37, 19 March 2010 (UTC)--Wetman (talk) 12:37, 19 March 2010 (UTC)[reply]
The Penguin edition translated and edited by Andrew George is pretty good. Images of the original Sumerian or Akkadian tablets may be seen mainly in specialist journals. New tablets (or fragments) used to appear routinely, but the recent hostilities in the region may delay new recoveries. Weepy.Moyer (talk) 20:27, 19 March 2010 (UTC)[reply]

services industries

Why services industries are generally very competitive in the world today?

Please do your own homework.
Welcome to the Wikipedia Reference Desk. Your question appears to be a homework question. I apologize if this is a misinterpretation, but it is our aim here not to do people's homework for them, but to merely aid them in doing it themselves. Letting someone else do your homework does not help you learn nearly as much as doing it yourself. Please attempt to solve the problem or answer the question yourself first. If you need help with a specific part of your homework, feel free to tell us where you are stuck and ask for help. If you need help grasping the concept of a problem, by all means let us know. Comet Tuttle (talk) 16:41, 19 March 2010 (UTC)[reply]
Service industry leads to a couple of other articles which could lead to further information. However, Google is probably a better bet. ←Baseball Bugs What's up, Doc? carrots16:48, 19 March 2010 (UTC)[reply]

sales revenue and employees

in 2001, General Motors was ranked 1st in sales revenue and 32nd in number of employees. McDonald's was ranked 108th in sales revenue and 4th in number of employees. What might explain these differences? —Preceding unsigned comment added by 76.64.53.214 (talk) 15:21, 19 March 2010 (UTC)[reply]

They are apples and oranges? They sell totally different types of products and have totally different business models, and they make use of their employees in a totally different way (McDonald's employees are mostly in service, GM I imagine are mostly in manufacturing). Is there any reason to think that ranked number of employees would have anything much to do with ranked sales revenue for all industries and products? I doubt it, but I'm no economist. --Mr.98 (talk) 15:49, 19 March 2010 (UTC)[reply]
Salary level. McDonald's employees teenage fry cooks, GM employees engineers. Rmhermen (talk) 15:49, 19 March 2010 (UTC)[reply]
I agree that there is not much in the way of a meaningful comparison to be made between these companies. One makes millions of easy to assemble burgers that sell for dollars each, whereas the other makes thousands of vehicles which sell for thousands of dollars each. Much better to compare GM to Ford or McDonald's to Wendy's. Googlemeister (talk) 15:54, 19 March 2010 (UTC)[reply]
Of course you can compare them. Investors compare them every day. Abstractly, that year, McDonalds utilized more employees to generate less revenue than GM did. GM's revenue-per-employee was much higher, which was great for GM. Of course, the two companies' fortunes have reversed in the subsequent 9 years, so revenue-per-employee clearly isn't everything. Comet Tuttle (talk) 17:04, 19 March 2010 (UTC)[reply]
But is sales revenue even what you are trying to compare? I was under the impression that McDonald's made a huge amount of its money from real estate, for --Mr.98 (talk) 19:35, 19 March 2010 (UTC)[reply]
I agree that it's comparing apples and oranges. However, when looking at such ratios between two auto companies, or between two fast food companies, the comparison is valid. And, if McD's goes to automated assembly lines for production of hamburgers, then maybe it can be compared with GM. StuRat (talk) 18:29, 19 March 2010 (UTC)[reply]
Claiming it is not possible to compare GM and McDonalds (as you did saying it would be an "apples and oranges" comparison) is nonsense. They are both businesses which exist primarily to turn a profit for investors, and investors compare them all the time using many indications of profitability and efficiency. Revenue-per-employee is one such indication. Comet Tuttle (talk) 18:43, 19 March 2010 (UTC)[reply]
Surely it is, but using revenue-per-employee on businesses in 2 wildly different industries will not give good results in and of itself any more then comparing only the relative fuel economy between a 747 and a pickup truck would give a good indication of which vehicle moves people more efficiently. Googlemeister (talk) 19:11, 19 March 2010 (UTC)[reply]
I'm not arguing that this is the best yardstick to use, but am arguing that comparing these two companies is not "apples to oranges". Comet Tuttle (talk) 19:19, 19 March 2010 (UTC)[reply]
You can compare them this way, just as you could compare their revenue to the heights of their buildings (GM probably wins in this respect, too), but it doesn't mean that you'll get meaningful investment or economic data out of it. Comparing random metrics is exactly what the phrase "apples and oranges" is all about. --Mr.98 (talk) 19:35, 19 March 2010 (UTC)[reply]
We are quibbling at this point, because I'd certainly agree that many, many metrics give a lot more information than this one when comparing companies in different industries; but I can't let it go: You are saying revenue per employee is actually meaningless in this comparison, which is untrue. In addition to being one indicator of efficiency or productivity, it implies the amount of managerial overhead each company must maintain; it can indicate how much each company will be affected by certain types of new taxation (or reductions in certain types of taxation); it says something about worker education and/or training, which in turn says something about each company's probable success if trying to commence operations in a new market. Some of these are second-order approximations but I think it's misleading the original poster to just say "The metric is actually useless when comparing two companies in different industries" — the indicators I just listed have some value. Comet Tuttle (talk) 19:59, 19 March 2010 (UTC)[reply]
How about this example then CT, In 2009, AIG got $101B revenues with 96,000 employees, value, $1.06M revenue/employee. Walmart got $405B in revenues off of more then 2 million employees, or about $200k revenue/employee. Using that metric, one would think trading %1 of AIG for %5 of Wal*Mart would be even, but in reality, it is a total sucker bet for the guy holding Wal*Mart (to the tune of $46M traded for $10B). Googlemeister (talk)
Straw man argument. Nobody but you in the above paragraph has ever suggested that anybody thinks that the "revenue per employee" metric gives you a literal valuation of Company A's shares versus the shares of Company B. My only argument is that comparing the revenue per employee, between two companies in different industries, has an amount of meaning that is nonzero. Comet Tuttle (talk) 20:46, 19 March 2010 (UTC)[reply]
But my argument shows that whatever "signal" you can get from such a comparsion can be totally swamped by the "noise" from all the other, far more useful indicators. Googlemeister (talk) 20:58, 19 March 2010 (UTC)[reply]
Arbitrarily counting the number of employees doesn't prove much. Clearly someone involved in the manufacture of a car is going to add more value than someone involved in frying a burger. It would be relevant to compare their Sales Revenue to their wages bill. That way you can compare the efficiency of their utilisation of labourJabberwalkee (talk) 06:34, 20 March 2010 (UTC)[reply]
I used to work for a Chartered business valuator, and though revenue per employee was rarely the best way of assigning values to businesses, we did have access to an annual publication that estimated it for different industries. As the discussion above points out, it varies significantly. NByz (talk) 22:14, 20 March 2010 (UTC)[reply]

Am I the only one who thinks this is a homework question? DOR (HK) (talk) 00:44, 22 March 2010 (UTC)[reply]

drum beat in music

An awful lot of popular music has a prominent drum beat, and an awful lot of "classical" music doesn't (I know there are tons of exceptions). I've never personally developed an appreciation for classical music, but I think I'd probably like it more if it did have a drum beat. Can you explain why classical music often avoids including drum beats? ike9898 (talk) 17:26, 19 March 2010 (UTC)[reply]

The presence or absence of a drum beat in classical music is likely to determine if it's a march or not. Thus they might have tended to look down on drums as an expedient instrument for war, since hauling a brass band, piano, and violin section around were problematic. But drums were loud enough and portable enough to signal commands to soldiers during battle. Later on, the bugle was more likely to be used for this. And, just like the drum, this made it less desirable for normal music (Boogie Woogie Bugle Boy aside). StuRat (talk) 18:09, 19 March 2010 (UTC)[reply]
The brief simplistic answer is that since around the early 20th-century, "western" popular music has been significantly influenced by African rhythms through various channels (at first rather indirectly). In 19th-century classical music, they had a rather different understanding of percussion, in which non-rhythmic percussion tended to be used for kind of "special effects" (simulating thunderstorms, cannons in battle, etc), while rhythmic drumming had a strong association with military marches (as mentioned by StuRat). AnonMoos (talk) 18:22, 19 March 2010 (UTC)[reply]
Hooked on Classics has/had a number of classical medleys with a recurring beat as an undertone. As for the drum beat in rock, swing music and jazz also had it a lot. But even classical marches such as this one seem to de-emphasize the drumbeat. Classical music has more to do with strings and woodwinds, with drums as percussion when needed. ←Baseball Bugs What's up, Doc? carrots18:24, 19 March 2010 (UTC)[reply]
It seems like there's something more to it than that. I've heard classical music that's been popularized with a drum beat (usually electronic) and it seems to be regarded by classical music lovers as extremely cheesy. Also, modern composers of classical-type music shouldn't associate drum beats with military action, and yet they continue to avoid them. ike9898 (talk) 20:10, 19 March 2010 (UTC)[reply]
I am sure music experts could identify a great many classical works which have a prominent drum beat, such as Ravel's Boléro. Others feature percussion with a steady rhythmic beat at least for several bars. But the tympani or other drums make their point and then some other part of the orchestra gets a turn, with the rhythm moving to the string bass, tuba, trombones, horns or other sections. In lots of modern pop music, a drum machine (or drummer) bangs out the same repetitive pattern bar after bar, "boom-CHUCK, boom-CHOCJ" etc, or "one TWO-AND, three, four, one TWO-AND, three, four," etc for 12 or 32 bars, which would have a classical audience yawning, then leaving (with notable exceptions like the music of John Cage which repeats something seemingly forever with only tiny variations). Edison (talk) 03:30, 20 March 2010 (UTC)[reply]
I have the impression that a typical percussion section of an orchestra set up to play 19th-century works consists of two guys placed in a corner in the back who are surrounded by diverse miscellaneous equipment (kind of like sound-effects guys on a 1930's radio show). If you look at them during the performance, then most of the time the piece is being played they aren't doing anything (though from time to time they may be working strenuously for brief periods). AnonMoos (talk) 09:59, 20 March 2010 (UTC)[reply]
My interpretation involves timing. Classical music has a conductor whom can be seen by the entire ensemble. Smaller groups in the recent - pre-amplification - past, needing an auditory equivalent, adopted a steady rhythm from a penetrating and concussive instrument. Though music may need no standard method to be enjoyed, it certainly needs one to be performed. Modern music seems to rely heavily on this concussive convention. Maybe a more enlightening question might be: "Why does so much modern music rely so heavily on drums as a rhythmic centre?"NByz (talk) 07:37, 20 March 2010 (UTC)[reply]
There is, unfortunately and disappointingly, very little information on this at the article on Rhythm - but, I've just discovered, a lot more here (in the .pdf document). Ghmyrtle (talk) 07:53, 20 March 2010 (UTC)[reply]
I'm not sure how true that can be. In my experience, groups of musicians from string quartets to moderately sized orchestras are perfectly capable of staying together without the conductor: the front desk of the first violins functions as the leader, and the rest of the orchestra usually sticks to them more closely than they do the conductor. The purpose of the conductor is more to shape the music and get specific effects: the basic keeping of time happens even if the conductor walks away. 86.177.124.127 (talk) 18:17, 20 March 2010 (UTC)[reply]
Very good point. Perhaps in the post-amplification and the rock bank format, no instrument but the drums can fit this leadership role so robustly and flexibly? The bass can be muddy (though it often leads for long periods in smaller jazz groups, this is only practical when all instruments, including the bass, are using an abrupt clean tone; certainly not in a rock band with a rounded bass tone and a distorted or "effects"-riddled guitar tone), the rhythm or lead guitar can be as well (and relying on it to be always playing and always indicating the time signature can be straining to the composition). The drums, on the other hand, are the perfect time keeper in this modern format. They are loud and abrupt, and the lack of tonality (except on advanced sets. I do realize that drums are tuned, but their purpose is mainly percussive) forces the player to focus their musical growth on tempo and time signature. Furthermore, the lack of tonality in most percussion instruments allows them to create another layer to the music without interfering with the melody. This can serve to make the music more complex without being harder to understand. It seems that most bands you hear on the radio rarely take advantage of this opportunity.NByz (talk) 19:05, 20 March 2010 (UTC)[reply]
If you've ever seen Tous les matins du monde (film), it depicts a 17th Century French chamber orchestra, with the conductor keeping time by banging a big stick on the floor. Classical music has moved on since then. Alansplodge (talk) 17:22, 21 March 2010 (UTC)[reply]
A strong rhythm is very accessible. Perhaps because of its similarity to the human heartbeat, it causes an instant and visceral reaction. It also makes music very danceable. The most popular forms of music have always had a strong beat. Not only modern pop music and traditional African, but also consider Celtic fiddle music or German Volkstümliche Musik. The disadvantage with relying on rhythm is that it is hard to have much complexity. Popular music tends not to have the changes in tempo and metre, diversity of emotion, array of instruments, and complex melodies that are common in something like western classical music or Indian ragas. - SimonP (talk) 21:32, 22 March 2010 (UTC)[reply]

Free Churches Moderator

AKA. the Moderator of the Free Churches Council, is a president of the Council of Christians and Jews [1] – but who is he, what does he do, how is he appointed, what is/are the Free Church(es)? Web-references are sparse! Thanks! ╟─TreasuryTagco-prince─╢ 18:24, 19 March 2010 (UTC)[reply]

One start is Free church. There is also the Free Church Federal Council. BrainyBabe (talk) 13:15, 21 March 2010 (UTC)[reply]
Hmm, thanks—I understand the concept of a free church, I was more interested in this specific British organisation... ╟─TreasuryTagsenator─╢ 14:59, 21 March 2010 (UTC)[reply]
Maybe the easiest thing would be to contact them directly? My answer tried to deal with your fourth question, but I think the previous three might be best answered by them. BrainyBabe (talk) 18:18, 21 March 2010 (UTC)[reply]

"it isn't _____ if you don't get caught"

This might belong to the language desk instead, but what do people mean when they say "it isn't stealing if you don't get caught", "it isn't cheating if you don't get caught", "it isn't illegal if you don't get caught", etc...? In what sense are these statements true, do we have an article about it?

Is there a history of this ideology? Where did it originate? Thank you. 82.113.106.89 (talk) 18:42, 19 March 2010 (UTC)[reply]

It's a (illogical) view of a victimless crime due to the idea that if you don't get caught doing something, then the victim will never know that they've been stolen from, cheated on, etc. It's also based on the idea, especially in the "cheating on someone" version, that if they don't know you cheated on them, then their feelings won't be hurt. Dismas|(talk) 18:59, 19 March 2010 (UTC)[reply]
That is assuming there are no such things as victimless crimes. I would refute such an idea by putting forth that running a stop sign where there is clearly no other traffic is such a victimless crime. If they caused an accident or near accident, their crime is no longer victimless. Googlemeister (talk) 19:04, 19 March 2010 (UTC)[reply]
I think the "victimless crime" concept is a red herring. There is one context where where the claim is essentially true, and that's the presumption of innocence at law — or, in other words, the fact that criminal cases in court aren't about whether you committed a crime but about whether you can be proved to have committed it.
The extent (if any) to which that principle might be extensible to other contexts is a matter for one's personal morality. --Anonymous, 19:07 UTC, March 19, 2010.
They're true in the sense that you won't get prosecuted for something if you aren't caught at it, and thus don't necessarily fit the legal definition of the crime. In a non-legal (e.g. moral) sense, this doesn't work. Even in a legal sense, getting away with something, in the sense of not suffering a criminal conviction, does not mean you are off scot free (as O.J. Simpson found). --Mr.98 (talk) 19:27, 19 March 2010 (UTC)[reply]
In the case of the UK MPs expenses "scandal", it was a common feeling (in England at least) that the MPs felt that the real crime was not claiming expenses to which they were not entitled, but that they had been caught doing so. --TammyMoet (talk) 20:32, 19 March 2010 (UTC)[reply]
Related quote, from Dirty Rotten Scoundrels: "To be with another woman, that is French. To be caught, that is American." --Mr.98 (talk) 20:40, 19 March 2010 (UTC)[reply]
Italian saying: "Where there is no police, there are no speed limits." Gabbe (talk) 21:07, 19 March 2010 (UTC)[reply]
If you look at it from an entirely pragmatic point of view, then the difference between a legal act and an illegal one is that you get punished for the illegal one but not the legal one. Since you don't get punished if you don't get caught, it can only be illegal if you get caught. It's not intended to be sound logic, it's more a redefinition of legality in terms of the consequences, rather than the act. --Tango (talk) 09:56, 20 March 2010 (UTC)[reply]
"It isn't _____ if you don't get caught" could most properly be filled in with "punishable". If you drive 50 in a 35 zone, you've broken the law, but if you don't get caught, then you "got away with it", and as far as the legal system is concerned, you're clean. I could pose a more extreme example: If you murder someone and are never caught, does that mean you are not a murderer? No, you're still a murderer, except not in the eyes of the legal system (yet). I'm reminded of one of my favorite quotes from The Adventures of Superman, in an early episode where the writing was a bit crisper, in which Clark Kent wants a man detained despite a lack of evidence, and Inspector Henderson admonishes him with, "The law, Kent, what has he done to break the law? This man might beat up his mother every day and twice on Sunday, but as far as the law is concerned, he hasn't done a thing." ←Baseball Bugs What's up, Doc? carrots15:10, 21 March 2010 (UTC)[reply]

The imminent resignation of Hillary Clinton?

Intrade is giving Hillary Clinton a 49% chance of resigning from her post before the end of Obama's first term. I don't understand why Clinton would want or have to do so; why then is this probability so high? Insight welcome, 86.45.173.139 (talk) 19:29, 19 March 2010 (UTC)[reply]

The "quotes" are based on user opinion and, for the most part, who is willing to click the green button or red button the most. It is not based on any tangible evidence one way or the other. It may as well be: "Do you like Hillary Clinton? Yes No." In other words, the website is purely for entertainment, not for factual evidence. -- kainaw 19:36, 19 March 2010 (UTC)[reply]
Intrade.com contracts trade in real money. -- Coneslayer (talk) 19:47, 19 March 2010 (UTC)[reply]
Unless I misread the Intrade quote matrix, Clinton's resignation has only been traded 31 times, so that's probably not a reliable sample...--达伟 (talk) 19:53, 19 March 2010 (UTC)[reply]
Well, it is not a sample but the entirety of the bids, but yes, the low volume lends to a less credible probability. 86.45.173.139 (talk) 21:13, 19 March 2010 (UTC)[reply]

There have been rumors of an eventual Clinton resignation for a couple months now. The recent spike was due to New York Governor David Paterson's troubles, with speculation that Clinton would step down and run for governor. See here for example. Additionally, the current state of US relations with Israel probably feeds speculation that Clinton won't be around for the duration. —Kevin Myers 20:07, 19 March 2010 (UTC)[reply]

Aha, that makes sense, thank you Kevin. Do you know who her main rivals might be for a prospective gubernatorial run? 86.45.173.139 (talk) 21:13, 19 March 2010 (UTC)[reply]
This poll by Rasmussen (the article was written March 2) says the front-runner is currently Andrew Cuomo (D), and the two Republican front runners are currently Rick Lazio (R) and "wealthy Buffalo developer" Carl Paladino (R). If it matters to you, Rasmussen has been noted to yield more conservative-leaning poll numbers than other pollsters. Comet Tuttle (talk) 22:02, 19 March 2010 (UTC)[reply]
Secretary of State to NY Governor? I can't see it - HC has been on the national/international scene for too long now, and there's no political advantage to her in stepping back from it. She might do something like that as a political statement if she had a good enough reason, but at 1 year into the current administration it would speak to some internal frictions or some looming political disaster that doesn't even appear on the Fox News radar (if not even they can stretch things to see that kind of trouble, what's the likelihood?) --Ludwigs2 22:13, 19 March 2010 (UTC)[reply]
She had an interview (with Tavis Smiley ?) where she was asked if she could imagine herself serving another term or even finishing this one, and she didn't sound optimistic. She also commented on how hard her job is. StuRat (talk) 22:45, 19 March 2010 (UTC)[reply]

You might also want to do a little research into the average tenure of Secretaries of State. Off the top of my head, 4 years is a very long time to stay in office. DOR (HK) (talk) 00:48, 22 March 2010 (UTC)[reply]

Really? List of Secretaries of State of the United States would seem to suggest to me that over the past say 30 years, ~4 years is about average and longer not unheard of Nil Einne (talk) 20:48, 22 March 2010 (UTC)[reply]
The NYT ran a semi-puff piece a few days ago about how she is settling into the job.[2] I hadn't thought of it this way before, but "I have complete confidence in Secretary Clinton" (not that Obama said that) is the traditional kiss of death.

Olympics question

What determines who can have an Olympic team? I mean, Puerto Rico had their own team, even though they are kinda sorta part of the US, so could say the Souix have a team as their reservation is only kinda sorta part of the US? Could Wales have their own team if they wanted, like they do in soccer? What are the IOC methods for determining this stuff? Googlemeister (talk) 20:16, 19 March 2010 (UTC)[reply]

Apparently they need a National Olympic Committee, who then petition for recognition by the International Olympic Committee. There are quite a hodgepodge of territories recognized, and at least one petitioner unrecognized, and Taiwan, who is only recognized so long as they call themselves "Chinese Taipei" to avoid irritating the People's Republic of China. As to whether the IOC would recognize the Sioux—who knows. They recognize the Palestinians, who are not quite in the same position of the Sioux, but it's not a huge stretch of an analogy (since the Sioux are self-governing). --Mr.98 (talk) 20:37, 19 March 2010 (UTC)[reply]
The Macau Sports and Olympic Committee have existed for years and have participated in international sporting events, it just has not been recognised by the IOC yet. I've not been able to find why this is. Dependent territories are certainly not barred from joining the IOC. Hong Kong's NOC has participated in the Olympics since 1952, it is now participating as "Hong Kong, China". On the other hand, the membership of Kuwait Olympic Committee has been suspended since Jan 2010 from violating "international regulation". From Kuwait's case, it seem the IOC requirements are stringent, technical and political. --Kvasir (talk) 20:38, 19 March 2010 (UTC)[reply]
It is also a question of funding. Establishing a separate NOC from the mother country means you are now independent from the funding structure and need to raise your own funds for the athletes. Macau may have their own NOC but in the case of Olympic sports, the athletes may want to keep their access to high performance national training program and funds. --Kvasir (talk) 20:45, 19 March 2010 (UTC)[reply]
As for Gilbratar, their non-recognition seems to based on the fact that its sovereignty is not "recognised by the international community". This Swiss source is in French. Apparently the Gilbratar National Olympic Committee is suing IOC in Swiss court. --Kvasir (talk) 21:08, 19 March 2010 (UTC)[reply]
I may have missed something (my French isn't that great), but I think that document is only about the IOC itself and does not talk about what it takes for a new NOC to be admitted to the club. Just for fun and practice, here's my translation (a bit rough in places) of just the first paragraph:
For more than a century, international sports have been primarily governed by a system of non-profit associations centered on the Olympic Games and on the the world championships of the various disciplines. This system was itself designed under the name of the Olympic Movement and its principal agent is the International Olympic Committee (IOC), a club of individual members founded in 1894 by Pierre de Coubertin. Its members are responsible for perpetuating the modern Games. Despite the considerable evolution of sports in the 20th century and the growing importance of the Summer and Winter Games, the IOC has continued with no important change of structure through a century that has known, among other things, plenty of surprises. It was only in 1999 that the IOC was suddenly shocked to its foundations because of the implication of about 20 of its members in a corruption scandal linked to the Winter Olympics in Salt Lake City. It was also at this same time that doping and sports-related violence began to strongly preoccupy governments, which perceived that the Olympic Movement had not taken serious control of these consequences. At the end of the century, across the news media, the IOC was all at once confronted with opinions and public bodies that put its legitimacy into question. (Chappelet 2001).
--Anonymous, 21:35 UTC, March 19, 2010.
The bit about Gibraltar is in section IV: L’harmonisation des dispositifs de régulation (Harmonisation of dispositifs of regulation)

La deuxième affaire date de 2003 et n’est pas encore tranchée par les tribunaux vaudois. Elle concerne une association qui se nomme « Comité national olympique de Gibraltar » et qui veut se faire reconnaître par le CIO, depuis la fin des années quatre-vingts, comme CNO à part entière. Cette association remplissait les conditions nécessaires au moment de sa demande, notamment avant que le CIO n’exige que le territoire concerné soit « reconnu par la communauté internationale » (règle 31 de la Charte olympique). Elle se plaint de la lenteur de la décision à son sujet du CIO qui ne veut pas en prendre une qui soit positive (et qui ne serait pas acceptée en Espagne) ou négative (qui risquerait d’être contredite par un tribunal vaudois du fait de la Charte en vigueur à l’époque).

Immediately above this paragraph was the bit about Taiwan, and most of us are pretty familiar with that. Don't worry about my quality of French translation, not a pro here:
"The second business dated from 2003 and is not yet "sliced" (decided?) by the Vaud tribunals. It's about an association called "Gibraltar National Olympic Committee" and that it wants to be recognised by the IOC since the end of the 80s as an NOC. This association fullfilled the necessary conditions at the moment of its request, notably before the IOC insists only that the concerned territory be "recognised by the international community" (Olympic Charter Rule 31). The association accuse the length the IOC has taken to make the decision be it positive (and which would not be accepted by Spain) or negative (that risked being countered vigourously by a Vaud tribunal over the fact of the Charter of the time.)." --Kvasir (talk) 22:30, 19 March 2010 (UTC)[reply]
Thanks! I thought I had searched the PDF for "Gibraltar", but I must've missed it somehow. Sorry to have been misleading. I suspect that "sliced" here means something like "scheduled (for consideration)", i.e. assigned its slice of time. In the last parentheses, "risked" should be "would risk".
So an interesting point here is that the rules seem to have changed since Gibraltar made its application, but in this paragraph they don't say how. I don't have time now to see if they cover it anywhere else. --Anonymous, 00:28 UTC, March 20/10.

Falkland War -- Thatcher Speech

According to Lexis/Nexis, Mrs Thatcher gave a speech to the House of Commons, which began thus: "The Falkland Islands and their dependencies remain British territory; no aggression and no invasion can alter that simple fact. It is the Government's object to see the islands are free from occupation". Lexis Nexis finds that bit in a NY Times article dated April 3, 1982, apparently on the front page. I know it's probably a long speech, but I'm looking for the entire text. Can anyone give me a hint about how to find the transcript of that speech? Thank you. Llamabr (talk) 20:28, 19 March 2010 (UTC)[reply]

This link [3] takes you to the Hansard record of her speeches in Parliament. You should be able to search it to find the official record of the speech you are looking for. DuncanHill (talk) 20:34, 19 March 2010 (UTC)[reply]
Wow, you guys are quick, and efficient. Thanks so much. Llamabr (talk) 20:40, 19 March 2010 (UTC)[reply]
It's well worth reading the whole debate. DuncanHill (talk) 20:56, 19 March 2010 (UTC)[reply]
The audio recording of the debate was rebroadcast on BBC Parliament on 'Falklands Night' in 2007 (marking the 25th anniversary); I have a copy and there may be others out there. Note also that all important Margaret Thatcher speeches, including those outside Parliament, are available from the Margaret Thatcher Foundation website. Sam Blacketer (talk) 22:00, 19 March 2010 (UTC)[reply]

History of the bound book

Can someone briefly explain the history and chronology of going from scrolls (for example the library of Alexandra hadnothing but scrolls in it didn't it) to bound leafs in a book that you can leaf through. Did scrolls and these bound books coexist for a while, or was itlike an Aha moment and as soon as the bound pages appeared no one wanted scrolls anymore. Thank you. This is not homework. —Preceding unsigned comment added by 80.187.107.105 (talk) 22:30, 19 March 2010 (UTC)[reply]

I seem to recall that, in ancient India, they wrote on long objects (tree leaves ?), bound together into a book. StuRat (talk) 22:37, 19 March 2010 (UTC)[reply]
From the article for Codex (bound books): "First described by the 1st century AD Roman poet Martial, who already praised its convenient use, the codex achieved numerical parity with the scroll around 300 AD, and had completely replaced it throughout the now Christianised Greco-Roman world by the 6th century." The introduction of that article, and its #History section, go into further detail. Cheers, -M.Nelson (talk) 22:43, 19 March 2010 (UTC)[reply]
Christians were very significant "early adopters" and popularizers of the book format... AnonMoos (talk) 22:57, 19 March 2010 (UTC)[reply]
The British Museum had on exhibit a few years ago a pretty little bound book which had been buried with some British religious leader quite early, maybe the 6th century. I do not recall his name, only that it was a relatively early "book" as such, and rather nicely bound. It was definitely not the Codex Sinaiticus. (Now its driving me mad trying to remember the details, but I can see it clearly as day in memory). Edison (talk) 03:42, 20 March 2010 (UTC)[reply]
Perhaps this one: [6] Rmhermen (talk) 05:09, 20 March 2010 (UTC)[reply]
The one I saw was older as I recall and in better shape, and was found in the tomb of some famous early religious figure in England, Ireland or Scotland. Edison (talk) 03:03, 21 March 2010 (UTC)[reply]
We have history of the book, although it doesn't really seem to have the info we're looking for here. Adam Bishop (talk) 15:16, 20 March 2010 (UTC)[reply]
The followers of Manichaeism were known as bibliophiles, and often had splendidly bound codexes of the major writings of their religion. --Saddhiyama (talk) 10:50, 21 March 2010 (UTC)[reply]


March 20

"Realistic" Visual Art Outside of the West

This question seems like an obvious one—but I'm having trouble finding answers (perhaps because it might be prone to controversy—but, still, I think it's a legitimate query):

Why does realism (i.e. photorealism) seem to be associated only with traditions in Western visual art? Strict representation seems to be the most intuitive approach to visual art—and the most straightforward criteria by which visual art can be assessed—but, most cultures seem to have traditions consisting of more stylized visual art forms, to the exclusion of more realistic approaches. Am I wrong?

Alfonse Stompanato (talk) 04:42, 20 March 2010 (UTC)[reply]

Did you consider that maybe, let's take a very early example, Egyptians tried to portray people completely realistically and not in a stylized form, but just weren't talented enough to? I mean, by your standard, every 6 year old draws in a "stylized" form, and not a realistic form, even though a realistic one would "make more sense". You understand what I'm getting at. If you want to know how people long ago lived and thought, just think of little children. 82.113.121.93 (talk) 09:47, 20 March 2010 (UTC)[reply]
There have been several serious scholarly tomes published about the "canon of proportions" in ancient Egyptian art, analyses of how Akhenaten changed things somewhat for a relatively brief period, etc., though we don't seem to have much specifically about this on Wikipedia, from what I can turn up. 10:10, 20 March 2010 (UTC)

There was a school of painting in Song Dynasty China that portrayed figures in a realistic manner. --Ghostexorcist (talk) 10:15, 20 March 2010 (UTC)[reply]

I saw a TV documentary which suggested that realistic depictions didn't happen in Europe until artists started using optical projections as a drawing aid. See Camera obscura. Alansplodge (talk) 12:36, 20 March 2010 (UTC)[reply]
A boring and tautological answer is "different styles of art flourish under conditions that promote them." The art historian would point out that the Western "realist" forms of art are no less stylized, specific to their period, and no more "objective" than the forms of art found elsewhere. Even in Western art, realism is hardly universal. Consider the difference between Ancient Greek sculpture (extremely "realistic" in its attention to proportions, etc.—e.g.) and Ancient Greek painting (which is "flat" and obviously stylized—e.g.). Of course, even here we are cherry-picking—if you look at Art in ancient Greece as a whole you see a whole variety of style of art (as one would expect for a period so long and so fruitful).
I think it is probably not the case that artists in other cultures couldn't have made "photorealistic" art if they had been trained to, had a tradition of it, thought it was what one should do. It is certainly the case that artists of those cultures can do it now, and I doubt it is because raw artistic talent has increased in those countries. I am not an art historian in the slightest, but I would suggest that the institutions of art—how it is taught, who pays for it, what it is used for, how "fads" and "trends" work within its dynamics, how it interacts with the larger culture in which it is embedded—probably is where we put the credit, here. The argument has been made in many other realms that the reason that Europe had such great diversity in philosophy, art, writing, culture, etc., was in part because of its long periods of very carved-up political ruling, where each little state had its own power structure, its own patrons, its own system of society. There are some limitations with such an argument (it is not like the rest of the world was exactly homogeneous), but it probably has some merit. --Mr.98 (talk) 13:07, 20 March 2010 (UTC)[reply]
Let's look at just one aspect of realism: size. In ancient times, size was used to indicate importance. Thus, the king and queen always had to be bigger than everyone else (as in a modern chess board). This precludes the use of perspective to actually show distance correctly. Thus, the movement toward equality for all may have also been reflected in more realistic art. I don't think it's an accident that the Greeks and Romans, who at least flirted with the idea of democracy, had the most realistic art, while the Egyptians had neither. StuRat (talk) 17:01, 20 March 2010 (UTC)[reply]
While an amusing theory, this doesn't have any correlation with any historical or artistic understanding that I have ever seen. There are large and small depictions of kings and queens in every culture. The idea that "in ancient times" (??) kings and queens (??) had to universally be depicted hugely is completely silly. Go to any good art museum, StuRat, and you will see there is a lot more variation than this! --Mr.98 (talk) 21:46, 20 March 2010 (UTC)[reply]
Care to show me some examples ? (Note that I didn't say that depictions of kings and queens are always "huge", only that they tended to be larger than commoners, when both are included in the same peice.) StuRat (talk) 15:09, 21 March 2010 (UTC)[reply]
Nobody has mentioned the "invention" of perspective only in the West, from about 1413 according to the article Perspective (graphical). 78.147.151.89 (talk) 17:33, 22 March 2010 (UTC)[reply]
I would think that a talented artist could have made a realistic picture without understanding the science and math behind perspective, just like an author can write a good book and yet not have a clue as to how to diagram a sentence. StuRat (talk) 17:24, 23 March 2010 (UTC)[reply]
Today everyone is familiar with perspective and has seen countless examples of perspective drawing and photos. But before it was invented, like any invention which seems obvious after the event, it didnt happen. Can you supply an example of a perspective picture before 1400? This drawing File:Qingming Festival 2.jpg shows stunning realism for something so early, but so this and other other Chinese paintings on close inspection look like they are based on isometric principals, not perspective. 78.149.133.100 (talk) 21:28, 23 March 2010 (UTC)[reply]
Here's a wall painting uncovered in Herculaneum, so from prior to the eruption of Vesuvius in 79 AD. It's a portrayal of the entrance to a theater, and it appears to use perspective, to me, with a vanishing point in the lower, right corner: [7]. StuRat (talk) 05:01, 24 March 2010 (UTC)[reply]
There is not enough depth in the scene to tell, so nothing that suggests it is anything more than an isometric projection. 78.149.167.173 (talk) 21:33, 24 March 2010 (UTC)[reply]
There's plenty of depth, as near the bottom, where the inside arch is drawn so much smaller than the front arch. StuRat (talk) 00:45, 25 March 2010 (UTC)[reply]
I presume that by the inside arch you mean the thin black line. That could be anything, most likely a decoration on the far wall. It is similar to the similar elongated shape on the right. Even if it was an arch, it may simply have been smaller in the scene depicted. Suppossing this was an example of perspective, then why only one example? Why wasnt Roman art full of perspective after it was discovered? The picture is made to look as if you were looking up at it, but you could do that with an isometric projection. Sorry, not enough evidence. 84.13.22.69 (talk) 15:33, 25 March 2010 (UTC)[reply]
As to why one artist might have figured out how to do perspective, and it didn't get passed down, I can think of several reasons:
1) The artist may have simply done it at an intuitive level, and not have been able to describe his technique to others.
2) They may have jealously guarded their new technique, for fear that competititors would steal their customers.
3) It might have simply been unfashionable to do realistic art. Just as saints were at one time all drawn with halos, there may have been other symbolism that was considered more important than realism. StuRat (talk) 15:50, 25 March 2010 (UTC)[reply]
Ernest Gombrich was of the same opinion as the person who started this thread, but perhaps not 100% right. Some further examples of a Non-Western realist art: Japanese art, Maya wallpaintings at Bonampak, Mayan, Peruvian paintings/drawings on ceramics, Benin Bronzes (now seen as pre-contact). Post-contact: Afghan art after Alexander the Great, some Islamic painting (in response to Renaissance Art), Chinese Christian art after the Jesuits. The invention of the true perspective in painting still seems to be a Renaissance thing.--Radh (talk) 10:12, 25 March 2010 (UTC)[reply]

Parting one's hair

I was just wondering when people started parting their hair. I've googled it and found nothing.149.125.176.38 (talk) 08:38, 20 March 2010 (UTC)[reply]

According to comb, the oldest combs found date to around 5000 years ago (although that doesn't mean they didn't exist before - we just haven't found any from earlier). I would guess that people started parting their hair at around the same time combs were invented (without a comb, you pretty much have to have dreadlocks or similar (or really short hair), so it can't have been sooner, and parting hair seems a pretty obvious way to comb your hair to me, so I doubt it was much after). --Tango (talk) 10:09, 20 March 2010 (UTC)[reply]
I can part my hair with just my fingers. It looks a bit messy, but when I forget to bring a comb, it's better than nothing. StuRat (talk) 16:50, 20 March 2010 (UTC)[reply]
Yes, but only because you have non-matted hair. If you never combed it, you would develop dreadlocks (or something similar), and you couldn't part those meaningfully. --Tango (talk) 17:12, 20 March 2010 (UTC)[reply]
Only if you did not wash it. I never comb my hair and it does not get matted. Granted it is also less then 1" long. Googlemeister (talk) 18:29, 22 March 2010 (UTC)[reply]
The former article "Part (hairstyle)" is currently a redirect, but here is the version that existed before an AFD discussion redirected: [8] It has references, so you may be able to follow some of those refs to see if they lead anywhere. --Jayron32 20:19, 20 March 2010 (UTC)[reply]
In the history of "western" fashions, center-parting became the prevailing publicly-visible style for women in the 1830s, and especially the 1840s (see File:Maria Carolina di Borbone, principessa delle Due Sicilie.jpg). Some center-parting also occurred in the 1400s, but women usually covered most of their hair then. Churchh (talk) 07:50, 24 March 2010 (UTC)[reply]

Step Will

I don't think this is a request for legal advice. Under England & Wales law, is there such a thing as a "step Will" and if so what is it? - Kittybrewster 11:28, 20 March 2010 (UTC)[reply]

I've never heard of a "step will" and Google doesn't find anything useful, but that isn't surprising since "step will" appears frequently in everyday language ("the next step will be to...", etc.) and just "law" isn't enough to narrow down the search. Unless someone hear has heard of one, we're going to need more context. Where have you heard the phrase? --Tango (talk) 13:04, 20 March 2010 (UTC)[reply]
(e/c) I've had a fair bit to do with wills in recent years, and it's not a term I have ever heard. As I suspect you have discovered, a pretty diligent Google search reveals no official-looking use of the term. If I was asked to guess the meaning, I would probably take it to signify either a will made with the intention of providing for step-relatives, particularly step-children, of whom there are many these days but who have no automatic right of inheritance from step-parents under the intestacy rules, or possibly a will in which the provisions are laid out as a series of steps and the eventual distribution of assets will depend on how the conditions in each step are fulfilled. But this is pure speculation. Is there a context in which you came across the term? Karenjc 13:05, 20 March 2010 (UTC)[reply]
Yes. I heard of it in the second of these two meanings. If a, then b, else if c, then d else e. I hadn't heard the term before. - Kittybrewster 14:00, 20 March 2010 (UTC)[reply]
Nor have I. I think the term isn't in wide use because many, if not most, wills have such things in place. For example, for married folks, a standard will would be to leave everything to their spouse, unless their spouse pre- or co-deceased them, in which case everything would go to the children, unless their children pre- or co-deceased them, in which case... etc. Matt Deres (talk) 20:24, 20 March 2010 (UTC)[reply]
Could it be STEP - the Society of Trust and Estate Practitioners (website), a professional group for people involves in wills, estates, and related materials? I haven't raked through their site, but they seem to offer courses and certification for such people, so I could imagine someone saying they're a "registered STEP will writer" or that they've been on a "STEP will course". -- Finlay McWalterTalk 15:30, 20 March 2010 (UTC)[reply]

Payment in cash

Watching the show Pawn Stars, I notice that some people are reassured or otherwise made to favor a deal more by an offer of money as cash rather than (I assume) check. What is more desirable about cash versus whatever other payment methods may be used? Ks0stm (TCG) 12:49, 20 March 2010 (UTC)[reply]

A cheque has to be paid into a bank account, which leaves a paper trail. Cash doesn't. That means you can easily get away with not paying tax on cash deals but can't with cheque payments (this is usually illegal, of course). A more legitimate reason would be to avoid bank fees associated with cheques or to avoid the delay in getting access to the money. The first reason is the real one, usually, I think. The other possibility is that people are favouring cash over credit (ie. being invoiced and paying at the end of the month, or paying by instalments over the next year, or whatever). In that case, you can expect a discount for cash since money now is worth more than money later (the time value of money). That doesn't apply if the credit option would involve paying interest, though, since not paying the interest would be the discount. You aren't usually charged interest when invoiced and given a month to pay, or similar, though. --Tango (talk) 13:13, 20 March 2010 (UTC)[reply]
I think you've also left out the fact that checks can bounce, but cash cannot. --Mr.98 (talk) 16:06, 20 March 2010 (UTC)[reply]
True. Cheque guarantee cards are commonly used to prevent that problem when using a cheque to pay a company. --Tango (talk) 16:57, 20 March 2010 (UTC)[reply]
I think Tango has hit the nail on the head when he mentions time. That's why people are in the pawn shop to begin with. It's probably mostly psychological (How else do you expect a pawn shop to pay out?) but that's probably the urge that's being played at here.
It's actually surprising how many people are super eager to get their money on that show. Sometimes he tells them straight out that for one reason or another a pawn shop is not the most profitable place to sell an item, but they do it anyway for a fraction of what it's worth because they want the money right now. APL (talk) 17:47, 20 March 2010 (UTC)[reply]
I've not seen the show, but our article on it suggests people go there to sell items. Why would anyone go to a pawn shop to sell an item? You go to a pawn shop to pawn it. That is, take out a loan secured to it. If you want to sell it, go to a shop that specialises in second hand whatever-it-is and you'll get a far better deal. --Tango (talk) 20:10, 20 March 2010 (UTC)[reply]
And, even worse than a check can be a prize. At times game shows give away total crap, yet claim it's worth far more than you could actually get for it. That "one-of-a-kind" sculpture they say is worth $1000 may only sell for $100, and yet you're responsible for paying income tax on the "retail value", meaning you lose money in the deal, unless you can convince the tax assessor that the original valuation was wrong. StuRat (talk) 16:41, 20 March 2010 (UTC)[reply]
One advantage of cash over cheques is that if you have an overdraft and pay the cheque into your bank account you may not be able to withdraw the full value after it has cleared. Cash in your hand can go to the pub and have a good time with you that very night. DuncanHill (talk) 17:04, 20 March 2010 (UTC)[reply]
"Pawn Stars" is not a game show. It's is a reality show about a pawn shop. It's clearly heavily directed and heavily edited, but the people on the show are supposedly off-the-street people who have showed up to either buy or (usually) sell something at a pawn shop. APL (talk) 17:47, 20 March 2010 (UTC)[reply]
The "Pawn Stars" shop is in Las Vegas, and some sellers may wish to get their hands on whatever small amount of cash is offered so they can rush back to the casino and gamble it away to stoke their gambling addiction or to avoid a creditor breaking their fingers for nonpayment of debts. Edison (talk) 02:58, 21 March 2010 (UTC)[reply]
One item to consider -- which is not related to pawn shops or game show prizes -- is that a purchase made in cash can be advantageous to the seller; the buyer can find it difficult to get his money back. I'll illustrate this from my own experience. Several years ago, my wife & I bought a car for her which was priced low enough that I could put the purchase on my credit card. The dealer refused to take my credit card, explaining that I could at a later date go to my bank & have them stop the payment. And then there is the cost banks charge for processing credit card payments, which I believe is 7%. Cash makes for a simple, clean transaction. -- llywrch (talk) 21:38, 25 March 2010 (UTC)[reply]

How feasible of a job would doing voice-overs for radio advertisements, etc., be for high school/college income, and what would be its pros and cons? What other jobs are there in voice-over besides radio/television advertising? Ks0stm (TCG) 15:17, 20 March 2010 (UTC)[reply]

Movies and video games often need voiceover, and your college may have programs in both. Of course a college department wouldn't pay much for such work. Anyway, I can think of several reasons this isn't a great prospect for a college student. Unlike a normal job, a VO actor gets an agent, provides them with a reel for potential customers to review, and then waits ... and waits ... and waits for a call. There is 0 income during that time. You may be asked by the agent to audition for a role, meaning you have to go into a studio and act out the script provided, on your own dime; and then you may or may not be selected against the 20 other people you are competing against. Depending on where you live, there may be very little demand for voiceover work, which reduces the prospect of income even further. We have an article, voiceover actor. Comet Tuttle (talk) 15:56, 20 March 2010 (UTC)[reply]
By the way, I'm not intending to discourage you from pursuing this — if you have an interest in the field, then go for it, absolutely. Just don't think you're going to get more than "a pittance" for a long time in the way of salary. Comet Tuttle (talk) 17:26, 20 March 2010 (UTC)[reply]
Sorry if this is stating the obvious... the folks who do voice-over work for ads on the radio are almost always the station's DJs, who are available, already trained on the equipment, and presumably have decent voices for the work. Some are the more experienced DJs, who provide some name recognition (at a higher cost), while some are the less experienced or fill-in jockeys who are doing it for less pay, but who need the exposure and experience. I was actually seriously considering starting that kind of thing as a career and attended some workshops on it. The field is not all that difficult to break in to, but it's not one you're likely to make good money on for at least a few years. Matt Deres (talk) 20:36, 20 March 2010 (UTC)[reply]
That's not at all obvious to me and I am sceptical about it. Radio ads are produced by advertising companies and then sent to stations. I don't see why the DJs would have anything to do with the production. --Tango (talk) 12:50, 21 March 2010 (UTC)[reply]
I do think Matt Deres should cite a source for the questionable claim, and the DJs obviously do near-zero voiceover work for national advertising campaigns; but it sounds likely to me that they would do some VO work for local companies. Comet Tuttle (talk) 00:09, 23 March 2010 (UTC)[reply]

I must be full of questions today (three questions in a row!). With campus radio stations in the US, how commonly do campus radio stations feature local news and weather coverage, such as reporting from news scenes, severe weather coverage, etc. If it is not common, why would the radio stations not feature such programming (in addition to normal programming) as part of experience for broadcast media students? Ks0stm (TCG) 15:17, 20 March 2010 (UTC)[reply]

This is WP:OR, but I never saw my university's radio station do any local news coverage; they read off nationwide news feeds. Local news coverage would require local reporting and journalism, presumably provided by the students, which implies a journalism teaching program; and at my university, that was available at the college newspaper. I think only one radio station person was really interested in doing journalism; the others interested in journalism gravitated to the newspaper. There's no reason of course why any university with both programs couldn't incorporate the radio station students into their journalism efforts. Comet Tuttle (talk) 15:47, 20 March 2010 (UTC)[reply]
More OR, our campus newspaper office was about 20 feet away from the college radio station but I don't remember anyone who worked on both (although they shared the newsfeed). Perhaps the kind of people who are attracted to college radio are simply too culturally different from the journalism crowd. 75.41.110.200 (talk) 17:09, 20 March 2010 (UTC)[reply]
More OR. My campus radio studio many years ago had a news service and national and state news were read from there, along with weather. There was little or no local news reporting. Guys went on the radio either because they were geeks who liked the gadgets, or because they wanted to go into a career in broadcasting, or because it was a good way to meet girls who called in or visited the studio. This was back in the days when it would have been very expensive to have a remote radio link, so a "roving reporter" would have had to phone in a story from a pay phone or come back to the studio to report. Edison (talk) 02:55, 21 March 2010 (UTC)[reply]
Even more OR. I think it depends on the station and the ideology of the staff at the time. At our station in the 1980s, there were those of us who were more interested in news than music, so we did local news. Some of it was re-write from the local newspaper (hey, even commercial stations do that!), but from time-to-time we would cover actual events ourselves. During the 1980 election, several presidential candidates came to town, and we covered their appearances just like every other station did — perhaps even more so, because we didn't have to interrupt for commercials. On election night, we went wall-to-wall news, with staffers at polling places and the vote-counting locations, cutting in and out with network news. (We were a non-commercial affiliate of the American Information Network.) And on January 20, 1981, we went all-news, all-day, covering Ronald Reagan's inauguration and the release of the Iranian hostages, both using the network and local reaction coverage. — Michael J 23:41, 22 March 2010 (UTC)[reply]
Yet more OR: I went to a small technical school. The radio station existed mainly to provide a test signal for the electrical engineering department, so it tended to broadcast whatever the person fiddling around with the system was interested in. Some of those guys were quite creative; others had lousy taste in music. --Carnildo (talk) 01:22, 23 March 2010 (UTC)[reply]

Slovakia during WWII

Did Nazi Germany annex parts of the Slovak Republic? I know that Hungary annexed the lower third and that there were some adjustments with the Polish border. I seem to recall that Bratislava/Preßburg was to be somewhat incorporated because of its large German population, but I'm not sure. I also came across this map: [9] but I can't find the legend.
Moreover, was Slovakia intended to be an independent nation after the war or there were plans for it to be annexed by Hungary or Germany (I know that Nazi plans where usually nebulous)--151.51.62.111 (talk) 19:00, 20 March 2010 (UTC)[reply]

This[10] page has details about the Karpatendeutsche (Carpathian German) minority in Slovakia and says that on 14th March 1939, when "the Slovak Provincial Parliament declared independence. The Slovak Republic lost territory...Germany received 43 square kilometers with 16,000 people, (Engerau, and the small city of Theben/Devin)." The Wikipedia article on Petržalka (a suburb of Bratislava or Pressburg in German) says "Petržalka is annexed by Nazi Germany on 10 October 1938 on the basis of the Munich agreement. It is renamed Engerau, and the Starý most bridge becomes a border bridge between the First Slovak Republic and Nazi Germany. Several thousand inhabitants of Slovak, Czech, and Hungarian ethnicity have to stay in Petržalka. They are considered citizens of Nazi Germany but are persecuted. The occupiers closed down all Slovak schools, and the German language replaces Slovak."
This[11] document confirms that: "The Germans did not occupy Pressburg but the bridgehead of the city, on the right bank of the Danube, Ligetfalu (Petrzalka, Engerau) was taken away from the CSR by Germany on October 10, 1938 without any previous notification. It was a great shock for the four-day old autonomous Slovak government, but the Germanophile Slovaks did not dare to disagree with Hitler" (p.157). "Hitler, accompanied by Marshall Goring, visited Engerau on October 25. This was not the only community which the Germans took away from the Slovak part of the CSR in 1938. On November 24 the German troops occupied, without incident and among the ovation of the predominantly German population, the zone of Devin (Theben, Deveny) on the left bank of the Danube at the estuary of the Morava River. It was considered a rectification of the borderline. (The Danube was the border at that point with Austria on the right bank.) As a consequence of this action, the waterworks of Pressburg fell in German hands 2 or 3 km from the city limits. The City of Pressburg requested that the government of the Reich rectify the borderline by several hundred meters to regain their aqueduct. This was the same friendly German government which two weeks earlier had occupied Ligetfalu" (p.158). Ligetfalu is the Hungarian name for Petrzalka / Engerau. Alansplodge (talk) 22:58, 20 March 2010 (UTC)[reply]
Slovak territory was used as a bargaining chip in negotiations between Germany, Poland, and Hungary. In the event of a partition, early plans called for annexation up to the river Nitra. Otherwise, always intended as a satellite state. Rich, N. D. (1974). Hitler's war aims. II. pp 55-67. OCLC 456645837.—eric 20:25, 21 March 2010 (UTC)[reply]

Medicinal brandy

In Victorian (and maybe a bit later) literature, doctors seem to use brandy to cure pretty much any illness. For example, Van Helsing and Dr. Watson both use it. Did real doctors at that time use brandy the same way? If so, why did they think it would help? --Tango (talk) 19:26, 20 March 2010 (UTC)[reply]

Because drunk people don't complain about their pain as much? Ethanol is a mild sedative and analgesic, so I imagine the idea was to provide a sort of general relief of aches and pains. --Jayron32 19:44, 20 March 2010 (UTC)[reply]
But it seems to be used to revive people that are barely concious too... --Tango (talk) 20:00, 20 March 2010 (UTC)[reply]
It is good for what ail's ya... But seriously, I am not sure that pre-mid-20th century medical science would be recognizable as particularly "medical" or "sciency" in any way. Remember that until the 1940's they were still lobotomizing people. It was probably the "Take two aspirin and call me in the morning"-type diagnosis; i.e. "Damned if I know how to fix him. Give him a shot of liquor and see what he does" is probably the depth of the science behind such treatments. --Jayron32 20:09, 20 March 2010 (UTC)[reply]
Actually, as an aside, the lobotomies were developed in the 30s, and were common in the 40s and 50s; incidence decreased after that, but they were still performed in the 80s. As an ever further aside, I once worked with someone who'd had one done in the 80s. She thought it was successful: it enabled her to lead a normal life and was better than being on drugs. So perhaps the idea's not entirely daft...? Gwinva (talk) 22:18, 21 March 2010 (UTC)[reply]
Not being of age to drink, I wouldn't know, but perhaps it is the (or so I hear) burning-like feeling of drinking high-alcohol-volume drinks that revives them. This is probably a worthless answer (as a shot in the dark), but still a theory. Ks0stm (TCG) 20:10, 20 March 2010 (UTC)[reply]
Brandy was considered a 'tonic' - in small quantities it induces a warm, sleepy, relaxed feeling that can offset some of the negative symptoms of things like head colds or flus. any curative properties come from its ability to help you relax and get some uninterrupted sleep, but don't discount the value of that.
This Google-cached page may provide some perspective (the webpage appears to be down). A Google search for medicinal brandy seems to provide a couple of useful links as well. Matt Deres (talk) 20:41, 20 March 2010 (UTC)[reply]
Ah, here we go. Our article on Armagnac (a kind of brandy) seems to be what you want. In short, yes, it was definitely prescribed for medicinal use. Matt Deres (talk) 20:47, 20 March 2010 (UTC)[reply]
Until WWII, brandy or rum was used to treat hypothermia. It does make you feel warmer, but actually cools your body core. A good discussion here[12]. Alansplodge (talk) 21:49, 20 March 2010 (UTC)[reply]
A 19th century doctor giving brandy as a remedy would have been less harmful than the almost universal practice of bleeding (which quieted a person down due to loss of blood) and purging with a poisonous mercury compound calomel (which had dramatic effects like causing teeth to fall out and causing nonstop drooling). Give me brandy instead any day. The following is for historical reference only and is not presented as medical advice. An 1800 publication said(page 245) "The stimulating nature of alcohol has been generally acknowledged" and said it could stimulate the heart muscles, but mostly noted in vitro experiments with animal tissue. An 1849 medical publication said "The value of alcohol as a medicine is universally acknowledged," and said "brandy, wine and porter" were the most valuable forms. It was said to be "a pure stimulant" imparting "temporary vigor and exhilaration." Its use was recommended for a variety of ailments. A popular book on natural science from 1869 said that alcohol in small quantities affects the body "like medicine" but in large quantities "like poison. It was said to "increase gastric juices" aiding digestion, to "excite the brain and nerves" to "accelerate the circulation"to "strengthen the weary and him who is exhausted bodily or mentally." The book noted the negative reaction which followed such stimulation.The Lancet (British) (1872) noted that alcohol was sometimes given "in cases of atrophy in children and in tuberculosis" and for "marasmus (severe infant malnutrition)" It was then used for "wasted children" to make them "fatter and stronger." 1877 book by a medical doctor who was an official of the American Medical Association and allied with the temperance movement noted that alcohol had little food value and little medical value, except as a stimulant, with paradoxical properties as a depressant, that it had many impurities and lacked standardized formula, and that it was subject to abuse if self-administered. The book noted that thousands of doctors regularly prescribed it (pretty popular with the patients, I would expect). An 1883 scholarly paper concluded that alcohol's "chief therapeutic use" was as a stimulant, a temporary imparter of power, which shall enable the system to stand some strain of like duration." It might be given in event of temporary reduction of heart action or fainting, exhaustion, or blood loss. It could aid in food digestion, in fevers, and in typhoid, to treat snakebite or certain poisons, or to lessen pain. He cautioned against treating depression with it. An 1888 publication discussed medicinal brandy specifically and seems very much on point. Brandy was then the first choice as a medicinal source of alcohol, as a stimulant and nutrient. publications noted common adulteration of supposed brandy and said just use pure alcohol diluted and flavored if alcohol was desired for medicinal purposes. Some of these seem to be POV from the temperance movement. An 1899 publication said the earlier physicians regularly prescribed alcohol "to combat shock," but that that use was discredited. Edison (talk) 02:21, 21 March 2010 (UTC)[reply]
Thanks for that, it's very interesting. So doctors of that time had the impression that it was a stimulant, despite the evidence provided by drunks in gutters... It seems they were using it in a similar way to the modern use of epinephrine (although epinephrine isn't used as often). --Tango (talk) 11:07, 21 March 2010 (UTC)[reply]

The tradition of using distilled spirits for medicinal purposes goes a lot farther back. In fact one of the earliest uses of it was medicinal, hence the name aqua vitae. --Saddhiyama (talk) 10:53, 21 March 2010 (UTC)[reply]

Actually brandy does have a medicinal use which doesn't require drinking it: see Clothes_hanger#Unintended_uses. --TammyMoet (talk) 11:56, 21 March 2010 (UTC)[reply]
Yes, alcohol as an antiseptic is a valid use. --Tango (talk) 12:47, 21 March 2010 (UTC)[reply]

Warren Buffett valuing a business

In his 1989 letter to the shareholders of Berkshire Hathaway, Warren Buffett invites business owners to contact him if they want to sell their company. He then goes on to say this:

We can promise complete confidentiality and a very fast answer - customarily within five minutes - as to whether we're interested.

What precisely is it Buffett looks at that allows him to make the decision so quickly? I suppose it has to do, at least partially, with finding a solid performance during the last ten years or so, but is it known more specifically how he values the business? —Bromskloss (talk) 19:39, 20 March 2010 (UTC)[reply]

"as to whether we're interested" doesn't mean "we'll decide whether to buy in 5 days" - it means "we'll decide whether to start thinking about whether to buy in 5 days" -- Finlay McWalterTalk 19:42, 20 March 2010 (UTC)[reply]
You might be interested in the booklet "Warren Buffet and the Interpretation of Financial Statements" (a play on the original "Interpretation of Financial Statements" by Benjamin Graham). It's an extremely simple and short book that captures the simple and short way Warren looks at non-financial businesses. Warren looks for steady revenue and profitability (in good times and bad), a good return on assets and conservative financing. He also prefers it to be a simple business that he can understand and that it operates in a market that isn't going away anytime soon. As long as those things are true, he will consider bidding on it, allowing the price he offers to determine his expected return (so begins the real thinking that Finlay mentioned above). My own interpretation is that he maintains this kind of folksy, non-threatening demeanor as part of his strategy appeal to private business owners who deeply care about the business they have created. He rarely changes the operations of a business and rarely involves himself in them. He rarely sells off chunks of the business or attempts to merge it with Berkshire's existing businesses. He limits his meddling to hiring - and setting up proper compensation incentives for - management. I believe that long term private owners would prefer to sell to Berkshire - and would accept a discounted price - because they know it's likely that their creation and legacy will continue in a largely unchanged way.NByz (talk) 20:14, 20 March 2010 (UTC)[reply]
He's also recently been talking about the scale of the business being an important factor. It's simply not worth his time to buy Furniture stores anymore, however successful and central to Berkshire's culture they might end up. NByz (talk) 20:17, 20 March 2010 (UTC)[reply]
His standard answer is that he looks for companies that have great management, a strong competitive position (a "moat"), reliable revenue streams, and a price that is far enough below fair value to provide a margin of safety. As for fair value, he is reported to calculate a discounted cash-flow that is based on what he calls "owner's income," which is EBIT (earnings before interest and taxes) minus capital expenditures, and disregarding "one-time expenses" in annual accounting unless thay happen often enough to be a red flag. Using actual "owner's income" numbers for the previous five to ten years, he estimates a growth rate applied to the next five years, a smaller growth rate for the five years after that, and a low "perpetuity" growth rate after that (roughly the long-term expected rate of real GDP). He discounts this via a rate of 8%-12% depending on the reliability of revenue streams and baseline generic estimates of future cost of capital. The total derived provides the value of the company, which is reduced by a margin of safety to provide the purchase bid. 63.17.86.9 (talk) 09:34, 21 March 2010 (UTC)[reply]

well, you should know that he never invested in technology companies, for exmaple, since his primary criterion was a good, solid, idea of how the company would be (and the world with respect to that company) ten years later. So, anything, such as high tech stocks, that no one has any idea of their market ten years later, he would not buy. By contrast, a family owned, 70 year old, traditional xyz manufacturer, now that he could start trying to picture. The main question he would ask within 5 minutes is: what is your durable competititve advantage. If the voice on the other end says "we, er, we always put 100% into everything we do, we work with such passion and dedication that, uh [click]. Hello? Hello?"
At least, that's how imagine it :). If you would like to know more about making a comapny that Warren Buffett will want to buy, you can leave an e-mail address for me, in a properly obfuscated form (for example email -at- gmail dotcom), so that it doesn't get picked up by spammers, and we can see where it goes from there. 92.229.14.140 (talk) 20:08, 20 March 2010 (UTC) —Preceding unsigned comment added by 82.113.106.109 (talk) [reply]

Why did Argentina run its trains on corn when there was world famine after WWII?

Hi, I've been trying to read up on the food situation after World war II, when there was serious shortage of food in the years 1945 - 1948 and I came across something very strange.

Argentina, traditionally one of the worlds largest food exporters, especially during that era, were burning corn to use for fuel, at the same time that people were starving in Europe and Asia.

Why was there such a shortage in of fuel in Argentina in the years after the war, surely now with the needs of the war over the must have been more than enough oil to spare, and especially to trade for precious food?--Stor stark7 Speak 23:27, 20 March 2010 (UTC)[reply]

Well, most of the world was rebuilding, which requires oil, and the US economy was booming then. Also, Argentina probably didn't want to get dependent on the nations that later became OPEC, like the US did. By the 1973 oil embargo, they may have been proven right. StuRat (talk) 04:47, 21 March 2010 (UTC)[reply]
That is an extremely interesting question. Chapter 6 of this [13] book is dedicated to the era. It's affordable, recently published (2003) and, according to this [14] link, the author's area of study "...focuses on grain farmers, the state, and changing economic and political conditions between the two world wars in Argentina." This political and economic situation should set the stage for - and hopefully directly address - your question. I should also mention that your question has encouraged me to add that book to my Amazon wish list; Argentina's history is very interesting and in many ways is a counter-factual to the history of my own home country, Canada. NByz (talk) 05:06, 21 March 2010 (UTC)[reply]
Famines are very rarely due to an absolute shortage of food on a world scale. The food shortages after WWII, at least in Europe, were due to economic, transport and infrastructure issues, not the non-existence of food. Argentina may well have had a surplus of corn. Additionally the increase in oil requirements to get the French and German economies moving again probably outweighed the oil savings through not needing to fight. There were still a huge number of troops in Germany needing to be supplied and transported, even if they are not fighting. DJ Clayworth (talk) 15:52, 22 March 2010 (UTC)[reply]
Thanks for the suggestions. So far I've been able to ascertain the U.S. was putting much pressure on Argentina including trade restrictions in 1944 and 1945, finally leading to Argentina yielding and declaring war on Germany & Japan in March 1945. Relations became seriously strained again in late 1945 and 1946 when the US worked hard to depose Argentine President/Dictator Peron (that they saw as a crypto Nazi). In the midst of this we have the UK who was unwilling to impose sanctions against Argentina because they needed Argentine food, mainly beef. In 1947 they reneged on the agreement the US had pushed on them, (and on countries such a Sweden), not to trade arms for food (Argentina seemed to be in a paranoid phase, the U.S. was trying to interfere in the internal politics of Argentina, was sending weapons to neighboring Brazil and embargoing weapons to Argentina, while Argentina in turn was rapidly expanding the size of their armed forces). However, I've also seen that the Argentinians themselves during this time were putting restrictions on exports, for whatever reasons. The book on Argentina was a good idea, I'll have a go at it. As for oil I don't know, but my gut feeling is that the European economies were not oil based, they were still running on coal.See this.[15] And although France was being reconstructed, the same did not apply to Germany until mid 1947 or possible 1948, see this.[16] As for food surpluses existing elsewhere that is very accurate, for example the Swedish fishing fleet was working part time until 1948 due to lack of paying customers, at some point in time the Dutch had to start destroying the harvest because Germany was occupied and not in a position to pay for the food it traditionally bough since they were under production restrictions and their country divided into hermetically sealed occupation zones. Meanwhile in the U.S. they were destroying potatoes[17] while during that very same winter in Germany under Allied occupation several hundred thousand starved to death[18]. It seems a bit messy to understand the whole thing for now, but perhaps it might have been a question of a reluctance in Congress to release funds for purchase of more than the very minimum of food-stuffs for the enemy aliens? --Stor stark7 Speak 01:26, 25 March 2010 (UTC)[reply]

American Culture

I live in Sweden, and in Swedish TV you can see quite a few American shows. One thing that I simply cannot understand is the doll-like appearance of men and women, especially talk-show hosts. Have a look at this picture, for instance: [19]. To me, and - I would be surprised if I were wrong - probably most Europeans over 18 years of age, the female looks more like an ever-smiling, plastic doll than a living human being with true emotions. (And the man is not far from this as well.) Please - believe me - I have absolute nothing against Americans, and I truly believe that everyone should dress the way she likes. But although I try my very best, I cannot understand why American TV show hosts want to look like this? (Now I am only speaking for my self, but when it comes to women I find they attractive when they look natural, as humans do, without makeup and strange clothes. And when they are not ever-smiling (unless they really are happy all the time - maybe Americans really are that happy).) I would guess that most mature Europeans almost could laugh at this American "phenomenon", and I find it very hard to understand in what way Americans are different. What is so different about the American culture? And again: please, trust me: I have nothing against Americans. I simply want to understand. --Andreas Rejbrand (talk) 23:58, 20 March 2010 (UTC)[reply]

Um, how do Swedish talk-show hosts dress? -- Mwalcoff (talk) 00:19, 21 March 2010 (UTC)[reply]
Less like I tried to describe above. Actually, I almost never watch TV, for most programmes in Swedish TV are ... sick (as I guess the case is world-wide). You know people arguing and huring eachother without any reasons, a lot of sex, very little respect and understanding etc. It has to "sell". However, public-service TV is still a bit better. (In Sweden the public-service TV company is called SVT.) But, to provide a couple of screenshots from SVT: [20] and [21]. --Andreas Rejbrand (talk) 00:24, 21 March 2010 (UTC)[reply]
Just as an FYI, the woman in the picture is Suzanne Somers. I have little idea her age, but I'd guess she's in her fifties. And though she has been a spokesmodel for the thighmaster for a number of years, I wouldn't be surprised if she's had a little plastic surgery. Dismas|(talk) 00:38, 21 March 2010 (UTC)[reply]
Oops... I guess she was older than I thought when I used to watch Three's Company as a kid. Somers is 63. Dismas|(talk) 00:39, 21 March 2010 (UTC)[reply]
The picture in link 17 is not from a talk show but from the 1990s version of the hidden camera prank show Candid Camera, the joke in this case being that people at the license bureau were given offensive license plates. A sweater vest like the guy in link 19 is wearing would seem incongruously casual, like they picked some guy up off the street to host the show. -- Mwalcoff (talk) 01:14, 21 March 2010 (UTC)[reply]
Really? Most Swedes would think wearing a suit just because you are a TV show host is like: "Hey, I am the president of the United States". In Sweden, TV hosts would not wear a suit unless it is some really, really major traditional, classy, formal event, such as the Nobel banquet. Nevertheless, I still do not understand why women tend to look so utterly plastic and unnatural. (I mean 50+ and trying to imitate a plastic doll?) I guess that many Americans react as I do, but it is still accpted. --Andreas Rejbrand (talk) 01:28, 21 March 2010 (UTC)[reply]
This is the kind of thing that feminists began griping about in the late 60s (if not much earlier), but it persists. The archetype would probably be Vanna White, who once joked that her job of turning the letters on Wheel of Fortune was challenging, "because you have to know the whole alphabet." ←Baseball Bugs What's up, Doc? carrots01:38, 21 March 2010 (UTC)[reply]
As backwards as this might sound, I don't think it's for the male demographic that women look this way. I don't know any guy (I'm a 30 something American) that likes the look of plastic women. They may like plastic women but not the look thereof. I think it's the women who find the plastic look most interesting in a "look what she can afford to do to try and look younger" sort of way. It's a sign of wealth and opulence. The older looking woman from the second link looks like she would fit in on an American news program but only if she were a more respected newswoman. Then she'd be able to "get away" with looking older. As my reference for this last statement, Diane Sawyer and Connie Chung. Dismas|(talk) 01:54, 21 March 2010 (UTC)[reply]
That sounds quite reasonable (but, of course, a bit "sad"). Personally I really do not like when women alter their appearance. I think they look the best (and least "silly"/"immature"/"superficial") when they look natural, as human beings do look. (Also, I think that the wonder of (sexual) attraction is much about the insight that it is a real, living human being, just like youself, that has opened herself to you.) Now, of course, I am 22 years old, and like women of my own age, so I cannot really tell for sure exactly how much I would dislike the plastic appearance, but I sure would not like it. --Andreas Rejbrand (talk) 02:02, 21 March 2010 (UTC)[reply]
Just FYI, without getting into who actually wants plastic people, from a feminist viewpoint "women should look more natural and not alter their appearance, which looks silly and superficial" is not a great improvement on idolising barbie. After all, you alter your appearance. Do you think all men who shave or trim their beards look silly and superficial? This isn't intended as an attack on you at all, just saying not to be surprised if women don't generally look very happy when you make this argument. 86.177.124.127 (talk) 12:55, 21 March 2010 (UTC)[reply]
Yes, that is actually a very good point. But even if it is not a great improvement, I think we can agree that it at least is a small improvement. I mean, I shave my beard, but I do nothing else, and I am afraid that there is quite a lot else going on here (excessive makeup, daring clothing (even in 50+ persons, who should be more mature than that), maybe even plastig surgery, etc.). --Andreas Rejbrand (talk) 13:28, 21 March 2010 (UTC)[reply]
Since you provided pics of your public broadcast station, here's a pic from ours, PBS: [22]. As you can see, they all wear suits. There's a definite movement toward casual wear in most US jobs, but it hasn't made it to news anchors yet. And as for the Nobel prize ceremony, wouldn't that require a tuxedo ? StuRat (talk) 04:39, 21 March 2010 (UTC)[reply]
Gwen Ifill looks pretty casually dressed in that picture. Bus stop (talk) 22:49, 22 March 2010 (UTC)[reply]
I'd say that's a pants suit, made infamous by Hillary Clinton. She has the top unbuttoned, but probably just because she had gained some weight and it looked too tight buttoned (and she didn't want to give us an "eyefull"). StuRat (talk) 14:30, 23 March 2010 (UTC)[reply]
Maybe it is linked to the different TV systems. I find that when I watch US TV which is poorly converted from NTSC to PAL, the fine detail is smoothed out and the colours look false (unsurprisingly, NTSC is sometimes dubbed "never twice the same color"). This effect is particularly prevalent on entertainment/chat shows and studio news programs. Strangely, the same effect is not so obvious in syndicated TV shows. Add to that the excessive makeup, extensive plastic surgery, and expensive cosmetic dentistry, it is no surprise the skin tone appears unnaturally smooth and plastic-like. Of course, in real life Americans are just as varied as the rest of us. Astronaut (talk) 11:57, 21 March 2010 (UTC)[reply]
StuRat: Yes, I realised that was a potential bias in my post. But still, even in Swedish "crap TV" (i.e. not public service), it is very hard to find this "doll-imitating" tendency. But truly: the people in the picture you provided sure looks much more mature/trustworthy/understanding.
Astronaut: Well, the TV system might explain a few percent of the issue, but I doubt it can explain much more than that. But, I am curious, as a citizen of the United Kingdom, how do you react to this issue? --Andreas Rejbrand (talk) 13:01, 21 March 2010 (UTC)[reply]
You may be interested in the case of US news presenter Christine Craft, who in her late 20s was forced from her position because she was, in the phrase she made the title of her autobiography, Too Old, Too Ugly, and Not Deferential to Men (ISBN 0914629654). Note that this happened almost 30 years ago; the pressure for women on television to look a certain way is long standing. I commend your taste in not liking Barbie-look doll-women, but I would point out, on the other hand, that the potential girlfriends around you are Swedish women in their early 20s, who as a group are generally considered to be high up there on the attractiveness scale. (No, I don't have a source for that.) Taking your cohort into account, it does not surprise me that you find the doll-women unbelievable. BrainyBabe (talk) 13:33, 21 March 2010 (UTC)[reply]
Yes, that is also a very good point. --Andreas Rejbrand (talk) 13:47, 21 March 2010 (UTC)[reply]
A good place to look into this would be the ideas raised by Naomi Wolf in a book called The Beauty Myth. It's been a long time since I read it, and what I don't recall for sure is whether she totally blames this problem on men, or also points out how women themselves feed into it and help perpetuate it. ←Baseball Bugs What's up, Doc? carrots15:00, 21 March 2010 (UTC)[reply]
I do not want to look stupid, but I really do not understand what meachanisms there are that would make it the "fault" of men that mid-age and elderly females struggle to look like Barbie dolls, because I cannot believe that men actually like the plastic appearance of such women. (Personally, they almost frighten me!) I would not be surprised if American men were a bit more tolerant (and even slightly more attracted) to this appearance than European men, but I still would believe that the majority of American men would prefer a natural-looking (i.e. unmodified) mid-age woman to a mid-age woman with excessive makeup, daring clothing, and maybe even plastig surgery. And even if you would like this appearance, at least I would be frightened by the "mind"/"soul"/"personality" of this woman: how can she be that obsessed by her looks? If she is that obessed with her looks, is she really tolerant/understanding towards minorities/etc.? Well, maybe it's just me having prejudices, but me point remains valid: if men does not like the plastic appearance of these women, how can you blame men for these women's "obsessions"? --Andreas Rejbrand (talk) 17:18, 21 March 2010 (UTC)[reply]
Well, I guess I have to admit to being immature, then. I can only view someone like Judi Dench for so long, then I turn the channel to view women I find attractive. From an evolutionary POV, there's no use in being attracted to post-menopausal women, as they won't help you pass on your genes. Therefore, to the extent that a woman can fool men into thinking she's still fertile, she should still be able to attract men. But, of course, there's a point where no amount of make-up or plastic surgery will fool anyone. I think of this being similar to a man's comb-over. When they only have a slight thinning of the hair, a comb-over works great and fools everyone. But, when they are almost completely bald, it's more like a bad joke. Still, once a man gets started down that path, it's hard to stop. StuRat (talk) 00:27, 22 March 2010 (UTC)[reply]
I have read quite a few of your posts at RD/C, and you appear to be highly mature. I agree totally with your reasoning. And as I said above, I cannot really tell whether I would be attracted to this plastic style or not, because I am far to young to be interested in either [23] or [24]. But I find the doll style so "silly" that I really believe I could never like it. But, I cannot help to wonder: You say that you do not like the appearance of Dench, but surely you would prefer her appearance to Dolly Parton or [25]? Personally, I feel that Dench look so much more mature, wise, and understanding compared to Parton or the Candid Camera one, that I would not care a bit about the looks of them (even if I would not find the plastic look extremely awful). However, of course, we can probably both agree that the "sexual attractiveness" of any of Dench and Parton is far, far below that of e.g. Emmy Rossum. (Although the latter appears to have a BMI that might be lower than what is healthy - I hope Rossum does not actively try to maintain a subnormal weight just because she thinks that she have to in order to "look good". She will look great at any healthy weight.) But: what is inside one's head is far more important than how one looks. --Andreas Rejbrand (talk) 08:49, 22 March 2010 (UTC)[reply]
Well, Suzanne Somers is 63, yet still looks fertile in that pic, so she seems to have pulled it off. (I suspect that she'd look worse up close, or in HD TV.) But, I'd still rather watch her than Judi Dench. StuRat (talk) 12:40, 22 March 2010 (UTC)[reply]
OK, I see. Personally I have nothing against Dench, but would prefer not to be in the same building as Somers, not because of her appearance, but because of what her appearance hints about her soul. Well, I was about to start an argument, but then I realized that this website is not the place for it. The factual question was about the cultural differences between US/EU when it comes to the dollification of mid-age to elderly women, and I think we have examined those rather well. My hypothesis is that American men and women are more tolerant to this phenomenon because they are more used to it. Indeed, they grow up watching "dolls" on TV. It also would appear as if American men really can be attracted to these "dolls", although - at least personally - that is somewhat of a mystery to me. I do find the appearance of these "dolls" terrible (without exception), and - more importantly - I find the pursuit of these women highly silly, even embarrasing. I think that a 40+ woman ought to be mature, ought to have a bit more distant view of things and life. After all: appearance is not important compared to personality, and everyone inevitably grows old, and - if you ask most people - attempts to dollify old women seldom (ever if you ask me) succeed. Without dollification they do look good - they look human. Also, I find it very tempting to associate these "superficial" women with properties like narrow-mindedness, poor understanding of minorities (with that I mainly mean unusual personalities), subnormal intellect etc. To a large extent these associations are probably wrong, however (I hope). I do not know if European men are like me in general (it would be interesting with some statistics), though. But perhaps the most important aspect of all is that these women probably cannot be very happy: they are in fact trying to win a war - the war against ageing - that they cannot win. And, as Lisa Simpson would point out: they set a really lousy example for young Americans. I hope that Rossum will age with dignity. --Andreas Rejbrand (talk) 13:16, 22 March 2010 (UTC)[reply]
One thing I should clarify, in case it isn't obvious, is that women in the entertainment industry go much further in trying to look young than the rest of the women in the US. So, if you walk down a street here, you aren't likely to find women who look like that. So why do women in the entertainment industry do it ? Because they think it will help their careers, of course. StuRat (talk) 15:00, 22 March 2010 (UTC)[reply]
Yes, I know. I have actually been in NYC, and I do not recall that the people there looked any different from how they look in Stockholm. All people I met were very kind (maybe even more kind than people in Stockholm - Swedes are a bit reserved). But still, if people in media/the entertainment industry dollify themselves, it must be more or less accepted by the public? Also, I guess that the US is a vast country, with a lot of nuances, so the situation in LA is probably different from the one in NYC? --Andreas Rejbrand (talk) 16:14, 22 March 2010 (UTC)[reply]
I don't think it's so much about what American men like in their women as about the history of television. TV shows in the U.S. have always aspired to make the viewer think they're watching something special, something that's a big deal. Indeed, one of the first big TV series was called Your Show of Shows. Certainly it would not have looked right for Fred Astaire to do one of his specials wearing a turtleneck, or for Ed Sullivan to show up in a sweatshirt and jeans! More casual dress would be appropriate for a show aiming for a different sort of feel, such as something aimed at a young-adult audience. When David Letterman's show aired at 12:30 a.m., he dressed more "business casual," but he switched to a suit and tie when he moved to the 11:30 p.m. slot and a broader audience. Candid Camera may be a low-budget show, but they don't want to give that away to the viewer. If the hosts appeared in casual clothing, it would say either, "We're making a lame attempt at seeming young and hip" or "We're not even trying to look professional, so don't bother watching our show." -- Mwalcoff (talk) 22:35, 22 March 2010 (UTC)[reply]
Americans can't be casual before a large audience because casualness is more diverse than formality in dress. Casual dress would alienate portions of the audience less so than formality does. America is a country of diverse ethnicities and formal wear is more limited in scope than casual type clothes. Bus stop (talk) 23:08, 22 March 2010 (UTC)[reply]
That last remark is very interesting. In Sweden it is quite the opposite: it would be very alienating for a TV show host to wear a black suit with tie... --Andreas Rejbrand (talk) 23:37, 22 March 2010 (UTC)[reply]
But it is a rather pleasant hypothesis. At first I thought American TV show hosts wear suits because, you know, they thought they were so important and rich people, much more important and far richer than the average American. But the reason you suggest is more sane: They do not want to offend people, and so they are wearing "natural" clothes. And the reason why I did not recognise this reason is because the black suit is more uncommon in Sweden, and rather associated with... well... unbelievably rich and powerful people, or at least people who think they are that (or Matrix-styled agents or whatever!). --Andreas Rejbrand (talk) 23:47, 22 March 2010 (UTC)[reply]
I'm assuming Sweden is a less ethnically diverse population. Don't forget America also has a unique relationship between black people and white people. You have competing interests in the clothes one chooses to wear, among other appearance choices. Formalwear is off-putting, it is true. But one has to weigh what is to be gained by looking attractively "natural" against what is to be gained by appearing more "nondescript." My theory is that if it is a nondescript appearance that one values most, one is more likely to find it in that which adheres to a degree of formality. Casualness is too "expressive." That expressiveness is certainly valued too, but I think it may tend to be valued more in contexts in which one is relating to a more narrowly defined segment of people. Bus stop (talk) 23:50, 22 March 2010 (UTC)[reply]
Now I understand why formal clothing is used - why men wear suit and tie. But now we can study my major concern regarding "dollification" in the light of these new ideas: are not "normal" US women alianated by 50+ women who (mostly in vain) use plastic surgery, cosmetics, etc., to look like they are 20-30 year old? --Andreas Rejbrand (talk) 00:05, 23 March 2010 (UTC)[reply]
I think celebrity is about alienation. You wouldn't want your next door neighbor looking like an alien from another world. But stars (of the human type) lend themselves particularly well to outsized thoughts of the possibilities of human life. We never get to know them, so we can load them up with all our ideas about what people can be, and even though we are probably wrong, we blissfully never find out about it. Bus stop (talk) 00:20, 23 March 2010 (UTC)[reply]
I guess you are right. But personally I just get angry/afraid when I see how stupid people can be. I would boycott any TV show hosted by someone like Somers. --Andreas Rejbrand (talk) 08:35, 23 March 2010 (UTC)[reply]
To make sure we're comparing the same things, let's look at some Swedish actresses, for a fair comparison with the American actress you used in your example, Suzanne Somers. When I do a Google image search on "Swedish actress", I also get a lot of "dollified" women, such as Victoria Silvstedt here: [26]. Now, she looks just like a Barbie doll, to me. You might argue that she just naturally looks like that, but it appears that she dyes her hair blond, to me, in the pic at the top of our article on her. StuRat (talk) 04:24, 23 March 2010 (UTC)[reply]
Yes, I cannot deny that whe have a lot of <can't find the right word, thought of "morons" but we should watch our language> in Sweden too. I had never heard of Silverstedt before, and she appears to be more of a pornographic model/actress than a regular actress. But I still think there is a difference between US and Swedish TV: First of all, men do not wear black suits and tie (unless its an exceptional event), but this has been explained above, and it is very, even exceptionally, rare that well-known 50+ TV women try to use plastic surgery, cosmetics, etc., trying to look like 20. I think I have never seen that, at least. (I only watch SVT public-service TV, but occasionally I "overlook" when others are watching other channels, of course.) And I think that it would not even be accepted to look like Somers in Swedish TV. People would just say "Oh, my God, she is 60 years old and dress like a teenager. And she has probably used plastic surgery too. She really embarrasses herself." People would just ignore her as a silly person that most likely have nothing wise to say. --Andreas Rejbrand (talk) 08:35, 23 March 2010 (UTC)[reply]
Note that Suzanne Somers is also a pornographic model: Suzanne_Somers#Playboy_pictorials, so my comparison is a valid one. StuRat (talk) 14:18, 23 March 2010 (UTC)[reply]
I see. Well, of course, one difference is the age: I think it is a bit worse that a 63 y.o. woman tries to look like 20, compared to a 35 y.o. woman trying to do the same. (And I would not be annoyed at all if Emmy Rossum tried to look like 20! (not factorial)). This does not mean that I am right and you are wrong, however: to make statistics we need a lot of American and Swedish TV hosts. Nevertheless, I do believe that such an analysis would show that older Swedish hosts make less of an effort to look (significantly) younger than they are. --Andreas Rejbrand (talk) 14:55, 23 March 2010 (UTC)[reply]
Well, of course it is not good to think that a 60 y.o. dollified woman has nothing wise to say. That's an oversimplification. There is nothing that says that such a woman cannot be wise. However, I suspect there is an increased probability for that, for a wise 60 year old woman, who is satisfied with her life and has understood what is important in life, really should not use plastic surgery as a futile attempt to stop ageing.
I must also admit that I am a slightly untypical Swede. I react much stronger to things I find (morally) wrong than most people I know. For instance, a lot of Swedes actually watch TV shows such as Expedition Robinson, an some even watch shows like Big Brother, and Paradise Hotel, all shows I would not watch even if I was paid for it. (People being immature and hurting eachother without rational reason, animal cruelty, too much ethanol consumption, strange views on sex, etc.) But most people are not constantly upset by these shows, as I am. --Andreas Rejbrand (talk) 08:56, 23 March 2010 (UTC)[reply]
Why do you even care how old the actress is? You've apparently invented a narrow, one-size-fits-all idea of what every 60-year-old woman "should" look like, and now you're unhappy because some of them don't.
If Somers can do her job -- and her employers apparently think that she can, because they keep hiring her -- then why do you care how old she is or how natural her hair color is or whether she meets your personal aesthetic ideal for a woman with a given, but 100% employment-irrelevant, characteristic? Do you also have ideas about what people born in a certain town should look like, or norms that you expect women with a given number of children to meet for their appearance?
BTW, I probably don't fit your curve, either. When I was fifteen, I had people assume that I was 25. I don't think that anyone has correctly guessed my age since I was a young child. It runs in my family. I have a pair of nearly identical photographs showing my grandfather at age 16 and age 50; he doesn't appear to have aged at all during this time. People regularly thought my mother was my sister -- my younger sister, even. When she was forty, she almost had a young man faint in front of her because her response to his wildly inaccurate guess (24 or 25) was to whip out her photos of her five kids. She died some years ago, but she'd be 60 now. Would you be equally offended if you saw her on television now? I guarantee that, despite shunning cosmetics and despising cosmetic surgery, she wouldn't happen to line up with your narrow-minded definition of what a 60-year-old woman "should" look like. WhatamIdoing (talk) 19:27, 23 March 2010 (UTC)[reply]
I am afraid that you have misunderstood my views. I do not care a bit about how people look. People can look exactly as they do, or as they wish. I just say that I find it a bit (to say the least) immature when a 60+ woman spends a lot of money, time, and effort trying to look like 20 by menas of plastic surgery, excessive makeup, daring clothes, etc., more or less fighting a futile war against ageing, and setting a lousy example for young people. I mean - she must really think it is important how she looks. Which is is not, compared to how she acts. Personality, wisdom, understanding, etc., is so much more important than appearance. And it is seldom wise to fight unwinnable battles (she will become old, no matter what she does). --Andreas Rejbrand (talk) 20:54, 23 March 2010 (UTC)[reply]
Now, I would not be offended if I saw her. But I am pretty sure that Dolly Parton, Suzanne Somers and their likes actually have been modifying their appearance rather heavily. --Andreas Rejbrand (talk) 21:58, 23 March 2010 (UTC)[reply]
If you actually didn't care, you wouldn't have spent four days talking about it. It wouldn't bother you any more than things that you actually don't care about, like the color of the carpet in their bedrooms, or what they had for lunch three weeks ago.
If you actually didn't care, you also wouldn't be judging these women's moral characters on the basis of whether their appearance lines up with your personal standard for what women aged 60+ "should" look like. People who actually don't care don't declare that women are "immature" for caring about their appearance or deride them for "setting a lousy example for young people" (as if young people could be credibly expected to follow senior citizens for fashion standards).
Who cares why these women choose the appearance they have? Maybe they believe that it brings them millions of dollars (and it might), but who cares?! It doesn't hurt anyone, and it doesn't cost anyone anything. Isn't there some real injustice you could be worrying about, like whether children in Haiti are going to drown when the hurricane season begins? WhatamIdoing (talk) 00:17, 24 March 2010 (UTC)[reply]
Yes, there is. I am particularly interested in animal rights. And, yes, you are right to some extent: if I had not found these people, 60+ women trying to look like 20 by means of plastic surgery etc. far more ugly than 60+ women who have not altered their apparances, I would maybe only have spent - say - 75 % of my effort in this discussion. I am only human. Still I find it hard to believe that these women are happy. They are fighting in vain - eventuelly even Dolly Parton will become 80. How will she handle that? And why did she spend maybe a huge amount of money on plastic surgery? Is that wise? As you say, people are starving and drowning around the globe. And her publicity is huge: her behaviour is likely to make people accept plastic surgery etc. to a greater extent. People might destroy their bodies, become frustrated, and waste a fortune on this. And one more thing: if I were a kid and saw Dolly Parton, I would be scared. And scaring people is not really god either. I know you will not accept my reasoning, so I think I will leave this discussion at this point. --Andreas Rejbrand (talk) 00:50, 24 March 2010 (UTC)[reply]
You did read my text
Well, of course it is not good to think that a 60 y.o. dollified woman has nothing wise to say. That's an oversimplification. There is nothing that says that such a woman cannot be wise.
above, right? If not I understand why you rect... --Andreas Rejbrand (talk) 01:16, 24 March 2010 (UTC)[reply]
I mean, if I knew that Dolly etc. do not feel bad about the race against ageing, that they can afford the surgery, that they are not narrow-minded but understand even the unusual people, and if it is not the case that the public is misled by their appearances, and if everyone is happy etc., then I do not mind how they look (although, of course, I would not marry any of them!). I do believe that we can understand each other if we only interpret each other's messages with the proper nuances. --Andreas Rejbrand (talk) 02:12, 24 March 2010 (UTC)[reply]
Please also notice that my original question was more or less a purely factual one: I have simply noticed that American TV show hosts dress and look in a way that probably would not be accepted by a Swedish audience, and I wondered what mechanisms have created these cultural differences between the US and Sweden. An excellent partial hypothesis was given by Bus stop above. --Andreas Rejbrand (talk) 02:23, 24 March 2010 (UTC)[reply]
They look weird for most Americans too. They are generally older than they are made up to look, and they use puffed-out hairstyles and clothing to that cover or distract from the effects of aging. Also, most US tv shows are made around Los Angeles, which has sort of a distinctive appearance of its own. It's probably like that all over, like you might be able to tell from someone's hairstyle and general demeanor that they live in a particular part of Stockholm, even though they didn't necessarily grow up there. 66.127.52.47 (talk) 07:15, 24 March 2010 (UTC)[reply]
Thank you! I guess that was what I wanted (or needed) to hear! --Andreas Rejbrand (talk) 12:47, 24 March 2010 (UTC)[reply]
Also note that I've worked in Woodland Hills, California, a suburb of LA, and there I did occasionally run into women who looked like that. When I did, I thought to myself "must be a porn star". StuRat (talk) 16:13, 24 March 2010 (UTC)[reply]

March 21

Religion

Do you have an accurate % of how many people are Catholic in the U.S.A.? Thank you and God Bless you, Father Jason Joseph Asche —Preceding unsigned comment added by Father Asche (talkcontribs) 05:37, 21 March 2010 (UTC)[reply]

Religion_in_the_United_States#Christianity says 23.9%. StuRat (talk) 05:47, 21 March 2010 (UTC)[reply]
Note, that is people that identify as Catholic. Not all of them will be practising Catholics. If you want the percentage of people that actually attend a Catholic church service on a regular basis, we'll have to do more hunting (the values differ depending on whether you ask people how often they go to church or ask churches how many people attend!). --Tango (talk) 12:45, 21 March 2010 (UTC)[reply]
Such a caveat should not be restricted to Catholics, though. Many Protestants also are "only on holidays" churchgoers. ←Baseball Bugs What's up, Doc? carrots14:54, 21 March 2010 (UTC)[reply]
This page[27] gives a figure of 65% practicing to 35% non-practicing. More data here[28]. Alansplodge (talk) 17:03, 21 March 2010 (UTC)[reply]
Absolutely. I didn't restrict it, I just didn't consider protestants relevant to the question. --Tango (talk) 17:27, 21 March 2010 (UTC)[reply]

Any truth to reports of Chinese people making cooking oil from sewage

I want to know if there is any truth at all to certain websites on the net suggesting that Chinese people in China consume food make from cooking oil which is created from raw sewage. For example this url

http://www.theepochtimes.com/n2/content/view/31712/

122.107.207.98 (talk) 12:54, 21 March 2010 (UTC)[reply]

It's probably true that a certain amount of cooking oil is recycled in China (which is what that article is suggesting, it's not oil "created from raw sewage"). To what extent the particulars of the article - that oil is harvested from sewers, that oil is recycled in such large quantities, or the level of health risk - are true I couldn't say. How much credit do you give the sources (eg the Epoch Times)? FiggyBee (talk) 13:36, 21 March 2010 (UTC)[reply]
(ec)That sounds bizarre. There are two possible re-interpretations. One form of cooking gas is methane, which can be extracted from sewage or manure, a process known as biogas. Another is the more general cycle of nature: farmers may use night soil to fertilise their crops, which might include plants from which vegetable fats and oils derive, sunflowers for example. BrainyBabe (talk) 13:44, 21 March 2010 (UTC)[reply]
Or, y'know, you could have actually read the article he linked to before "reinterpreting" the question... FiggyBee (talk) 15:37, 21 March 2010 (UTC)[reply]
what about this url? http://www.recordchina.co.jp/group.php?groupid=40648&type=1 122.107.207.98 (talk) 14:25, 21 March 2010 (UTC)[reply]
This is a gamer url talking about it. http://gbatmw.net/showthread.php?tid=13428 122.107.207.98 (talk) 14:33, 21 March 2010 (UTC)[reply]
We need some perspective on this. We in the West drink water (tap or bottled) that was flushed down the toilet by others. It's just properly processed to remove contaminants then returned to the rivers and lakes (from which it is drawn back out for people). The same can be done with oil. The differences are that some of the oil is 100% recycled by humans, while water is not, and the water treatment methods we use are better at removing contaminants that those used by individuals in China. Rather than stopping this process, they should have the government regulate it, so the oil never goes into the sewers, and is properly decontaminated, before being reused. Also note that we have a similar issue in restaurants in the West, where the same vat of oil can be reused for long periods, accumulating contaminants. Perhaps a system to continually decontaminate the oil and reuse it would be better. StuRat (talk) 14:59, 21 March 2010 (UTC)[reply]

The oil is being recycled from waste, but not human waste. Based on press reports by government sources from China (and not from anti-government sources like the Epoch Times, which is affiliated with the Falun Gong), the current scandal is about unscrupulous merchants who siphon the oil from kitchen waste, purify it and re-sell it as cooking oil. Some say that one in ten restaurant meals in China is cooked with recycled oil. --PalaceGuard008 (Talk) 02:47, 22 March 2010 (UTC)[reply]

There was a scandal in Britain a few years ago when the French were discovered to be feeding chickens human sewage. It put me off eating chicken for a while. 78.149.193.98 (talk) 20:57, 22 March 2010 (UTC)[reply]

Life of Moreshwar Ramchandra Kale

Please help me find biography of Moreshwar Ramchandra Kale. He had a lot of Sanskrit works but I can't find his biography. Thank you. --ธวัชชัย (talk) 13:43, 21 March 2010 (UTC)[reply]

Israel and the Palestinians

What exactly are the issues between the Palestinians and the Israelis that no matter how many people try to help solve the issues, they are still where they were years ago? —Preceding unsigned comment added by 71.183.76.14 (talk) 13:44, 21 March 2010 (UTC)[reply]

Wikipedia's article Israeli–Palestinian conflict has many linked articles to help you get a full view. Remember, the Reference desk is not a forum for airing debate.--Wetman (talk) 13:56, 21 March 2010 (UTC)[reply]
The Palestinians assert that the territories (West Bank, Gaza Strip and Golan Heights) are occupied territories and demand that Israel vacate them so that they can form a new nation called "Palestine." The Israelis assert that the territories are annexed into Israel and that the Palestinians have no sovereign right to the land from any persepctive (historical, geopolitical, religious, etc.) I don't think anyone's been able to help solve their issues because the media spins the issues and very few people are informed about the history behind the present state or really care about the issues. (And as suggested by Wetman, this question will likely develop into a big mess because it's controversial -- but I think I outlined the basics pretty simply.) DRosenbach (Talk | Contribs) 14:01, 21 March 2010 (UTC)[reply]
I don't think it's just the West Bank and Gaza Strip which are in dispute. Both the Jews and Palestinians claim the whole of Israel/Palestine. StuRat (talk) 14:43, 21 March 2010 (UTC)[reply]
On what basis would the Palestinians claim the entirety? DRosenbach (Talk | Contribs) 15:13, 21 March 2010 (UTC)[reply]
On the basis that they were living in it before being expelled by force of arms. DuncanHill (talk) 18:11, 21 March 2010 (UTC)[reply]
(ec) On the basis of Muslims having had control of Israel/Palestine, from 630 AD - 1918 (with the exception of short periods during the Crusades), from the conquest by Mohammed to the defeat of the Ottoman Empire during WW1. See Palestine#Islamic period (630–1918 CE).StuRat (talk) 18:27, 21 March 2010 (UTC)[reply]
1) Israel has yet to expel Palestinians from their land en masse. Arab nations have, however, expelled Jews numerous times.
2) Your argument would be valid if a) the Palestinians actually wanted to just live where they have lived for many years and b) Israel was not allowing them to do so. But the Palestinians don't just want to maintain their private property but demand a sovereign nation on land that "was their sovereign territory" -- but that's based on the false premise that there was ever a Palestinian sovereign territory to begin with. The claim that "Muslims lived there and so other Muslims can by proxy demand the land returned to them" (read "the other group of Muslims") doesn't possess any validity.
I just don't understand your argument -- do you deny these historical realities? DRosenbach (Talk | Contribs) 03:48, 22 March 2010 (UTC)[reply]
I pass no judgment on the validity of the claim, but simply state that this claim has been made (that "historically Muslim controlled land should remain so (or be so restored)"). StuRat (talk) 12:22, 22 March 2010 (UTC)[reply]
Validity of claims is apparently overlooked by the uninformed majority. DRosenbach (Talk | Contribs) 17:15, 22 March 2010 (UTC)[reply]
I don't think it's our place to discuss that here, since that would lead to a debate. The nature of the claims is a factual matter, but their validity is a matter of opinion. StuRat (talk) 03:47, 23 March 2010 (UTC)[reply]
Well, first of all no Palestinian group claims the Golan heights as Palestinian. The Golan heights belong to Syria. Secondly, I think we must conclude that the Israeli-Palestinian conflict is not merely a conflict between Israelis and Palestinians, but involves a lot of other, international, interests. The conflict could be settled quite rapidly, if the U.S. stopped propping up the Israeli warmachine and if Arab neighbouring regimes would end their hypocritical attitude of denouncing Israel in rhetoric whilst allowing oil exports to Israel in practice. --Soman (talk) 14:28, 21 March 2010 (UTC)[reply]
It could also be settled if certain countries stopped openly vowing to destroy Israel and stopped conducting suicide bombings and the like. ←Baseball Bugs What's up, Doc? carrots14:52, 21 March 2010 (UTC)[reply]
@baseballlbugs; no, on the contrary. The position that the conflict could only be ended by first enabling a sense of 'security' amongst the Israeli polity is the same as wishing perpetual conflict. The sense of security is elusive, and can never be obtained on forehand. Trust is something that is built in process. There are numerous examples (for example, almost all of Europe) were previous mortal enemies are now happy neighbours. It is acheived through normalization of relations. In this case the ball is in Israel's court. They can withdraw from the occupied territories, and thus change the dynamics of the conflict. Such a move would enable a peaceful solution to the conflict and in such a context whatever rhetoric that might come out of Tehran would be just as irrelevant as Gaddafi's statements on dissolving Switzerland (see the other thread above). --Soman (talk) 15:26, 21 March 2010 (UTC)[reply]
No, the real problem is that each side says the other started it and won't back down from their repsective positions. In fact, Israel and Egypt settled their differences, and Israel pulled out of the Sinai because they no longer regarded Egypt as a threat. The "he started it" mentality is what fuels this situation, and what has always fueled it - and until that mentality changes, the fighting will continue. The bottom line of what you're saying is, "We'll stop the suicide bombings if you'll surrender." Would you trust the word of someone who said that to you? I certainly wouldn't. ←Baseball Bugs What's up, Doc? carrots15:33, 21 March 2010 (UTC)[reply]
No, that is not what I'm saying. Even if there would be a peaceful settlement of the conflict, there would still be spoilers on both sides. But gradually such elements could be marginalized. The notion that people turn into suicide bombers for sheer fun (or by some abstract philosophical reasoning) lacks material basis. Once Palestinians are able to live normal lifes as human beings (without sieges, humiliation, blockades, curfews, check-points, arbitrary arrests, etc.), suicide bombings will be a historical phenomenon. We have to recognize that the Israeli-Palestinian conflict is an asymmetrical conflict. Israel has the possibility to withdraw and end the occupation, the Palestinian leadership lacks such options. The Israeli side is highly institutionalized, on the Palestinian side there are various different armed actors. If one group ends rocket launches from Gaza, another resumes it. Only when there is a viable Palestinian state, there will be a Palestinian side that can be a partner for mutual security. In short, peace and justice must preclude security. (edit conflict, responding later to query below) --Soman (talk) 16:20, 21 March 2010 (UTC)[reply]
"...peace and justice must preclude security" ? I think you mean "prelude". To "preclude" means to "prevent". StuRat (talk) 04:00, 23 March 2010 (UTC) [reply]
Yes of course, English is not my first language and such mishaps are quite frequent for me. --Soman (talk) 14:02, 23 March 2010 (UTC)(Freudian slip?)[reply]
No problem, I just wanted to make sure you weren't misunderstood. StuRat (talk) 17:17, 23 March 2010 (UTC)[reply]
@Soman - why would Israel withdraw from land that it captured while defending itself from belligerants? The other countries attack Israel during the War of Israeli Independence and when Israel defends itself and does such a good job that it captures land in the process, the countries say "just kidding -- let me have the land back now"/ DRosenbach (Talk | Contribs) 16:07, 21 March 2010 (UTC)[reply]
Peace? Unless peace is attributed some sort of value in itself (and given prominence over arguments of national pride, etc.), any sort of solution of the conflict gets quite remote. From that line of reasoning (i.e. never give up land that was conquered in war), I would say that you are not particularily interested in a settlement of the conflict. Moreover, if you see the 1948 capture of Palestine as a war of defense, then you will have serious difficulties understanding the opposite side of conflict. So, the question perhaps is to ask whether the Israeli majority at this point actually want a peaceful and just 2-state solution. After all 90%+ vote for pro-war parties. In such a scenario, there are two ways of viewing the issue: 1) either we sit down an wait for the Israeli public acheive a moral awakening and to turn around and see the benefits of ending the inhuman occupation of the Palestinian territories or 2) we organize pressure on all fronts (political, economical, cultural, etc.), calling for boycotts, divestments and sanctions, that Israel cannot be a full member of the international community as long as occupation persists. In the case of bringing down Apartheid in South Africa, option 2) worked. --Soman (talk) 16:36, 21 March 2010 (UTC)[reply]
As I said, each side justifies its behavior by saying it's the other side's fault. Until that mentality changes, nothing will change. ←Baseball Bugs What's up, Doc? carrots16:53, 21 March 2010 (UTC)[reply]
The the validity of your elaboration rests on the assumption that the Israelis were at fault in the War of Independence. Perhaps there's no precedent (and that may very well be the issue) but Israel captured the territories in a war in which they were attacked. So land captured by the Israelis is not, for instance, similar to land captured by Germany in its conquest to a priori overtake other nations. And there was no capture of Palestine -- Palestine has never existed. The land belonged to the Ottomans and then the British and then it was either given over for the future State of Israel (which was declared and formed) and to Jordan. DRosenbach (Talk | Contribs) 18:05, 21 March 2010 (UTC)[reply]
Well, the vast majority of the world's disputes, from international hostility to that argument with your neighbour about the size of his tree, could be sorted if you just sat them down, strapped some logic to them, and asked "But what's the point, really?" Vimescarrot (talk) 15:12, 21 March 2010 (UTC)[reply]
Ask yourself how the ongoing Israel situation has helped solidify and strengthen the resolve of all three major groups involved in the struggle, namely Christianity, Islam and Judaism. The answer to that question is also the answer to the question, "What's the point?" ←Baseball Bugs What's up, Doc? carrots15:19, 21 March 2010 (UTC)[reply]

DRosenbach's first answer above is unfortunately mainly incorrect. When it comes to the formal, legal, settings, there is complete agreement, including Israel, that the West Bank, Gaza, East Jerusalem and the Golan Heights were militarily occupied by Israel in 1967. Israel would like to annex them, but has never squarely come out and formally annexed any of them, although one can make arguments concerning Jerusalem and the Golan Heights. The official representatives of the Palestinians for decades have recognized pre-1967 Israel and supported a two state solution with a state of Palestine based on the West Bank and Gaza with a capital in Jerusalem, with the aim of a future unification of Palestine and Israel through peaceful means.John Z (talk) 11:28, 22 March 2010 (UTC)[reply]

But the Palestinians were offered the Gaza Strip and the West Bank and rejected this offer! And one can perhaps waive the lack of formality employed by Israeli as a display of political submission to a greater power (USA?) -- with Israeli relying on de facto rule for the past 43 years. Even if you reject such a proposal (which I admit is merely a possible explanation of the reality and not an attempt at political wiggling, if you will), Israel (as you have said) has considered unified Jerusalem as its capital for "the past 42 years". How then can Palestinian settlers (because that's what they are, in essence) demand "restoration" of "sovereign" territory that never existed in the first place?
And on your second point, that "the official representatives of the Palestinians" did or did not do something -- has there ever really been an official representation? Even when Yasser Arafat was involved, there would be terrorist attacks and Arafat would bemoan his lack of control over the terrorist entities perpetrated by those over which he had no control -- but a lack of control would suggest that he was not the official representative after all (See September 1993). DRosenbach (Talk | Contribs) 12:01, 22 March 2010 (UTC)[reply]
Obviously it should be given back to the Canaanites, from whom it was originally taken. Adam Bishop (talk) 06:24, 23 March 2010 (UTC)[reply]

Kabul Times

Is there anyway to find an archive of the english language newspaper, the Kabul Times, later called the Kabul New Times, from the 1970's and 80's? —Preceding unsigned comment added by 143.229.178.175 (talk) 14:50, 21 March 2010 (UTC)[reply]

WorldCat indicates that there are microfilm copies in various university libraries. That's probably the best option out there. --Mr.98 (talk) 17:55, 21 March 2010 (UTC)[reply]

fishing sector of Canada 2

what were the main issues and controversies of the fishing industry of Canada? —Preceding unsigned comment added by 76.64.52.175 (talk) 16:01, 21 March 2010 (UTC)[reply]

Some links were provided for you three days ago. If you have more specific questions that the links did not answer, please be specific. Comet Tuttle (talk) 16:08, 21 March 2010 (UTC)[reply]

I meant to say that in the fishing industry of Canada, were there any controversies being faced? —Preceding unsigned comment added by 76.64.54.19 (talk) 18:52, 22 March 2010 (UTC)[reply]

If you google canada fishermen dispute there are many links. The second one is about flounder. Comet Tuttle (talk) 00:11, 23 March 2010 (UTC)[reply]

Uganda-Tanzania War press conference

Hello. Apparently, during the Uganda–Tanzania War, the president of Uganda, Idi Amin, (who was a former world champion), challenged the president of Tanzania (Julius Nyerere) to a boxing match, in order to settle the dispute. I believe he did so in a press conference, according to here. That page quotes this bit "I am keeping fit so that I can challenge President Nyerere in the boxing ring and fight it out there, rather than having the soldiers lose their lives on the field of battle."

I've seen this reference in a few other places, too. I'd like to see the transcript of the entire press conference, or the entire address. I can't seem to find archives of major Ugandan newspapers at the time, and it's missing from the NY Times archive. Any help would be appreciated. Thank you. Llamabr (talk) 21:13, 21 March 2010 (UTC)[reply]

Apparently the quote was from Bob Astles in a phone call to reporters.[29].—eric 21:37, 21 March 2010 (UTC)[reply]
Thank you for the link to the Ledger article. I hadn't seen it. Though the way I read it, however, is that the author of the article received it as a quote during a telephone call. I still believe that it was announced at a press conference, which makes me think that there's a transcript, if not an audio or a video version of it somewhere. Llamabr (talk) 21:49, 21 March 2010 (UTC)[reply]
more[30].—eric 21:50, 21 March 2010 (UTC)[reply]
Hi. Yes, I read that the same way. In fact, it's the same quote, which makes me think that this author got it from the same wire service as the previous one. In this one, he goes on to say that he'd do it with one hand tied behind his back. That makes me think there's a full transcript somewhere. thanks again, Llamabr (talk) 22:07, 21 March 2010 (UTC)[reply]
Well, i was able to find a bit more, but i think it was a prepared statement and not from any public remarks. The blog you linked was probably mistaken in saying it was a press conference—as well as mixing up the chronology some. Amin announced the annexation of Kagera 1 November and offered to negotiate a peace at the same time. Nyerere rejected negotiations at a rally 2 November, calling Amin a "barbarian" and comparing him to a snake. The phone interview with Astles came on the 3rd and was a response to Nyerere's comments.
Astles (phoning from Kampala and speaking to reporters in Nairobi) stated that Amin had spent the 2nd "in the battle zone" and had flown back to the State House at Entebbe that morning where he began "basketball exercises" and dictated the following statement:

I am keeping fit so that I can challenge President Nyerere in the boxing ring, and we fight it out there rather than soldiers lose their lives on the field of battle. There is a saying in Africa that when elephants fight it is the grass that suffers.

That might be the whole of the prepared statement, the rest about Ali etc. Astles recounting their earlier conversation.—eric 06:53, 22 March 2010 (UTC)[reply]
As a minor correction, Amin was never a 'world champion' boxer. He was a champion of Uganda for a time (as his article mentions), and I believe had won British Army boxing titles, but (despite the British Army serving around the world) these would only have been open to British Army personnel and cannot be considered 'world championships' in the usual sense. 87.81.230.195 (talk) 22:48, 21 March 2010 (UTC)[reply]

Why was France so weak militarily during WW2?

Never really understood this. Were they also weak during WW1? I'm assuming it might be because they lost the Napoleonic Wars, but that was a long time ago relatively speaking. ScienceApe (talk) 21:58, 21 March 2010 (UTC)[reply]

In 1939, the French had almost a million men under arms, with 5 million trained reservists. However, the strategic thinking was still First World War, with massive defenses all along the border. In the end the Germans invaded through an unexpected route, captured Paris within a couple of months, and simply left the bulk of the French forces in their dust. FiggyBee (talk) 22:26, 21 March 2010 (UTC)[reply]
Our article Military history of France is a featured article. France has a long history of having one of the very best militaries in Europe. Specific to WW2, you want Battle of France and the enormous article Military history of France during World War II. Comet Tuttle (talk) 23:26, 21 March 2010 (UTC)[reply]
The Oxford Companion to World War II says that the French army in 1939 was "regarded by many as the best in the world, heavily armed, well equipped and led by highly acclaimed veterans of 1918... it was the German superiority in the operational deployment and use of tanks and planes that made these comparisons in retrospect look irrelevant."--Pondle (talk) 23:33, 21 March 2010 (UTC)[reply]
If one does a comparison between German and French military strength in 1939, do remember that France in 1939 also had a vast colonial empire that stretched across every continent on earth. Large parts of the French military were stationed outside of metropolitan France, not at its land border with Germany. --Soman (talk) 01:53, 22 March 2010 (UTC)[reply]
Vast, but mostly rather sandy. Moonraker2 (talk) 03:27, 22 March 2010 (UTC)[reply]
The main problem with the French Army in 1940 was one of poor morale and poor leadership. Gamelin's HQ didn't use radio or telephones for fear of espionage and was famously described as a "submarine without a periscope". There was no planning for a German breakthrough and stunned inaction when it happened. The British commander "Tiny" Ironside picked-up his French counterpart Billotte by the lapels in a vain effort to get something done. Alan Brooke was astonished that Gamelin's successor Weygand was more concerned about his career than the defeat of France. The Allies had a clear numerical and material superiority (on the ground anyway), but lacked the co-ordination, aggression and initiative that the Germans showed from top to bottom. Alansplodge (talk) 13:35, 22 March 2010 (UTC)[reply]
  • It's not unreasonable to say that France wasn't weak - it just wasn't conceptually prepared to fight a modern mechanised war. The French army, to all intents and purposes, fought one battle and lost it. This would not have been a knock-out blow in 1914 or 1870, but the unexpected speed and mobility of the German army meant that the French army was unable to recover from its defeat in that battle; the country collapsed within a month, and the army effectively ceased to exist. Had it had a chance to recover - had it stalled the German advance as in 1914 and begun another "long war" - we would not remember it as "weak", just as having been caught off balance in May 1940. Shimgray | talk | 19:54, 22 March 2010 (UTC)[reply]

News orgs using "Mr. Obama"

I thought the US president was always referred to as "President <lastname>"; I thought it was improper to the point of rudeness to refer to him as "Mr. <lastname>", especially during his presidency. In fact, I thought it continued to be the correct way to address him even after he left office.

I've noticed more and more news agencies referring to our current president as "Mr. Obama"; often they use "President Obama" early in the story, and then use "Mr." (often multiple times) later on.

Have they done this with previous presidents? In particular, did they often refer to the previous president as "Mr. Bush"? I can't remember ever seeing it, but I tended to pay less attention to stories about him

Ralphcook (talk) 23:07, 21 March 2010 (UTC)[reply]

Certain news outlets, including the New York Times, have a specific policy of using Mr. (or Ms., etc.) and the surname. They did follow this with Bush and with other leaders such as Tony Blair. I don't believe it's "generally" considered rude to use a president's surname + Mr.--达伟 (talk) 23:11, 21 March 2010 (UTC)[reply]
(ec's)This is a matter of house style. On the front page of today's New York Times, we have Obama referred to first as "President Obama" and later as "Mr. Obama" in each article concerning him, as well as "Speaker Nancy Pelosi" first and "Ms. Pelosi" thereafter. More casual media might drop the titles. (The NYT always use titles, except on the sports pages. Rarely will one see Willie Nelson referred to as "Mr. Nelson" elsewhere.) PhGustaf (talk) 23:27, 21 March 2010 (UTC)[reply]
If you were speaking with him face-to-face, and you didn't have any prior relationship to justify any other familiarity, it would be expected of you to address him as "Mr. President", but that doesn't apply to reporting about him. —Akrabbimtalk 23:18, 21 March 2010 (UTC)[reply]
The protocol for conversations with the Queen of England is to address her on first meeting as "Your Majesty" and as "Ma'am" thereafter. Applying that in this case would yield "Mr. President" and "Sir". The more formal title at goodbye time would seem appropriate. If I recall, that's how they did it on West Wing. PhGustaf (talk) 01:19, 22 March 2010 (UTC)[reply]
Actually, I think it's from West Wing that I got my sensitivity to it; the only time I remember President Bartlet being referred to as "Mister", it was meant as an insult, spoken by a (retiring) Supreme Court Justice, and he was sort-of-gently corrected to "Dr. Bartlet." —Preceding unsigned comment added by Ralphcook (talkcontribs) 02:29, 22 March 2010 (UTC)[reply]
(ec)The first time I can recall the American press using "Mr." in lieu of "President" was with Richard Nixon, and I wouldn't be surprised if it predates that. And I aslo recall some commentators raising the same question as the OP, that it implied a lack of respect for the office. The OP is also correct that "Mr. President" continues to be used once a President has left office, as with other high officials such as "Senator". ←Baseball Bugs What's up, Doc? carrots23:21, 21 March 2010 (UTC)[reply]
The BBC has done this for a very long time. Searching their "on this day" section (which reproduces verbatim an historical report, juxtaposed with a later "in context" reflection) refers to Nixon as Mr here and here, Soviet Premier Leonid Brezhnev as Mr Brezhnev here, and FDR as Mr here. They also refer to German Chancellor Angela Merkel as Mrs here (not as Chancellor Merkel or Dr. Merkel) and no-one ever seems to call Gordon Brown "Dr. Brown". Conversely, the American habit of addressing British Prime Ministers as "Mr Prime Minister" or "Prime Minister Blair" is both grating and wrong. -- Finlay McWalterTalk 23:56, 21 March 2010 (UTC)[reply]
No one would call Brown "Dr" because he sensibly doesn't use it himself. In the UK it is generally looked on as a bit pompous for non-medical doctors to use their title unless they have an educational or clerical career. Moonraker2 (talk) 03:18, 22 March 2010 (UTC)[reply]
Which is weird because the vast majority of medical "doctors" do not have doctorates of anything, whereas the non-medical doctors do. If anyone is more entitled to use the title, it's the ones who have actually done the hard yards and actually are doctors. In medical parlance, it means a registered medical practitioner regardless of their actual educational attainments. It's analagous to architect, dentist, lawyer etc - yet nobody ever refers to "Lawyer Smith" or "Architect Jones" or "Dentist Brzezinski", do they. But "Doctor Hoffmann", ah, that's different. -- Jack of Oz ... speak! ... 07:27, 22 March 2010 (UTC)[reply]
Although, in the US lawyers do use a postnominal "Esq.". (In Britain, "Esq." or "Esquire" after a name is equivalent to "Mr." or "Mister" before the name, albeit a little old-fashioned.) --Tango (talk) 11:17, 22 March 2010 (UTC)[reply]
That really surprises me, Tango. I'd have thought "Esq" was as completely unknown in modern-day USA as it is in modern-day Australia. It appears only in reference books and old stories here, never in practice. -- Jack of Oz ... speak! ... 20:31, 22 March 2010 (UTC)[reply]
Spammers frequently use 'Barrister' as a title in their begging letters. --ColinFine (talk) 08:48, 23 March 2010 (UTC)[reply]
"Mr. Hoover" was widely used (31,000 Google News archive hits) to refer to FDR's predecessor. "Mr. Coolidge" was widely used for his predecessor. Etc. I did not find it used for George Washington at Google News archive, but perhaps the papers from 1789-1797 are not adequately represented there. Edison (talk) 03:29, 22 March 2010 (UTC)[reply]
Washington, when not addressed as "President Washington", was addressed as "General Washington". His immediate successors were addressed as "Mr. Adams" and "Mr. Jefferson". —Kevin Myers 03:51, 22 March 2010 (UTC).[reply]
Just an interesting side note on that: In July 1776, just as the Continental Congress was declaring independence for the US, the main part of the revolutionary army was in New York City, and the British began arriving with a huge fleet to dislodge them (which they did: see New York and New Jersey campaign). Hoping to negotiate an immediate surrender, the British commander, Lord Howe, began by sending a letter to George Washington -- who refused the delivery because it was addressed to "Mr. Washington" and not "General Washington". In other words, "If you won't even recognize that I'm not part of your empire any more, we have nothing to talk about." (Source: 1776 by David McCullough.) --Anonymous, 19:25 UTC, March 22, 2010.
Washington's efforts in NYC didn't go all that well. A book on the history of the Empire State Building and its location has a section titled, "George Washington shlepped here". Regardless, wasn't it Washington who came up with the subdued title of "Mr. President"? ←Baseball Bugs What's up, Doc? carrots20:59, 22 March 2010 (UTC)[reply]
(Some OR here): As a journalist, I have had to deal with people commenting to me about this very issue for years. Strictly speaking, according to the AP Stylebook, one uses a title and full name on first reference, and surname only [my emphasis] on subsequent reference. That is the listed journalistic style for all names; there is no exception for the president or any other public official. The use of "Mr." is actually an attempt to be more courteous than calling him simply "Obama." Frequently during the George W. Bush administration, we were accused of being Democratic sympathizers because we said "Mr. Bush" instead of "President Bush," even though proper style would have been just "Bush." Now the same journalists are tabbed as pro-Republican when they equivalently say "Mr. Obama" instead of "President Obama." ... At least, that's what many reporters do. As for me, I go strictly by-the-book and write just "Obama" and "Bush." — Michael J 23:23, 22 March 2010 (UTC)[reply]

Death penalty law wording

In the states that have it, how do they make an exception for the prison workers who throw the switch so they won't be charged with murder, or in the case where there are multiple switches and nobody knows if theirs was the one that did it, attempted murder, since they knew theirs could be the live switch and they threw it anyway? Do they simply not define the act of throwing the switch as murder, even though obviously it results in the person not living any more as a direct result of their action? 71.161.59.39 (talk) 23:26, 21 March 2010 (UTC)[reply]

Throwing the switch or otherwise participating in a legal execution is not murder. ←Baseball Bugs What's up, Doc? carrots23:32, 21 March 2010 (UTC)[reply]
What if I at home hooked someone up to the same exact system they have and threw the switch? I'm interested in how they word their making it legal for them to do it.71.161.59.39 (talk) 23:43, 21 March 2010 (UTC)[reply]
I don't follow. Legal executions are legal. They are not murder. ←Baseball Bugs What's up, Doc? carrots23:47, 21 March 2010 (UTC)[reply]
Murder is the unlawful taking of life. Executions are the end product of a legal process. Hence, executions are not murder. ←Baseball Bugs What's up, Doc? carrots23:50, 21 March 2010 (UTC)[reply]
That's the pith. Is there a written definition of lawful execution in the books of the states that have it? The part on the books that differentiates a state where the death penalty is allowed and a state in which it isn't, how do those states come out and say "we can legally perform executions" on the page of the law books?71.161.59.39 (talk) 23:58, 21 March 2010 (UTC)[reply]
You may want to familiarise yourself with the idea of common law before asking where in the 'law books' something is. Not every legal provision is necessarily codified. AlexTiefling (talk) 00:02, 22 March 2010 (UTC)[reply]
Surely the ending of a human life at the state's hands is statutory. That's what I'm looking for and would like to read.71.161.59.39 (talk) 00:34, 22 March 2010 (UTC)[reply]
Capital punishment has existed at common law for a long, long time. It's the abolition of capital punishment that requires a statute (or at least an amendment). --PalaceGuard008 (Talk) 02:50, 22 March 2010 (UTC)[reply]
In the US, executioners obtain a execution warrant. Our article says: "This protects the executioner from being charged with murder." Staecker (talk) 00:48, 22 March 2010 (UTC)[reply]
Side discussion
::Does this question sound vaguely familiar? ←Baseball Bugs What's up, Doc? carrots03:05, 22 March 2010 (UTC)[reply]
No. Staecker was answering a question, not asking one. And no offense, but he gave a better answer than you did. ScienceApe (talk) 06:48, 22 March 2010 (UTC)[reply]
No offense, but you're missing the point. A few weeks ago also, an IP was raising questions about capital punishment that presupposed certain things that weren't true. As long as it doesn't devolve (again) into a moral discussion about executions, things will be peachy. ←Baseball Bugs What's up, Doc? carrots06:58, 22 March 2010 (UTC)[reply]
I understand. Well, whoever it was a few weeks a go wasn't me. My core interest was in how the switch guy is covered, which Staecker answered. 71.xxx and 20.xxx is me. The only reason I haven't joined yet is I don't know what to make my name. 20.137.18.50 (talk) 14:47, 22 March 2010 (UTC)[reply]
If that was your point, you should have replied to the original post, and not Staecker's comment. Didn't sound like he was trying to get into a moral debate. Sounded like you were jumping to conclusions. ScienceApe (talk) 16:32, 22 March 2010 (UTC)[reply]
I was asking Staecker if it sounded familiar. You jumped to a conclusion. ←Baseball Bugs What's up, Doc? carrots21:39, 22 March 2010 (UTC)[reply]
No, because I didn't draw a conclusion, I made a suggestion. ScienceApe (talk) 08:34, 23 March 2010 (UTC)[reply]

The first thing to note is that there's nothing special about capital punishment in this respect. When a suspect is arrested, when a criminal is punished by imprisonment or by a fine, the police officer or criminal officer or court officer carrying out such duties is still doing something that would be illegal if not for the proven or suspected crime that justifies them. Since we were talking about executions, I looked at the law in Texas and found Title 2, Chapter 9, of their Penal Code, in which section 9.21 reads:

PUBLIC DUTY. (a) Except as qualified by Subsections (b) and (c), conduct is justified if the actor reasonably believes the conduct is required or authorized by law, by the judgment or order of a competent court or other governmental tribunal, or in the execution of legal process.
(b) The other sections of this chapter control when force is used against a person to protect persons (Subchapter C), to protect property (Subchapter D), for law enforcement (Subchapter E), or by virtue of a special relationship (Subchapter F).
(c) The use of deadly force is not justified under this section unless the actor reasonably believes the deadly force is specifically required by statute or unless it occurs in the lawful conduct of war. If deadly force is so justified, there is no duty to retreat before using it.
(d) The justification afforded by this section is available if the actor reasonably believes:
(1) the court or governmental tribunal has jurisdiction or the process is lawful, even though the court or governmental tribunal lacks jurisdiction or the process is unlawful; or
(2) his conduct is required or authorized to assist a public servant in the performance of his official duty, even though the servant exceeds his lawful authority.

So this part of the law is a general rule that takes precedence over the section that says if you kill someone it's murder, and the section that says if you imprison someone it's kidnapping, and so on. (Other sections of Chapter 9 cover things like self-defense.) Other jurisdictions can be expected to have similar laws. --Anonymous, 19:52 UTC, March 22, 2010.

The general idea here is Hobbsean - i.e., that when individuals prove themselves incapable of proper moral restraint, the state (in the collective defense of its people) is entitled/obligated to exercise moral constraints over them, up to and including execution. An executioner is not acting as an individual any more than a soldier is: he is acting as an extension of the state. In fact, the use of multiple switches, executioner's hoods, guns with blanks in firing squads, and etc, is specifically designed so that each participant can believe that it was not his individual act which carried out the state's will. Whether the state has a right to execute its citizens at need is a difficult question, on which you will find reams of philosophical discussion. but currently it is a well-established practice in many nations. --Ludwigs2 20:56, 22 March 2010 (UTC)[reply]
The Fifth Amendment to the United States Constitution says you cannot be deprived of life, liberty or property without due process of law. Implicit in that statement is that executions, incarcerations and confiscation can all be acceptable under the law. ←Baseball Bugs What's up, Doc? carrots21:55, 22 March 2010 (UTC)[reply]
Aha, when in doubt, go right to the source and ask the horse. The statement in execution warrant which reads, "This protects the executioner from being charged with murder", has been in that article, uncited and unchallenged, from the second edit that occurred, in November of 2002. While the warrant authorizes the execution, that's just part of a legal process. To say that it "protects the executioner" is original synthesis, as the executioner is not going to act until he has that warrant authorizing him to do it. ←Baseball Bugs What's up, Doc? carrots22:24, 22 March 2010 (UTC)[reply]

Is this a real photo of Hitler?

[31] Looks like a painting, maybe, I'm not sure. It certainly doesn't look like him and his posture is weird. A Quest For Knowledge (talk) 23:52, 21 March 2010 (UTC)[reply]

If it's a painting, it might be based on this[32] which looks like a legit photo. A Quest For Knowledge (talk) 23:55, 21 March 2010 (UTC)[reply]
This site says it's "Hilter with the children of Nazi dignitaries on his birthday"[33] A Quest For Knowledge (talk) 00:04, 22 March 2010 (UTC)[reply]
Well, I'll be darned. It appears to be a legit photo.[34]. A Quest For Knowledge (talk) 00:11, 22 March 2010 (UTC)[reply]
Neither of Quest's links works for me, but the original post looks like a badly colorized photo to me, by the way it includes every pastel known to man. StuRat (talk) 00:11, 22 March 2010 (UTC)[reply]
Looks like a still from a B-movie about an evil babysitter. ←Baseball Bugs What's up, Doc? carrots01:07, 22 March 2010 (UTC)[reply]
In the article Goebbels children, it says "Hitler was very fond of the children, and even in the last week of his life still took great pleasure in sharing chocolate with them". I recall other sources saying that he would get on the floor to play with visiting kids. Alansplodge (talk) 15:57, 22 March 2010 (UTC)[reply]
looking at the second image (which is obviously a photo - you have to refresh the page to get it to load), I think the first image is a painting based on the photo that tries to correct that fact that Hitler's face is hidden in the photo. the painter tried to rotate Hitler's upper body backwards and added a full-frontal face, but the head looks jutted out because the neck in the photo is bent downward. or it could be a modern colorized cut and paste job, I suppose, but notice the differences in hitler's hand positions (which would be difficult to achieve), and look at the face of the girl in the blue bonnet, which has the classic 'Norman Rockwell' too-smooth, rosy-cheeked look. lousy painter, though - why would anyone who wanted to paint hitler in a positive light like this give him an expression like he's suffering from a horrid attack of gas? can anyone find a picture that the painter might have used to copy hitler's facial features? maybe something with him saluting troops or giving a public speech - that artist captured that 'thousand-mile stare' that public speakers sometimes get. --Ludwigs2 21:12, 22 March 2010 (UTC)[reply]
I suspect they are both photos of the same event, almost certainly the same photographer. One is either colorized or was created with very poor quality color photography (something that would be fairly primitive even for WWII). I don't see any reason to think the first is a painting other than the colors being off (which is more an argument for it being colorized than painted). --Mr.98 (talk) 16:29, 23 March 2010 (UTC)[reply]


There is almost no one in the acting or "TV" or movie bussiness that looks real on the set. But to get that Doll like look it takes about three inches thick of makeup, Fake tans, and clothes there Studio picks out for them to wear. —Preceding unsigned comment added by Iluvgofishband (talkcontribs) 23:13, 23 March 2010 (UTC)[reply]

March 22

Height and weight of Hitler

The above thread about the colorized picture of Hitler has me wondering, how big was he? The picture makes him look rather frail and skinny. He also doesn't look that tall. The link says that the photo was taken during his 50th birthday celebration. Though, judging by his posture, I'd put him probably 20 years past that. Dismas|(talk) 00:32, 22 March 2010 (UTC)[reply]

Adolf Hitler's health doesn't give either (bar saying he gained weight as he aged); any amount of unreliable sources Google finds put his height at 5'8" or 5'9", but I can't find anything reliable. Given all the things allegedly wrong with him (in Adolf Hitler's health) it'd be no wonder he'd look old. -- Finlay McWalterTalk 00:45, 22 March 2010 (UTC)[reply]
According to this[35], Adolf Hitler weighed 175 lbs and stood at 5'9". But yes, he does look frail in that photo. And the weird part is that this is from April 1939 before Parkinsons and the weight of the war took its toll on him. A Quest For Knowledge (talk) 00:51, 22 March 2010 (UTC)[reply]
That would make him just about my size. Interesting... Dismas|(talk) 01:57, 22 March 2010 (UTC)[reply]
You don't have any desire for world domination, do you? A Quest For Knowledge (talk) 02:09, 22 March 2010 (UTC)[reply]
Desire? Yeah. Motivation? No. Dismas|(talk) 02:49, 22 March 2010 (UTC)[reply]
Being a dictator is a stressful job. I wonder if he liked that hat because it made him look taller. ←Baseball Bugs What's up, Doc? carrots01:11, 22 March 2010 (UTC)[reply]
They saved Hitler's height, see Heightism#In politics. meltBanana 03:47, 22 March 2010 (UTC)[reply]

I heard Lil Rob died is it true? they said he died from lung cancer does anybody Know if its true —Preceding unsigned comment added by 76.168.105.49 (talk) 08:20, 22 March 2010 (UTC)[reply]

The article indicates that some users are trying to post a death date, but it's getting reverted due to unreliable sources. Probably the best bet would be to google ["Lil Rob"]. If he actually has died, it would likely pop up early in the list. And if so, see if any major news sources have it, or only blogs and other fly-by-night stuff. ←Baseball Bugs What's up, Doc? carrots09:29, 22 March 2010 (UTC)[reply]
The first story that comes up is this one[36] which addresses this internet rumor and denies it. ←Baseball Bugs What's up, Doc? carrots09:39, 22 March 2010 (UTC)[reply]
FYI, the page is now semi'd in order to keep most of the yokels away from it until this false rumor dies out. ←Baseball Bugs What's up, Doc? carrots08:44, 23 March 2010 (UTC)[reply]

US health care reform - pre-existing conditions

I understand that the health care reform bill that just passed in the US includes a provision to prevent insurance companies withholding cover from people with pre-existing conditions, but I can't find any explanation of precisely what that means. One news correspondent on TV said it's to do with being cut off after you get ill (does that really happen?), others seem to think its to do with applying for new coverage. Which is it? Also, are the insurance companies required to cover treatment for that pre-existing condition or just for other conditions that arise after the coverage is taken out? --Tango (talk) 12:35, 22 March 2010 (UTC)[reply]

Basically when you applied for health care in any state that allowed pre-existing conditions, it asked you to list all of your health conditions that you were coming into the health care with (diabetes, cancer, whatever), and these would be used to assess your risk and whether they'd let you in to their plan and your premium if so. Now the trick here was that if they let you in and you are then diagnosed with cancer, if they can establish that you had the cancer before you applied for health care (whether you knew it or not), they could drop you from the health care plan, saying it was pre-existing and undeclared in your initial application (and had lawyerly ways of insinuating that you should have known about it even if you didn't) and thus violated the agreement you made with health insurance company, and thus they could terminate the agreement. Yes, this happened, and not infrequently. Additionally, if you did actually declare your cancer in your application, it would either be rejected or result in impossible premiums. So it's at the nexus of the two things you describe: it's about the application process, but it's also about how the application process gets invoked in disputing later conditions.
Even supporters of the health care status quo generally thought this was a particularly dastardly business practice (and counter to the very idea of insurance), and in many states it had already been outlawed, but it was not uniformly so. See Pre-existing condition for more details, statistics on public opinion, various state laws, etc. It was one of the more egregious examples of something that makes good, capitalist business sense (drop the expensive ones if possible), but is considered by most to be at worst morally repugnant, at best counter to the entire purpose of health insurance. --Mr.98 (talk) 13:50, 22 March 2010 (UTC)[reply]
They would also deny coverage to cancer patients who developed cancer well after they were insured, on the basis of them having "lied on their application, making the contract void", if they didn't report some minor thing, like teenage acne years ago. And I agree that unrestricted capitalism leads to some horrid practices in health insurance. StuRat (talk) 14:52, 22 March 2010 (UTC)[reply]
  • Incidentally, this exact trick (although not for cancer coverage) was dramatized in the most recent episode of the TV series The Good Wife. --Anon, 19:57 UTC, March 22, 2010.
Right, I had forgot that part. Yes, any "undisclosed pre-existing condition" could be the basis for denying later coverage, even if it was not the real reason they were denying the coverage. There are also some really ugly definitions of "pre-existing condition," famously treating domestic violence as a pre-existing condition. --Mr.98 (talk) 15:46, 22 March 2010 (UTC)[reply]
Thank you - that is a very useful description of the status quo. The article you linked to is rather unclear, though - it uses terminology without defining it (it is probably commonly used in the US, but I'm not familiar with it). For example, what do phrases like "Maximum pre-existing condition exclusion period" mean? Likewise, the rest of the headings in that section. Do you know what the recent bill will actually change? --Tango (talk) 15:00, 22 March 2010 (UTC)[reply]
The exclusion period is how long you have to wait before signing up for health care. I think it's meant to make it so that if you are dying of cancer, they can wait to see if it really crops up. The idea from the business standpoint is to limit people who buy healthcare just when they are sick. (Which makes sense in an ideal world—you don't want healthy people not paying into insurance rolls, only to sign up when they get sick. In a less ideal world—one in which health insurance is tied primarily to employment, and out-of-pocket costs for insurance are extraordinarily prohibitive—it is problematic.) "Look back" is how much time prior to your plan starting you have to declare/they can look for preexisting conditions (so if it is unlimited, they can go back forever in your life; if it is six months, they have to find things prior to that). Permanent exclusion is about whether the health care plan can ban covering certain pre-existing conditions entirely—just because you have a pre-existing condition does not mean that you can't get healthcare, obviously—otherwise nobody would ever be able to change insurance carriers. This is I think primarily in reference to pre-existing conditions you have declared, and they don't prohibit you from joining the plan. (Just because you have health issues doesn't mean you are automatically rejected.)
The bottom of the page discusses the recent bill's reforms (some of which existed in individual states already—Massachusetts, for example, had already banned most of this pre-existing condition stuff). I'll admit that some of these is insurance-ese to me, but the gist is that pre-existing conditions can no longer be used as a basis for denying or canceling health insurance.--Mr.98 (talk) 15:46, 22 March 2010 (UTC)[reply]

Evolution and redemption

My question is about two groups of people.

Where can I find population figures regarding the intersection set, that is to say, regarding people who believe in all of those things? (I used "figures" in the plural, in order to accommodate different times and different places and different surveys.) -- Wavelength (talk) 15:55, 22 March 2010 (UTC)[reply]

Catholicism, and Charles Darwin. Comet Tuttle (talk) 17:02, 22 March 2010 (UTC)[reply]
Your first category needs some careful definition to be useful. 'Belief in evolution' can mean a lot of things. 'Does evolution ever occur' is answered yes by most people, even those who believe that the earth is only 6000 years old. On the other hand 'did all life come into existence solely through a process of natural selection' is answered no even by a lot of people who accept science's conclusions about the fossil record and have no particular religious beliefs.
Your second category is very much a statement of Christian doctrine. I think you can safely say that believers in your second are synonymous with Christians (someone who knows more about islamic theology than me might want to correct me there).
If, purely for simplicity, we take your first to include anyone who accepts that the earth is millions of years old and that significant new species have arisen during that time, we can make some deductions.
Creationism gives some figures for those who agree with "Human beings are descended from earlier species of animals". The US figures say that 40% agree, with 20% unsure. Christianity by country says 78% of the US are Christian. Assuming that all the creationists (and unsure) are Christians, that means 18% of the population are Christians and believe in evolution, and 20% are Christians and aren't sure. The US is unusual in how many creationists it has, so let's take the UK as another example. Following the same steps we find that nearly 50% of the population believe in both Christianity and human evolution. DJ Clayworth (talk) 17:04, 22 March 2010 (UTC)[reply]
You need to be careful with statistics here. There is no necessary contradiction between 'fall-of-man/redemption' beliefs (essentially theological arguments about the nature of human consciousness) and evolutionary theory (biophysical arguments about the evolution of physical traits and characteristics). any conflict here comes form relatively small groups who insist there is a conflict, either by arguing that the theological argument about consciousness must be interpreted as a literal biophysical description, or by arguing that the theological argument is pure poppycock. Darwin himself had no problem hanging onto both thoughts simultaneously, and Gregor Mendel (purported father of modern genetics) was a priest. There are and always will be people willing to make silly arguments in order to gain sociopolitical advantages. The real core of this dispute is philosophical: Abrahamic faiths say that man was designed in the image of God; evolutionary biologists say that man evolved from monkeys; few people on either side are willing to think that God is a Monkey (or even one of the Monkees). It's a confusion that will work itself out over time, in spite of all the political gnashing-of-teeth. --Ludwigs2 20:27, 22 March 2010 (UTC)[reply]
I think something has been lost in Ludwigs2's statement above. I am neither a scientist nor a theologian, but I have some difficulty with the statement that "evolutionary biologists say that man evolved from monkeys". I understand monkeys and human are supposed to be descended from a common ancestor, and not one "evolving" from the other. (If god were a monkey, god would then have a common ancestry with humans, which may satisfy theologians, but I doubt it.) Bielle (talk) 21:41, 22 March 2010 (UTC)[reply]
The notion that "man evolved from monkeys" is a common oversimplification. Man and ape are both considered primates, so they presumably would have a common ancestor somewhere along the way. That would also have been a common ancestor of other now-extinct hominids such as the Neanderthals. ←Baseball Bugs What's up, Doc? carrots21:44, 22 March 2010 (UTC)[reply]
Strictly speaking, humans are apes, though that term itself is a bit fuzzy. Both apes and Monkeys are of course primates, and share a common ancestor about 40 million years ago. Of course, things are muddied by the fact that old world monkeys and apes are actually more closely related to each other then old world monkeys are to new world monkeys. Oh, the joys of taxonomy. Of course, to a creationist, I'm just spouting evolutionist rhetoric. Buddy431 (talk) 01:13, 23 March 2010 (UTC)[reply]
No-one here has said there is a contradiction between those beliefs. The OP is asking how many people believe both - implicit in that is the assumption that it is possible to believe both. --Tango (talk) 21:52, 22 March 2010 (UTC)[reply]
It most certainly is possible to believe both, and many do, as Clayworth's computations show. Where it can get interesting is the claim by some Christian sects that it is not possible to believe in both, for the simple reason that Darwinian evolution clearly takes a very long time to occur, whereas the Bible is "literally true", so the only way evolution can occur is by the direct hand of God, which in their assumption would be instantaneous. In my church, we were taught that the Adam-and-Eve stories were "symbolically" true, rather than "literally" true. That's another way to get around the apparent contradictions. ←Baseball Bugs What's up, Doc? carrots22:12, 22 March 2010 (UTC)[reply]
Well, they're right - creationism is not compatible with evolutionary biology (it's compatible with evolution, but not with evolution as the origin of species). I guess the issue is over whether you count people that don't believe in biblical literalism as Christians. Most people do, but there are some that hold onto a stricter definition. --Tango (talk) 22:22, 22 March 2010 (UTC)[reply]
Well, my point was a statistical one (or rather a methodological one): we can't really assert claims about belief in evolution by looking at statistics on religion. the debate is being carried on by some very small minorities which big axes to grind, and each minority is going to claim a lot more representation than they actually have. most people have never made the original sin of conflating scientific and religious beliefs, and so their answer to survey questions is probably not easily interpretable. You get the same result when you do research on product price comparisons: Such research invariably shows that people do not make logical price comparisons when they shop, but most such research admits that most people aren't even trying to make logical price comparisons (preferring to use brand-name heuristics or other decision strategies), so the value of the research is limited. --Ludwigs2 22:39, 22 March 2010 (UTC)[reply]

I was hoping that someone would find one or more pages (in Wikipedia or elsewhere) which already had the figures from surveys already conducted. I did not anticipate that someone would use other figures as preliminary figures from which to calculate the desired figures. The results of such calculations do not necessarily follow logically. For example, if one half of Americans are female and if one in 100 Americans is a nurse (hypothetically), it does not follow logically that one in 200 Americans is a female nurse. -- Wavelength (talk) 02:46, 23 March 2010 (UTC)[reply]

Fascism and obedience

A slogan of Mussolini's fascists was "Believe! Obey! Fight!", and I also recall a translated speech of Hitler's where he talked ranted about obedience.

Yet the idea of fascism seems to be about forcefully taking power over others. So how did fascists square this with obedience? 78.149.193.98 (talk) 21:27, 22 March 2010 (UTC)[reply]

I've linked the article, which would be a good starting point. ←Baseball Bugs What's up, Doc? carrots21:49, 22 March 2010 (UTC)[reply]
Hitler didn't rant. He is widely acknowledged as an excellent orator. Just because you disagree with what someone is saying doesn't mean they are ranting. --Tango (talk) 21:54, 22 March 2010 (UTC)[reply]
Most of the news clips of Hitler focused on the end of his speeches, where he was speaking loudly, with great passion, gesturing; and the audience was shouting with him. That part might be considered a "rant", especially without a translation. As I understand it, the way it actually worked was that Hitler would come onstage and stand there silently for a minute or so, gazing over the crowd, getting a sense of their mood, getting their attention and anticipation. Then he would start talking softly and calmly, slowly working up to the frenetic ending that the news clips showed. He wasn't a ranter, he was a seducer. ←Baseball Bugs What's up, Doc? carrots22:04, 22 March 2010 (UTC)[reply]
In the fascism article that is linked to above I find the following:
Fascist states pursued policies of social indoctrination, through propaganda in education and the media, and through regulation of the production of education and media material. Education was designed to glorify the fascist movement and inform students of its historical and political importance to the nation. It attempted to purge ideas that were not consistent with the beliefs of the fascist movement, and taught students to be obedient to the state.
While it doesn't actually answer the question, it may, to an extent, confirm the premise of the question. Bus stop (talk) 21:59, 22 March 2010 (UTC)[reply]
The scary part comes when you ask yourself how different it really was from your own primary school education. ←Baseball Bugs What's up, Doc? carrots22:05, 22 March 2010 (UTC)[reply]
Do as I say rant, not as I do. Clarityfiend (talk) 22:29, 22 March 2010 (UTC)[reply]
Fascist ideology owes a lot to a peculiar reading of German philosophy (people like Heidegger and Neitzche, and other authors I can't ever spell correctly). As I understand it, the fascist ideology was puritanical socialism. basically, the world is filled with corrupt ideals obeyed by a sheep-like populace and maintained by manipulative political figures (read the first few chapters of "Thus Spake Zarathustra" to get the right impression), and the only way for a man to liberate himself from this was to turn his back on the corrupt social forces and band together with others in adherence to a moral ideology. Obedience wasn't subservience in the Nazi lingo - obedience was the only path to liberation and what we would call self-determination. it's basically the same logic as "Dress for Success" taken to a homicidally philosophical extreme: conform to 'correct' social standards and society will reward you.
in other words, obedience was promoted as the first step in the path to personal realization. --Ludwigs2 22:30, 22 March 2010 (UTC)[reply]
Fascism qua fascism (as distinguished from Nazism, which is something of a variant) is a fundamentally collectivist philosophy, not an individualist one. It is about the strength of collective entities (for Italian fascism, this is the state; for Nazis, this is the race/Volk). Individuals only flourish in a fascist model when acting in the name of the state. An ideal fascist state is coordinated in order to support the will of the individual who truly embodies the will and interests of the state (the leader). (The obvious problem with this that someone raised with democratic values would put forward is, "How do you know who really embodies the will and interests of the state?") --Mr.98 (talk) 22:54, 22 March 2010 (UTC)[reply]

So the fascists believed that the only right and proper thing to do was to obey the boss figure, and that people who did things like thinking for themselves or taking the initiative were being willfully disobediant and had to be put down? 92.29.120.231 (talk) 15:37, 23 March 2010 (UTC)[reply]

Well, fascists in a philosophical sense are idealists. They believe that there is basically one right answer at all times. That happens to be the answer espoused by the leader in their cases (which of course is tautological... is he the leader because he has the right answers, or are his answers right because he is the leader?). If you don't have the right answer, you are wrong, and there was no tolerance for many differences of opinion. Now that does not always mean execution. But it does imply negative consequences. To a true fascist "thinking for yourself" is just a cloak that wrong people use in order to justify their wrong ideas. "Taking the initiative" was generally discourages under the German model—the "coordination" of power meant that you would be expressing the direct will of the Fuehrer only if you were doing what you were told to do. Going outside of that will just showed that you were wrong (because the Fuehrer is never wrong). --Mr.98 (talk) 16:14, 23 March 2010 (UTC)[reply]
I just also want to add that my answers here are in terms of how a fascist would self-justify in philosophical terms (based on a course on the philosophy of fascism I took a long time ago now). In practice of course fascists in Italy and Germany were like many other politicians, but had greater political power. They forced out their enemies and those who threatened them. They were a pretty successful form of self-organization from the standpoint of taking over a country (running it in war, not so much). They were fairly ruthless and did not spend their time debating fine philosophical points about who was correct and who was not. Fascists are hardly unique in this respect, but the particularly linearly hierarchical nature of their systems allowed for it to be fairly streamlined. In practice even this was more complicated (organizations vied for power and could be played against each other in some situations, for example). --Mr.98 (talk) 23:52, 23 March 2010 (UTC)[reply]
The article definitions of fascism discusses different interpretations of the term. 66.127.52.47 (talk) 23:22, 23 March 2010 (UTC)[reply]
In the 1950s the Frankfurt School social theorists Theodor Adorno and Max Horkheimer came up with the concept of the 'Authoritarian Personality' - someone with a need to submit to authority figures, who is aggressive towards those they see as 'different'.[37] Most fascist movements were essentially opportunistic and based on demagoguery and the appeal of the 'strong man' who claimed that he could solve his country's problems, as long as he was given unswerving loyalty.--Pondle (talk) 00:07, 24 March 2010 (UTC)[reply]

Why pay banks to lend to the Fed when the largest risk is they aren't lending to their customers?

Is there any actual advantage to paying interest on bank reserves [38] when the largest risk is underwater commercial real estate [39]? 208.54.14.104 (talk) 22:11, 22 March 2010 (UTC)[reply]

The advantage is obviously to the banks. This was a provision rushed through Congress using extortionary rhetoric at a time of crisis by hundreds of Wall Street lobbyists (and major campaign contributors), former Goldman Sachs CEO, Hank Paulson, and by Ben Bernanke, chairman of the Federal Reserve System, which is essentially owned and, to a substantial extent, controlled by its member commercial banks. It is a handout to banks at the expense of every other holder of U.S. currency (in that the money supply is increased and therefore reduced in value relative to a shrinking GDP or a GDP growing very slowly). Marco polo (talk) 23:02, 22 March 2010 (UTC)[reply]
A decidedly insider point of view: [40] NByz (talk) 03:31, 23 March 2010 (UTC)[reply]

More accurately, interest on bank reserves is compensation aimed at encouraging banks to hold higher levels of reserves, thus (it is hoped) reducing the likelihood of their needing an expensive taxpayer-funded bailout. DOR (HK) (talk) 06:07, 23 March 2010 (UTC)[reply]

Is there any evidence that hope is grounded in reality? Isn't a bank allowed to lend a certain multiple of its reserves? If that multiple is re-deposited as additional reserves, isn't that akin to an individual kiting a check, and more importantly, doesn't that increase the risk of exposure if the banks other assets decline in value, nullifying the benefit of larger reserves? 99.56.137.254 (talk) 09:16, 23 March 2010 (UTC)[reply]
Generally, a bank is allowed to lend out everything but the fraction that is required as a reserve (ignoring various capital requirement schemes like Basel II). Thus, it's assumed that, with profit maximizing financial institutions, anything not held in reserve will be redeposited (see Fractional-reserve banking), expanding the money supply to approaching R*1/RR (R = total bank reserves, RR = the minimum reserve ratio; see Money multiplier. Fractional reserve banking (also referred to as "Multiple Deposit Expansion") does indeed increase interconnectedness and the risk of the financial system. It's unavoidable, under our current system. An alternative would have to involve creating "classes" of money. (Class one money, for example, could be deposited and lent out. When it was lent out, however, it would have to turn into class two money, which could be deposited but not re-lent out. Something like that). Schemes like paying interest on deposits, some would argue, is a way of creating proper incentives - and giving central banks more control - in a monetary economy such that we live in. NByz (talk) 13:21, 23 March 2010 (UTC)[reply]
Is there any evidence that banks have used such control for their chartered purposes of facilitating commerce as opposed to enriching executives at the expense of the remaining vast majority leading to increased poverty because of the conservation of scarce resources? 99.56.137.254 (talk) 13:43, 23 March 2010 (UTC)[reply]
Well the idea actually would be to take a little bit of that decision-making power away from the banks and give it to the central bank (the Federal Reserve). With this tool, the central bank would be able to control the money supply more directly (specifically, it would be able to more effectively take money out of circulation by encouraging excess reserves). An integrated, efficient and competitive banking system is highly correlated with economic growth (efficiency) and, though a western-style central bank system certainly draws its critics (this is somewhat a side note, but Political positions of Ron Paul is a delightfully well-cited and thorough article), it just seems to work well enough. Many economists would say that ideas of income distribution (equity/equality) are a separate matter and perhaps ought to be addressed through the tax code and the welfare system. High executive pay could be handled through a British-style super tax [41], if thought appropriate, for example.
Interestingly and additionally, the Dodd plan, currently in the news, tackles some of the problems of interconnectedness by creating an oversight council that can declare a financial firm (even a non-bank) "integral to the financial system" and therefore put it under the supervision of the Federal Reserve (think AIG, Lehman Bros), require all financial firms with more than $50 billion in assets to pay into a special rainy day fund (for future bailout purposes) and creates a supervised bankruptcy process designed to be as painful as possible to equity holders and long-term creditors while limiting defaults in many short-term or money markets through which systemic risk is spread.
In response to DOR (HK), there are more effective ways of reducing the likelihood that banks will need an expensive taxpayer-funded bailout. These ways include 1) tighter regulation, 2) breaking up large banks so that none is "too big to fail", 3) eliminating perverse incentives that enrich individual bankers for taking on risks. These policies would clearly have been in the interests of taxpayers and the general public. However, they would have limited profits and compensation at big banks. Instead of enacting policies that would have offered more certain protection for the public, our financial leadership pressured Congress into approving a policy that gives free money to banks. This policy in fact comes at the indirect expense of taxpayers and others by increasing the purchasing power of banks and bankers relative to other firms and individuals. In concrete terms, this preserves the privileged access of bankers to the best real estate and amenities of New York City and London (among other things) and keeps these things unaffordable to people from less favored economic spheres. Finally, by failing to address the root causes of the risky behavior that put the financial sector in need of a public rescue, and by leaving the implicit government guarantee in place, this policy allowed banks to keep playing essentially the same game with free money and made it likely that they will need an even more expensive taxpayer-funded bailout in the future. Marco polo (talk) 14:46, 23 March 2010 (UTC)[reply]

Paying interest on bank reserves gives the Fed more control over short-term interest rates. In addition, it gives banks more incentive to hold reserves, as opposed to, say, using sweep accounts to put customer assets in money market funds. The link given above by NByz gives much more detail. John M Baker (talk) 16:41, 23 March 2010 (UTC)[reply]

How much income equality is optimal?

I've recently become familiar with http://www.equalitytrust.org.uk which presents evidence that about a dozen separate quality of life measures are more highly correlated with the Gini index of income equality than economic (GDP) growth. That makes sense, but growth is very important for things like sanitation in the developing world, competition, technology, etc. Canada seems to be advanced in these respects, with Mexico trailing.

I would like to find a reliable discussion of the optimal quantity of income equality in the developing and industrialized world. Firstly, does the most appropriate amount of income equality depend on how developed a country is? 208.54.14.40 (talk) 22:32, 22 March 2010 (UTC)[reply]

I understand what you're saying. Under capitalism, for there to be an incentive to work, you must let people who work harder get richer. This promotes an increase in total wealth, however, it also leads to an inequality of wealth. And, while an increase in total wealth helps with something like infant mortality, the inequality in wealth actually hurts it. So, what level would minimize infant mortality ? (I just picked that as one measure, but we could use others).
Rather than take a theoretical approach, I propose looking at infant mortality rates around the world, and then looking at the inequality rates in each nation (Gini index). This should give a good indication of what level of inequality is ideal, at least for that measure. You could then repeat this study looking at another measure, like life expectancy. Then it would be a matter of weighting the different results based on which measures you think are the most important, in order the come up with the ideal level on inequality overall.
You compared Canada and Mexico, but Canada and Mongolia might be a better comparison, since they have a similar Gini index, but Canada has far more per capita income and wealth. Note that I'm assuming a more or less zero-sum-gain between wealth and inequality. That is, I'm assuming that when one goes up, the other goes down. If this is not assumed, then the problem becomes much more difficult to solve. For example, if the inequality of wealth is too severe, total wealth may also suffer, due to violent revolutions that this level of inequality will spawn. Also, an uneducated and unhealthy workforce may result, even in times of peace, and this workforce won't be very productive. So, we may find that there is also a certain Gini index that leads to maximum growth. Now, whether this same Gini index is also ideal for the health of the population, or not, might be more difficult to determine. StuRat (talk) 03:06, 23 March 2010 (UTC)[reply]

I really don't think that there's any one single invariable number (measured according to the Gini index or any other method) which marks the boundary between a healthy society and a sick society in all cases, ignoring the detailed specific context of each society. Furthermore, sometimes inequality of wealth can be at least as important as inequality of income (in the statistics of the United States, for example, it's very noticeable that the difference in average wealth between white families and black families is not leveling out anywhere nearly as rapidly as the difference in average income between white families and black families is leveling out). In simplistic broad-brush terms, if there's such a marked degree of inequality that a society is effectively polarized into two classes of "peasants" and "lords" (or whatever terms the two may be called by), and there's only a very small middle class between the two extremes, and only a very tiny chance that a "peasant" will ever be able to become a "lord", then that's a very strong disincentive for people to try to achieve new things. On the other hand, if taxes on high incomes are so confiscatory that people will see very little direct personal benefit to themselves if they significantly increase their pre-tax incomes, then that can also be a disincentive to economically productive innovations... AnonMoos (talk) 05:04, 23 March 2010 (UTC)[reply]

The Equity Trust’s solution for reducing inequality: more progressive income and property taxes; more generous benefits; higher minimum wages; more generous pensions; running the national economy with low levels of unemployment, (contradicting previous solutions, IMHO); better education and retraining policies; and the bargaining power of trade unions. No indication how to (or who) will pay for all of this.

I’m going to disagree with StuRat on his point that inequality in wealth hurts infant mortality, with an example. Two groups of people in the same society, A and B. A’s income goes up by 779%, and B’s income by 594%. In both cases, it would be nonsensical to posit that B’s infant mortality rate increases. Sure, it might not improve as much as A’s, but that is a whole different idea, and one that does nothing to encourage efforts to reduce inequality. (By the way, the case is a real one, and it represents urban and rural China, 1990-2008. Infant mortality among both groups fell during the period as their income inequality grew worse.)

As for the Canada-Mongolia (or, Mexico) comparisons, one has to assume that the legal systems, crime rates, infrastructure and a whole host of other things are identical before one can come to the conclusion that income inequality reduces infant mortality. DOR (HK) (talk) 06:28, 23 March 2010 (UTC)[reply]

To consider the effect of inequality, you must find a case where inequality changed and every other variable stayed the same. In the case of China, which you used, obviously total wealth has also improved dramatically (no doubt due to China ditching communism and moving to capitalism). Now ask yourself, had the wealth of those above the median income gone up, and those below gone down, would that improve infant mortality ? No, because there are far more people below the median, and those above were likely already getting good health care and nutrition, anyway. StuRat (talk) 13:57, 23 March 2010 (UTC)[reply]
By low levels of unemployment contradicting previous solutions, do you mean the balance between employment levels and inflation? Can that balance be addressed by reducing the money supply at the top brackets of steeply progressive income taxes? Don't more progressive income taxes themselves pay for more generous benefits, etc.? 99.56.137.254 (talk) 07:49, 23 March 2010 (UTC)[reply]
To be honest income equality is a bad measure (on its own) of welfare. It's all well and good to divide the pie up equally, but if it's a really small pie everyone's still gonna be doing pretty badly. You would be better looking at the median income as well as the lowest quartile and perhaps some measure of poverty (beware that poverty lines are normally constructed relative to average income, not some objective measure). To go back to your question, what do you been by optimal? If you asked an economist what is the optimal level of income equality they would probably say "whichever level maximised GDP growth". It also depends on the way governments spend money. A high taxing and spending country with highly unequal income levels may provide better outcomes for the less fortunate than a low taxing and spending country with far more equal outcomes. Simply correlating outcomes and the gini co-efficient doesn't help you understand why those outcomes have come about.Jabberwalkee (talk) 08:05, 23 March 2010 (UTC)[reply]
To the questioner: the correlation of all those quality-of-life variables with the Gini index doesn't mean that the increase in the Gini index caused increases in those measures. Zain Ebrahim (talk) 09:29, 23 March 2010 (UTC)[reply]
Indeed, so the question becomes, does increased quality of life cause income equality, or do the two share a common cause? The book The Spirit Level: Why More Equal Societies Almost Always Do Better goes in to detail about how, for example, income inequality can lead to problems with higher education and health care affordability, quality parenting (e.g., who can afford babysitters), and other factors which suggest that chain of causation travels from income equality to quality of life. 99.56.137.254 (talk) 09:40, 23 March 2010 (UTC)[reply]
Before you can work out what is optimal, you need to work out what you want to achieve. What are your priorities? Happiness? High life expectancy? Low infant mortality? Peace? Freedom? Global influence? Chances are, you want a balance of all those things, so the question becomes how do you weight them. There is then the issue that inequality is not directly controllable. You have to change other things (tax structures and benefit structures are the obvious ones, but education has a big impact too, as do various other things). Those things will affect more than just inequality. That makes it all very complicated and I doubt there actually is a unique optimal level of inequality for most reasonable weightings of priorities. --Tango (talk) 09:53, 23 March 2010 (UTC)[reply]
I think with extreme inequality (feudalism, Czarist Russia, banana republics of various sorts) you get violent revolution, and with enforced equality (various attempts at socialism) you get a Handicapper General situation. That suggests there is an optimum somewhere, and it would be interesting to draw a graph comparing social unrest to inequality level in various societies through history. 66.127.52.47 (talk) 23:33, 23 March 2010 (UTC)[reply]
There may be an optimum of a sort, but it's unlikely to be something which can be validly expressed as one single abstract number, independent of all the particularities and specific characteristics of each society. AnonMoos (talk) 01:05, 24 March 2010 (UTC)[reply]
This isn't probably very helpful, but I remember one Economics professor from my past mentioning and emphasizing that the tradeoff between efficiency and equity necessarily required some sort of value judgement and though Economic theory could certainly help with quantifying or estimating results, it could say nothing of substance about at what rate one ought to trade off against the other. NByz (talk) 01:22, 24 March 2010 (UTC)[reply]

Politics

Politics: Who was the last Republican president to balance the budget? —Preceding unsigned comment added by 216.237.69.212 (talk) 22:52, 22 March 2010 (UTC)[reply]

According to this source, the last Republican president under whom the Federal budget was not in deficit was Dwight D. Eisenhower, exactly 50 years ago in 1960. (I have not bothered to link to a source showing the deficits under Ford, Reagan, Bush I, and Bush II, but if you do the research, you will see that they ran deficits every year they were in office. The last president of any party under whom the budget was not in deficit was Bill Clinton.) Marco polo (talk) 23:19, 22 March 2010 (UTC)[reply]
Those were budget balancing within a given fiscal year, of course. The last time the U.S. was out of debt or nearly so was around 1920. United States public debt has a good overall history and also a number of links to related articles, including one showing debt levels during administrations since WWII. ←Baseball Bugs What's up, Doc? carrots01:20, 23 March 2010 (UTC)[reply]
Interesting, true and unrelated. Deficits aren't debt (yet). DOR (HK) (talk) 06:31, 23 March 2010 (UTC)[reply]
Although the questioner implies that it's the president who can balance the budget, the modern process is better understood as a joint effort (or battle) between Congress and the president. And for the two branches to achieve a balanced budget, factors outside their immediate control often play a major role. In The Federal Budget: Politics, Policy, Process (rev. 2000), Allen Schick concluded that Clinton had "little to do" with the underlying reasons that allowed for a balanced budget "on his watch" (particularly a booming economy and the end of the Cold War), but that it made political sense for him to take credit for the achievement, since he would have gotten the blame had it gone otherwise. —Kevin Myers 08:12, 23 March 2010 (UTC)[reply]
I agree that the parties in control of Congress are as important as the president in determining the budget balance. As it happens, Clinton balanced the budget during a time when Republicans controlled both houses of Congress. However, George W. Bush failed to balance the budget even though he had the benefit of a Republican House from 2001–2007 and a Republican Senate from 2003–2007. It's true that GWB's first years in office were during a recession, when it is difficult to avoid a budget deficit. The same excuse cannot be made for the six years from 2003–2009, during most of which he enjoyed a Republican majority in both houses. I would also point out that the last Republican president to balance a budget, Eisenhower, did so when Democrats controlled both houses of Congress. Marco polo (talk) 14:34, 23 March 2010 (UTC)[reply]
Clinton and the Democrats raised taxes in 1993 before the Republicans gained control of Congress the following year. Of course there is endless political spin on both sides about how that affected the economy in the years that followed, but as a zeroth order approximation, deficits are when expenditures exceed revenues, so increasing revenues may have had something to do with decreasing/eliminating the deficit. 66.127.52.47 (talk) 00:24, 24 March 2010 (UTC)[reply]

Looking for a book I read about five years ago

I can't remember anything except the plot. A man was expelled from a city for practicing "wild magic" (as opposed to regular magic). He escapes the subsequent manhunt with the help of a unicorn, and he must promise celibacy for 1 year in exchange. Outside, he meets his sister, elves, and other things. (Here's where my memory gets vague). There is discussion of the elves and humans joining forces in advance of some menace or another... Ring any bells? (The book is first in a series.) Mxvxnyxvxn (talk) 22:52, 22 March 2010 (UTC)[reply]

Googling "wild magic unicorn" finds The Outstretched Shadow. -- Finlay McWalterTalk 23:45, 22 March 2010 (UTC)[reply]
That's it! Thanks. Mxvxnyxvxn (talk) 17:28, 23 March 2010 (UTC)[reply]

Has charles Manson ever lived in or near to Lyons, Oregon —Preceding unsigned comment added by Averybound (talkcontribs) 22:56, 22 March 2010 (UTC)[reply]

According to the article, now linked, Manson was born in the midwest and stayed there until he moved to California. His mother once lived in "the Pacific Northwest", but not with Charles. ←Baseball Bugs What's up, Doc? carrots01:23, 23 March 2010 (UTC)[reply]
You know, I'm always curious what the parents of people like Manson, Dahmer, Bundy, and etc think about the whole thing. did anyone ever do an interview with Manson's parents? --Ludwigs2 03:54, 23 March 2010 (UTC)[reply]
I'm not aware of one. I recall seeing interviews with Bundy's mother and with Dahmer's parents. They were as perplexed as anyone about why things turned out as they did. I think Bundy came from a broken home. Dahmer is the puzzler. As I recall, they said he had a sadistic bent even when he was five years old - killing animals and dissecting them, stuff like that. I don't think there was any suspicion that his parents were abusive. He just had some weird genetic bent, apparently. As with H. H. Holmes, perhaps. ←Baseball Bugs What's up, Doc? carrots05:26, 23 March 2010 (UTC)[reply]
http://www.wowessays.com/dbase/af5/dtb172.shtml suggests that the parents of terrible criminals are often poorly skilled at parenting and/or unable to obtain affordable parenting support facilitation. 99.56.137.254 (talk) 12:25, 23 March 2010 (UTC)[reply]
also, I just noticed that his next parole hearing is scheduled for 2012, the same year that the Mayan calendar ends. coincidence? I think not! --Ludwigs2 03:56, 23 March 2010 (UTC)[reply]
They should grant him parole just as the earth is breaking apart. That would be a little joke on the old guy. ←Baseball Bugs What's up, Doc? carrots05:20, 23 March 2010 (UTC)[reply]

March 23

mining industry in Canada

what are the present situation: facts and figures, the economic strengths or successes, problems faced, main issues and controversies, policies adopted and implemented by the government and the extent of success and failures of these policies? —Preceding unsigned comment added by 76.64.54.19 (talk) 01:39, 23 March 2010 (UTC)[reply]

Your assigned subject of research can not be reduced to a question that the Reference Desk can answer.--Wetman (talk) 03:54, 23 March 2010 (UTC)[reply]
Natural Resources Canada and its website are probably a good place to start. Adam Bishop (talk) 04:07, 23 March 2010 (UTC)[reply]
You could also search the archives of the Hill Times [42]. They specialize in reporting on issues and controversies in key Canadian economic sectors. --Xuxl (talk) 14:54, 23 March 2010 (UTC)[reply]

Housewife

How many percent of teenage girls in Western countries want to be housewives rather than pursuing their own career? Am I correct in my observation that this number is much lower than the percentage of women who are currently stay-at-home moms? --99.237.234.104 (talk) 05:47, 23 March 2010 (UTC)[reply]

It is going to depend on whether you mean being a housewife for life, or just while their children are growing up. The latter will give a much larger number. --Tango (talk) 09:55, 23 March 2010 (UTC)[reply]
Either for life or while their children are growing up. It takes 12 years to raise a child to adolescence, which is too long for a woman who's serious about having a career. --206.130.23.67 (talk) 12:14, 23 March 2010 (UTC)[reply]
I take exception to that one, as you'd expect: so far in my life I've had three careers: a secretary, a lecturer and a therapist. Plenty of time for me to have had children as well in a working life of 30 years and counting! --TammyMoet (talk) 13:03, 23 March 2010 (UTC)[reply]
I guess the dispute is over the definition of "career". Is it really a career if you only spend a few years doing it? I would tend to agree that taking 12 or so years off work to have children (and assuming you work part-time while they are at school during the period) doesn't preclude having a career, though. (I'm not sure why the OP has chosen the beginning of adolescence as the cutoff point, though. Puberty doesn't seem relevant to me.) --Tango (talk) 13:12, 23 March 2010 (UTC)[reply]
There is also a big difference between "want to..." and "forced to by circumstances". I think many women would prefer to pursue a career, but a lack of affordable child care causes many to give up work until their children are able to look after themselves. Astronaut (talk) 13:31, 23 March 2010 (UTC)[reply]
On the other hand, some women would like to stay at home, but are thwarted by the fact that it's very difficult for a family in the United States to lead a middle-class lifestyle without two incomes these days. AnonMoos (talk) 13:58, 23 March 2010 (UTC)[reply]

The whole idea of a "stay at home mom" (SAHM) is practically unknown in some countries, for example Denmark. There a SAHM is very lonely, and so are the children, since there are no other SAHM's or children in the neighborhood to socialize with or play with, like in the "olden days". Once the rather long maternity leave (which applies to both parents) is over, they are back to work: "Altogether parents are entitled to 52 weeks paid maternity leave." [43] It's a rather interesting fact that few men use it. This is likely because, even though job security is very good there, leaving work for too long can have a very negative effect on a career, depending on what profession one is in. -- Brangifer (talk) 14:10, 23 March 2010 (UTC)[reply]

Do I detect that you imagine that all "western" countries are just like America, but with different languages or a funny accent? That seems to be a common idea among Americans from what I've read here. The percentage will vary from country to country. In Britain, for example, it would be rare for a woman to be a "housewife", although she could be at home because she could not get a job or temporarily while looking after young children. The percent of teenage British girls who want to be "housewives" must be near zero I expect. "Housewife" is an antiquated and rather derogatory word here. 92.29.120.231 (talk) 15:54, 23 March 2010 (UTC)[reply]
I've no idea how much variation there is. I live in Canada; can you give an idea of how Canada compares to America, Britain, and other European countries? As for "housewife" being derogatory, I didn't know that, but if it's simply due to political correctness I couldn't care less. --206.130.23.67 (talk) 17:30, 23 March 2010 (UTC)[reply]
It's not really derogatory: it's not the word that is looked down on, it's the people. Housewives are not particularly respected. There is perception of laziness, since housework is far from a full-time job in these days of refrigerators and washing machines - a housewife is presumed to be a "lady of leisure" (an exception is made for those with young children, but they wouldn't usually use the term "housewife", although there isn't a particularly accepted neutral term - most women that stay at home with their children would say just that, rather than try to put a name to it). There is also a feminist viewpoint that housewives are letting women down. --Tango (talk) 17:50, 23 March 2010 (UTC)[reply]
Tango, you obviously do not know any housewives who have children. It is more than a full-time job. Comet Tuttle (talk) 18:40, 23 March 2010 (UTC)[reply]
I explicitly said it's different if you have young children. --Tango (talk) 20:16, 23 March 2010 (UTC)[reply]
We should create a new job title for them. Perhaps call it, a Victorian Traditionalist or something. Googlemeister (talk) 18:26, 23 March 2010 (UTC)[reply]
"Homemaker", for example. The term "housewife", my Webster's defines as, "a married woman in charge of a household". That doesn't sound so demeaning, does it? But it was made so, by elements such as those described by Tango. A little history: In the old days, nearly everyone worked, particularly among the poor (hence the early-1900s comic song, "Everybody Works But Father"). Women only stopped working long enough to bear children. And they were still expected to manage the home. As with the old saying that my Mom used to bring up, "Man works from sun to sun, but woman's work is never done." And it was grueling, due to the lack of modern conveniences. For example, Monday was typically "Washday", which was an all-day activity. Not that being a man was a picnic either. But a woman not having to work for a living and/or being able to hire maids and the like, and truly manage the house as opposed to doing all the work, was a sign of status, of being well-off. Following WWII, when prosperity finally came after a couple of decades of Depression and War, and with the development and growth of labor-saving devices, a lot more women could become "just" housewives because they no longer "had to" work for a living as such. That's largely true about suburbia. But the reality is that women in poor families still have to work and manage the home, just as they always did, since feeding the family is top priority. ←Baseball Bugs What's up, Doc? carrots19:25, 23 March 2010 (UTC)[reply]
All of this is very interesting, but it doesn't get me closer to an answer. Is anybody looking for a numerical answer to the original question? --99.237.234.104 (talk) 20:36, 23 March 2010 (UTC)[reply]

Health Care Bill

I read that the Health Care Bill will extend cover to 95% of Americans. Who are the 5% who are included out? - Kittybrewster 11:41, 23 March 2010 (UTC)[reply]

According to The Guardian, the 5% are illegal immigrants, people eligible for Medicaid who don't use it, and poor people exempted from having to buy insurance. But The Guardian says "Exact figures on who will make up this grouping are hard to find." Maybe we'll soon have a better idea since, as Ms Pelosi so aptly put it: "[W]e have to pass the bill so that you can find out what is in it...." —Kevin Myers 12:44, 23 March 2010 (UTC)[reply]
Mostly undocumented migrant workers from Mexico, who are undergoing a tuberculosis epidemic. Could the Canadians have been on to something when they did that universal thing? 99.56.137.254 (talk) 13:23, 23 March 2010 (UTC)[reply]
Also maybe the British, French, Germans, Australians, New Zealanders... DJ Clayworth (talk) 14:28, 23 March 2010 (UTC)[reply]
Illegal immigrants in Canada receive a health card? Our article Health care in Canada doesn't address the issue. —Kevin Myers 14:35, 23 March 2010 (UTC)[reply]
Apparently so. LOL 99.56.136.197 (talk) 15:51, 23 March 2010 (UTC)[reply]
Amusing, but the question remains. —Kevin Myers 16:04, 23 March 2010 (UTC)[reply]
Congratulations to the US of A in finally catching up with Britain (well, 95% of the way) sixty-two years later. 92.29.120.231 (talk) 15:58, 23 March 2010 (UTC)[reply]
Illegal immigrants aren't entitled to free-at-the-point-of-use (non-emergency) health care in the UK, either. However, even with the recent bill, American healthcare will be nothing like the NHS. For example, there are no deductibles here. As I understand it, for those in the US with the cheapest health insurance, health care can still be very expensive. The recent bill doesn't change that, as far as I know. --Tango (talk) 16:12, 23 March 2010 (UTC)[reply]
Patient Protection and Affordable Care Act is our article on the new law. Comet Tuttle (talk) 17:50, 23 March 2010 (UTC)[reply]

OFEX stock market, UK

Ofex re-directs to Plus. When did the change of name happen? Why did it happen? Is Plus exactly the same as Ofex except for a change in name? Thanks 92.29.120.231 (talk) 16:37, 23 March 2010 (UTC)[reply]

According to this corporate history page Ofex Holdings plc was renamed as PLUS Markets Group plc in November 2004. Gandalf61 (talk) 16:46, 23 March 2010 (UTC)[reply]

what qualifications can you get overnight

if you wake up tomorrow with all the mental skills (including memories of their experience) anyone possesses, to include world famous surgeons scientists mathematicians, anyone and anything that is "mental" in nature, then what paper quealifications could you get within the next few days assuming you have the money. 82.113.121.34 (talk) 18:27, 23 March 2010 (UTC)[reply]

Nothing worth having. You could gain membership to a high IQ society, but proper academic qualifications, medical licenses, etc., all require far more than a few days to acquire. WhatamIdoing (talk) 18:34, 23 March 2010 (UTC)[reply]
One of the easier ones may be passing the local bar exam to qualify to practice law; there is no requirement to have a law degree to take the bar exam (in the US and many other countries, at least). (If I'm reading our article correctly, though, the bar exam is generally given only twice a year throughout the US.) Medical degrees are probably out; the Doctor of Medicine ("MD") has training requirements that will take a long time; though your hypothetical savant would ace them quickly, it's not quick enough for your needs. You have to write a book, basically, to get a Ph.D, so that's out. Comet Tuttle (talk) 18:36, 23 March 2010 (UTC)[reply]
It is conceivable to me that a person of immense brilliance could write a Ph.D. dissertation in mathematics or theoretical physics in a few days—they aren't necessarily very long. There may be other degree requirements, such as classes, that would take longer if the requirements aren't waived. Convening a committee for the thesis defense can also take quite a while. -- Coneslayer (talk) 18:45, 23 March 2010 (UTC)[reply]
Immense brilliance and luck. If you are unlucky then, however brilliant you are, you could easily waste days (or longer) on a dead-end approach to a proof. The dissertations are only short because they miss out all the work that didn't go anywhere. --Tango (talk) 19:00, 23 March 2010 (UTC)[reply]
My condolences, Tango. Just remember: you fail 100% of the shots you don't take. — Preceding unsigned comment added by 82.113.121.38 (talk)
I think that's where getting everyone's experience (as stated in the question) helps. Probably somewhere in there is a brilliant, nearly-complete idea that someone just hasn't written up yet. -- Coneslayer (talk) 19:14, 23 March 2010 (UTC)[reply]
It seems you do need a law degree to be admitted to the bar in the US. There are some exceptions, but none that can be qualified for in a few days. --Tango (talk) 19:00, 23 March 2010 (UTC)[reply]
I stand corrected! Thank you, Tango; I struck my "easiest" claim above. This link lists some famous non-law-degree lawyers, like Abraham Lincoln, but as the article states, only 7 states in the US currently allow "reading" into the bar with the exam and no law degree; and all of those require some time apprenticing or the like. Comet Tuttle (talk) 20:38, 23 March 2010 (UTC)[reply]
aren't there certificates you can basically just sit for? 82.113.121.38 (talk) 19:06, 23 March 2010 (UTC)[reply]
Plenty of technology training companies offer week-long courses with exams at the end. They probably won't turn you away from the final exam just because you didn't show up for the course, provided you paid.
In Ontario you can get a license to operate a powered watercraft by just sitting the exam. DJ Clayworth (talk) 19:09, 23 March 2010 (UTC)[reply]
Assuming you have the money, you could get any qualification or degree you wanted in a few days. See Bribery. Googlemeister (talk) 19:41, 23 March 2010 (UTC)[reply]
That's assuming you could find someone willing to be bought. -- Jack of Oz ... speak! ... 20:12, 23 March 2010 (UTC)[reply]
Even just finding that corrupt official could require several days' effort. ←Baseball Bugs What's up, Doc? carrots20:16, 23 March 2010 (UTC)[reply]
They would not need to be an official. The poor schlub who prints the certificates would suffice. Googlemeister (talk) 20:52, 23 March 2010 (UTC)[reply]
You need someone to put your name in the database in case someone phones they to verify the qualification. --Tango (talk) 00:21, 24 March 2010 (UTC)[reply]
If you want a fairly meaningful qualification, you may well need to find several corrupt people. There are often moderation systems in place to make sure one people isn't solely responsible for giving out certificates. --Tango (talk) 20:33, 23 March 2010 (UTC)[reply]
That might change the size of the bribe, but not the end result. If a bribe will not do, there are other illegal means of persuasion that can be bought. Googlemeister (talk) 20:53, 23 March 2010 (UTC)[reply]

these answers bite. i was thinking that maybe it is possible you guys are all thinking of bachelors, masters and so forth. obviously you can't get one of those. but there are a lot of other paper qualifications, and I'd like to hear some of them. an example would be getting a certificate like the DELF (French) saying you speak a certain foreign language. any other ones ? 82.113.121.37 (talk) 20:17, 23 March 2010 (UTC)[reply]

If we're so rubbish, why are you asking us for help? Answer the question yourself if you're so brilliant. --Tango (talk) 20:33, 23 March 2010 (UTC)[reply]
Apparently he (or is it "they") thinks we're being paid for this. :) And I have to wonder about the example he gave. How could you immediately get a certificate saying you know French? Maybe by taking a standardized French test? Yes, that by itself might be quickly done. But unless you speak French fluently as a second language and/or have been trained in how to speak French, it's not an "overnight" process. (Leaving out the bribery hypothesis.) ←Baseball Bugs What's up, Doc? carrots21:38, 23 March 2010 (UTC)[reply]
And as far as "finding someone to bribe", well, consider the recent George Ryan case. While he was Secretary of State of Illinois, his office took bribes from guys who wanted to get special trucking licenses without having to go through the work needed to qualify. The smoking gun for that kind of thing eventually comes, though, and in this case it was a fatal accident involving a driver with an illegally bought license. ←Baseball Bugs What's up, Doc? carrots21:43, 23 March 2010 (UTC)[reply]

first of all I am sorry about my tone of frustration, I have striken it, though not from the record. As for baseball bugs' last statement above, it seems to show that I communicated very unclearly, as BB says "unless you speak French fluently as a second language". If I had phrased my original quesiton more coherently, speaking French obviously falls into the category of a mental activity, including all of one's experience and memories. I don't know how I can phrase myself more clearly, I will think about it and come back and try again. 82.113.121.34 (talk) 22:14, 23 March 2010 (UTC)[reply]

Should we also assume you are assuming "by honest means", as opposed to bribery as some have semi-facetiously suggested here? ←Baseball Bugs What's up, Doc? carrots22:22, 23 March 2010 (UTC)[reply]


Hi, I just don't know how I can be more clear. If you read these accreditation associations for example, and then extrapolate to every other similar organization anywhere in the world, which ones will grant you a paper accreditation you can just sit for or answer orally or by computer, or pass by means of a demonstration, and so on? Thank you. If you need further clarification of what I'm asking, please ask me and I will explain. Thank you for your contributions. 82.113.121.34 (talk) 22:55, 23 March 2010 (UTC)[reply]

Yeah I can't believe no one has answered the question yet (except the Ontario boating one, that was real). Here are a couple personal to me:
Canadian Securities Course: Requires only two sittings that can be scheduled, I think, as little as two weeks apart. You need to pay for the books even if you're not going to use them.
General Securities Representative Exam (Used to be called the "Series 7"): The US version, similar to above, requires only one sitting, limited only by scheduling.
CompTIA's A+ certification: Requires only two sittings and I don't think there are any time requirements except what you can schedule with your local provider. You DON'T need to buy any books. Just ~ $150 for the exam. (CompTIA has several other IT certs that can also be completed in a single exam)
Level one of the Chartered Financial Analyst program: Needs to be signed up for by Sept 15 for a mid December sitting (only 3 months; not bad). Levels two and three take another year each unfortunately. You need to buy the books even if you're not going to use them. NByz (talk) 00:53, 24 March 2010 (UTC)[reply]

is there a psychological condition where people hear themselves being narrated?

is there a psychological condition where people hear themselves being narrated (or at least fragments) so that if they start doing something but change their mind they hear the fragment "...but thought better of it..." and so on. I know this has been the premise of movies, I can't remember just which one at the moment, it involves a watch and a chainsmoking writer, but my question is whether hearing oneself narrated is an actual existing psyhcological condition/disorder? Thank you. 82.113.121.38 (talk) 19:03, 23 March 2010 (UTC)[reply]

Hearing voices is a type of auditory hallucination. The voices some people hear vary enormously in how they sound and what they say; suggestions, orders, threats, personal comments on the hearer and others, compliments, irrelevant nonsense, and general chit-chat may all be experienced. A running commentary on what the hearer is doing is an experience that is quite regularly reported, whether fragmentary as you suggest or more coherent and long-lasting. I haven't found any evidence that the narrative type of auditory hallucination is regarded as a particular disorder, separate from other experiences of hearing voices. Our article about the support movement for those who hear voices, Hearing Voices Movement, is well referenced and you may find more information via its links and citations. Karenjc 20:47, 23 March 2010 (UTC)[reply]

Stranger than Fiction is the movie you're thinking of. On a side note, isn't auditory hallucination also a side effect of schizophrenia? 24.189.90.68 (talk) 20:49, 23 March 2010 (UTC)[reply]

Hearing voices may be a symptom of schizophrenia (the term side effect generally refers to unintended effects of medicines). It is probably a dissociative disorder; it sounds vaguely like depersonalization, but there may be multiple causes of hearing oneself narrate one's own life, not all necessarily a mental disorder. Intelligentsium 21:00, 23 March 2010 (UTC)[reply]
Yes, hearing voices can be one symptom of diagnosable psychotic disorders such as schizophrenia, although the definition of such disorders has changed over time and continues to do so. However, some people who hear voices do not exhibit any other symptoms of a mental health disorder, and there is now some recognition that such people may be able to manage their voices and lead an otherwise normal life without medical intervention, which is the premise of the Hearing Voices Movement. See Anti-psychiatry. It is a controversial area. Karenjc 21:06, 23 March 2010 (UTC)[reply]
Sounds like Scrubs :) 76.229.239.145 (talk) 22:33, 23 March 2010 (UTC)John[reply]

Name the fallacy

Which logical fallacy is inherent in the following argument(putting aside the question of whether either statement is true): "Darwin recanted on his deathbed; therefore the theory of evolution must be false"? 137.151.174.176 (talk) 20:04, 23 March 2010 (UTC)[reply]

That sounds like a homework question. Logic 101 at Cal State? :) You're looking for a term. Maybe another Logic 101 student can come up with it. In this case, Darwin deciding he was wrong would not prove that he was wrong, except in his own mind; unless he would do more than just "recant"; he would also have to supply evidence sufficient to contradict his previous theory. Whether he would have time to do all that, while on his deathbed, would depend on what he was dying from. ←Baseball Bugs What's up, Doc? carrots20:12, 23 March 2010 (UTC)[reply]
Logically enough, the Fallacy article gives a similar example and labels it an Irrelevant conclusion. ←Baseball Bugs What's up, Doc? carrots20:15, 23 March 2010 (UTC)[reply]
It was not a homework question. This argument was brought up in Philosophy 333: Evolution and Creation and I just wondered which fallacy it was. 137.151.174.128 (talk) 21:15, 23 March 2010 (UTC)[reply]
It could also be considered the genetic fallacy, meaning that an idea is questioned based on it's origin. StuRat (talk) 20:29, 23 March 2010 (UTC)[reply]
Very good. :) The OP said it was hypothetical, so another way to put it would be, "What is the truth value of the following statement: 'IF Darwin renounced his theory of evolution on his deathbed, THEN the theory of evolution is false.'" And the answer is that the truth value is PROBABLY FALSE, because given only the information we have (i.e. no elaboration on whether Darwin produced counterevidence), it would be an irrelevant conclusion. And I think the bold part is the answer to the OP's question. ←Baseball Bugs What's up, Doc? carrots21:33, 23 March 2010 (UTC)[reply]
doesn't the law of excluded middle (tertia non datur or whatever) say your another way to put it would be, "What is the truth value of the following statement: 'IF Darwin renounced his theory of evolution on his deathbed, THEN the theory of evolution is false.'" And the answer is that the truth value is PROBABLY FALSE is an impossible logical option? 82.113.121.34 (talk) 21:58, 23 March 2010 (UTC)[reply]
I say "probably false" because we don't have enough information to definitively say it's false. We have to make certain assumptions, i.e. that he was dying quickly, and therefore he didn't have time to construct a devastating counterargument. Maybe if he was dying from a long-term case of ringworm, he would have had time. It's been a long time since Logic 101, but typically with true/false questions, if it's not definitively true, then it's false. So if forced to choose one or the other, then it's FALSE. ←Baseball Bugs What's up, Doc? carrots22:19, 23 March 2010 (UTC)[reply]
I would class this as an appeal to emotion. The suggestion being that fearing God and the afterlife, Darwin forswore his heathen teachings. Well, fear is an emotion. It is also the mind-killer. Vranak (talk) 23:23, 23 March 2010 (UTC)[reply]
That could explain why Darwin (or anyone) would hypothetically renounce something they had said earlier. But his renouncing it does not logically lead to the conclusion that his theory is false, unless he provides counterevidence that demolishes his theory. However, I could see where a religionist might jump to that false conclusion. Perhaps that's what you're getting at? ←Baseball Bugs What's up, Doc? carrots23:28, 23 March 2010 (UTC)[reply]
It's not appeal to emotion; that's quite different ("Believe this, if you are a true patriot!" and the like). I think genetic fallacy is probably the best answer yet given. It's equivalent to saying, "if Darwin didn't believe his theory, it must not be true." The fallacy lies in asserting the theory's truth status has solely to do with whether Charles Darwin specifically believed in it. --Mr.98 (talk) 23:43, 23 March 2010 (UTC)[reply]
(Incidentally, in case anyone was curious, Darwin did not recant. See Elizabeth Hope for more information on the facts behind this.) --Mr.98 (talk) 23:43, 23 March 2010 (UTC)[reply]
I'd class it as argument from authority. Despite the fact that Darwin was the credited as the discoverer, he does not enjoy any special privilege in arbitrating the truth/falsity of the theory. (e.g. even if Darwin had wanted to forswear the theory of evolution, Alfred Russel Wallace would have likely been willing to take up the role as its champion.) -- 140.142.20.229 (talk) 00:35, 24 March 2010 (UTC)[reply]
To put this in some perspective, consider this assertion, parallel to the OP's question, except it's based on a true fact, albeit leaving some other key facts out: "Galileo recanted; therefore the theory of the heliocentric planetary system must be false." ←Baseball Bugs What's up, Doc? carrots01:24, 24 March 2010 (UTC)[reply]

Indigenous people of the United States

How many indigenous Amerindians are left in the U.S.? I would imagine it's below 1% due to what happened to them. B-Machine (talk) 20:51, 23 March 2010 (UTC)[reply]

It is difficult to define native American. There are probably very few, if any, people whose ancestors are all from America all the way back to the initial colonisation of the continent thousands of years ago. So, how may native American blood do you require someone to have to count as native American? If you count anyone with any native American blood then there are probably more now than ever before just because the total population of the continent has increased so much. --Tango (talk) 21:02, 23 March 2010 (UTC)[reply]
Our Demographics of the United States give a number of 2.4 million, or 0.8%. Googlemeister (talk) 21:08, 23 March 2010 (UTC)[reply]
Thank you, Googlemeister. References on the Reference Desk please, rather than guessing. This page from the 2000 U.S. Census says that 0.87% of the US population stated they were "American Indian" or "Alaska Native" alone. The number rose to 1.53 if you could "alone or in combination", meaning people who stated they were one of those and also reported belonging to 2 or more "races". Comet Tuttle (talk) 21:08, 23 March 2010 (UTC)[reply]
This number is within the rather large range of natives living in the current US in the Pre-Colmbian times of 1-18 million. I guess it is hard to count people living in unexplored territory. Googlemeister (talk) 21:10, 23 March 2010 (UTC)[reply]
It looks like the Native American population dropped off sharply, by maybe 80%, after first contact, mainly due to European diseases: See Native_Americans_in_the_United_States#European_explorations. Since then, the population has grown, although not as fast as other portions of the US population, since, unlike other ethnicities, there's very little immigration to the US of Native Americans. :-) StuRat (talk) 21:20, 23 March 2010 (UTC)[reply]
Why not? There were a lot of natives in current day Canada and Mexico as well. Googlemeister (talk) 21:24, 23 March 2010 (UTC)[reply]
If they come from outside the US, then, by definition, they aren't "indigenous people of the United States", they are "indigenous people of Canada and Mexico". StuRat (talk) 21:28, 23 March 2010 (UTC)[reply]
You said Native Americans, which includes those from Canada and Mexico. ScienceApe (talk) 21:37, 23 March 2010 (UTC)[reply]
The OP asked about only Amerindians in the US. What I want to know is that do they have "Status Indian" type of classification in US census like in Canada. Note, Métis and Inuits are counted separately in Canadian census. --Kvasir (talk) 22:40, 23 March 2010 (UTC)[reply]

Longevity of political systems or forms of government

Fascism for example seems to have only existed for about twenty years, which is not long in the history of mankind. Communism was about seventy years in Russia but still continues in one or two countries. Socialism is, I guess, about a hundred years so far. How long have other systems lasted? 78.149.133.100 (talk) 21:11, 23 March 2010 (UTC)[reply]

The United States is generally described as a Constitutional republic, more specifically, as the longest continuous constitutional republic. We have had our current constitution since 1788, so 222 years and counting. Googlemeister (talk) 21:21, 23 March 2010 (UTC)[reply]
Longevity of particular forms of government is difficult to measure, because governments shift form. for instance, the US has technically made one major shift in governance (from pre-civil war confederalism to post-civil war federalism), and some other smaller but significant changes (addition of Judicial review, changes in suffrage, commercialization of the military, the introduction and vastly increased influence of lobbyists and corporate influence). I sincerely doubt that Washington, Jefferson, of Franklin would approve of - or even fully recognize - the thing we credit them with creating. Kingdoms tend to last a long time, but mostly under different dynasties, as one family line is murdered off and replaced (there's a loose 3-4 generation rule for most familial dynasties). Empires have shorter life spans than kingdoms, mostly because they lack the internal cohesion of kingdoms - outlying regions tend to spin off, go into revolt, or get picked off by surrounding opponents. revolutionary republics tend to be short-lived because one of the inevitable first steps in a revolution is the destruction of the governmental structures that would otherwise hold the nation together (police, judiciary, civil service...). Fascism would likely have lasted a lot longer except that it connected itself with rabid expansionism. "Communism" in Russia actually constituted at least three different systems (pre-WWII, stalinism, cold-war period), none of which (except perhaps the first) could really be referred to as communist.
rankly, it's a badly framed question. are you asking about nations or systems? finding the life-span of nations is easy, finding the life-span of systems is next to impossible, because the system of governance of a nation is mutable and not easy to define. --Ludwigs2 21:58, 23 March 2010 (UTC)[reply]
The British monarchy hasn't had the crown transfer through direct killing since 1485 (Henry VII killed Richard III). It's jumped around a bit due to people dying without issue, and the name of the ruling house has changed when we've had queens, but it's been fairly peaceful (not entirely peaceful, of course!). --Tango (talk) 00:34, 24 March 2010 (UTC)[reply]
The UK has been a constitutional monarchy since 1688 - that's 322 years and counting! (touche!) --TammyMoet (talk) 22:05, 23 March 2010 (UTC)[reply]
And the royal line of succession was established in 1066, which is a pretty good run also. ←Baseball Bugs What's up, Doc? carrots22:14, 23 March 2010 (UTC)[reply]
Baseball Bugs -- the current root of genealogical legitimacy was established in 1066, but the principle that the crown could pass through a female line of descent wasn't established until the Stephen-Matilda wars, and the principle that the crown could pass to a woman wasn't really established until Mary I in 1553, and the current principles of succession are laid down in an act of Parliament passed in 1701, so it's hard to say how the "line of succession" was established in 1066. AnonMoos (talk) 00:51, 24 March 2010 (UTC)[reply]
The line of Emperors of Japan claims to be formed in 660 BC. Even if you shave off 50 years since it has become a Constitutional monarchy after World War II, it's still some 2600 years. The Icelandic Althing is one of the oldest parliamentary traditions. --Kvasir (talk) 22:33, 23 March 2010 (UTC)[reply]
Egypt was ruled by pharaohs for three millennia. See History of Egypt. —D. Monack talk 23:23, 23 March 2010 (UTC)[reply]
Pharoahs of many dynasties, though. AnonMoos (talk) 00:51, 24 March 2010 (UTC)[reply]
European and British feudalism lasted many centuries, maybe more than a millenium--I guess the Norman conquest just switched England from one line of feudal rulers to another, but the system itself arguably started centuries before William the Conqueror and persisted til the 1700's or so. 66.127.52.47 (talk) 23:39, 23 March 2010 (UTC)[reply]
The general trend in recent historical studies has been to very narrowly and precisely define the word "feudalism", so that the period of classic feudalism defined in that way doesn't last all that long... AnonMoos (talk) 00:51, 24 March 2010 (UTC)[reply]

Marahasta

... Nagar-Aveli was given to the Portuguese as a compensation for the sinking of a Portuguese ship by the Maratha navy.

What this one of the requirements of the treaty? What is the name for "the requirement of a treaty"? What is this treaty? What was this war?174.3.113.245 (talk) 21:15, 23 March 2010 (UTC)[reply]

What treaty? The Maratha navy sank two ships, according to this source, and Portugal threatened war; the Maratha Empire offered the territory to avoid war. --jpgordon::==( o ) 22:47, 23 March 2010 (UTC)[reply]

Violin Sheet music for "hey soul sister" by Train

is there any way i can find Violin sheet music for this? I've looked everywhere but they have it just for piano and vocals and guitar. —Preceding unsigned comment added by Iluvgofishband (talkcontribs) 23:08, 23 March 2010 (UTC)[reply]

Have you a particular reason to think that somebody has arranged it for fiddle? It's quite possible that nobody has. --ColinFine (talk) 23:38, 23 March 2010 (UTC)[reply]


March 24

Health care law

If the Supreme Court rules that Congress can't mandate that all individuals buy health insurance, would the entire law be invalidated? --70.250.214.164 (talk) 00:06, 24 March 2010 (UTC)[reply]

I'd expect the legislation has a severability clause, so if the court knocked out part of it, the rest would stay in force. But I haven't checked. 66.127.52.47 (talk) 00:11, 24 March 2010 (UTC)[reply]
The suit alleges that the mandate is central to the legislation and so, unseverable. If this argument carries the day, the legislation would be invalidated in its entirety. (And, of course, if not, not.) - Nunh-huh 01:30, 24 March 2010 (UTC)[reply]

procrastination trick

If I've got to do something and I'm putting it off, I have a really bad habit of purposely avoiding looking at the clock so I don't notice how late it's getting. Sort of a form of denial--if I'm not aware of it, it's not really happening. Is there a name for that trick, of not looking at the clock? I'm thinking of programming my computer to make a voice announcement of what time it is (or play chimes or something), every 15 minutes or so through the day, to undo the habit. 66.127.52.47 (talk) 00:17, 24 March 2010 (UTC)[reply]

"Burying ones head in the sand" describes it quite well. --Tango (talk) 00:44, 24 March 2010 (UTC)[reply]

Break Even

Hello. When a firm wants to find an output level and a selling price per unit to break even, the firm would like to maximize profits by finding the intersection of marginal cost curve, marginal revenue curve, and average total cost curve. Why would the average total cost curve be minimal at this point? Thanks in advance. --Mayfare (talk) 00:53, 24 March 2010 (UTC)[reply]

This sounds like mathematics. Well you have

FC = fixed Cost
MC = Marginal Cost (per item)
SV(price) = Sales Volume for each price

Profit(price) = price * SV(price) - FC - (MC*SV(price))

Profit(price)/item = (price * SV(price) - FC - (MC*SV(price)) )/ SV(price)

Profit(price)/item = price - ( FC/SV(price) ) - MC

Average Cost per item = ( FC/SV(price) ) + MC

So we have

Profit(price)/item = price - Average Cost per item

So you see it's very easy. All you have to do is find the mathematical function SV(price), the amount sold at each price 122.107.207.98 (talk) 01:29, 24 March 2010 (UTC)[reply]