|WikiProject Mathematics||(Rated B-class, Mid-priority)|
|WikiProject Statistics||(Rated B-class, Mid-importance)|
Links from this article with broken #section links :
- 1 Merging with other articles
- 2 References to terrorism
- 3 Pareto Distribution: reference to Capitalism
- 4 (Comments 2003-2005)
- 5 external links
- 6 Merging Zipf, Power Law, and Pareto?
- 7 Financial market power laws
- 8 A How To Guide?
- 9 Weibull not power-law?
- 10 with k > 1
- 11 "Practical use"
- 12 Category: exponentials
- 13 Major revision of article
- 14 Accessability
- 15 Pictures please!
- 16 Polynomial?
- 17 Technical template: What does Power law mean?
- 18 Estimating the exponent from empirical data
- 19 Gutenberg-Richter example
- 20 clarification needed?
- 21 Error?
- 22 Unsourced variants
- 23 Power laws overused
- 24 Confusion - to a layman
- 25 Removing section 'Two point fitting method'
- 26 Controversy
- 27 Mainly about power law distributions
- 28 Missing citations to this article in the book "Multifractal Financial Markets"?
- 29 Definition of tail exponent different from literature
- 30 Reference "Neukum" probably wrong
- 31 s?
- 32 No average - little sense
- 33 Mistakes in text
- 34 Adding example of power law of CPU cache misses
- 35 Confusing exponent notation switches
Merging with other articles
I propose that our revised power law article become the redirect point for Fat tail and Heavy-tailed distribution. I've placed such a suggestion on the talk pages of each of these articles (which substantially overlap with the revised power law article, and with each other).
Paresnah 20:14, 13 March 2007 (UTC)
- Also, the Extreme value theory article, which mentions 'tail-fitting' but points to a non-existent article on the topic, should be connected to the Power laws subsection on estimating the tail exponent of power-law distributions. Paresnah 20:21, 13 March 2007 (UTC)
References to terrorism
I concurr with the removal of the two references on terrorism; if the article isn't going to list all the citations that show power laws in weird systems, then it shouldn't priviledge a few while ignoring others. So long as we keep the links to major review articles like Newman's up-to-date, we can let the academics keep track of which systems show power laws -- Paresnah 23:38, 28 August 2006 (UTC)
Pareto Distribution: reference to Capitalism
The Pareto distribution, for example, the distribution of wealth in capitalist economies
There's no mention of capitalism in the article on the Pareto distribution - the word capitalist seems redundant here since in feudal societies there appears to have been a similar distribution, only more with a higher exponent, and even within Romania, USSR, and most other non-capitalist societies similar distribution of wealth appears to have occured. Any thoughts? (especially from someone who knows more about Pareto distributions!) --Dilaudid 08:10, 12 December 2006 (UTC)
- I believe you're right, the Pareto distribution of wealth is pretty universal although the exponent may change. Incidentally, although Pareto proposed a pure power law, nowadays people recognise that the income distribution is more like a Lévy distribution.
- Incidentally I think that the Pareto distribution article is pretty parochial, in the sense that they consider "Pareto distribution" to refer to power law probability distributions in general. That's very much an economics point of view. It would be better to merge the technical power-law material away from that article into the present one, and rewrite the Pareto distribution article to talk just about income, making reference to Mandelbrot's article on Pareto-Levy law, etc. —WebDrake 23:26, 12 December 2006 (UTC)
Should it be y=axk, where a is a constant? -- Heron
- Yes, I believe so, since Zipf's law is a power law and has a probability distribution of the form of y = ax-b. -- Lexor 22:37, 18 Aug 2003 (UTC):
- A more general relationship would be (i.e. "goes like" ) rather than assuming a literal equation like the above. e.g. consider the equation where is constant. WebDrake 22:31, 7 October 2005 (UTC)
Minor edit: I changed the internal link on self-organized criticality to point to that page, instead of just self-organization; and I updated the xxx.lanl.gov external link so that it points to the abstract of the paper rather than just the PDF (this allows a user to select download file type). WebDrake 22:31, 7 October 2005 (UTC)
"One rule of thumb, however, is if the distribution is straight on a log-log graph over 3 or more orders of magnitude." => This sentence has an "if" part, but it is missing a "then" part. Is it supposed to mean:
"One rule of thumb, however, is if the distribution is straight on a log-log graph over 3 or more orders of magnitude, then it is a power law distribution."
I have deleted some external links that were unrelated to this subject and could not be considered external sources of information on power laws. There are lots of published and unpublished scientific works that can be related in some way to the application of power-law techniques, but do not apport any knowledge to the generic properties of power laws or generic applications and methods. 20:31, 21 February 2006 (UTC)
- Sorry but I don't agree with 18.104.22.168 about this. Why does the Wikipedia article have to discuss only generic properties? Referring readers of the article to a variety of sources which illustrate the diversity of cases in which power laws occur seems helpful and constructive to me. -- JimR 03:06, 26 February 2006 (UTC)
Since there's been no further discussion on this, I've now restored the previously removed links. -- JimR 09:19, 9 March 2006 (UTC)
The section on estimating the exponent contains a reference to some article by Goldstein et al. which has apparently been removed from the bibliography section. I am no expert on the subject, but either this part of the article should be changed or the reference put back in, the status quo is rather irritating. 22.214.171.124 09:14, 24 September 2007 (UTC)
Merging Zipf, Power Law, and Pareto?
This article, the Zipf's Law article, and the Pareto distribution article have significant overlap, for obvious reasons. Is it worth trying to create a central article for this set of material? --Experiment123 02:38, 8 March 2006 (UTC)
- I don't think so. Zipf's Law is a particular case from linguistics, and Power laws are not exclusively statistical, but also found in natural sciences. By the way, there is also overlap or a common theme with e.g. 80-20 rule, Lorenz curve and The long tail.--Niels Ø 18:57, 8 March 2006 (UTC)
- I re-read the above, and yes, I do actually think a central article clarifying the relationship between these various topics would be good, but no, I don't think the articles should be merged. In fact, a while ago I created an article listing these articles, and linked each article to the list instead of to all the other articles. I never found a good name for my list, though, and it was afd'ed.--Niels Ø 09:49, 9 March 2006 (UTC)
- Yes, that's the way I was thinking, too: relatively brief articles for Zipf's Law, Power Law, Pareto, etc., which all contain links to a meatier piece that can discussed shared concepts and talk about the relationships between the subjects. I see what you mean about the name, though. "Power Law Distributions" maybe? Do you have your old article around still somewhere? --Experiment123 12:34, 9 March 2006 (UTC)
Financial market power laws
The article refers to Mandelbrot and Taleb as "recently" popularising the presence of power laws in financial markets. Actually Mandelbrot's statistical analysis of price fluctuations, showing power law tails ("fat tails"), dates back to---and was well-known in---the 1960s. There's also the Pareto law for the distribution of income, which is another fat tail distribution.
More generally the article needs to be more clear about distributions/scalings which are power law for only part of the range.
I don't have time to make the change now but will try to add some material in the near future. —WebDrake 09:57, 29 April 2006 (UTC)
A How To Guide?
Thank you all - this is the best clearing houses of information on Power Laws I've found. I don't have the most technical of backgrounds, so forgive me if these questions would better be asked elsewhere...
At what point have I confirmed the presence of a power law?
1. I've got my nice straight log-log graph done up in Excel. The y-axis of log of probability runs to -3.5, but my x-axis only carries out to 1.5 (my data set n ~ 1,500).
2. This leads me to my next question, do I have to have a slope of ~-1 for a power law to be shown?
3. My R^2 is 0.9, are there any special considerations for evaluating correlation coefficients for power laws?
How would I compare two power law distributions?
1. My current results give me the equation of f(y) = -0.06x^(-2.4) - What would it show if over time my a and k values were to change?
If I have shown a power law in the data, this means...
1. The presence of a power law shows that there is no mean, and that standard Gaussian or Normal statistics should not be applied to this data set
2. Power laws are evidence of self-reinforcing systems
3. Power laws are evidence of small-world or scale-free networks
4. Power laws imply the presence of random behavior
5. Power laws are simply Cauchy distributions with an asymptote at the mean
6. In a rank order setting the farthest right nodes could be considered to be either / both the oldest and most crucial to a network
Thanks again -- Flybrand 16:43, 30 April 2006 (UTC)
- A quite good how-to relating to power laws is found in M. E. J. Newman, "Power laws, Pareto distributions and Zipf's law", Contemporary Physics 46, 323–351. Very brief reply to your questions:
- Presence of power laws. Accurately checking for power laws basically comes down to one thing: having enough data. Too little may cause you to not see the law, or get the wrong exponent. When you're binning the data for plotting, you will need to put it in logarithmic bins, otherwise the tail of the distribution will get very messy.
- I don't understand what you mean about "a slope of ~-1". Power laws come with many different possible exponents.
- Most importantly, standard regression tests are not good for estimating exponents. There's a method described in the Newman paper which is far preferable. Another method of testing involves scaling for systems of different sizes [see e.g. the two papers by Lise and Paczuski (2001)].
- Comparing two power-law distributions. I would forget about the value in . Concentrate on the exponent . If you're running simulations, you may see that the exponent is lower, or the distribution not really power-law like, if you do not let the system go through a transient period before taking data. Ditto if you have too few data points. If the negative exponent is higher (e.g. if you have an exponent of -2 instead of -1) you will need more data points, as the larger events are more unlikely. This is particularly important with e.g. Lévy distributions where the power-law is only the tail and most points are bunched in the initial peak.
- If you have shown a power law in the data, then ... what you can infer tends to be context-dependent. But to go through your cases: (1) whether or not there is a finite mean depends on the exponent. If the exponent is less than -2 a mean exists. If it's greater (e.g. -1.5) the mean is infinite. You can tell this just by basic probability theory. I don't know what you mean by "applying standard Gaussian or Normal statistics". (2) I don't know what you mean by self-reinforcing. (3) Depends on whether you're looking at a network or something else. (4) No, not at all, you can generate power laws by purely deterministic means. (5) I don't know what you mean, but I suspect no. (6) Depends on what system you're looking at. In e.g. the Barabasi-Albert model of scale-free networks, yes, to the "oldest". In other systems, not at all. "Most crucial" is something it's not possible to talk about in general.
- These answers were written in a hurry so hope they make sense. I suspect you need to find a complex-systems physicist to work with who can sit down and supervise your work and lead you through things gently. :-)
- —WebDrake 14:37, 18 July 2006 (UTC)
Weibull not power-law?
I'm pretty sure the Weibull distribution is not an example of a power-law distribution, as the current article claims. When Weibull parameter k=1, it's the same as the exponential distribution, which is definitely not power-law. I'm fairly certain that Weibull isn't power-law regardless of the value of k, but in case my understanding is muddled, I wanted to give others a chance to comment. —The preceding unsigned comment was added by Agthorr (talk • contribs) 14 June 2006.
- Agreed: it is not a power law (although the formula for the failure or hazard rate given in the Weibull distribution article is a power law). I've removed the Weibull distribtuion from the examples, and put it in See also instead. -- JimR 05:31, 18 June 2006 (UTC)
with k > 1
I have added "with k > 1", as, e.g., with k = 1 dX/dt = aX^k describes just an exponential, with k = 0 we have a lineal function and so on. —The preceding unsigned comment was added by Athkalani (talk • contribs) 2006-07-13.
- Sorry but I don't think this is correct. There are significant cases where , for example, inverse-square laws and critical exponents. The cases (not exponential but simple proportionality) and (a constant function) are not very interesting, but they are still degenerate cases of power laws. I've reverted this change. -- JimR 04:24, 15 July 2006 (UTC)
- Athkalani appears to have confused the equation with the equation . It's not an issue. —WebDrake 21:59, 15 July 2006 (UTC)
I've removed the comment that "It remains to be seen whether knowing that something follows a power law has a practical use". This seems to me just the sort of very silly POV thing said by people who are cynical about the whole complex-systems-science field. Knowing that something is power-law distributed is as practically useful as any other accurate description of nature. ---WebDrake 10:54, 13 July 2006 (UTC)
The very first sentence mentions that power laws are observed in many fields. No doubt, but some of the listing feel ike they need a reference to me. War, Terrorism? No doubt, there is something in catagories of war and terrorism that does, indeed, follow a power law distribution, but to juxtiposition these next to a list of sciences just felt POV to me. --126.96.36.199 11:40, 27 July 2006 (UTC)
- It does seem surprising. But there were already two external links in the article justifying the mention of war and terrorism. I've turned these into explicit reference notes to make the connection clear. -- JimR 05:07, 30 July 2006 (UTC)
- I think this is a style issue. The article jumps too quickly into "where power laws are observed" rather than telling you what they are first. As for the juxtaposition of "war and terrorism" versus sciences, power laws really are very ubiquitous — they are observed in hard sciences, biology, social sciences, the works. It would be better to make a short comment ("found in many situations in nature") initially and elaborate later: the trouble with making a list is that everyone and their mother wants to add their personal pet example (it's happening on the self-organized criticality page too). Will try to rewrite soon. —WebDrake 19:37, 3 August 2006 (UTC)
What's the justification for power laws being in the "exponentials" category? —WebDrake 09:30, 19 July 2006 (UTC)
Major revision of article
I've begun a major revision of the Power law article, which I've placed for the moment at Talk:Power law/Revised article. This is to enable two things:
- A gradual rewrite — it will take time to create a complete article of high quality.
- Collaboration — I do not wish to make such a large-scale change without scrutiny and others' input.
The aim of the rewrite is mainly to bring better structure to the article, and give a better overview of what power laws are. For example the present article's introduction starts by telling you where power laws are found, and only later gets round to describing power laws themselves.
I also propose to clear up a bit the cited references and external links, since it seems that some of the choices are a bit arbitrary.
I'm looking forward to people's thoughts and ideas about this. —WebDrake 18:28, 15 August 2006 (UTC)
- I tweaked a couple of parts of your draft, and added the maximum likelihood estimator for discrete power laws (from the Goldstein article). Generally, I like what you've done with the article. I'm a bit concerned that the "power laws in nature" section could get really lengthy, since power laws basically show up everywhere. -- Paresnah 23:42, 28 August 2006 (UTC)
- I'm delighted by your contributions, they are very helpful, and you really should feel free to add your own material as well as revise what I've written. Regarding power laws in nature, you're right, it could get huge, and there is a tendency for everyone to add their pet topic to a never-ending list (this is happening on the self-organized criticality page as well, which I will probably have to do another rewrite of at some stage). What I plan for that section is to give a general overview, some cautionary words about spurious "not-really-power"-laws, and some prominent examples as per the reviews cited. We might try to craft a policy regarding what qualifies for being added here, and create a "list of power laws observed in nature" page, or something like it, if too many people want to crowd in, so as to reserve the main page for a concise writeup. —WebDrake 09:54, 29 August 2006 (UTC)
- I've done a major revision of the proposed revised article. There are still some things to tidy up, particularly the sections on power-law functions, but I think this new version should be quite close to something we can use to replace the currently public version of the article (which, with every passing day, becomes more and more shameful). Paresnah 11:28, 11 March 2007 (UTC)
- Made another pass through the article to flesh out the sections that needed attention, and to more fully integrate its links with other related wikipedia articles. I think this revised version is now very close to being ready to post. Paresnah 03:31, 14 March 2007 (UTC)
- I replaced the old article with the new one (and made a couple of small adjustments). Paresnah 01:29, 17 March 2007 (UTC)
Following a couple of brief exchanges between Paresnah and I on our talk pages, it's perhaps best to continue the debate here. The discussion is on how to phrase comments relating to universality.
Anyway, here's what Paresnah wrote:
- I think that the section on universality needs to make a couple of things explicit. In some systems in physics, e.g., spin glasses, the universality of power-law behavior rests on firm empirical and theoretical grounds. For other systems, such as complex networks or various sociophysics applications, claims of universality seem to be mere speculation. It seems to me that a lot of statistical physicists who work in these areas have imported the familiar (to them) idea of universality, but are applying it in a way that may not be congruent with its original meaning. For instance, in statistical physics, critical exponents describe the functional behavior of a variable, while in these "complex systems", the exponent of a power law describes the probabilistic behavior of the variable. Even if we accept that the idea of universality is applicable to probability distributions, significant problems remain.
- For instance, universality requires the coincidence of the scaling exponents. But, in most of these "complex systems", the estimates of the scaling exponent's value are extremely coarse (this is at least partially due to the MLEs not being widely used). People often claim universality after seeing a power law with an exponent in the range , rather than by precisely matching exponents as was done for the more traditonal physics systems. Also, is a huge range over which to claim a precise matching, and it's expecting a lot to assume that, as the fitting methods improve, the values will continue to match in a precise way among these disparate systems. And, there are a large number of stochastic mechanisms that all produce power law distributions with exponents in exactly the observed range, so there doesn't seem to be any reason that we should a priori expect any universality class, in the traditional sense, to apply to these probabilistic systems (more on this in a moment). One of the silliest claims of universality in a "complex system" is in the "scale-free networks" literature, concerning the structure of the Internet. Walter Willinger, David Alderson and their colleagues have criticized this claim extensively in a series of careful articles, such as this one.
- Finally, there are no validated theories that describe why we should expect to see power laws with certain exponents in these "complex systems". There are lots of interesting ideas, but researchers rarely do the necessary work to validate their models. Without such validation, it seems hazardous to try to generalize two systems into a universality class. Rather, similar behavior could just be a coincidence - perhaps two different mechanisms just happen to produce power law distributions with the similar exponents, but, for other reason, there is no way to put them in the same class of systems. (Or, similar behavior could be a data artifact.) Renormalization provided this deeper connection for many traditional systems, but there is not yet such a theory for any "complex systems". (At least, I'm not aware of one...) Michael Mitzenmacher has commented on some of these issues in a recent editorical for Internet Mathematics (available from his publications page). The problem is basically that people have been getting ahead of themselves (and what their data supports) in pushing into these new "complex systems" areas. I'm pretty sure that all of these issues will get sorted out eventually, but since there's already a lot of confusion in the "complex systems" scientific literature, I think we need to be careful in the wikipedia article not to repeat things as fact that are merely speculation.
- Now... I wonder how we can work this perspective into the article.
I would reply that I think it depends on what you're looking at. For example, in the case of self-organized criticality I don't see any reason to doubt that well-defined universality classes exist (and indeed a lot of work has been done here, although I don't think there is a complete theory at this point).
On the other hand as regards scale-free networks I can agree, there are various ways to tune exponents with these models and there is not really a coherent point of view yet. Part of the problem may be that they are rather abstract and are not "physical" in a conventional sense. But they can emerge from physical models (see e.g. the paper by Paczuski on the self-organized criticality page) which may help to resolve some issues in the future.
The issue of coarse-grained observation I think can be dealt with simply. In some empirical cases exponents are not firmly established so it's difficult to make statements. Some may not even be power laws!
That's a very brief reply that doesn't really do justice to your detailed comments, but I think we can deal with the issue quite simply: we refer to traditional thermodynamics and SOC, and then add some comments about "whether universality can be extended beyond these cases continues to be a matter of debate, and some have said ..."
I'll try to read more deeply on the papers you suggested (any more would also be welcome) and come up with some new text. —WebDrake 09:11, 12 September 2006 (UTC)
- Right, exactly. The problem here is, I think, that there are several different communities that all deal with power laws (and universality), and we need to be careful in the wikipedia article about the generality of its statements about them. That is, the article should probably note important differences between the meaning of power laws in different communities (e.g., functional versus probabilistic, or critical behavior versus other processes). In some places, such as SOC, the evidence of power laws and universality seems pretty clear (NB: I'm not an expect in SOC), while in others, like scale-free networks and terrorism, it's much less clear, and claims of universality are probably wrong (at least, I would say there's no a priori reason to believe them, and the empirical evidence seems poor). Paresnah 21:29, 12 September 2006 (UTC)
As a layman I find this article very inaccessible. The top paragraphs do not explain at any point in plain english what power law is, it uses a mathematic sum to do the explaining. I'd rework it but I just dont understand what you are trying to explain. —The preceding unsigned comment was added by 188.8.131.52 (talk) 01:52, 16 February 2007 (UTC).
- I'll second that. An example of the application might serve to illustrate what it is. Crag (talk) 18:20, 12 May 2008 (UTC)
a graph would be nice. Puddytang 01:38, 16 May 2007 (UTC)
I question the use of the term "polynomial" in the intro. Polynomials have integer exponents, where in power laws, as I understand them, the exponent can be any real number. The fact that k need not be an integer should at least be made clear.--agr (talk) 14:00, 4 March 2008 (UTC)
- You are correct. I have edited the article to reflect the proper definition of polynomial, removing, for example, "Typically, power-law functions are polynomials in a single variable..." Harold f (talk) 10:59, 18 August 2013 (UTC)
Technical template: What does Power law mean?
The first line states:
- I found an answer:
- All the phenomena discussed here obey a certain mathematical formula known as a power law. As they increase in scale, so they decrease in frequency. When the size of an earthquake doubles, it becomes four times as rare. When the size of a stock-market fluctuation doubles, it becomes 16 times as rare. The exact fraction does not matter; it is the general law that counts. What this law indicates, translated into English, is that there is no such thing as an average size for an earthquake or a stock-market fluctuation. There is no median point around which they all cluster.
- travb (talk) 01:12, 19 January 2009 (UTC)
The situation has gotten worse. The first line is now
- "A power law is a type of probability distribution"
Estimating the exponent from empirical data
The estimator attributed to Hall only applies to continuous data (not discrete), and this is not made very clear in the discussion ('real' is not necessarily 'continouous'). Also, there is discussion in the 'literature' about how this estimator does not work very well. I haven't been able to get it to work for me either. —Preceding unsigned comment added by 184.108.40.206 (talk) 17:53, 10 February 2009 (UTC)
This is given as an example of a power law, when, in fact (as the Wiki link even discusses), it is an exponential law. —Preceding unsigned comment added by 220.127.116.11 (talk) 21:28, 10 February 2009 (UTC)
please forgive me as i have never tried to ask a question or add to a discussion before. i am also not a mathematician so forgive me if i am misunderstanding a technical concept. however, in reading the introduction to this article, i came across the examples of "frequencies of family names" and "frequencies of words in most languages". i have 2 question with regard these examples:
1. while the example of the number of cities vs population size is clear, it is unclear (at least to me) what is the attribute of the family name or words that is being raised to a power to get the frequency? is the relationship between the number of family names that share a frequency (e.g. only one name (brown, for example) has a frequency of a, while 10 different names have a common frequency of b such that they would be represented by points on the graph as 1 to the a and 10 to the b, or to some constant power 1 to the c, and 10 to the c)? or is the name or word itself the "attribute"? or does it have to do with the length of the name or word or the ordinal value of the placement of the letters in the alphabet?
2. if the name or word itself is the "attribute", is it accurate to say that it is an example of the power law (rather than just something which mimics it) because you are not really raising the attribute of the name or word to any power to obtain it's frequency. the frequency is merely distributed in a similar fashion. one could not, for example, take a new family name or word ("xzerq", for example) and use the power law to calculate it's frequency. if you cannot use the law to calculate the frequency, how can it be said to follow the law?
again, my apologies if this is a ridiculous question. i just didn't understand or wasn't able to reconcile the examples with the explanation when i read it so i assumed other non-mathematicians might have the same problem.
It is written in the page that Student's t-distribution (continuous), of which the Cauchy distribution is a special case are Examples of power-law distributions. Is this true? phauly (talk) 13:14, 15 September 2010 (UTC)
The editor who added the new section writes "I googled the names from memory and adapted one of the first results. These terms are quite common in astrophysics. I'm not sure which kind of source would be suitable though, so feel free to add whatever you find relevant. Skippy le Grand Gourou (talk) 16:48, 24 September 2010 (UTC)" We should work on that... Dicklyon (talk) 18:17, 24 September 2010 (UTC)
- Note that the power law with an exponential cutoff is already defined under Power law#Power-law distributions (without citation either…), so some merging or reorganization might be wise. Skippy le Grand Gourou (talk) 20:31, 24 September 2010 (UTC)
Power laws overused
Clauset et al. show that their forest fire data can NOT reasonably match a power law. I'd like to see a mention of this recent backlash against the everything's-a-power-law-give-Per-Bak-a-Nobel-Prize trend. —Preceding unsigned comment added by 18.104.22.168 (talk) 23:55, 11 November 2010 (UTC)
Confusion - to a layman
I am trying to determine if Power Laws are applicable to some research I am doing.
I have therefore read this article as part of my education on power laws.
There are a few things that would help me.
1 - If the graph near the top of the article was somehow related to the formula. As far as I can tell, the exponent for the graph in the article is -1. It took me a while to figure this out - since the formulas in the article that define Power Laws do not say that typically the value of the exponent will be approximately -1.
2 - It would be nice to have a worked out example. An example (preferably one each from physics and statistics) that has a graph, then a log-log graph, and then the actual values of the parameters. The graphs should have labelled axis. I have seen too many articles on Power Laws that do not label the axis.
For the statistics example, maybe also a rank-ordering version of the graph, to connect it to Zipfs law. Size of cities or any of the standard web-usage examples would work well.
3 - The article uses the symbol
which I do not know what it means. What does it mean? Is there an article somewhere that explains it.
4 - The more explanation of what are thought to be the mathematics mechanisms that create power-law statistical distributions, the better. I am looking at a boatload of data that may have power-law distribution to it (the jury is out). Understanding the underlying mechanisms that generate power-law (and other distributions) would be of great help. As an engineer, I may be able to create better products once I understand the underlying mechanisms that are at play. — Preceding unsigned comment added by Ideality (talk • contribs) 07:19, 13 December 2010 (UTC)
- The graph is not an accurate representation for any particular power, but somewhere near -1 might be close; -0.5 might be closer.
- Good idea on an illustration.
- The sim symbol means "is similar to" or "goes sort of like".
- There are probably not "mathematical mechanisms" that create power laws. They are descriptions of data from physical processes, which are probably better explained in the terms of the processes than of math; see Stevens' power law, for example, a description of psychophysical data. And keep in mind that this is about approximating data, not about physical laws, in most cases; the "law" really applies to the mathematical function, not to it being a physical law. If you just search for "power law" in book and articles, you'll find lots of place where it has been found useful.
Removing section 'Two point fitting method'
The section “Two Point Fitting Method” had been deleted repeatedly by user Derek farn, on the basis of vague considerations. I retain that this constitute a constructive contribution for Wikipedia, as it involves new methodologies. I want to point out that reverting of some contribution should be performed only on the basis of comments concerning the content of the contribution itself, so as stated in Wikipedia: Dispute resolution#Focus on content and mainly in Wikipedia: Dispute resolution#Discuss with the other party. In this case, motivation of reverting can be classified as “Ad Hominem” comment (which is considered inappropriate). Furthermore I retain that terms such as “unknown significance” or “ambiguous quality” are inappropriate when dealing with theoretical topics. A theoretical proof can only be correct or incorrect. So I invite user Derek farn to deal only with article content and, at least, to ask for a third opinion before remove this section again. Structuralgeol (talk) 15:38, 25 May 2011 (UTC)
- The contribution of "new methodologies" would appear to conflict with WP:NOR. Dicklyon (talk) 04:26, 4 June 2011 (UTC)
- Dear Dicklyon, thanks for your interest and suggestion. I cite here a piece of the WP:NOR: "...Wikipedia articles must not contain original research. The term 'original research' (OR) is used on Wikipedia to refer to material—such as facts, allegations, ideas, and stories—for which no reliable published source exists...". The contribution in subject refers to published sources, available online and (for free) in university library. With my regards Structuralgeol (talk) 16:53, 16 June 2011 (UTC).
- Dear Structuralgeol, the cited articles are generally only freely available to university research staff. Perhaps you could convince Guerriero et al to make them freely available on a web site. I see that the Poisson distribution article is also being used to promote these Guerriero et al research papers.
- I look forward to you responding to my statement on the power law median page Wikipedia:Mediation_Cabal/Cases/2011-05-08/Power_law. Derek farn (talk) 18:06, 16 June 2011 (UTC)
Recently the accuracy of several power law claims has been called into question. This is still an area of active research, and the results are unclear. This leads to several editors adding power law examples only to have others delete them due to controversy (real or not).
To avoid confusion to those who may only want a broad overview, I propose creating a section on controversial power laws. This will inevitably make some people upset - "but my favorite power law is real/was accepted to a prestigious journal, etc." However, in keeping with Wikipedia's goals of accessibility and accuracy, I think it's best to include only the clearest of examples, rather than mixing in the clear power laws with the contested ones.
I'd say an example/method is contested if there are reasonable published academic articles claiming for/against the power law status. If your favorite example falls into this category, tough, unless it's abundantly clear that the criticisms are false. The other option is to just ax all controversial claims entirely, but I don't think that's entirely fair.
Finally, I think a bad criterion in this case would be to ask a single expert, since the experts still vehemently disagree on power laws in many fields, and egos, careers, and dollars are on the line. Octochimps (talk) 06:37, 6 July 2011 (UTC)
Mainly about power law distributions
In its present form, the article is mainly about power law distributions. Scaling law redirects here, but simple scaling law relations are hardly explained, like weight being proportional to the cube of the size of animals. /Pieter Kuiper (talk) 08:13, 15 November 2012 (UTC)
Missing citations to this article in the book "Multifractal Financial Markets"?
How interesting: In the book "Multifractal Financial Markets - An Alternative Approach to Asset and Risk Management" written by Ms Yasmine Hayek Kobeissi (Springer, New York, 2013) we can read the following on page 8:
- "Power laws are interesting because of their scale invariance. That is, given a relation f(x) = ax^k, scaling the argument x by a constant factor c causes only a proportionate scaling of the function itself."
In the Wikipedia article it says:
- The main attribute of power laws that makes them interesting is their scale invariance. Given a relation f(x) = ax^k, scaling the argument x by a constant factor c causes only a proportionate scaling of the function itself. That is, f(c x) = a(c x)^k = c^k f(x) α f(x). That is, scaling by a constant c simply multiplies the original power-law relation by the constant c^k
There is no reference in the book to Wikipedia.
It might however also quite well be that the author actually inserted these lines herself into Wikipedia before using them also in the book. Especially if some author has studied a certain topic in depth, it is not so unlikely that he/she also has written about it in Wikipedia. And once a good wording is put together in one's mind to explain something, this can repeatedly appear in presentations, articles etc. 22.214.171.124 (talk) 11:49, 27 August 2013 (UTC) e_l_
Definition of tail exponent different from literature
The article defines the symptotic exponent as where is the density rather than . It is consistent but can be confusing as it is not customary in the literature. Limit-theorem (talk) 10:32, 1 December 2013 (UTC)
Reference "Neukum" probably wrong
The section on empirical examples "introduces" s. This is unclear, given the grammar of the sentence. Is s crime type? What is this paragraph trying to say in normal English?126.96.36.199 (talk) 04:46, 11 July 2014 (UTC)
As an independent reader, I agree with this problem and the need for it to be edited. In general, the sentences on crime as an example are problematic and seem out of place. --188.8.131.52 (talk) 14:24, 7 August 2014 (UTC)
No average - little sense
The section entitled "No average" has question that isn't answered [the billionaire enters the room, and the mean skyrockets; the median moves a smidgeon; so what?] and then veers from income to car exhaust. 184.108.40.206 (talk) 04:51, 11 July 2014 (UTC)
I agree that the "thought experiment" makes no sense as it stands. The thought experiment just shows that if the richest person in the world enters a room of people, the average wealth in the room would obviously go way up. This is true for many different distributions of wealth and has nothing to do with the existence of an expected value for the mean or variance (both of which exist in this case because there are only finitely many people) or with power laws. 220.127.116.11 (talk) 14:59, 25 November 2016 (UTC)
Mistakes in text
Someone mistook P(X>x) for p(x), i.e. survival function for density. For L(x) the slowly varying function to apply, one needs to correct. Also confusion between parameter and Limit-theorem (talk) 13:20, 25 August 2016 (UTC)
Adding example of power law of CPU cache misses
I am adding the power law of CPU cache misses as an example in this article. It has been experimentally shown since the 1970s that the cache misses follow an exponential relationship with the size of the cache.Ysrajwad (talk) 14:12, 18 October 2016 (UTC)
- An exponential relationship is not a power law. Please explain. Limit-theorem (talk) 10:40, 19 October 2016 (UTC)
Confusing exponent notation switches
The mathematical representation first encountered in the article defines the exponent as -k and the scaling factor as a. Unfortunately, the very next section (2.2) redefines the exponent as -a and omits any scaling factor, without much to do. For the casual but interested reader, this leads to some confusion when trying to make sense of some following comments about a. This type of switcheroo happens again later in section 3.0, where three definitions in the one section have the exponent as k twice and once.
I am going to unify all of these places to k. This might also help correct some of the confusion between a and mentioned elsewhere in this talk page. In the more mathematically rigorous later sections (3.2 Variants and beyond), leaving as is, as it seems to be used consistently . --Se7ens (talk) 02:58, 27 March 2017 (UTC)