An example powerlaw graph, being used to demonstrate ranking of popularity. To the right is the
long tail, and to the left are the few that dominate (also known as the
80–20 rule).
In statistics, a power law is a functional relationship between two quantities, where one quantity varies as a power of another. For instance, the number of cities having a certain population size is found to vary as a power of the size of the population. Empirical powerlaw distributions hold only approximately or over a limited range.
Contents

Empirical examples of power laws 1

Properties of power laws 2

Scale invariance 2.1

No average 2.2

Universality 2.3

Powerlaw functions 3

Examples of powerlaw functions 3.1

Variants 3.2

Broken power law 3.2.1

Power law with exponential cutoff 3.2.2

Curved power law 3.2.3

Powerlaw probability distributions 4

Graphical methods for identification 4.1

Plotting powerlaw distributions 4.2

Estimating the exponent from empirical data 4.3

Maximum likelihood 4.3.1

Kolmogorov–Smirnov estimation 4.3.2

Twopoint fitting method 4.3.3

R function 4.3.4

Validating power laws 5

See also 6

Notes 7

Bibliography 8

External links 9
Empirical examples of power laws
The distributions of a wide variety of physical, biological, and manmade phenomena approximately follow a power law over a wide range of magnitudes: these include the sizes of

Zipf's law

Zipf, Powerlaws, and Pareto – a ranking tutorial

Gutenberg–Richter Law

Stream Morphometry and Horton's Laws

Clay Shirky on Institutions & Collaboration: Power law in relation to the internetbased social networks

Clay Shirky on Power Laws, Weblogs, and Inequality

"How the Finance Gurus Get Risk All Wrong" by Benoit Mandelbrot & Nassim Nicholas Taleb. Fortune, July 11, 2005.

"Milliondollar Murray": powerlaw distributions in homelessness and other social problems; by Malcolm Gladwell. The New Yorker, February 13, 2006.

Benoit Mandelbrot & Richard Hudson: The Misbehaviour of Markets (2004)

Philip Ball: Critical Mass: How one thing leads to another (2005)

Tyranny of the Power Law from The Econophysics Blog

So You Think You Have a Power Law – Well Isn't That Special? from ThreeToed Sloth, the blog of Cosma Shalizi, Professor of Statistics at CarnegieMellon University.

Simple MATLAB script which bins data to illustrate powerlaw distributions (if any) in the data.

The Erdős Webgraph Server visualizes the distribution of the degrees of the webgraph on the download page.
External links

Bak, Per (1997) How nature works, Oxford University Press ISBN 0 19 850164 1

Clauset, A.; Shalizi, C. R.; Newman, M. E. J. (2009). "PowerLaw Distributions in Empirical Data". SIAM Review 51 (4): 661–703.

Laherrère, J.; Sornette, D. (1998). "Stretched exponential distributions in nature and economy: "fat tails" with characteristic scales". The

Mitzenmacher, M. (2004). "A Brief History of Generative Models for Power Law and Lognormal Distributions". Internet Mathematics 1 (2): 226–251.

Alexander Saichev, Yannick Malevergne and Didier Sornette (2009) Theory of Zipf's law and beyond, Lecture Notes in Economics and Mathematical Systems, Volume 632, Springer (November 2009), ISBN 9783642029455

Simon, H. A. (1955). "On a Class of Skew Distribution Functions".


Mark Buchanan (2000) Ubiquity, Wiedenfield & Nicholson ISBN 0297643762

Stumpf, M.P.H. and Porter, M.A. "Critical Truths about Power Laws" Science 2012, 335, 6656
Bibliography

^ ^{a} ^{b} Newman, M. E. J. (2005). "Power laws, Pareto distributions and Zipf's law".

^ Humphries NE, Queiroz N, Dyer JR, Pade NG, Musyl MK, Schaefer KM, Fuller DW, Brunnschweiler JM, Doyle TK, Houghton JD, Hays GC, Jones CS, Noble LR, Wearmouth VJ, Southall EJ, Sims DW (2010). "Environmental context explains Lévy and Brownian movement patterns of marine predators". Nature 465 (7301): 1066–1069.

^ ^{a} ^{b} ^{c} Klaus A, Yu S, Plenz D (2011). Zochowski, Michal, ed. "Statistical Analyses Support Power Law Distributions Found in Neuronal Avalanches". PLoS ONE 6 (5): e19779.

^ Albert, J. S.; Reis, R. E., eds. (2011). Historical Biogeography of Neotropical Freshwater Fishes. Berkeley: University of California Press.

^ Newman M. Power laws, Pareto distributions and Zipf’s law. Contemporary Phys 2005, 46, 323

^ ^{a} ^{b} 9na CEPAL Charlas Sobre Sistemas Complejos Sociales (CCSSCS): Leyes de potencias, http://www.youtube.com/watch?v=4uDSEs86xCI

^ G. Neukum and B. A. Ivanov, Crater size distributions and impact probabilities on Earth from lunar, terrestrialplanet, and asteroid cratering data. In T. Gehrels (ed.), Hazards Due to Comets and Asteroids, pp. 359–416, University of Arizona Press, Tucson, AZ (1994).

^ Malcolm Gladwell (2006), MillionDollar Murray; http://gladwell.com/milliondollarmurray/

^ Andriani, P., & McKelvey, B. (2007). Beyond Gaussian averages: redirecting international business and management research toward extreme events and power laws. Journal of International Business Studies, 38(7), 1212–1230. doi:10.1057/palgrave.jibs.8400324

^ Reed W.J.; Hughes B.D. From gene families and genera to incomes and internet file sizes: Why power laws are so common in nature. Phys Rev E 2002, 66, 067103; http://www.math.uvic.ca/faculty/reed/PhysRevPowerLawTwoCol.pdf

^ ^{a} ^{b} "Hilbert, M. (2013), Scalefree powerlaws as interaction between progress and diffusion.", Martin Hilbert (2013), Complexity (journal), doi: 10.1002/cplx.21485; free access to the article through this link: martinhilbert.net/Powerlaw_ProgressDiffusion_Hilbert.pdf

^ Bolmatov, D.; Brazhkin, V. V.; Trachenko, K. (2013). "Thermodynamic behaviour of supercritical matter". Nature Communications 4.

^ "Afterglow Light Curves and Broken Power Laws: A Statistical Study". Retrieved 20130707.

^ "POWERLAW DISTRIBUTIONS IN EMPIRICAL DATA". Retrieved 20130707.

^ "Curvedpower law". Retrieved 20130707.

^ Kendal WS & Jørgensen B (2011) Taylor's power law and fluctuation scaling explained by a centrallimitlike convergence. Phys. Rev. E 83,066115

^ Kendal WS & Jørgensen BR (2011) Tweedie convergence: a mathematical basis for Taylor's power law, 1/f noise and multifractality. Phys. Rev E 84, 066120

^ Beirlant, J., Teugels, J. L., Vynckier, P. (1996a) Practical Analysis of Extreme Values, Leuven: Leuven University Press

^ Coles, S. (2001) An introduction to statistical modeling of extreme values. SpringerVerlag, London.

^ ^{a} ^{b} ^{c} ^{d} Diaz, F. J. (1999). "Identifying Tail Behavior by Means of Residual Quantile Functions". Journal of Computational and Graphical Statistics 8 (3): 493–509.

^ Resnick, S. I. (1997) "Heavy Tail Modeling and Teletraffic Data", The Annals of Statistics, 25, 1805–1869.

^ Jeong, H; Tombor, B. Albert; Oltvai, Z.N.; Barabasi, A.L. (2000). "The largescale organization of metabolic networks". Nature 407 (6804): 651–654.

^ Arnold, B. C., Brockett, P. L. (1983) "When does the βth percentile residual life function determine the distribution?", Operations Research 31 (2), 391–396.

^ Joe, H., Proschan, F. (1984) "Percentile residual life functions", Operations Research 32 (3), 668–678.

^ Joe, H. (1985), "Characterizations of life distributions from percentile residual lifetimes", Ann. Inst. Statist. Math. 37, Part A, 165–172.

^ Csorgo, S., Viharos, L. (1992) "Confidence bands for percentile residual lifetimes", Journal of Statistical Planning and Inference 30, 327–337.

^ Schmittlein, D. C., Morrison, D. G. (1981), "The median residual lifetime: A characterization theorem and an application", Operations Research 29 (2), 392–399.

^ Morrison, D. G., Schmittlein, D. C. (1980) "Jobs, strikes, and wars: Probability models for duration", Organizational Behavior and Human Performance 25, 224–251.

^ Gerchak, Y. (1984) "Decreasing failure rates and related issues in the social sciences", Operations Research 32 (3), 537–546.

^ Hall, P. (1982). "On Some Simple Estimates of an Exponent of Regular Variation".

^ ^{a} ^{b} ^{c} Guerriero, V. (2012). "Power Law Distribution: Method of Multiscale Inferential Statistics". Journal of Modern Mathematics Frontier (JMMF) 1: 21–28.
Notes
See also
One method to validate a powerlaw relation tests many orthogonal predictions of a particular generative mechanism against data. Simply fitting a powerlaw relation to a particular kind of data is not considered a rational approach. As such, the validation of powerlaw claims remains a very active field of research in many areas of modern science.
Although powerlaw relations are attractive for many theoretical reasons, demonstrating that data do indeed follow a powerlaw relation requires more than simply fitting a particular model to the data.^{[14]} For example lognormal distributions are often mistaken for powerlaw distributions. For example, Gibrat's law about proportional growth processes can actually produce limiting distributions that are lognormal, although their loglog plots look linear. An explanation of this is that although the logarithm of the lognormal density function is quadratic in log(x), yielding a "bowed" shape in a loglog plot, if the quadratic term is small relative to the linear term then the result can appear almost linear. Therefore a loglog plot that is slightly "bowed" downwards can reflect a lognormal distribution – not a power law. In general, many alternative functional forms can appear to follow a powerlaw form for some extent. Also, researchers usually have to face the problem of deciding whether or not a realworld probability distribution follows a power law. As a solution to this problem, Diaz^{[23]} proposed a graphical methodology based on random samples that allow visually discerning between different types of tail behavior. This methodology uses bundles of residual quantile functions, also called percentile residual life functions, which characterize many different types of distribution tails, including both heavy and nonheavy tails.
Validating power laws
pwrdist < function(u,...) {
# u is vector of event counts, e.g. how many
# crimes was a given perpetrator charged for by the police
fx < table(u)
i < as.numeric(names(fx))
y < rep(0,max(i))
y[i] < fx
m0 < glm(y~log(1:max(i)),family=quasipoisson())
print(summary(m0))
sub < paste("s=",round(m0$coef[2],2),"lambda=",sum(u),"/",length(u))
plot(i,fx,log="xy",xlab="x",sub=sub,ylab="counts",...)
grid()
lines(1:max(i),(fitted(m0)),type="l")
return(m0)
}
The following function estimates the exponent in R, plotting the loglog data and the fitted line.
R function
This criterion can be applied for the estimation of powerlaw exponent in the case of scale free distributions and provides a more convergent estimate than the maximum likelihood method.^{[34]} It has been applied to study probability distributions of fracture apertures.^{[34]} In some contexts the probability distribution is described, not by the cumulative distribution function, by the cumulative frequency of a property X, defined as the number of elements per meter (or area unit, second etc.) for which X > x applies, where x is a variable real number. As an example,^{[34]} the cumulative distribution of the fracture aperture, X, for a sample of N elements is defined as 'the number of fractures per meter having aperture greater than x . Use of cumulative frequency has some advantages, e.g. it allows one to put on the same diagram data gathered from sample lines of different lengths at different scales (e.g. from outcrop and from microscope).
Twopoint fitting method
where P_\mathrm{emp}(x) and P_\alpha(x) denote the cdfs of the data and the power law with exponent \alpha, respectively. As this method does not assume iid data, it provides an alternative way to determine the powerlaw exponent for data sets in which the temporal correlation can not be ignored.^{[3]}

D_\alpha = \max_x  P_\mathrm{emp}(x)  P_\alpha(x) 
with

\hat{\alpha} = \underset{\alpha}{\operatorname{arg\,min}} \, D_\alpha
Another method for the estimation of the powerlaw exponent, which does not assume independent and identically distributed (iid) data, uses the minimization of the Kolmogorov–Smirnov statistic, D, between the cumulative distribution functions of the data and the power law:
Kolmogorov–Smirnov estimation
More about these methods, and the conditions under which they can be used, can be found in . Further, this comprehensive review article provides usable code (Matlab, R and C++) for estimation and testing routines for powerlaw distributions.
Further, both of these estimators require the choice of x_\min. For functions with a nontrivial L(x) function, choosing x_\min too small produces a significant bias in \hat\alpha, while choosing it too large increases the uncertainty in \hat{\alpha}, and reduces the statistical power of our model. In general, the best choice of x_\min depends strongly on the particular form of the lower tail, represented by L(x) above.
where \zeta(\alpha,x_{\mathrm{min}}) is the incomplete zeta function. The uncertainty in this estimate follows the same formula as for the continuous equation. However, the two equations for \hat{\alpha} are not equivalent, and the continuous version should not be applied to discrete data, nor vice versa.

\frac{\zeta'(\hat\alpha,x_\min)}{\zeta(\hat{\alpha},x_\min)} = \frac{1}{n} \sum_{i=1}^n \ln \frac{x_i}{x_\min}
For a set of n integervalued data points \{x_i\}, again where each x_i\geq x_\min, the maximum likelihood exponent is the solution to the transcendental equation
where \{x_i\} are the n data points x_{i}\geq x_\min.^{[1]}^{[33]} This estimator exhibits a small finite samplesize bias of order O(n^{1}), which is small when n > 100. Further, the uncertainty in the estimation can be derived from the maximum likelihood argument, and has the form \sigma = \frac{\alpha1}{\sqrt{n}}. This estimator is equivalent to the popular Hill estimator from quantitative finance and extreme value theory.

\hat{\alpha} = 1 + n \left[ \sum_{i=1}^n \ln \frac{x_i}{x_\min} \right]^{1}
to the data x\geq x_\min, where the coefficient \frac{\alpha1}{x_\min} is included to ensure that the distribution is normalized. Given a choice for x_\min, a simple derivation by this method yields the estimator equation

p(x) = \frac{\alpha1}{x_\min} \left(\frac{x}{x_\min}\right)^{\alpha}
For realvalued, independent and identically distributed data, we fit a powerlaw distribution of the form
Maximum likelihood
There are many ways of estimating the value of the scaling exponent for a powerlaw tail, however not all of them yield unbiased and consistent answers. Some of the most reliable techniques are often based on the method of maximum likelihood. Alternative methods are often based on making a linear regression on either the loglog probability, the loglog cumulative distribution function, or on logbinned data, but these approaches should be avoided as they can all lead to highly biased estimates of the scaling exponent.
Estimating the exponent from empirical data
Although it can be convenient to logbin the data, or otherwise smooth the probability density (mass) function directly, these methods introduce an implicit bias in the representation of the data, and thus should be avoided. The cdf, on the other hand, introduces no bias in the data and preserves the linear signature on doubly logarithmic axes.
Note that the cdf is also a powerlaw function, but with a smaller scaling exponent. For data, an equivalent form of the cdf is the rankfrequency approach, in which we first sort the n observed values in ascending order, and plot them against the vector \left[1,\frac{n1}{n},\frac{n2}{n},\dots,\frac{1}{n}\right].

P(x) = \Pr(X > x) = C \int_x^\infty p(X)\,\mathrm{d}X = \frac{\alpha1}{x_\min^{\alpha+1}} \int_x^\infty X^{\alpha}\,\mathrm{d}X = \left(\frac{x}{x_\min} \right)^{\alpha+1}.
In general, powerlaw distributions are plotted on doubly logarithmic axes, which emphasizes the upper tail region. The most convenient way to do this is via the (complementary) cumulative distribution (cdf), P(x) = \mathrm{Pr}(X > x),
Plotting powerlaw distributions
Another graphical method for the identification of powerlaw probability distributions using random samples has been proposed.^{[23]} This methodology consists of plotting a bundle for the logtransformed sample. Originally proposed as a tool to explore the existence of moments and the moment generation function using random samples, the bundle methodology is based on residual quantile functions (RQFs), also called residual percentile functions,^{[26]}^{[27]}^{[28]}^{[29]}^{[30]}^{[31]}^{[32]} which provide a full characterization of the tail behavior of many wellknown probability distributions, including powerlaw distributions, distributions with other types of heavy tails, and even nonheavytailed distributions. Bundle plots do not have the disadvantages of Pareto QQ plots, mean residual life plots and loglog plots mentioned above (they are robust to outliers, allow visually identifying power laws with small values of \alpha, and do not demand the collection of much data). In addition, other types of tail behavior can be identified using bundle plots.
Loglog plots are an alternative way of graphically examining the tail of a distribution using a random sample. This method consists of plotting the logarithm of an estimator of the probability that a particular number of the distribution occurs versus the logarithm of that particular number. Usually, this estimator is the proportion of times that the number occurs in the data set. If the points in the plot tend to "converge" to a straight line for large numbers in the x axis, then the researcher concludes that the distribution has a powerlaw tail. Examples of the application of these types of plot have been published.^{[25]} A disadvantage of these plots is that, in order for them to provide reliable results, they require huge amounts of data. In addition, they are appropriate only for discrete (or grouped) data.
A straight line on a loglog plot is strong evidence for powerlaws, the slope of the straight line corresponds to the power law exponent.
On the other hand, in its version for identifying powerlaw probability distributions, the mean residual life plot consists of first logtransforming the data, and then plotting the average of those logtransformed data that are higher than the ith order statistic versus the ith order statistic, for i = 1, ..., n, where n is the size of the random sample. If the resultant scatterplot suggests that the plotted points tend to "stabilize" about a horizontal straight line, then a powerlaw distribution should be suspected. Since the mean residual life plot is very sensitive to outliers (it is not robust), it usually produces plots that are difficult to interpret; for this reason, such plots are usually called Hill horror plots ^{[24]}
Pareto QQ plots compare the quantiles of the logtransformed data to the corresponding quantiles of an exponential distribution with mean 1 (or to the quantiles of a standard Pareto distribution) by plotting the former versus the latter. If the resultant scatterplot suggests that the plotted points " asymptotically converge" to a straight line, then a powerlaw distribution should be suspected. A limitation of Pareto QQ plots is that they behave poorly when the tail index \alpha (also called Pareto index) is close to 0, because Pareto QQ plots are not designed to identify distributions with slowly varying tails.^{[23]}
Although more sophisticated and robust methods have been proposed, the most frequently used graphical methods of identifying powerlaw probability distributions using random samples are Pareto quantilequantile plots (or Pareto QQ plots), mean residual life plots^{[21]}^{[22]} and loglog plots. Another, more robust graphical method uses bundles of residual quantile functions.^{[23]} (Please keep in mind that powerlaw distributions are also called Paretotype distributions.) It is assumed here that a random sample is obtained from a probability distribution, and that we want to know if the tail of the distribution follows a power law (in other words, we want to know if the distribution has a "Pareto tail"). Here, the random sample is called "the data".
Graphical methods for identification
The Tweedie distributions are a family of statistical models characterized by closure under additive and reproductive convolution as well as under scale transformation. Consequently these models all express a powerlaw relationship between the variance and the mean. These models have a fundamental role as foci of mathematical convergence similar to the role that the normal distribution has as a focus in the central limit theorem. This convergence effect explains why the variancetomean power law manifests so widely in natural processes, as with Taylor's law in ecology and with fluctuation scaling^{[19]} in physics. It can also be shown that this variancetomean power law, when demonstrated by the method of expanding bins, implies the presence of 1/f noise and that 1/f noise can arise as a consequence of this Tweedie convergence effect.^{[20]}
In this distribution, the exponential decay term \mathrm{e}^{\lambda x} eventually overwhelms the powerlaw behavior at very large values of x. This distribution does not scale and is thus not asymptotically a power law; however, it does approximately scale over a finite region before the cutoff. (Note that the pure form above is a subset of this family, with \lambda=0.) This distribution is a common alternative to the asymptotic powerlaw distribution because it naturally captures finitesize effects. For instance, although the Gutenberg–Richter law is commonly cited as an example of a powerlaw distribution, the distribution of earthquake magnitudes cannot scale as a power law in the limit x\rightarrow\infty because there is a finite amount of energy in the Earth's crust and thus there must be some maximum size to an earthquake. As the scaling behavior approaches this size, it must taper off.

p(x) \propto L(x) x^{\alpha} \mathrm{e}^{\lambda x}.
Another kind of powerlaw distribution, which does not satisfy the general form above, is the power law with an exponential cutoff
which is only well defined for m < \alpha 1. That is, all moments m \geq \alpha  1 diverge: when \alpha<2, the average and all higherorder moments are infinite; when 2<\alpha<3, the mean exists, but the variance and higherorder moments are infinite, etc. For finitesize samples drawn from such distribution, this behavior implies that the central moment estimators (like the mean and the variance) for diverging moments will never converge – as more data is accumulated, they continue to grow. These powerlaw probability distributions are also called Paretotype distributions, distributions with Pareto tails, or distributions with regularly varying tails.

\langle x^{m} \rangle = \int_{x_\min}^\infty x^{m} p(x) \,\mathrm{d}x = \frac{\alpha1}{\alpha1m}x_\min^m
where the prefactor to \frac{\alpha1}{x_\min} is the normalizing constant. We can now consider several properties of this distribution. For instance, its moments are given by

p(x) = \frac{\alpha1}{x_\min} \left(\frac{x}{x_\min}\right)^{\alpha},
where \alpha > 1, and L(x) is a slowly varying function, which is any function that satisfies \lim_{x\rightarrow\infty} L(t\,x) / L(x) = 1 with t constant and t > 0. This property of L(x) follows directly from the requirement that p(x) be asymptotically scale invariant; thus, the form of L(x) only controls the shape and finite extent of the lower tail. For instance, if L(x) is the constant function, then we have a power law that holds for all values of x. In many cases, it is convenient to assume a lower bound x_{\mathrm{min}} from which the law holds. Combining these two cases, and where x is a continuous variable, the power law has the form

p(x) \propto L(x) x^{\alpha}
In a looser sense, a powerlaw probability distribution is a distribution whose density function (or mass function in the discrete case) has the form
Powerlaw probability distributions

f(x) \propto x^{\alpha + \beta x}^{[18]}
Curved power law

f(x) \propto x^{\alpha}e^{\beta x}.
A power law with an exponential cutoff is simply a power law multiplied by an exponential function:^{[17]}
Power law with exponential cutoff

f(x) \propto x^{\alpha_1} for x

f(x) \propto x^{\alpha_1\alpha_2}_\text{th}x^{\alpha_2}\text{ for } x>x_\text{th}.
A broken power law is defined with a threshold:^{[16]}
Broken power law
Variants

The frequencydependency of acoustic attenuation in complex media

The Stevens' power law of psychophysics

The Stefan–Boltzmann law

The inputvoltage–outputcurrent curves of fieldeffect transistors and vacuum tubes approximate a squarelaw relationship, a factor in "tube sound".

Squarecube law (ratio of surface area to volume)

Kleiber's law relating animal metabolism to size, and allometric laws in general

A 3/2power law can be found in the plate characteristic curves of triodes.

The inversesquare laws of Newtonian gravity and electrostatics, as evidenced by the gravitational potential and Electrostatic potential, respectively.

critical point as an attractor

Exponential growth and random observation (or killing)^{[13]}

Progress through exponential growth and exponential diffusion of innovations^{[14]}

Highly optimized tolerance

Model of van der Waals force

Force and potential in simple harmonic motion

Kepler's third law

The initial mass function of stars

The Msigma relation

Gamma correction relating light intensity with voltage

The twothirds power law, relating speed to curvature in the human motor system.

The Taylor's law relating mean population size and variance of populations sizes in ecology

Behaviour near secondorder phase transitions involving critical exponents

Proposed form of experience curve effects

The differential energy spectrum of cosmicray nuclei

Fractals

Pareto distribution and the Pareto principle also called the "80–20 rule"

Zipf's law in corpus analysis and population distributions amongst others, where frequency of an item or event is inversely proportional to its frequency rank (i.e. the second most frequent item/event occurring half as often the most frequent item and so on).

The safe operating area relating to maximum simultaneous current and voltage in power semiconductors.

Supercritical state of matter and supercritical fluids, such as supercritical exponents of heat capacity and viscosity.^{[15]}

Zeta distribution (discrete)

Yule–Simon distribution (discrete)

Student's tdistribution (continuous), of which the Cauchy distribution is a special case

Lotka's law

The scalefree network model

Pink noise

Neuronal avalanches^{[3]}

The law of stream numbers, and the law of stream lengths (Horton's laws describing river systems)

Populations of cities (Gibrat's law)

Bibliograms, and frequencies of words in a text (Zipf's law)

90–9–1 principle on wikis (also referred to as the 1% Rule)

Richardson's Law for the severity of violent conflicts (wars and terrorism){Lewis Fry Richardson, The Statistics of Deadly Quarrels, 1950}

Gutenberg–Richter law of earthquake magnitudes

Social Network Websites
More than a hundred powerlaw distributions have been identified in physics (e.g. sandpile avalanches and earthquakes), biology (e.g. species extinction and body mass), and the social sciences (e.g. city sizes and income).^{[12]} Among them are:
Examples of powerlaw functions
Mathematically, a strict power law cannot be a probability distribution, but a distribution that is a truncated power function is possible: p(x) = C x^{\alpha} for x > x_\text{min} where the exponent \alpha is greater than 1 (otherwise the tail has infinite area), the minimum value x_\text{min} is needed otherwise the distribution has infinite area as x approaches 0, and the constant C is a scaling factor to ensure that the total area is 1, as required by a probability distribution. More often one uses an asymptotic power law – one that is only true in the limit; see powerlaw probability distributions below for details. Typically the exponent falls in the range 2 < \alpha < 3, though not always.

y = ax^k + \varepsilon.\!
In empirical contexts, an approximation to a powerlaw o(x^k) often includes a deviation term \varepsilon, which can represent uncertainty in the observed values (perhaps measurement or sampling errors) or provide a simple way for observations to deviate from the powerlaw function (perhaps for stochastic reasons):
However much of the recent interest in power laws comes from the study of probability distributions: The distributions of a wide variety of quantities seem to follow the powerlaw form, at least in their upper tail (large events). The behavior of these large events connects these quantities to the study of theory of large deviations (also called extreme value theory), which considers the frequency of extremely rare events like stock market crashes and large natural disasters. It is primarily in the study of statistical distributions that the name "power law" is used; in other areas, such as physics and engineering, a powerlaw functional form with a single term and a positive integer exponent is typically regarded as a polynomial function.
Scientific interest in powerlaw relations stems partly from the ease with which certain general classes of mechanisms generate them. The demonstration of a powerlaw relation in some data can point to specific kinds of mechanisms that might underlie the natural phenomenon in question, and can indicate a deep connection with other, seemingly unrelated systems; see also universality above. The ubiquity of powerlaw relations in physics is partly due to dimensional constraints, while in complex systems, power laws are often thought to be signatures of hierarchy or of specific stochastic processes. A few notable examples of power laws are the Gutenberg–Richter law for earthquake sizes, Pareto's law of income distribution, structural selfsimilarity of fractals, and scaling laws in biological systems. Research on the origins of powerlaw relations, and efforts to observe and validate them in the real world, is an active topic of research in many fields of science, including physics, computer science, linguistics, geophysics, neuroscience, sociology, economics and more.
Powerlaw functions
The equivalence of power laws with a particular scaling exponent can have a deeper origin in the dynamical processes that generate the powerlaw relation. In physics, for example, attractor. Formally, this sharing of dynamics is referred to as universality, and systems with precisely the same critical exponents are said to belong to the same universality class.
Universality
On the one hand, this makes it incorrect to apply traditional statistics that are based on variance and standard deviation (such as regression analysis). On the other hand, this also allows for costefficient interventions.^{[7]} For example, given that car exhaust is distributed according to a powerlaw among cars (very few cars contribute to most contamination) it would be sufficient to eliminate those very few cars from the road to reduce total exhaust substantially.^{[9]}
Powerlaws have a welldefined mean only if the exponent exceeds 2 and have a finite variance only when the exponent exceeds 3; most identified power laws in nature have exponents such that the mean is welldefined but the variance is not, implying they are capable of black swan behaviour^{[6]} This can be seen in the following thought experiment:^{[7]} imagine a room with your friends and estimate the average monthly income in the room. Now imagine the world's richest person entering the room, with a monthly income of about 1 billion US$. What happens to the average income in the room? Income is distributed according to a powerlaw known as the Pareto distribution (for example, the net worth of Americans is distributed according to a power law with an exponent of 2^{[8]}).
No average
That is, scaling by a constant c simply multiplies the original powerlaw relation by the constant c^k. Thus, it follows that all power laws with a particular scaling exponent are equivalent up to constant factors, since each is simply a scaled version of the others. This behavior is what produces the linear relationship when logarithms are taken of both f(x) and x, and the straightline on the loglog plot is often called the signature of a power law. With real data, such straightness is a necessary, but not sufficient, condition for the data following a powerlaw relation. In fact, there are many ways to generate finite amounts of data that mimic this signature behavior, but, in their asymptotic limit, are not true power laws (e.g., if the generating process of some data follows a Lognormal distribution). Thus, accurately fitting and validating powerlaw models is an active area of research in statistics.

f(c x) = a(c x)^k = c^k f(x) \propto f(x).\!
One attribute of power laws is their scale invariance. Given a relation f(x) = ax^k, scaling the argument x by a constant factor c causes only a proportionate scaling of the function itself. That is,
Scale invariance
Properties of power laws
for relationships between biological variables are among the best known powerlaw functions in nature.
Allometric scaling laws follows frequency powerlaws within wide frequency bands for many complex media. Acoustic attenuation Few empirical distributions fit a power law for all their values, but rather follow a power law in the tail. , wars, criminal charges per convict, and many other quantities.power outages the sizes of [4]
This article was sourced from Creative Commons AttributionShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, EGovernment Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a nonprofit organization.