Friday, January 28, 2011

Earthquakes: Moment magnitude scale

The moment magnitude scale (abbreviated as MMS; denoted as MW) is used by seismologists to measure the size of earthquakes in terms of the energy released. The magnitude is based on the moment of the earthquake, which is equal to the rigidity of the Earth multiplied by the average amount of slip on the fault and the size of the area that slipped. The scale was developed in the 1970s to succeed the 1930s-era Richter magnitude scale (ML). Even though the formulae are different, the new scale retains the familiar continuum of magnitude values defined by the older one. The MMS is now the scale used to estimate magnitudes for all modern large earthquakes by the United States Geological Survey.

Definition

The symbol for the moment magnitude scale is Mw, with the subscript w meaning mechanical work accomplished. The moment magnitude Mw is a dimensionless number defined by
M_\mathrm{w} = \textstyle{\frac{2}{3}}\log_{10}M_0 - 10.7,
where M0 is the magnitude of the seismic moment in dyne centimetres (107 Nm). The constant values in the equation are chosen to achieve consistency with the magnitude values produced by earlier scales, most importantly the Local Moment (or "Richter") scale.
As with the Richter scale, an increase of 1 step on this logarithmic scale corresponds to a 101.5 ≈ 32 times increase in the amount of energy released, and an increase of 2 steps corresponds to a 103 = 1000 times increase in energy.



No comments:

Post a Comment

Please send in your feedback. I appreciate your visit.