language-icon Old Web
English
Sign In

Richter magnitude scale

The so-called Richter scale – also Richter magnitude or Richter magnitude scale, more accurately but informally Richter's magnitude scale – for measuring the strength ('size') of earthquakes refers to the original 'magnitude scale' developed by Charles F. Richter and presented in his landmark 1935 paper. This was later revised and renamed the Local magnitude scale, denoted as 'ML' or 'ML'. Because of various shortcomings of the ML scale most seismological authorities now use other scales, such as the moment magnitude scale (Mw ), to report earthquake magnitudes, but much of the news media still refers to these as 'Richter' magnitudes. All magnitude scales retain the logarithmic character of the original, and are scaled to have roughly comparable numeric values. Prior to the development of the magnitude scale the only measure of an earthquake's strength or 'size' was a subjective assessment of the intensity of shaking observed near the epicenter of the earthquake, categorized by various seismic intensity scales such as the Rossi-Forel scale. In 1883 John Milne surmised that the shaking of large earthquakes might generate waves detectable around the globe, and in 1899 E. Von Rehbur Paschvitz observed in Germany seismic waves attributable to an earthquake in Tokyo. In the 1920s Harry O. Wood and John A. Anderson developed the Wood–Anderson Seismograph, one of the first practical instruments for recording seismic waves. Wood then built, under the auspices of the California Institute of Technology and the Carnegie Institute, a network of seismographs stretching across Southern California. He also recruited the young and unknown Charles Richter to measure the seismograms and locate the earthquakes generating the seismic waves. In 1931 Kiyoo Wadati showed how he had measured, for several strong earthquakes in Japan, the amplitude of the shaking observed at various distances from the epicenter. He then plotted the logarithm of the amplitude against the distance, and found a series of curves that showed a rough correlation with the estimated magnitudes of the earthquakes. Richter resolved some difficulties with this method, then, using data collected by his colleague Beno Gutenberg, produced similar curves, confirming that they could be used to compare the relative magnitudes of different earthquakes. To produce a practical method of assigning an absolute measure of magnitude required additional developments. First, to span the wide range of possible values Richter adopted Gutenberg's suggestion of a logarithmic scale, where each step represents a tenfold increase of magnitude, similar to the magnitude scale used by astronomers for star brightness. Second, he wanted a magnitude of zero to be around the limit of human perceptibility. Third, he specified the Wood–Anderson seismograph as the standard instrument for producing seismograms. Magnitude was then defined as 'the logarithm of the maximum trace amplitude, expressed in microns', measured at a distance of 100 km. The scale was calibrated by defining a magnitude 3 shock as one that produces (at a distance of 100 km) a maximum amplitude of 1 micron (1 µm, or 0.001 millimeters) on a seismogram recorded by a Wood–Anderson torsion seismograph. Finally, Richter calculated a table of distance corrections, in that for distances less than 200 kilometers the attenuation is strongly affected by the structure and properties of the regional geology. When Richter presented the resulting scale in 1935 he called it (at the suggestion of Harry Wood) simply a 'magnitude' scale. 'Richter magnitude' appears to have originated when Perry Byerly told the press that the scale was Richter's, and 'should be referred to as such.' In 1956 Gutenberg and Richter, while still referring to 'magnitude scale', labelled it 'local magnitude', with the symbol ML , to distinguish it from two other scales they had developed, the surface wave magnitude (MS) and body wave magnitude (MB) scales. The Richter scale was defined in 1935 for particular circumstances and instruments; the particular circumstances refer to it being defined for Southern California and 'implicitly incorporates the attenuative properties of Southern California crust and mantle.' The particular instrument used would become saturated by strong earthquakes and unable to record high values. The scale was replaced in the 1970s by the moment magnitude scale (MMS, symbol Mw ); for earthquakes adequately measured by the Richter scale, numerical values are approximately the same. Although values measured for earthquakes now are M w {displaystyle M_{w}} (MMS), they are frequently reported by the press as Richter values, even for earthquakes of magnitude over 8, when the Richter scale becomes meaningless. The Richter and MMS scales measure the energy released by an earthquake; another scale, the Mercalli intensity scale, classifies earthquakes by their effects, from detectable by instruments but not noticeable, to catastrophic. The energy and effects are not necessarily strongly correlated; a shallow earthquake in a populated area with soil of certain types can be far more intense in effects than a much more energetic deep earthquake in an isolated area. Several scales have historically been described as the 'Richter scale', especially the local magnitude M L {displaystyle M_{ ext{L}}} and the surface wave M s {displaystyle M_{ ext{s}}} scale. In addition, the body wave magnitude, m b {displaystyle m_{ ext{b}}} , and the moment magnitude, M w {displaystyle M_{ ext{w}}} , abbreviated MMS, have been widely used for decades. A couple of new techniques to measure magnitude are in the development stage by seismologists.

[ "Magnitude (mathematics)", "Surface wave magnitude" ]
Parent Topic
Child Topic
    No Parent Topic