LENGTH MEASUREMENT

Type of applied science: Scientific measurement

Field of study: Metrology

    Length, being one of the seven fundamental quantities of physical measure, has a rich history of attempts to quantify its extent.  The standardization of length measurements has many applications in science, technology, and commerce.  Its standardization has facilitated the communication of scientific observations and the trade of goods and services world wide.

Principal terms:

accuracy: A number that specifies the agreement of the result of measurement with the true value of the measured quantity.

gage: A device for determining the size or shape of an object. Gages are widely used to measure and control the dimensions of objects during manufacture and production.

measure of length: A distance between two points established according to some standard or reference.

metrology: The branch of science that deals with quantifying the measures of physical quantities.

precision: The repeatability of the measurement process. It refers to how well several measurements of the same quantity agree with each other.

primary standard: An unchanging physical construction or realization of a unit of measure. Primary standards provide the absolute basis of reference of the particular unit and are used to pass on the unit to all who need it.

unit: A quantity, value, or dimension adopted as a standard of measurement. Generally, a unit is fixed by definition. For example, the units of pound, bushel, and chain are used to express a fixed weight, capacity or volume, and length respectively.

Overview of the Technology

    The concept of length is intimately connected to the notion of an event. An event is "a happening" at one point in space and one point in time. A complicated phenomena, such as for example the collision between two billiard balls, can be analyzed in terms of a series or succession of individual events. The first event in this phenomena is the initial contact between the objects. The collision process spreads out over many points in space and time as it evolves.

    To determine the position of an event in space, a reference position or origin must be established. The origin is the location of the "zero position" of the coordinate system. A coordinate system may be imagined to consist of a three-dimensional (3-D) grid of lines surrounding this origin. The location in space of a particular event my simply be read off this grid of lines. Many phenomena and events are conveniently represented using a 3-D Cartesian or rectangular coordinate system. This coordinate system is based on three mutually perpendicular lines passing through the origin. Two other coordinate systems are often used in physical measurement problems; these are the spherical and cylindrical coordinate systems.

    It is a observational fact that three coordinates are necessary to locate the position of an event in the space in which we live. In addition, it appears the space around us obeys a Euclidean geometry; that is, the postulates and theorems of Euclid are valid in our world. One important theorem proved by Euclid asserts that the sum of the interior angles if a plane triangle in space is equal to 180 degrees. It is important to emphasize that the assumed three-dimensional character of space and its Euclidean geometry are based only on empirical evidence. Empirical facts are based on observation and experimentation; they do not explain the causes of what is observed, they just describe. The validity of Euclid's triangle theorem has been demonstrated for plane triangles on or near the Earth to within an uncertainty of a few tenths of a second of arc. Thus, it appears that the ideas of Euclidean geometry are good to an accuracy of a few parts in per million.

    The meter is the fundamental unit of length in the Systeme International (SI) or metric system of units. The meter was originally defined by the French Academy of Sciences in the 1791 as one ten-millionth of the distance from the North Pole to the Equator, along the meridian line passing through Paris. As one could imagine, this definition of the meter was difficult to realize due the arduous nature of the measurement. However, in 1798, these geodetic measurements were completed and in 1799, a platinum prototype meter bar was constructed and housed in the Archives of the Republic in Paris, France. Several iron copies of this standard meter were made and one, the "committee meter", was brought to the United States, by Ferdinand Rudolph Hassler, First Superintendent of the Coast Survey and Weights and Measures. This bar served as the metric length standard in the United States throughout most of the 1800's.

    In 1889, a new physical realization of the meter, the International Prototype Meter was legalized by the 1'st General Conference on Weights and Measures and constructed. This new realization, although constructed to agree in length with the 1799 bar, was an arbitrary standard. That is, it was not required to conform to any natural or absolute standard, rather, it was used to define the unit of length called the meter. The 1889 legislation defined the meter as the distance, at 0 degrees Celsius, between the center portions of two lines graduated on the polished surface of a particular bar of platinum-iridium alloy. The material platinum-iridium was used because it is hard, resists oxidization, takes a very high mirror polish, and has a low coefficient of thermal expansion. The original International Prototype Meter is housed at the Bureau International de Poids et Measures (BIPM) in Sevres, France.

    29 copies of the International Prototype Meter were also constructed at the BIPM and distributed to other countries. Prototype Meter No. 27 was given to the United States and is now housed at the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland. Accurate comparisons between the secondary standards and objects of unknown length could be made using a longitudinal optical comparator. The precision of optical comparisons is limited to approximately one part in ten million (0.1 ppm). As science, technology, and commerce advanced in the twentieth century this level of precision became inadequate.

    The pioneering work of the American scientist Albert Michelson, involving optical interferometric techniques, paved the way for a new and more precise definition of the meter. An interferometer is a device used to measure accurately the wavelength of light. In 1960, the 11'th International Conference on Weights and Measures defined the meter to be 1,650,763.73 times the wavelength (in vacuum) of a particular orange colored light emitted by an isotope with, atomic mass of 86, of the element Krypton.

    The advantage of the krypton standard is obvious. Since all Kr-86 atoms are alike, this atomic length standard is universally accessible to any suitably equipped scientific laboratory. It is not necessary to keep a "prototype krypton-86 atom" at the BIPM for reference as was the case with 1889 International Prototype Meter. Unknown lengths could be compared with the standard by the use of optical interferometry. However, the wavelength of the emitted light is slightly uncertain due to quantum mechanical effects that occur in the krypton atom during the emission process. These uncertainties limit the absolute precision of the krypton-86 length standard to the 1 to 3 parts per billion level. Still, this is a clear improvement over the 1889 length standard.

    The current standard of length was defined during the 1983 International Conference on Weights and Measures. This standard of length is quite different from the 1889 and 1960 standards since it is defined in terms of time. The 1983 standard defines the meter to be the distance traveled by light in vacuum in 1/299,792,458 of a second. The precision of this length standard is approximately one part per ten trillion, a factor of one-million improvement over the 1889 standard. The basic rational for this standard is the precision of time interval measurement provided by that the current generation of atomic clocks. This level of precision of time measurements combined with the assumed consistency of the speed of light in vacuum allow this "natural" standard of length to be defined. The consistency of the speed of light is vacuum was a fundamental postulate of Albert Einstein's theory of special relativity. A consequence of this definition of the meter is that the speed of light in vacuum is defined to have a value of 299,792,458 m/s. This result is in excellent accord with the best experimental determinations of the speed of light in vacuum, (299,792,458 +- 1) m/s.

    Practical realizations of the 1983 length standard using frequency stabilized lasers allow the change in position (i.e. displacement) of an object or event in the millimeter range to be measured with an uncertainty of one-picometer, the change in position in the meter range to ten-nanometers, and the "length" (the distance between the endpoints) of a sub-meter sized object to an uncertainty of one-nanometer. The measurement of the length of objects to a lower uncertainty is precluded by the slight deformations of the object under measure by the measuring apparatus.

Uses of the Technology

    One of the most important uses of length measurement technology in industry involves the use of measuring instruments and gages to determine and control the dimensions of manufactured parts. Mass production of goods requires complex systems of metrology to evaluate critical dimensions. Components for a complex manufactured object, such as an automobile, may be produced at several locations and then brought together for final assembly. The standardization and control of length measurement is absolutely critical in this context so that all the various parts will mesh with each other as designed.

    As an example of the demands placed on instruments used for length measurement, imagine that it is desired to manufacture a part to some desired dimension within a tolerance of one thousandth of an inch (.001"). In order that the part conform to this level of tolerance, the instrument or gage that inspects it must be accurate to one ten-thousandth of an inch (.0001"). The precision instrument that checks the gage must be accurate to one hundred-thousandth of an inch (.00001"). The working gage blocks that are used to set the precision instrument must be accurate to four millionths of and inch (.000004") and the master blocks that calibrate the working blocks must be accurate to one millionth of an inch (.000001"). In addition, if this hierarchy of calibration structure is to have an absolute base of reference, the dimensions of the master gage blocks must be derived from the primary length standard or other standards derived from it. In the United States, master blocks must be certified to length standards ultimately traceable to NIST.

    Most manual instruments used for length measurement in the manufacturing context are one of three basic types: line graduated measuring instruments, fixed gages, and gage blocks. Line graduated measurement instruments are geometric objects with graduation spacing representing known distances. These instruments may be used to measure distances within their capacity range to some level of sensitivity or discrimination. The discrimination level of a line graduated instrument is related to the smallest increment of the scale graduation. Examples of line graduated measuring instruments include: line graduated rules and tapes, line graduated bar standards, caliper gages, micrometer gages, diffraction gratings, and line graduated angle measuring instruments. Measurement errors that may occur when using line graduated measurement instruments fall into two broad classes: instrument limitations and observational errors. Instrument limitations include geometric deficiencies resulting from flatness or parallelism errors and inaccuracies of scale graduations. Observational errors include alignment deficiencies and parallax errors. Observational errors can in principle be eliminated from the measurement process with proper instrument design and measurement technique.

    The next general class of manual measurement instruments are fixed gages. A fixed gage is either a direct or reverse physical replica of the object dimension to be measured. Fixed gages may be constructed to represent the part dimension in its desired or nominal dimension - the master gage, or it may be used to check its limit conditions resulting from the tolerance specified on the dimension - the limit gage. Fixed gages are very useful in the role of inspection of dimensions of manufactured parts. They are critical to the success of the interchangeable part manufacturing system. Examples of fixed gages include: limit length gages, adjustable limit snap gages, cylindrical limit gages, taper gages, multiple dimension gages, screw thread gages, and contour gages. Some advantages associated with using fixed gages in the production environment include the following:

i) Fixed gages are free from errors due to the drift of the original adjustment. However, they are not free of errors due to the use and subsequent wear of the gage.

ii) Limit gages provide a definite yes/no answer to the acceptability of the inspected part.

iii) Fixed gages can be transported to the place needed and usually require no additional set-up.

iv) The cost of fixed gages is relatively modest and thus makes this type of inspection economical.

    Limit gages are made to sizes which are identical to the design limit sizes of the dimension to be inspected; i.e. the nominal dimension plus or minus the dimension tolerance. A limit gage that deems a part acceptable for assembly is termed a "GO" gage. If the "GO" limit gage can enter or be entered by the part, then it is acceptable. A limit gage that deems a part unacceptable for assembly is termed a "NO-GO" gage. If a "NO-GO" gage can enter or be entered by the part, then its dimension measured is incorrect and the part may be rejected.

    The design, construction, and dimensioning of fixed gages is a critical application of length measurement technology. Physical properties of the gage material, such as long-term dimensional stability, thermal stability, and wear resistance must be considered carefully along with the intended application when designing fixed gages. The tolerances of the dimensions of fixed gages must often be at the working gage block or even the master block level of the calibration hierarchy described above; a task that would be nearly impossible without universal standards of length.

    Gage blocks are the third class of manual measurement instruments. A gage block is a length standard with rectangular, round, or square cross sections having flat parallel opposing gaging surfaces. Gage blocks are indeed the "master gages" of the machine shop; they are true secondary standards of length and have a calibration that is often traceable to one of the three primary standards of length described in the first section of this article. Since individual gage blocks are often combined together to make a standard of specific length, they must meet the following requirements:

i) The individual blocks must be available in sizes needed to construct a set able to achieve any desired size and graduation.

ii) The accuracy of each individual element of the set must be within a known and accepted tolerance limit.

iii) In the built-up combinations, the individual blocks must be attached so closely so that the length of the combination is for all practical purposes equal to the sum of the lengths of the individual elements.

iv) The attachment of the individual blocks to each other must be firm enough to allow for a reasonable amount of handling of the combination, but should not harm or prevent the reuse of the blocks in any way.

    The technical requirements for gage block sets that meet all four of the requirements listed above is outlined in the first entry of the bibliography. Gage block sets for length measurement exist in both English unit and metric unit versions and are also manufactured in several tolerance grades. For English unit gage blocks with nominal size less that one inch, the length tolerance for grades 0.5, 1, 2, and 3 are respectively: +- 1, +-2, +4 to -2, and +8 to -4 millionths of an inch. Notice that the length uncertainty of a one inch grade 0.5 gage block is two parts per million. This is only a factor of 20 less than the ultimate precision of the 1889 International Prototype Meter; grade 0.5 gage blocks are truly master gages!

    The bilateral nature of the tolerance for each block usually results in the total tolerance for a stack of blocks to be much less than the sum of the tolerances for the individual blocks. For example, if 30 one-half inch grade 0.5 blocks were combined together, the cumulative tolerance would amount to .000030 inches (30 micro-inches). However, the actual length of this stack would be much closer to 15.000000" than the 15.000030" one would expect if the individual tolerances were additive.

    Some applications of grade 0.5 and grade 1 gage blocks in length measurement technology include: providing a readily accessible length measurement standard to those needing it, providing reference for gage calibration, and calibration of precision measuring instruments. Grade 2 gage blocks are used to: check limit gages, set adjustable limit gages, and measure setting gages. Grade 3 gage blocks are used routinely for measurement tasks including: direct measurement of distances between parallel surfaces, checking and adjusting mechanics' measuring tools, and precision layout of work places.

Context

    The early role of man as a creator of physical structure and the shaper of his environment necessitated the need for dimensional measurement. Body measurements were probably the most convenient references for early length measurement. The cubit, devised by the Egyptians about 3000 BC, is generally regarded as the most important length standard in the ancient Mediterranean world. It appears that the cubit represented the length of the forearm, from the elbow to fingertips. The Egyptian cubit was standardized by a master "royal cubit" made of black granite, against which all other cubit sticks used in Egypt were compare with. The present system of comparing units of measure with a standard physical realization of it follows directly from this Egyptian custom. The royal cubit, whose length is 524 millimeters, was subdivided in a complicated way. The smallest division, the digit, represented the width of a finger. There are 28 digits in a royal cubit. Four digits was equal to a palm and five digits to a hand. Twelve digits equaled a small span and 14 digits equaled as large span. Twenty-four digits was a small cubit. The digit was in turn subdivided into parts, the smallest being 1/16 part of a digit or 1/448 of a royal cubit. The accuracy of the royal cubit and the Egyptian system of length standards is realized in their Great Pyramid's. The lengths of the bases of the sides of the Great Pyramid of Giza vary by less that 0.05 percent from the mean length of 230.364 meters, a truly remarkable feat.

    The historical progression of units, on the European continent at least, has followed a generally westward direction. The units of the ancient nations, such as Egypt, traveled most likely as the result of trade to the Greek and then the Roman empires, then to Britain via the Roman conquest and finally to America.

    Today, the total standardization of measurement throughout other world still has not be realized. However, in the industrial world, systems of metrology conform to either the inch or metric systems whose basis are precisely defined primary measurement standards. The standardization of measurement allows individual countries to produce and consume goods and services in the world economy, and exchange important technology and scientific information worldwide; an obvious benefit to all mankind.

Bibliography

The American Society of Mechanical Engineers. Precision Inch Gage Blocks for Length Measurement (Through 20 inches). New York: The American Society of Mechanical Engineers, 1974.

This small book specifies the American National Standard for gage blocks up to and including 20" in length. This standard is of critical importance to industry and commerce since gage blocks are the most widely used transfer standards for length measurement. This book specifies the physical properties, tolerance grades, flatness, parallelism, and surface texture requirements for gage blocks.

American Society of Tool and Manufacturing Engineers. Handbook of Industrial Metrology. Englewood Cliffs, NJ: Prentice-Hall Inc, 1967.

This book is an extensive reference on principles, techniques, and instrumentation design and applications for physical measurement in the manufacturing industries. Topics ranging from mathematical concepts of metrology and general principles of measurement to methods of measuring gear and screw thread forms are discussed. Contains many illustrations and photographs of measuring instruments.

Cochrane, Rexmond C. Measurement For Progress: A History of the National Bureau Of Standards. Washington D.C.: U.S. Department Of Commerce, 1966.

A historical account of the role of the National Bureau of Standards in developing and providing measurement standards. Gives many examples of the uses of standards in commerce and industry. This book also includes a very informative appendix that describes the French origin of the metric system and some details of the construction of the 1889 meter and kilogram primary standards.

Farago, Francis T. Handbook of Dimensional Measurement. 2'nd edition. New York: Industrial Press Inc, 1982.

A comprehensive and readable guide to advanced dimensional measurement technology. This book contains both a theoretical discussion and practical information relating to the tools and techniques of length measurement.

Kibbe, Richard R., et. al. Machine Tool Practices. New York: John Wiley & Sons, 1979.

This is an introductory textbook for beginning machinists. Section C of this text, entitled "Dimensional Measurement", discusses the use of various measuring instruments, such as: steel rules, vernier calipers, micrometer instruments, and gage blocks in a manufacturing or production context.

Sydenham, P.H. Measuring instruments: tools of knowledge and control. Stevenage, UK: Peter Peregrinus Ltd., 1979.

A largely historical account of the design, development, construction, and use of measuring instruments by man. This book traces this subject from ancient times to the mid-twentieth century.

Zebrowski, E. Jr. Fundamentals of Physical Measurement. North Seituate, MA: Duxbury Press, 1979.

A concise introduction to the fundamental concepts of units and standards in relation to measurement theory.

Copyright 1993 Ben Shaevitz and Salem Press


Home to Physics
Comments to Dr. Ben A. Shaevitz

This page has been accessed times since January 18, 1999.
Last updated on January 18, 1999.