Mechanical Testing of Steel
Many tests are available to evaluate the mechanical properties of steel. This section summarizes some laboratory tests commonly used to determine properties required in product specifications. Test specimens can take several shapes, such as bar, tube, wire, flat section, and notched bar, depending on the test purpose and the application. Certain methods of fabrication, such as bending, forming, and welding, or operations involving heating, may affect the properties of the material being tested.
Therefore, the product specifications cover the stage of manufacture at which mechanical testing is performed. The properties shown by testing before the material is fabricated may not necessarily be representative of the product after it has been completely fabricated. In addition, flaws in the specimen or improper machining or preparation of the test specimen will give erroneous results (ASTM A370).
Tension Test of Steel
The tension test (ASTM E8/E8M) on steel is performed to determine the yield strength, yield point, ultimate (tensile) strength, elongation, and reduction of area. Typically, the test is performed at temperatures between 10°C and 35°C.
The test specimen can be either full sized or machined into a shape, as prescribed in the product specifications for the material being tested. It is desirable to use a small cross-sectional area at the center portion of the specimen to ensure fracture within the gauge length. Several cross-sectional shapes are permitted, such as round and rectangular, as shown in Figure 1.
Plate, sheet, round rod, wire, and tube specimens may be used. A 12.5 mm diameter round specimen is used in many cases. The gauge length over which the elongation is measured typically is four times the diameter for most round-rod specimens. Various types of gripping devices may be used to hold the specimen, depending on its shape.
In all cases, the axis of the test specimen should be placed at the center of the testing machine head to ensure axial tensile stresses within the gauge length without bending. An extensometer with a dial gauge or an LVDT is used to measure the deformation of the entire gauge length. The test is performed by applying an axial load to the specimen at a specified rate.
Figure 2 shows a tensile test being performed on a round steel specimen using an LVDT extensometer to measure the deformation. Mild steel has a unique stress–strain relationship (Figure 3).
Here, a linear elastic response is displayed up to the proportion limit. As the stress is increased beyond the proportion limit, the steel will yield, at which time the strain will increase without an increase in stress (actually the stress will slightly decrease). As tension increases past the yield point, strain increases following a nonlinear relation up to the point of failure.
Note that the decrease in stress after the peak does not mean a decrease in strength. In fact, the actual stress continues to increase until failure. The reason for the apparent decrease is that a neck is formed in the steel specimen, causing an appreciable decrease in the cross-sectional area.
The traditional, or engineering, way of calculating the stress and strain uses the original cross-sectional area and gauge length. If the stress and strain are calculated based on the instantaneous cross-sectional area and gauge length, a true stress–strain curve is obtained, which is different than the engineering stress–strain curve (Figure 3).
As shown in Figure 3, for small strain levels the true stress and strain are similar to the engineering stress and strain; the true stress is slightly greater than the engineering stress and the true strain is slightly less than the engineering strain. However, as the strain level increases, especially as the neck is formed, the true stress becomes much larger than the engineering stress, because of the reduced cross-sectional area at the neck.
The necking also causes the true strain to be larger than the engineering strain, since the increase in length at the vicinity of the neck is much larger than the increase in length outside of the neck. The large increase in length at the neck increases the true strain to a large extent because the definition of true strain utilizes a ratio of the change in length in an infinitesimal gauge length.
By decreasing the gauge length toward an infinitesimal size and increasing the length due to localization in the neck, the numerator of an expression is increased while the denominator stays small, resulting in a significant increase in the ratio of the two numbers.
Note that when calculating the true strain, a small gauge length must be used at the neck, since the properties of the material (such as the cross section) at the neck represent the true material properties. For various practical applications, however, the engineering stresses and strains are used, rather than the true stresses and strains.
Different carbon-content steels have different stress–strain relationships. Increasing the carbon content in the steel increases the yield stress and reduces the ductility.
Figure 4 shows the tension stress–strain diagram for hot-rolled steel bars containing carbons from 0.19% to 0.90%. Increasing the carbon content from 0.19% to 0.90% increases the yield stress from 280 to 620 MPa. Also, this increase in carbon content decreases the fracture strain from about 0.27 to 0.09 m/m. Note that the increase in carbon content does not change the modulus of elasticity.
Steel is generally assumed to be a homogeneous and isotropic material. However, in the production of structural members, the final shape may be obtained by cold rolling. This essentially causes the steel to undergo plastic deformations, with the degree of deformation varying throughout the member.
Plastic deformation causes an increase in yield strength and a reduction in ductility. Figure 5 demonstrates that the measured properties vary, depending on the orientation of the sample relative to the axis of rolling. Thus, it is necessary to specify how the sample is collected when evaluating the mechanical properties of steel.
Torsion Test of Steel
The torsion test (ASTM E143) is used to determine the shear modulus of structural materials. The shear modulus is used in the design of members subjected to torsion, such as rotating shafts and helical compression springs. In this test, a cylindrical, or tubular, specimen is loaded either incrementally or continually by applying an external torque to cause a uniform twist within the gauge length.
The amount of applied torque and the corresponding angle of twist are measured throughout the test. Figure 6 shows the shear stress–strain curve. The shear modulus is the ratio of maximum shear stress to the corresponding shear strain below the proportional limit of the material, which is the slope of the straight line between R (a pre-torque stress) and P (the proportional limit).
For a circular cross section, the maximum shear stress (tmax), shear strain (γ), and the shear modulus (G) are determined by the equations
Where T = torque, r = radius,
J = polar moment of inertia of the specimen about its center, πr4/2 for a solid circular cross section.
θ = angle of twist in radians, L = gauge length.
The test method is limited to materials and stresses at which creep is negligible compared with the strain produced immediately upon loading. The test specimen should be sound, without imperfections near the surface.
Also, the specimen should be straight and of uniform diameter for a length equal to the gauge length plus two to four diameters. The gauge length should be at least four diameters.
During the test, torque is read from a dial gauge or a readout device attached to the testing machine, while the angle of twist may be measured using a torsiometer fastened to the specimen at the two ends of the gauge length. A curve-fitting procedure can be used to estimate the straightline portion of the shear stress–strain relationship of Figure 6 (ASTM E143).
Charpy V Notch Impact Test of Steel
The Charpy V Notch impact test (ASTM E23) is used to measure the toughness of the material or the energy required to fracture a V-notched simply supported specimen. The test is used for structural steels in tension members. The standard specimen is 55 x 10 x 10 mm with a V notch at the center of one side, as shown in Figure 7.
Before testing, the specimen is brought to the specified temperature for a minimum of 5 minutes in a liquid bath or 30 minutes in a gas medium. The specimen is inserted into the Charpy V notch impact-testing machine (Figure 8) using centering tongs.
The swinging arm of the machine has a striking tip that impacts the specimen on the side opposite the V notch. The striking head is released from the pretest position, striking and fracturing the specimen.
By fracturing the test specimen, some of the kinetic energy of the striking head is absorbed, thereby reducing the ultimate height the strike head attains. By measuring the height the strike head attains after striking the specimen, the energy required to fracture the specimen is computed. This energy is measured in m . N as indicated on a gauge attached to the machine.
The lateral expansion of the specimen is typically measured after the test using a dial gauge device. The lateral expansion is a measure of the plastic deformation during the test. The higher the toughness of the steel, the larger the lateral expansion.
Figure 9, shows the typical energy (toughness) required to fracture structural steel specimens at different temperatures. The figure shows that the required energy is high at high temperatures and low at low temperatures.
This indicates that the material changes from ductile to brittle as the temperature decreases. The fracture surface typically consists of a dull shear area (ductile) at the edges and a shiny cleavage area (brittle) at the center, as depicted in Figure 10. As the toughness of the steel decreases, due to lowering the temperature, for example, the shear area decreases while the cleavage area increases.
Bend Test of Steel
In many engineering applications, steel is bent to a desired shape, especially in the case of reinforcing steel. The ductility to accommodate bending is checked by performing the semi-guided bend test (ASTM E290). The test evaluates the ability of steel, or a weld, to resist cracking during bending. The test is conducted by bending the specimen through a specified angle and to a specified inside radius of curvature.
When complete fracture does not occur, the criterion for failure is the number and size of cracks found on the tension surface of the specimen after bending The bend test is made by applying a transverse force to the specimen in the portion that is being bent, usually at mid-length.
Three arrangements can be used, as illustrated in Figure 11. In the first arrangement, the specimen is fixed at one end and bent around a reaction pin or mandrel by applying a force near the free end, as shown in Figure 11(a).
In the second arrangement, the specimen is held at one end and a rotating device is used to bend the specimen around the pin or mandrel, as shown in Figure 11(b).
In the third arrangement, a force is applied in the middle of a specimen simply supported at both ends, Figure 11(c).
Hardness Test of Steel
Hardness is a measure of a material’s resistance to localized plastic deformation, such as a small dent or scratch on the surface of the material. A certain hardness is required for many machine parts and tools. Several tests are available to evaluate the hardness of materials.
In these tests, an indenter (penetrator) is forced into the surface of the material with a specified load magnitude and rate of application. The depth, or the size, of the indentation is measured and related to a hardness index number. Hard materials result in small impressions, corresponding to high hardness numbers.
Hardness measurements depend on test conditions and are, therefore, relative. Correlations and tables are available to convert the hardness measurements from one test to another and to approximate the tensile strength of the material (ASTM A370). One of the methods commonly used to measure hardness of steel and other metals is the Rockwell hardness test (ASTM E18).
In this test, the depth of penetration of a diamond cone, or a steel ball, into the specimen is determined under fixed conditions . A preliminary load of 10 kg is applied first, followed by an additional load. The Rockwell number, which is proportional to the difference in penetration between the preliminary and total loads, is read from the machine by means of a dial, digital display, pointer, or other device.
Two scales are frequently used, namely, B and C. Scale B uses a 1.588 mm steel ball indenter and a total load of 100 kg, while scale C uses a diamond spheroconical indenter with a 120° angle and a total load of 150 kg. To test very thin steel or thin surface layers, the Rockwell superficial hardness test is used. The procedure is the same as the Rockwell hardness test except that smaller preliminary and total loads are used.
The Rockwell hardness number is reported as a number, followed by the symbol HR, and another symbol representing the indenter and forces used. For example, 68 HRC indicates a Rockwell hardness number of 68 on Rockwell C scale. Hardness tests are simple, inexpensive, nondestructive, and do not require special specimens.
In addition, other mechanical properties, such as the tensile strength, can be estimated from the hardness numbers. Therefore, hardness tests are very common and are typically performed more frequently than other mechanical tests.
Ultrasonic Testing of Steel
Ultrasonic testing is a nondestructive method for detecting flaws in materials. It is particularly useful for the evaluation of welds. During the test, a sound wave is directed toward the weld joint and reflected back from a discontinuity.
A sensor captures the energy of the reflected wave and the results are displayed on an oscilloscope. This method is highly sensitive in detecting planar defects, such as incomplete weld fusion, delamination, or cracks.