The retaining ring was invented in Germany nearly 100 years ago and naturally followed the DIN metric standard since its inception. When the retaining ring was first produced in the U.S. a few decades later, it conformed to the inch standard. The U.S. engineers who conceived the inch version considered their design an improvement over the German DIN metric design. However, a series of events in the years to come proved that there was more value in offering both standards, if the U.S. was to meet worldwide demand for retaining rings.
In 1917, Hugo Heiermann worked on cylinders, pistons, and wrist pins used primarily in railroad locomotives for a company in his native Germany. He noticed that securing the wrist pin to the piston to prevent axial movement was a challenge. Traditional methods of accomplishing this often failed causing scoring and damage to the cylinder wall.
Heiermann believed he had a solution. In 1927, he submitted a patent for a device to be used “…in such a manner that the axial displacement thereof will be permanently prevented.” His solution was a spring ring inserted into a groove “…in such a manner that the said spring ring projects with its end surface to such an extent beyond the annular groove that the wide end surface of the ring forms a sure abutment for the bolt, pin or the like which is to be locked in position.” The patent was granted in May 1930 and the device he proposed became known as a retaining ring, which gained popular acceptance throughout Europe during the next few decades.
The retaining ring was always manufactured to DIN metric standards, since it was virtually unknown in the U.S. The original metric design called for a thicker ring that seated in a shallow groove. The rationale for this was that an unexpected overload would dislodge the ring from the groove without damaging the shaft or housing.
Then fate interceded and rings underwent a transformation. According to the industry story, the U.S. military had noticed the use of retaining rings in military equipment they encountered on the battlefield during World War II. Fascinated by the technology and eager to apply it to their own equipment, the U.S. military sought to set up a producer of retaining rings.
Waldes Kohinoor was already manufacturing bombtail fuses, 20mm anti-aircraft projectiles, zippers, and other equipment for the American war effort at its Long Island City, N.Y., facility. The military persuaded the company to take on the project and in 1942, it successfully produced the tooling needed to manufacture a line of retaining rings under the Waldes Truarc brand.
The company sought to improve on the design which would now be made to inch standards. Unlike their German counterparts, Waldes engineers sought to make a thinner ring that would fit in a narrower, deeper groove. The logic was that in the event of an overload, the shaft or housing would fail first, thus minimizing damage to the retained components.
Waldes engineers believed they were improving on the original DIN design by making the ring seat into a deeper groove. Obviously, there was contention between the two schools of thought. The issue was brought to a head in the 1970s, when the U.S. was in the middle of an effort to introduce metric standards to align our industries with the rest of the world.
Between 1974 and 1976, three main U.S. producers of retaining rings held a series of meetings in New York City to decide how to meet the competitive challenge of the metric system as it related to retaining rings. The group realized it was faced with two choices: Either support our inch system using “soft conversions” (for example, a 25mm ring is very close to a 1 inch ring) or capitulate and adopt the metric system.
By going the soft conversion route, the three competitors calculated they would be able to save 75% of the tooling used to produce the rings. So the companies decided to create a line of retaining rings which were essentially those inch rings that converted satisfactorily to metric dimensions. This became the ANSI Metric line, consisting of the MHO, ME, and MC series and seemed to embrace the best of both worlds: a thinner ring seated in a deeper groove that conformed to metric dimensions and tolerances.
Although Robert Slass, President and owner of Rotor Clip at the time, agreed to the ANSI metric line, he didn’t feel it was sufficient to satisfy future customer requirements. The ANSI metric line was limited and only offered three types of retaining rings; worldwide industries were accustomed to DIN metric standards. Plus, Slass knew you couldn’t serve these industries without a DIN metric line of retaining rings.
The company decided to give customers an alternative and let them choose. He began tooling an entire DIN metric line. It was quite a gamble at the time, but it paid off in the long run. Regardless of the standard, the design engineer has at his disposal the right ring for the application whether it needs to conform to DIN metric, ANSI metric or inch standards.
Guest blog by Joe Cappello, Director of Global Marketing, Rotor Clip.
Roger Davies says
Cut my engineering teeth in the early ’70’s. Had we pushed harder to convert to metric then, this discussion would be mute today. No? That said, the part of the “global” thing that really gets me is, why should WE have to comply with Europe when it comes to something like GD&T symbology? I’m a firm believer in, IF IT AIN’T BROKE DON’T FIX IT! There was NOTHING wrong with our symbology! Why did we have to change it!? But, that’s MHO. 🙂
William K. says
The metric system uses units sized to be mathmaticly convenient, which would possibly be OK if it also allowed for useful ratios of sizes and strengths. But that was never considered in the metric system.
That is why the inch-based system is superior.
And if our country had posessed the courage to forbid the importing of metric based cars and trucks, we may have been able to win the fight. But a bunck of mecanically ignorant lawmakers just did not understand. They still don’t understand.
Robert Price, C.Mfg.Engr. says
If you read some history on the metric system you will find that its basic premise, that it is one millionth of the distance from the geographic north pole to the center of Paris you will find that the two guys who were commissioned to make the measurements were not exactly the best choice for the job. The point being that the so-called Imperial system has always been criticized by the Europeans as being based on such foolish standards as the length of six barley corns or the middle joint of a King’s thumb. Since the French guys didn’t come anywhere near the correct value, the meter is equally flawed.
And the other argument that the meter is “decimalized” is also a red herring. I have been engineering and designing machinery for over 40 years and all the detail drawings are based on the decimal inch. There are no fractions displayed on the dials of a vertical spindle milling machine or the cross slide of a lathe. And a CNC controller doesn’t care what units you choose it will convert your units to bits no matter what you use.
Erik says
Wow, I cannot believe the whining after 40 years. The US (and to a lesser extent, the UK) are the only industrialized countries still making heavy use of the Imperial system (well, to be more precise, two significantly different Imperial systems – remember those strange gallons of gas in Canada pre-1970’s?). And there are far more people living in an SI world – that’s why the US has to be able to speak SI. That is, unless you want to limit your sales to the USA (and maybe Canada).
There’s more to measurement systems than inches vs mm…
Sure machine tools use decimal inches, but where is the convenience in describing the height of a 1/4 dipole AM radio mast in inches, or the attenuation in a fiber-optic cable in dB/inch? Oh, you have to use sane conversion factors like 12, 36, 5280, or 63360 to get more reasonable units? Sorry, I’ll just work that out in my head. Give me a second…
Don’t get me started on Imperial volume measurements.
What about wire/sheet gauge. Is that British or American gauge?
Quick, what’s the clearance drill for a #3 screw? (hey, I know that’s an oddball size, but I figure enough people have the even sizes memorized)
PCB copper thickness described in ozCu/ft2? See how fast you can compute the current-carrying capacity of a 10mil 0.5oz PCB trace without some reference books and some serious buddy-time with a calculator.
You cannot completely get away from “fudge factors” irrespective of how you choose your measurement system (anything you choose is just a human construct, after all), but the SI system sure has a lot fewer of them. That’s why it’s more popular. IMHO.