“The use of the Drunkometer assures an honest prosecution and eliminates the chance of error under the Drunk Motor Law,” a Detroit police official stated. Commentators often remarked that science had now entered the field. As a New York Times reporter wrote, “Science is replacing guesswork in obtaining evidence in drunken-driving cases.”
Lerner, Barron H. (2011-09-22). One for the Road: Drunk Driving since 1900 (pp. 24-25).
Since the invention of the automobile, police had searched for a device that could relatively inexpensively and scientifically determine whether the person driving was impaired. In part, as Barron Lerner describes in his book, this was a search designed to introduce objectivity into the question of whether the person was impaired. Throughout the history of drunk driving, police have conducted various tests – “field sobriety tests” – that are designed to determine whether someone is impaired based on their speech patterns, actions, or performance on these tests.
For instance, many police officers would have drivers repeat the phrase “Methodist Episcopal” to see if they could hear a drunken lisp or slurring of the speech. But given the subjectivity of these tests, they have always been subject to vigorous challenge by criminal defense lawyers who can show that the person either didn’t perform that poorly, the instructions were given incorrectly, or the tests don’t show what they purport to show.
In the 1910s, a Swedish physician developed a method of having police sample the blood from a driver’s fingertip to determine impairment levels. In the 1930s, an American biochemist developed the Drunkometer, a device into which the driver would blow. The driver’s breath would fill a balloon that would then be connected to a chemicals that would then change color according to the person’s intoxication.
Lerner explains that at an early stage, scientists agreed that relatively low levels of alcohol in the blood could cause impairment, although they exact range was subject to some disagreement. Scandanavian countries established fairly restrictive BAC requirements in the 1930s of between .05 and .08. These were among the first per se laws – meaning that the results of a chemical analysis that showed a BAC of more than a number was grounds for a finding of guilt.
But the United States took a more lenient approach, establishing three ranges and requiring no national standards until the 1980s. The first range of below .05 was basically a “not guilty” range where there would be presumptively no prosecution for driving while impaired.
The second range of of .15 meant that there should be prosecution for DWI.
In the middle range of .05 to .15, prosecution was only warranted if there were definitive other signs of behavior that confirmed the impairment.
This created a wide range within which people who were impaired might otherwise escape prosecution or conviction because of a permissive attitude with respect to how drunk driving should be prosecuted.