Car insurance began becoming mandatory in the United States in the 1920s, with Massachusetts passing the first compulsory law in 1925, followed by Connecticut, while most other states enacted similar requirements between the 1930s and 1970s, with nearly all states now requiring coverage, except...