Until the last minute, both sides campaigned in Silicon Valley. On Sunday afternoon, September 29, Governor Gavin Newsom made his decision: he vetoed a bill that sought to impose the strictest safeguards on large artificial intelligence models in the United States.
The text SB 1047, or “Secure Innovation in Pioneering AI Models Act”, had divided the world of technology, as well as the Californian political class. Although it was not as demanding as European regulations, it made artificial intelligence companies legally responsible for damages caused by their models.
Required that they include a “kill switch” (off switch) to disable its systems if they become uncontrollable and cause significant damage, such as massive loss of life or property damage exceeding $500 million (€448 million). Elon Musk declared himself in favor, unlike Mark Zuckerberg (Meta), Sam Altman (OpenAI) or the pro-Trump venture capitalist Marc Andreessen.
Fear of “a chilling effect on the industry”
Like Musk, several historical AI figures, such as Geoffrey Hinton and Yoshua Bengio, approved the text. “We believe that more powerful AI models could soon pose serious risks, such as increased access to biological weapons and cyberattacks on critical infrastructure”Hinton wrote in a letter addressed to the governor and co-signed by dozens of current and former AI industry employees.
During software giant Salesforce’s annual conference on September 24 in San Francisco, Newsom expressed skepticism. Usually quick to praise the cutting-edge initiatives taken by California, particularly on climate or consumer protection, he lamented that the AI safety bill has replaced the federal debate on the issue.
He had also expressed the fear of having “a deterrent effect for the industry”. “We dominate this area and I don’t want to lose that advantage.” he said, noting that thirty-two of the world’s largest artificial intelligence companies were located in California. When it comes to the risks posed by AI, he said he tries to stick to those that are demonstrable rather than those that are hypothetical. The bill targeted systems that required more than $100 million to build. No model has reached this threshold, but pessimists believe it could be reached quickly.
You have 44.64% of this article left to read. The rest is reserved for subscribers.