The letter said OpenAI required employees to sign employee contracts that waived their federal rights to whistleblower compensation. The contracts also required OpenAI employees to obtain the company’s prior consent if they wanted to disclose information to federal authorities. OpenAI did not create an exemption from the employee disparagement clause for disclosing securities violations to the SEC.
The letter said such an overly broad settlement violated longstanding federal laws and regulations designed to protect whistleblowers who wish to disclose sensitive information about their companies anonymously and without fear of retaliation.
“These contracts send a message that ‘we don’t want our employees talking to federal regulators,’” said one whistleblower, who requested anonymity for fear of retaliation. “I don’t think AI companies can build technology that is safe and in the public interest if they shield themselves from scrutiny and dissent.”
be caught
Stories that inform you
“Our whistleblower policy protects our employees’ protected disclosure rights,” OpenAI spokeswoman Hannah Wong said in a statement. “We also believe that rigorous discussion of this technology is essential, and we have already made significant changes to our termination procedures to remove the anti-disparagement clause.”
The whistleblower’s letter comes as OpenAI raises concerns: A nonprofit with an altruistic mission is putting profits over safety when it comes to building the technology. The Post reported Friday that OpenAI rushed to release its latest AI model, which powers ChatGPT, to meet a May launch date set by company executives despite concerns that employees had failed to adhere to its own security testing protocols that the company says keep the AI safe from catastrophic harm, such as teaching users how to create biological weapons or helping hackers develop new kinds of cyberattacks. OpenAI spokeswoman Lindsey Held said in a statement that the company “did not shorten the safety process, but we know the launch has been stressful for the team.”
Tech companies’ strict confidentiality agreements have long plagued workers and regulators. During the #MeToo movement and nationwide protests over the killing of George Floyd, workers have warned that these legal contracts limit their ability to report sexual misconduct or racial discrimination. Regulators, meanwhile, have worried that these terms could stifle tech employees who might otherwise raise concerns about misconduct in an opaque environment. The tech sector has been hit particularly hard by allegations that companies’ algorithms promote content that harms elections, public health, and child safety.
The rapid development of artificial intelligence is becoming sharper Policymakers have voiced concerns about the power of the tech industry and have called for regulation. In the United States, AI companies operate largely in a legal vacuum, and policymakers say they cannot effectively craft new AI policies without the help of whistleblowers who can help explain the potential threats posed by the fast-moving technology.
“OpenAI’s policies and practices appear to have a chilling effect on whistleblowers’ rights and undermine their right to receive adequate compensation for protected disclosures,” Sen. Chuck Grassley (R-Iowa) said in a statement to The Post. “If the federal government is to stay ahead of artificial intelligence, OpenAI’s nondisclosure agreement must change.”
A copy of the letter sent to SEC Chairman Gary Gensler was sent to Congress. The Post obtained the whistleblower’s letter from Grassley’s office.
Formal complaints mentioned The letter was filed with the SEC in June. Stephen Kohn, a lawyer representing the OpenAI whistleblower, said the SEC has responded to the complaint.
It is unclear whether the SEC has launched an investigation. The agency did not respond to a request for comment.
The SEC said in the letter that it must take “swift and aggressive” action to address these illegal agreements, which could have implications for the broader AI sector and violate an October White House executive order requiring AI companies to develop the technology securely.
“Central to these enforcement efforts is the recognition that insiders should have the freedom to report concerns to federal authorities,” the letter said. “Employees are best positioned to detect and warn of the types of risks addressed in the Executive Order, and to ensure that AI works to benefit humanity rather than counterproductively.”
Kohn said the agreements threatened employees with criminal charges if they reported violations of trade secrets to federal authorities. Employees were told to keep company information secret and threatened with “severe sanctions” for not reporting such information to the government, he said.
“We’re very much in the early stages of AI oversight,” Kohn said. “People need to step up, and OpenAI needs to be open.”
The SEC should require OpenAI to write all employment, termination, and investment agreements that include confidentiality provisions to ensure they do not violate federal law, the letter says. Federal regulators should require OpenAI to notify all current and former employees of any violations the company has committed, and to inform them of their right to confidentially and anonymously report violations of the law to the SEC. The SEC should fine OpenAI for each “improper agreement” under SEC law and direct OpenAI to correct the “chilling effect” of its past practices, according to the whistleblower’s letter.
A number of tech employees, including Facebook whistleblower Frances Haugen, have filed complaints with the SEC, which created the whistleblower program after the 2008 financial crisis.
San Francisco lawyer Chris Baker said Silicon Valley’s use of NDAs to “monopolize information” has been a long-running battle. He won a $27 million settlement in December for Google employees over claims the tech giant used strict confidentiality agreements to block whistleblowing and other protective activities. He said tech companies are now increasingly fighting back with clever ways to suppress speech.
“Employers are willing to take risks because they realize the cost of a leak can be much higher than the cost of litigation,” Baker said.