Encode, the nonprofit that co-sponsored California’s ill-fated AI security legislation, SB 1047, has requested permission to file an amicus brief in support of Elon Musk’s order to stop OpenAI from being passed to a for-profit company.
In a proposed brief filed in the U.S. District Court for the Northern District of California on Friday afternoon, counsel for Encode said that converting OpenAI to a for-profit would “undermine” the firm’s mission to “develop and deploy … transformative technology in a way that is safe and beneficial to the public.”
“OpenAI and its CEO, Sam Altman, claim to be developing technology to transform society, and these claims should be taken seriously,” the summary said. “If the world is truly on the cusp of a new era of artificial general intelligence (AGI), then the public has a vested interest in that technology being controlled by a public charity legally bound to prioritize safety and public benefit rather than an organization focused on generating financial returns for a privileged few investors.”
In a statement, Sneha Revanur, founder and president of Encode, accused OpenAI of “internalizing (AI’s) profits but externalizing the consequences for all of humanity” and said that ”
Encode’s brief has received support from Geoffrey Hinton, an AI pioneer and 2024 Nobel laureate, and Stuart Russell, a professor of computer science at UC Berkeley and director of the Center for Human Compatible AI.
“OpenAI was founded as a non-profit organization clearly focused on security and made a number of security-related promises in its charter,” Hinton said in a press release. “It received numerous tax and other benefits from its non-profit status. Allowing it to destroy all of this when it becomes inappropriate sends a very bad message to other actors in the ecosystem.”
OpenAI was launched in 2015 as a non-profit research lab. But as his experiments became increasingly capital-intensive, it created its current structure, receiving outside investment from VCs and companies, including Microsoft.
Today, OpenAI has a hybrid structure: a for-profit side controlled by a non-profit organization with a “restricted profit” portion for investors and employees. But in a blog post Friday morning, the company said it plans to begin transitioning its existing for-profit into a Delaware Public Benefit Corporation (PBC), with common stock and the OpenAI mission as its public benefit interest. .
OpenAI’s non-profit organization will remain, but will relinquish control in exchange for shares in PBC.
Musk, an early contributor to the initial nonprofit entity, filed suit in November seeking an injunction to stop the proposed change, which has long been in the works. He accused OpenAI of abandoning its original philanthropic mission to make the fruits of its AI research available to everyone and of depriving rivals of capital — including its own AI startup, xAI — through tools anti-competitive.
OpenAI has called Musk’s complaints “baseless” and just a case of sour grapes.
Facebook’s parent company and AI rival Meta is also backing efforts to block the OpenAI conversion. In December, Meta sent a letter to California Attorney General Rob Bonta arguing that allowing the change would have “seismic implications for Silicon Valley.”
Lawyers for Encode said OpenAI’s plans to transfer control of its operations to a PBC would “convert an organization bound by law to ensure the safety of advanced AI into one bound by law to ‘balance’ consideration of any public benefit against the ‘money’ interests of (its) shareholders.”
For example, Encode’s advisor notes in the brief, for example, that the OpenAI nonprofit has pledged to stop competing with any “value-driven, security-conscious projects” that come close to building AGI before do, but that OpenAI as a for-profit would have less (if any) incentive to do so. The summary also states that the board for the non-profit organization OpenAI will no longer be able to cancel investors’ capital if necessary for safety once the restructuring of the company is completed.
OpenAI continues to experience a drain on top-level talent in part due to concerns that the company is prioritizing commercial products at the expense of security. One former employee, Miles Brundage, a longtime policy researcher who left OpenAI in October, said in a series of posts on X that he worries about OpenAI’s nonprofit becoming a “side thing” that licenses PBC to operate as a “normal company”. ” without addressing potentially problematic areas.
“OpenAI’s protected fiduciary duty to humanity would disappear, as Delaware law is clear that directors of a PBC owe no duty to the public at all,” Encode’s summary continued. “The public interest would be harmed by a safety-focused, mission-constrained non-profit organization relinquishing control of something so transformative at any cost to a for-profit venture with no binding commitment to safety.”
Encode, founded in July 2020 by Revanur, describes itself as a network of volunteers focused on ensuring that the voices of younger generations are heard in conversations about the impacts of AI. Encode has contributed to various pieces of state and federal AI legislation in addition to SB 1047, including the White House AI Bill of Rights and President Joe Biden’s executive order on AI.
Updated December 30, 2024, with statements from Revanur and Hinton.
TechCrunch has an AI-focused newsletter! Register here to receive it in your inbox every Wednesday.