Encode, the nonprofit that co-sponsored California’s ill-fated AI safety law SB 1047, seeks permission to file a court brief in support of Elon Musk’s injunction to block OpenAI from becoming a for-profit company. Ta.
In a draft brief filed Friday afternoon in the U.S. District Court for the Northern District of California, Encode’s lawyers said OpenAI’s transformation into a for-profit company would undermine the company’s mission to “develop and deploy innovative technology.” He said it would be “damaging.” It is safe and beneficial to the public. ”
“OpenAI and its CEO Sam Altman claim to be developing technology that will transform society, and that claim should be taken seriously,” the brief reads. “If the world is truly on the cusp of a new era of artificial general intelligence (AGI), then the people should control that technology not by organizations but by legally mandated public charities that prioritize safety and the public interest. The focus is on delivering economic benefits to a small number of privileged investors, who will have a deep interest in ”
OpenAI was founded in 2015 as a nonprofit research organization. But as that experiment became increasingly capital-intensive, it accepted outside investment from venture capital firms and companies including Microsoft to build its current structure.
Currently, OpenAI has a hybrid structure. The for-profit side is controlled by nonprofit organizations, with “profit caps” set for investors and employees. However, in a blog post this morning, the company announced plans to begin converting its existing for-profit corporation into a Delaware Public Benefit Corporation (PBC), making its common stock and OpenAI mission a public benefit.
OpenAI will remain a nonprofit organization, but will transfer management rights in exchange for PBC stock.
Musk, an early contributor to the original nonprofit, filed a lawsuit in November seeking to block the changes he had been working on for years. He accused OpenAI of abandoning its original philanthropic mission of making the fruits of AI research available to all and using anticompetitive means to steal capital from rival companies, including his AI startup xAI. accused of being
OpenAI said Musk’s complaints were “baseless” and nothing more than sour grapes.
Meta, Facebook’s parent company and AI rival, is also supporting efforts to block OpenAI’s transformation. Mehta sent a letter to California Attorney General Rob Bonta in December arguing that allowing the transition would have a “seismic impact on Silicon Valley.”
Encode’s lawyers argued that the plan to transfer OpenAI’s operational management to PBC would “force an organization bound by law to ensure the security of advanced AI to ‘balance public interest’ and ‘monetary’ considerations.” “We will transform it into an organization that is bound by the law in order to ensure that the (its) shareholder interests. ”
For example, Encode’s lawyers said in a brief that OpenAI’s nonprofit would stop competing with any “values-driven, safety-focused projects” that come close to building AGI before it becomes a reality. promises, but commercial OpenAI says it will. There will be less incentive (if any) to do so. The brief also notes that the nonprofit OpenAI’s board of directors will no longer be able to cancel investors’ shares if necessary for security reasons once the company’s restructuring is complete.
OpenAI continues to lose high-level talent amid concerns that it is prioritizing commercial products at the expense of safety. One former employee, Miles Brundage, a longtime policy researcher who left OpenAI in October, wrote in a series of posts about X that OpenAI’s nonprofit organization would operate as a “normal company” to PBC. He said he was concerned that it would become “ancillary” to granting a license. without addressing potentially problematic areas.
“OpenAI’s touted fiduciary duty to humanity will disappear, as Delaware law makes clear that PBC’s directors have no duty to the public,” Encode’s brief continued. . “For safety-minded, mission-bound nonprofit organizations to relinquish control of change-at-all-costs to for-profit companies with no safety enforcement is harming the public interest. It will be.”
Founded in July 2020 by high school student Sneha Revanur, Encode describes itself as a network of volunteers focused on ensuring the voices of young people are heard in conversations about the impact of AI. Masu. In addition to SB 1047, Encode has contributed to a variety of state and federal AI legislation, including the White House’s AI Bill of Rights and President Joe Biden’s Executive Order on AI.
TechCrunch has a newsletter focused on AI. Sign up here to get it delivered to your inbox every Wednesday.