October 16, 2024

Sam Altman’s ouster shows Biden isn’t handling AI properly

What happened with Sam Altman behind the scenes? Neither OpenAI nor Microsoft have provided all the answers. The industry deserves more congressional scrutiny.

CbatGPT developer OpenAI announced last week that it had fired CEO Sam Altman due to a loss of confidence by the board — only to see him return to the company after 90% of OpenAI staffers threatened to resign. The firing caused a flurry of excitement from companies offering to match OpenAI salaries in an attempt to lure top-tier talent.

The debacle — and its associated lack of transparency — highlighted the need to regulate AI development, particularly when it comes to security and privacy. Companies are developing their artificial intelligence divisions rapidly and a reshuffling of talent could propel one company ahead of others and existing laws. While President Joe Biden has taken steps to that effect, he has been relying on executive orders, which do not require input from Congress. Instead, they rely on agency bureaucrats to interpret them — and could change when a new president is inaugurated.

Biden this year signed an executive order related to the “safe, secure, and trustworthy artificial intelligence.” It commanded AI companies to “protect” workers from ‘harm,’ presumably in reference to the potential loss of their jobs. It also tasked the Office of Management and Budget (OMB) and Equal Employment Opportunity Commission (EEOC) with, in part, establishing governing structures within federal agencies. It also asked the Federal Trade Commission (FTC) to self-evaluate and determine whether it has the authority “to ensure fair competition in the AI marketplace and to ensure that consumers and workers are protected from harms that may be enabled by the use of AI.”

Biden’s executive orders are not going to last long

The fundamental problem with an approach driven by executive fiat is its fragility and limited scope. As evident by the SEC and CFTC’s (largely unsuccessful) attempts to classify cryptocurrencies as securities, tasking agencies with promulgating laws can cause confusion and apprehension amongst investors, and are ultimately open to interpretation by the courts.

Related: WSJ debacle fueled US lawmakers’ ill-informed crusade against crypto

Policies developed by agencies without legislative support also lack permanence. While public input is necessary for the passing of agency-backed regulations, the legislative process allows consumers of artificial intelligence and digital assets to have a stronger voice and assist with the passage of laws that deal with actual problems users face — instead of problems invented by often ambitious bureaucrats.

Biden’s failure to address the complex ethical implications of AI implementation on a mass scale is dangerous; concerns such as bias in algorithms, surveillance and privacy invasion are barely being addressed. Those issues should be addressed by Congress, made up of officials elected by the people, rather than agencies composed of appointees.

Related: 3 theses that will drive Ethereum and Bitcoin in the next bull market

Without the rigorous debate required for Congress to pass a law, there is no guarantee of a law that promotes security and privacy for everyday users. Specifically, users of artificial intelligence need to have control over how this automated technology uses and stores personal data. This concern is particularly acute in the field of AI, where many users fail to understand the underlying technology and the severe security concerns that come with sharing personal information. Furthermore, we need laws that ensure companies are conducting risk assessments and maintaining their automated systems in a responsible manner.

Reliance on regulations enacted by federal agencies will ultimately lead to confusion — consumers distrusting artificial intelligence. This precise scenario played out with digital assets after the SEC’s lawsuits against Coinbase, Ripple Labs, and other crypto-involved institutions, which made some investors apprehensive about their involvement with crypto companies. A similar scenario could play out in the field of AI where the FTC and other agencies sue AI companies and tie vital issues up in the court system for years ahead.

It’s imperative that Biden engage Congress on these issues instead of hiding behind the executive  branch. Congress, in turn, must rise to the occasion, crafting legislation that encapsulates the concerns and aspirations of a diverse set of stakeholders. Without such collaborative efforts, the United States risks repeating the pitfalls experienced in the digital assets domain, potentially lagging behind other nations and driving innovation elsewhere. More importantly, the security and privacy of American citizens — as well as many around the globe — is in jeopardy.

John Cahill is an associate in national law firm Wilson Elser’s White Plains, N.Y., office. John focuses his practice on digital assets, and ensures that clients comply with current and developing laws and regulations. He received a B.A. from St. Louis University and a J.D. from New York Law School.

This article is for general information purposes and is not intended to be and should not be taken as legal or investment advice. The views, thoughts, and opinions expressed here are the author’s alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.

Please enter CoinGecko Free Api Key to get this plugin works.