Current and former employees at artificial intelligence (AI) companies Microsoft-backed OpenAI and Alphabet’s Google DeepMind on Tuesday (June 4) raised major AI concerns.
In an open letter by 13 current and former employees of OpenAI and Google DeepMind, the industry insiders say the financial motives of AI companies hinder effective oversight.
“We do not believe bespoke structures of corporate governance are sufficient to change this,” the letter added.
The group warns of major risks from unregulated AI and calls for enhanced transparency and accountability measures within the industry to mitigate potential dangers.
“These risks include deepening existing social inequalities, spreading misinformation, and the possibility of losing control over autonomous AI systems, which could have catastrophic consequences, including human extinction,” the letter reads.
The letter points out that AI companies cannot be relied upon to share information about the capabilities and limitations of their systems, voluntarily.
“Without effective government oversight, current and former employees are among the few capable of holding these companies accountable,” it continues.
To address these challenges, the coalition calls on AI companies to commit to the following principles:
No retaliation for criticism
AI companies should not enter into or enforce agreements prohibiting criticism or “disparagement” related to risk concerns, nor should they retaliate by hindering any vested economic benefits.
Anonymous reporting mechanism
Companies should facilitate a verifiably anonymous process for employees to report risk-related concerns to the company’s board, regulators, and an independent organization with relevant expertise.
Culture of open criticism
AI companies should promote an environment where employees can freely raise concerns about technologies without fear of retaliation, provided trade secrets and intellectual property are protected.
Protection for whistleblowers
Companies should not retaliate against employees who share risk-related confidential information publicly, especially if other reporting processes have failed. An adequate anonymous reporting mechanism should be established, but until it exists, employees should retain the right to public disclosure.
Comments