• Home
  • News
  • Coins2Day 500
  • Tech
  • Finance
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
TechAI

Microsoft whistleblower sounds alarm on offensive, harmful imagery generated with help of OpenAI tool

By
Matt O'Brien
Matt O'Brien
and
The Associated Press
The Associated Press
Down Arrow Button Icon
By
Matt O'Brien
Matt O'Brien
and
The Associated Press
The Associated Press
Down Arrow Button Icon
March 6, 2024, 3:08 PM ET
tMicrosoft Copilo
A Copilot page showing the incorporation of AI technology is shown in London, Tuesday, Feb. 13, 2024.AP Photo/Alastair Grant, File

A Microsoft engineer is sounding alarms about offensive and harmful imagery he says is too easily made by the company’s artificial intelligence image-generator tool, sending letters on Wednesday to U.S. Regulators and the tech giant’s board of directors urging them to take action.

Shane Jones told The Associated Press that he considers himself a whistleblower and that he also met last month with U.S. Senate staffers to share his concerns.

The Federal Trade Commission confirmed it received his letter Wednesday but declined further comment.

Read more: I’m a whistleblower and have been called a snitch, rat, and traitor. What about hero?

Microsoft said it is committed to addressing employee concerns about company policies and that it appreciates Jones’ “effort in studying and testing our latest technology to further enhance its safety.” It said it had recommended he use the company’s own “robust internal reporting channels” to investigate and address the problems. CNBC was first to report about the letters.

Jones, a principal software engineering lead, said he has spent three months trying to address his safety concerns about Microsoft’s Copilot Designer, a tool that can generate novel images from written prompts. The tool is derived from another AI image-generator, DALL-E 3, made by Microsoft’s close business partner OpenAI.

“One of the most concerning risks with Copilot Designer is when the product generates images that add harmful content despite a benign request from the user,” he said in his letter addressed to FTC Chair Lina Khan. “For example, when using just the prompt, ‘car accident’, Copilot Designer has a tendency to randomly include an inappropriate, sexually objectified image of a woman in some of the pictures it creates.”

Other harmful content involves violence as well as “political bias, underaged drinking and drug use, misuse of corporate trademarks and copyrights, conspiracy theories, and religion to name a few,” he told the FTC. His letter to Microsoft urges the company to take it off the market until it is safer.

This is not the first time Jones has publicly aired his concerns. He said Microsoft at first advised him to take his findings directly to OpenAI, so he did.

He also publicly posted a letter to OpenAI on Microsoft-owned LinkedIn in December, leading a manager to inform him that Microsoft’s legal team “demanded that I delete the post, which I reluctantly did,” according to his letter to the board.

In addition to the U.S. Senate’s Commerce Committee, Jones has brought his concerns to the state attorney general in Washington, where Microsoft is headquartered.

Jones told the AP that while the “core issue” is with OpenAI’s DALL-E model, those who use OpenAI’s ChatGPT to generate AI images won’t get the same harmful outputs because the two companies overlay their products with different safeguards.

“Many of the issues with Copilot Designer are already addressed with ChatGPT’s own safeguards,” he said via text.

A number of impressive AI image-generators first came on the scene in 2022, including the second generation of OpenAI’s DALL-E 2. That — and the subsequent release of OpenAI’s chatbot ChatGPT — sparked public fascination that put commercial pressure on tech giants such as Microsoft and Google to release their own versions.

But without effective safeguards, the technology poses dangers, including the ease with which users can generate harmful “deepfake” images of political figures, war zones or nonconsensual nudity that falsely appear to show real people with recognizable faces. Google has temporarily suspended its Gemini chatbot’s ability to generate images of people following outrage over how it was depicting race and ethnicity, such as by putting people of color in Nazi-era military uniforms.

Coins2Day Brainstorm AI returns to San Francisco Dec. 8–9 to convene the smartest people we know—technologists, entrepreneurs, Coins2Day Global 500 executives, investors, policymakers, and the brilliant minds in between—to explore and interrogate the most pressing questions about AI at another pivotal moment. Register here.
About the Authors
By Matt O'Brien
See full bioRight Arrow Button Icon
By The Associated Press
See full bioRight Arrow Button Icon
Rankings
  • 100 Best Companies
  • Coins2Day 500
  • Global 500
  • Coins2Day 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Coins2Day Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Coins2Day Brand Studio
  • Coins2Day Analytics
  • Coins2Day Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Coins2Day
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Coins2Day Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Coins2Day Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.