Microsoft employee: AI tool should be removed until ‘offensive images' can be addressed
Microsoft Employee Warns AI System Could Generate Harmful Images #
A Microsoft employee has sent a letter to the US Federal Trade Commission, warning that the company’s AI systems, specifically the Copilot Designer tool, may create potentially offensive or inappropriate images. The employee claims that the tool frequently generates sexualized images of women and criticized the company for marketing it as safe, including for children. The employee found over 200 examples of concerning images created by Copilot Designer and urged Microsoft to remove the tool from public use or restrict its marketing to adults. Concerns about AI image generators spreading offensive or misleading images have been growing, as highlighted by recent incidents involving AI-generated pornography and historically inaccurate images.