Nebula XAI

Experience the future artificial intelligence

Microsoft Uses AI Image Generator to Create an Ad… and Deletes It After People Notice

Microsoft Uses AI Image Generator for Advertisement…
…and Deletes It After People Notice

Microsoft posted an image on X (formerly known as Twitter) to promote indie games. The publisher wrote “walking in an indie wonderlaaand” and asked players to name their favorite indie titles.

The picture showed children in a snowy, wintry landscape. However, careful observers quickly noticed deformed faces and perspective errors. The reason: The image was created by an artificial intelligence.

AI image generators, mostly based on Stable Diffusion, are all the rage, but they are also heavily criticized due to various problems. Many legal issues regarding the technology are still unclear. Companies like StabilityAI, Microsoft, or Midjourney have been sued several times, and the court hearings are still ongoing. In both Germany and the US, AI-generated images are denied any copyright protection that would normally apply.

Artists worldwide rightfully accuse the companies behind the technology of committing millions of copyright infringements, as the programs are only capable of creating “new” images if they are trained with millions (or in many cases even billions) of copyrighted images. Artists and designers claim, not without justification, that their own works are being used against them, as the AIs can imitate the styles of any artist on request. More and more companies are using inexpensive AI generators to save costs and are laying off human artists as a result. Billion-dollar corporations are becoming even wealthier through the use of AI, while freelance artists are losing their livelihood.

However, the use of AI image generators is also considered an ethical nightmare for other reasons. The world’s largest dataset for training AIs, LAION-5B (whose creator LAION is already being criticized as a “non-profit association” for its perfidious practices), was only recently taken offline because Stanford University researchers found more than 3000 images with child pornographic content in it. Previously, images of rape and ISIS executions as well as private medical records had been detected. All these images now serve as the basis for AI-generated images with comparable content.

See also  Microsoft Unveils “GPT-4 Turbo” Model for Copilot

The technology is also not exactly environmentally friendly. According to a report by MIT Technology Review, generating a single AI image requires as much energy as charging an entire smartphone battery.

Users on Twitter reacted to Microsoft’s ad with comments like “My favorite indie game was “paying actual artists instead of pushing horrific AI slop you fucking leeches“ (@matt_bii) or „Nothing says ‘we don’t care about indie developers’ like using AI. If you can’t hire an artist to do advertising, I highly doubt you’ll do it with independent developers.“ (@NecroKuma3). We’ll save you a translation at this point.

And how did Microsoft react? Not at all. However, the image has since been deleted.