AI Images: Friend or Foe for Creatives?

“People assume that computers are smarter than people. They aren’t. They are just larger amounts of data running faster,” said Ada Lovelace, an 18th-century mathematician and writer, who is widely considered to be one of the minds behind the Analytical Engine, a theoretical model that predated the modern computer. 

While new AI models can churn out photorealistic imagery at remarkable speeds, machine learning algorithms often fall short in capturing the essence of human creativity and intuition.


Big tech and social media companies like Meta and Alphabet have introduced generative AI tools to enhance advertising strategies for small businesses. These tools aim to produce more effective and persuasive ads by automating tasks like text variation, background generation, and image cropping. Amazon sought to address the costly and challenging task of creating clickable “lifestyle” ad imagery for sellers by using AI to automate certain aspects of product photography. However, in doing so, they inadvertently created a tool that enables sellers to deceive customers by using AI art generators that mimic professional photo shoots. 

While these AI tools may satisfy the algorithms, and increase ad clicks and sales, they lack the informative value of genuine product photos, potentially misleading customers about the product’s appearance and features. 

Moreover, AI-generated art remains deficient in the execution of high-quality campaigns. Text-to-image generators consistently struggle with rendering complex arrangements, establishing a logical depth of field, maintaining symmetry, and establishing contextual coherence, often producing nonsensical or imbalanced compositions. While this may seem inconsequential, the compounded effect of this uncanny imagery reflects a disregard for transparency and authenticity in advertising, emphasizing the need for ethical considerations in AI-generated content creation.

Faster production, lower costs, and more engagement are likely reasons for companies turn to AI. However, reality can be quite different, especially when human oversight and creativity are lacking.

Take the three ads below, all found on the first page of Promoted Results on Amazon’s homepage.

Nonsensical Layouts

While the shadows and scale are appropriate in this ad for an at-home nut milk machine, it makes no logical sense for this machine to exist on the coffee table of a velvety-pink living room where the algorithm has placed it. Often, AI technology only considers trendy, aesthetic choices instead of logical, product-specific settings.

Incorrect Scale

Not to bring geometry into this, but if the armrest of this sofa set is 7.9” wide, as it is listed in the description, then it stands to reason that the play place behind it would be 31 inches tall, or the length of the average 11-month-old. Similarly, this would make the pool float only eight inches from end to end, which I guess could support the size of the aforementioned 11-month-old, but not much else. By placing this sectional in the middle of a stock image without considering the scale, the AI image generator is bypassing one of the most important questions when online shopping: What does this product actually look like?

Inability to Process Complex Arrangements

This may look like a soccer team celebrating their recent victory in The Big Game, but it is actually a prime example of artificial intelligence’s inability to create a complex scene (and a very bad advertisement for a blender!). Notable errors include: the lack of photorealism, the blurring of bodies and faces in the image background, varying lighting on all the women in the foreground, and inconsistent setting choices (what nacho bar also has multiple crystal chandeliers?). Again, the scale issue, as demonstrated by the comically large blender – the actual product being sold – cannot be overstated.


Despite being marketed as tools for producing quality photos and illustrations, AI image generators fall short of true creativity, instead creating averaged-out – and often stereotypical – representations of various concepts. Spotting AI-generated art styles isn’t merely about technical flaws; it’s about discerning the absence of human touch. From the lack of skin texture to the uniformity of facial expressions, artificial intelligence struggles to replicate the nuanced imperfections that define human artistry. While it offers efficiency and scalability for some operations, it also risks perpetuating deceptive practices and reinforcing societal stereotypes.

AI is in fact not creating net new ideas or wholly unique images, instead pulling together an amalgamation of ideas pulled from social media posts and sites like Wikipedia and Reddit (because copyrighted work is not, by definition, able to be added into open-source codes). Because of this, the information being churned through these machine learning engines is submitted by the most frequent internet users which skew wealthier, male, and white. It follows that the output of this will also skew wealthier, male, and white, eliminating entire audiences from the equation.

Midjourney, an early artificial intelligence text-to-image generator, highlights the issue of perpetuating stereotypes in AI-generated images. Trained on a large dataset of digital art scraped from the web, it tended to produce stereotypical and sometimes offensive images based on generic prompts. For example, using the text prompt for an image of a “woman” without further description often resulted in a pouting, sexualized portrait of a white person. Similarly, requests for underrepresented groups led to cartoonish and exaggerated depictions. As tech giants harness AI for advertising purposes, it’s essential to interrogate whose interests are served and whose voices are amplified.


If you aren’t swayed by wacky proportions or ethical ambiguity, consider the legal implications of using AI image generators. Brown Bag explored adding AI-generated imagery to our creative team arsenal when the services first rolled out, intrigued by the allure of offering a quicker output for our clients. Confident in our ability to manage text prompts (and re-prompts) to secure high-quality images from the AI tools, we embarked on a test run, investing time and imagination to create the elements of a campaign.

“Our issue is with licensing through third-party providers,” said Brown Bag Senior Vice President and Creative Director Jerry Lewis. “The issue we ran into is these programs don’t provide a ‘rights managed’ option for controlling the use of these images.” Lewis continued, “we built a campaign which was approved by the client and prepped to launch when we saw another company promoting their services with one of the images from our campaign. BOOM! We were toast! We did a bit of research to find out that Adobe didn’t offer image contracts and due to that the client pulled the campaign.” 

Brown Bag was able to create a brilliant campaign with the aid of AI, unfortunately, copyright and licensing restrictions for commercial purposes limited us from preventing AI image generators to serve the exact campaign to our competitors.

Further, last August, a federal judge upheld a decision by the US Copyright Office that AI-generated art is not eligible for copyright protection. The ruling came in response to a challenge by Stephen Thaler, a computer scientist and self-proclaimed AI Artist, against the government’s refusal to register his AI-generated art. The judge emphasized that copyright law does not extend to works produced solely by AI without human involvement, stating that “human authorship is a bedrock requirement.” While AI tools continue to reshape the creative landscape, the indispensability of human creativity remains unequivocal.


The future of digital marketing and advertising lies in finding the perfect blend of human creativity and AI efficiency. At Brown Bag Marketing, we leverage cutting-edge technology while prioritizing human oversight, ensuring authentic, clear, and visually captivating brand messaging, while staying true to every client’s unique brand voice.

Ada Lovelace’s insights remind us that true innovation stems from the human mind, imbued with intuition, empathy, and cultural understanding. As we navigate the complex interplay between AI models and human creativity, let us uphold the principles of authenticity, diversity, and ethical responsibility in shaping the future of brand work and imagery.

Twitter user @JillTwiss said it best: “The whole point is to invent robots to do the work no one wants to do, not to invent robots to do the jobs that people love. How are we messing this up so badly”