In a move that has stirred up a debate on authenticity and ethics in digital media, Adobe Inc.’s ADBE stock image service is now offering for sale artificially created images depicting scenes from the Israel-Hamas conflict.
What Happened: Amidst growing concerns over AI-generated fake imagery, Adobe Stock is selling artificial images depicting the Israel-Hamas conflict, reported PetaPixel.
Searching for “Israel-Palestine conflict” on Adobe Stock yields many AI-created images, with most of the results being computer-generated. These images, while some more convincing than others, are often indistinguishable from real photographs at a glance, raising alarms over the potential for misinformation.
Australian website Crikey, which first reported the issue, discussed the implications with Dr. T.J. Thomson from RMIT University. Dr. Thomson highlighted the risk of these images distorting public perception and truth, expressing concern for the role of photographers who capture these scenes at great personal risk.
See Also: Elon Musk Says The Secret Behind Grok AI’s Sassy Personality Is This Science Fiction Comedy
Adobe has responded to the controversy, stating that all generative AI content on their platform is clearly labeled upon submission and that they are actively engaged in combating misinformation through initiatives like Content Authenticity Initiative.
Last year, Adobe announced the acceptance of AI-generated submissions emphasizing transparency. However, a recent analysis from Stock Performer revealed that AI images are generating significantly higher revenue compared to traditional photographs on Adobe Stock.
The original article received an update on November 7th with comments from Adobe addressing the issue.
Why It Matters: Adobe’s decision to allow AI-generated images onto its stock platform, especially those depicting sensitive political conflicts, brings forth critical questions about the reliability and ethics of visual content in the era of deepfakes and misinformation.
With AI-generated content showing a potential to overshadow the work of traditional photographers – whose job often involves risk-taking to capture the truth – the impact on photojournalism and public trust is profound.
Moreover, the increasing revenue from AI images could incentivize a shift away from human photography, possibly undermining the value of authentic images. This might lead to a broader conversation about the future of photography and the standards of truth in media.
Adobe’s Content Authenticity Initiative represents a step towards addressing these concerns, but as the technology evolves, so too must the measures to ensure transparency and trust in digital media.
Check out more of Benzinga’s Consumer Tech coverage by following this link.
Read Next: China Plans To ‘Reshape The World’ By Mass Producing Humanoid Robots
Engineered by Benzinga Neuro, Edited by Rounak Jain
The GPT-4-based Benzinga Neuro content generation system exploits the extensive Benzinga Ecosystem, including native data, APIs, and more to create comprehensive and timely stories for you. Learn more.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.