Tribe’s use of AI imagery sets dangerous precedent

On April 11, the Cherokee Nation posted a pair of seemingly innocuous pictures to their social media accounts, featuring principal chief Chuck Hoskin Jr. and deputy chief Bryan Warner as action figures. I recoiled when I saw it while scrolling Instagram in bed.

For the uninitiated, this is a part of a recent internet trend using the image generation featured in the latest version of OpenAI’s ChatGPT software in which you generate an image of an action figure of a person. While innocent on the surface, the use of AI image generation by a government that claims to support artists makes their claims ring hollow, in addition to setting a dangerous precedent about the use of tools by the tribe instrumental in the dissemination of misinformation and disinformation.

The ethical implications of using AI art may not be clear to most. The technology that underlies this trend works because large neural networks are trained on vast amounts of content scraped off the internet without consent of the people that created it. These image generators are good — often eerily good — but they come with baggage. When institutions trusted as being canonical sources of information start leaning on this technology for public-facing content, it opens a whole can of worms regarding authorship, censorship, consent, and where the line gets drawn between a fun internet trend and potentially misleading the public.

What stings most about this situation is that the Cherokee Nation should know better. We are a nation loaded with talented artists, many of whom are actively working to preserve and evolve Cherokee culture through their art. And yet, instead of commissioning one of them for a cheeky, stylized piece of leadership-as-action-figures content, the tribe opted for a soulless AI rendering.

It’s like skipping your cousin’s beadwork to buy a knockoff at Hobby Lobby. This isn’t just a missed opportunity; it’s a slap in the face to the very creators the Nation claims to support. We constantly hear about investing in the arts, promoting Cherokee voices, lifting up tradition—but that message starts to feel pretty hollow when the Nation’s own social media prefers mass-generated pixels over something created by an actual Cherokee hand.

What worries me most, though, is the dangerous precedent this sets. If the Nation is comfortable using AI to generate content for something as visible as its leadership’s Instagram feed, what’s next? Educational materials? Cultural storytelling? Doctoring a gaffe made by the chief? Once you open that door, it gets easier to keep walking through it — especially when it saves time and money and can serve to benefit those in power.

That kind of normalization sends a message, whether intentional or not: that it’s okay to bypass human creativity and integrity for the sake of convenience. The fast and flashy option overpowers the thoughtful and real. Coming from a government that should be setting the standard for cultural stewardship, it feels like it very well could be a slippery slope.

AI is inescapable if you participate in modern society. You can find chat bots and image generation tools in Facebook, Instagram, and even a normal Google search. An oft disregarded fact of the technology is that there is no intelligence. These systems don’t think, reflect, or understand — they remix. They’re sophisticated pattern recognizers trained on mountains of human-made content, and they spit out plausible-sounding results based on what’s statistically likely, not what’s true or appropriate.

We should all be thoughtful of how we use these tools. Truth is in the balance.

Abby Bigaouette is an aerospace technician at Blue Origin and former graphic designer for the Daily Press. She is a citizen of the Cherokee Nation. You can reach her at abby@bigaouette.com. This op-ed was originally published in the Tahlequah Daily Press.