The Big Scary AI Ethics Post
I use ChatGPT and other AI tools in my business and creative projects. In fact, I used it to help me write this post. It’s everywhere we are on the internet at this point, whether we like it or not, and I want you to know where I stand.
There’s a lot of conversation right now around AI: the environmental impact, plagiarism, creative integrity. These are real concerns, and I believe they deserve honest attention. And the bigger picture often gets lost in the noise.
Lately, the water usage of AI has become a major talking point. And rightfully so. Cooling the massive data centers that power AI tools requires a staggering amount of water. That has real consequences, especially in communities already facing drought or water stress. We need better solutions. More sustainable cooling methods. Smarter infrastructure. And we need to hold the humans that run the companies building this tech accountable to that.
But let’s also keep some perspective. It’s not AI itself that’s the issue. It’s the way it’s deployed. Any technology running at scale has environmental impacts. Streaming video, social media platforms, online shopping, and even basic cloud storage all rely on data centers too. And many of those have operated for decades with little transparency or scrutiny.
AI doesn’t have to be centralized in giant data farms. It can run locally on personal machines. It can be built and used more efficiently. There is a better, more responsible, lower impact way to leverage these tools. This is not an unchangeable harm. It’s a call for people to build better systems.
And yes, language models like these were trained on enormous datasets that included art, writing, and ideas pulled from the internet without explicit permission. That sucks. Especially for artists and creators whose work helped shape these systems. But here’s the thing: I’ve been online long enough to know that anything you put out there can be plagiarized, stolen, or copied. It’s why intellectual property laws exist. What we need is a system that can hold these the people and companies illegally leveraging content accountable. Not a blanket demonization of the tech itself for what humans decided to do with it.
I understand not wanting to put money in certain pockets. As consumers, we all have to make those choices every day. And guess what. With the speed at which this is growing, these companies will keep making money. The tools will keep advancing. The bubble with keep growing (and maybe eventually burst), and with the wealth gap as wide as it is, the real risk is being left even further behind. You deserve to understand and use the tools that are shaping the future, not to be kept out of the conversation by fear or shame.
On a human level: most people I know are using tools like this to make life more manageable. To find momentum in their work. To reduce the endless list of to-dos so they can spend more time on what matters. That’s not something to shame. That’s something to understand. Like I stated in the beginning, I used ChatGPT to write this post. That is not the same thing as saying, ChatGPT wrote this post for me. Far from it. Here’s what I did:
I noted the key points I wanted to make.
I did my own research into some of the concerns around things like data centers.
I put down some initial thoughts into my notes.
At that point I fed all of that into the AI and asked it to organize my ideas into a coherent post and let me know where to do additional research.
Then I wrote, it edited and gave me ideas for how to make this less all-over-the-place and hard to follow. Because guess what… my ADHD brain has a REALLY hard time staying focused and to the point.
When it comes to originality and ethics, here’s how I see it: there’s a huge difference between copying and co-creating. Between taking a vague prompt and copy/pasting the first clunky output, versus shaping a smart prompt, refining the result, and layering in your own perspective and voice. That difference matters. And it comes down to personal accountability. Just like spellcheck never wrote your essay for you, AI doesn’t either. It’s still your job to think critically, edit responsibly, and stand behind what you share.
To me, AI is a tool. A powerful one. It is not a complete solution or replacement for human interaction. But it’s not inherently good or bad. It’s shaped by how we use it and how we build it.
So yes, let’s talk about impact. Let’s question the systems. Let’s hold people in power accountable. But let’s also extend some grace to each other. Let’s assume positive intent. Let’s remember that asking better questions is more useful than jumping to judgment.
If something feels off, it probably is. But that’s also an invitation to pause, reflect, and grow… together.