The Machines Are Learning, But Are We Listening? – joshbalogh

by Zaki Ghassan
0 comments
The Machines Are Learning, But Are We Listening? – joshbalogh


By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.”

Eliezer Yudkowsky

There’s something sneaky happening in the music world right now, and most folks don’t even know it’s taking place.

It’s not your typical label drama or streaming payout scandal (though those still sting, too). It’s quieter. More technical. More dangerous.

Artificial intelligence is learning how to make music. I’m sure this isn’t super new, but I’m waking up on it. That’s not inherently bad. Tools evolve—electric guitars once scandalized traditionalists, and now they’re sacred. But here’s the difference: the AI isn’t just learning to make music in a vacuum. It’s learning from us. That’s a ramification I had not fully considered. It’s learning from the indie kids on Bandcamp. From the bedroom producers uploading to SoundCloud. From artists bleeding real stories into the digital void, hoping someone will care enough to listen.

And the machines are listening. But not to be moved or appreciate the artistry. They’re listening to imitate.

If you want to take a deeper dive on how this is playing out before our eyes/ears, here’s an alarming article from The Guardian.



That’s where it gets tricky. We all know the current system is stacked—has been for a long time. Big labels and corporations already hold most of the cards; they control access to radio, playlists, major press, tour support, and increasingly, the cursed algorithms. Now, they’re investing heavily in AI that can mimic the very art they’ve refused to fairly support. It’s the musical equivalent of gentrification: take the sound, scrub it clean, and profit—while the originators are left behind, again. What’s to be done? 

Let me be clear: money’s not evil. It helps keep the lights on. It pays for guitar strings and studio time and maybe a burrito or two. But the love of money—the blind pursuit of efficiency and scale at the expense of human souls—that’s where rot takes root. And in this case, that rot looks like replacing messy, vulnerable, glorious human music with AI-generated replicas designed to hit the same dopamine triggers… but with none of the spirit, none of the soul.

Music has always been about more than sound. It’s about communion. A three-minute sanctuary from isolation. A lyric that names your pain when you can’t find the words. A melody that reminds you you’re not the only one still hanging on. When we outsource that to machines—especially ones trained on stolen ideas—we cheapen the connection that makes music matter.

This isn’t just a tech problem. It’s an ethics problem.


If your AI model is trained on the unpaid, uncredited work of struggling musicians—some of whom already feel invisible—and then turns around and sells that music back to the world without acknowledgment, that’s not innovation. That’s exploitation. That’s a use or the tool that’s just plain wrong.

We need transparency. We need clear labels on AI-generated content. We need to know who these models were trained on and how. We need opt-out rights for artists. And maybe more than anything, we need a cultural gut check: Are we okay with replacing human creativity just because it’s cheaper?

Because I’m not. Not when I see the blood, sweat, and tears it takes to put art into the world. Not when I’ve seen firsthand how one honest lyric from an unknown band can change someone’s life. Or the beauty of a human partnership with the divine created musical composition that moves me to tears. Not when the world is starving for connection—and machines, for all their clever mimicry, still don’t know what it means to feel shame, grief, awe, or grace.

I have a heart for the underdog. I always will. And right now, independent artists—the ones who wake up early, stay up late, work two jobs, and still find time to write truth—are getting pushed to the margins again. I think of all these things from the perspective as a long time CCM fan and critic too. What are the ramifications of AI use in Christian Music? Who decides the ethics there? I don’t know, but it’s far past time for the discussion as far as I’m concerned.


But it doesn’t have to be this way. We can speak up. We can demand a better system. One that honors originality. One that protects the vulnerable. One that doesn’t confuse data with soul.

The machines are learning.
The question is: Are we?

AI is stealing songs from indie artists to teach machines how to fake soul. It’s not innovation—it’s exploitation. Music is meant to heal, not be harvested. We need transparency. We need ethics. We need to protect human creativity before it’s replaced by algorithms. We be music listener much demand more, we have to pay attention. #ProtectTheUnderdogs #HumanMadeMusic #AIregulations

I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish. I mean with artificial intelligence we’re summoning the demon.”

Elon Musk warned at MIT’s AeroAstro Centennial Symposium


Full Disclosure: I’m not opposed to the use of A.I. I’m truly not. I use GPS, Spell check, etc. I just believe there’s a good use of the tool and then a very dangerous way to use it that we should pass on.

  • AI as a general-purpose tool: In this context, AI is used to automate tasks, analyze data, and improve decision-making processes. It functions within defined parameters, relying on rules and algorithms to process existing information and achieve specific goals. For instance, AI is employed in fraud detection systems to identify suspicious activity based on established patterns.
  • Creative generative AI: This type of AI goes beyond analysis and prediction, focusing on generating novel and original content. It learns from vast datasets of existing content, identifying patterns and structures to create new outputs like text, images, music, or code. An example is DALL-E, which generates images from text descriptions, demonstrating a level of creativity not achievable with traditional AI. Buuuuut…that data, that creativity that it’s learning isn’t owned by them, it’s someone else’s thought and work. That’s more than a little problematic.

We can do better, and we must push for something better to preserve the human connection.


Part Two: What’s to Be Done? Fighting for the Soul of Music in an Age of AI (Coming soon!)


You may also like

Leave a Comment