Select Page

Music is in great part about emotions, yet the early days of discussing LLMs have been mostly about text,, and what effect conversations with AI have on humans. This AI topping the charts thing seems to be some important way point on the AI development graph.

Clearly AI created music becoming very popular… and probably in a much much wider distribution of AI in non tech populations…than say GPT…. is impactful and meaningful.

But what is that doing to human’s relationship with music and the emotions involved? Do we have any idea whatsoever at this point? Do we need to understand that ASAP?

 

 

 

 

 

 

 

Synthetic soul: AI-generated gospel singer tops Christian chart

From ongoing discussions on AI text relationships with humans, that node of AI seems far from being understood or “digested” into the zeitgeist as yet. Instead there’s many polarized takes on “is this a good thing?” or “is this a bad thing”? and little consensus either way.

AI Video is beginning the movement towards the “common place taken for granted” part of human experience. That has huge implications for what we are “used to” and can adjust to and cope with.

And if Gospel isn’t immune from AI in it’s interactions with humans, that needs to somehow be understood and accounted for too, because there are going to be huge changes in “what’s human” if Gospel isn’t.

IOW, we are still in the early days of understanding what the current AI does and how it affects humans, let alone seeing clearly where all this is going. As has been often noted, everything is going to be affected.

For the  leaders in innovation of education and healthcare, the way ahead is full of fog and blankness where understanding and foresight needs to be.