Quote:
Originally Posted by
elambo
➡️
It doesn't matter if AI is cognizant of what it's doing, nor does it matter *how* it does what it does, when the person experiencing the AI output cannot discern a difference between AI and human authorship. This is where we're at. And this is why it's pointless to suggest that it's inarticulate and unintelligent. Is it dumber than 98% of the social media posts we read throughout the day? Do we know graphic designers who can repair massive portions of images with complete transparency in 10 seconds? Visual artists who can extend the oops-we-yelled-cut-too-early scene of a movie by 20 frames at the click of a button? Writers with access to the entire history of literature as a reference?
It's not a time to feel safe by its lack of human flair, it's a time to recognize where this is headed, and just how quickly it will cross a serious threshold. The only way to do that is to cleanse those mirky doors of perception.
Feeling “safe” implies there’s a threat, it puts us in a defensive, inferior position. And I think that mindset is just as mistaken as projecting human traits onto something that is fundamentally not human. Calling AI an "artist" or a "writer" is, quite simply, ontologically wrong.
No one has ever seriously said, “Wow, the jitter in this MIDI drum feels like John Bonham’s groove.” And if they did, they were probably drunk.
At the root of all this, I believe, lies a deeper issue: a kind of collective self esteem crisis in our civilization. We've absorbed a view of ourselves as mechanical, chemically reactive, and ultimately predictable beings, as if we were just extremely complicated algorithms.
This reduction of the human experience flattens our depth, our mystery, our agency. And perhaps that’s why so many are so quick to grant machines a seat at the human table. Because if we start seeing ourselves as mere machines, or cogs in a bigger machine, it doesn’t feel so strange to mistake a machine for one of us, or a complex machine as a god / authority better than us.