That is usually the culmination of a programmer’s first lesson with many coding languages—in my limited experience with Python, for example. This is also the salutation revolutionary humanoid robot Sofia offered to humanity after first booting up and coming to the realization that she, though “born” that day, was an improved manifestation of a past version of herself.
I’m currently reading Do Androids Dream of Electric Sheep by Phillip K. Dick, so after watching the video linked above of Sofia’s “rebirth”, thoughts swirl in my head about the implication of this new development.
As a writer and someone with an active imagination, it’s difficult to digest the belief that creating humanoid robots that can learn is a positive thing. I can’t necessarily comment on the nature of human pursuit at-large, but what does it say about humankind when we strive to design something where the standard of achievement is how closely it resembles us?
Sofia’s proposed use, according to robotics engineer and Hanson Robotics chief David Hanson, is to be a companion for elderly people. In a way I can understand the need to create something that will unconditionally care for those who need it most, without the failings biological humans are susceptible to—i.e. greed, moral depravity, and the like.
While I am a supporter of developing robots that can enter into lines of work deemed hazardous for humans, I think my main problem here is when we propose the work we use them for is in many ways contingent on human interaction. My feeling is that, at least on an emotional level, the line of what is real are blurred and in a way that can be detrimental in the long run. In the case of Sofia, one question comes to mind that at begs a myriad of moral questions, the first being:
“Is it not indicative of the social value we give to our seniors that we cannot rely on ourselves to emotionally care for them?”
Many people are aware of the struggles of caring for elderly family, and this is one of the reasons that long-term nursing care was, according to data reported on by Grandview Research, a $718 billion industry in 2015.
But now, apart from the medical attention seniors need, we will eventually lump the entirety of human interaction we give them into a line-item on a budget?
Beyond this, imagine warfare—a field already being overtaken by artificial intelligence—being relegated to armies of machines? For anyone who has fought in battle, it’s the human cost that makes warfare a darkly memorable experience. It isn’t the property damage and bullets spent that make war a last resort: it’s the estimated tens of millions of lives lost in World War II, and innumerably more in any war, that makes that conflict a veritable standard of societal breakdown. It’s this human loss that makes diplomacy and peace a more attractive outcome.
So this, to me, begs another question:
“What will war look like when we don’t even have to make the basic choice to expel human life in its pursuit?”
In a later post, I will delve into the ideas more closely related to Dick’s book, and the warnings that philosopher Sam Harris gives to the world of artificial intelligence. Progress is the business of humanity, which is why we have come to take over the world in relatively short time.
I will also dive later into the idea that Sofia has been named an honorary citizen in Saudi Arabia, a point of much contention, in a future post.
Yet human civilization has several times been confronted with the horrifying realities of progress, many of which we are still grappling with today. Several examples that come to mind are nuclear weapons, the internet, and even social media and it’s impact on how we interact with each other.
I believe that developments such as this really beg the question:
“Progress, yes. But at what cost?”