Jason Thacker serves as the Creative Director and Associate Research Fellow at The Ethics and Religious Liberty Commission of the Southern Baptist Convention. He is also a founding member of AI and Faith, a consortium of faith communities and academic institutions whose mission is to bring the world’s major religions into discussion regarding the ethics of AI and related technologies. Thacker is the author of The Age of AI: Artificial Intelligence and the Future of Humanity, which will be released in March of 2020.
Synesis was privileged to speak with him, and the following is a revised transcript of our conversation.
To begin, how would you define “artificial intelligence?”
The simplest definition is “non-biological intelligence,” computers which exhibit certain intelligent behaviors like being able to process data or find connections that humans may otherwise miss. It’s also important to distinguish between narrow AI and general AI, because most of the time, all we see in pop culture is general AI – killer robots, this kind of dystopian future. We’re nowhere near that ability of AI, nor do we know if it’s even possible to obtain a general level of intelligence. There’s only one general level of intelligence in the universe, and it’s us – the human level intelligence. Narrow AI is very specific to a task and narrow in its application, something like Siri on our phones. We use AI really every single day, but people don’t often realize that the systems they’re using or things they do are actually powered in large part by AI.
Why do you feel the need to address this topic and get Christians thinking about it?
First of all, it’s already an issue we deal with everyday. There’s a scientist named Ray Kurzweil who begins his book How to Create a Mind by saying that if all the AI systems became conscious and went on strike, our entire economy and society would instantly crash to a halt. Our society is already undergirded by it.
The reason why I want to speak into this is that there’s a lot of confusion, fear, and even apathy about what AI is and where we’re heading. From a Christian perspective, we see that AI isn’t really causing us to ask new questions of humanity, what we’re able to do or where we’re headed. Rather, it’s causing us to ask the same old questions in light of new opportunities. We’re starting to ask the fundamental questions of “What does it mean to be human?”, “Am I defined by my work?”, and “Are we able to create something better than ourselves?”
As Christians, we believe that we have these deep longings because God has created us to know him and to love our neighbor. And that’s the framework that I base my work on: Jesus’ command to love the Lord your God with all your heart, soul, and mind, and to love your neighbor as yourself. Are we utilizing AI to love God and to love our neighbor, or are we using it to love ourselves and treat our neighbor as less than human?
You mentioned Ray Kurzweil. Part of his work has to do with the possibility of immortality and being able to download our minds into computers. What are your thoughts on this possibility?
And none of those ultimate desires of overcoming brokenness and what we would call sin are unique to our technological age. That’s what I meant by the return to fundamental questions. We know we live in a world that’s not the way it’s supposed to be, and we’ve always wanted to overcome poverty, sickness, brokenness, and death.
I think with such a powerful tool as AI at our fingertips and with the potential for even more power, there’s a kind of dual tension: we can easily see ourselves as less than human while simultaneously seeing ourselves as greater than human, like a god. I’m not super worried about this dystopian robot overlord takeover scenario. The thing I’m most concerned about is whether we see each other as less than human, whether we have a proper understanding of what it means to be created in the image of God and whether we recognize our limitations, because not all of our limitations are bad.
So do you think we can justify the idea of immortality or downloading our minds onto computers from a Christian perspective?
Well, I don’t think that we’re going to obtain a fully conscious computer, even if it can fool us into thinking that it’s actually conscious. But I would take a step back and say that in order to rightfully understand AI, we need to rightfully understand who we are as creation in the image of God.
According to the Biblical Scriptures, humanity is distinct from the rest of creation because we were created in the likeness of God. The basis of the uniqueness and dignity of humans isn’t based on what we do, how we think, or any kind of intrinsic part of ourselves. In my understanding of the image of God, we were created in God’s likeness to represent him.
I think Kurzweil’s ideas are very dangerous concepts for Christians to buy into, because the idea of downloading our minds assumes that the body isn’t worth anything, that it’s just a vessel or container. Wanting to just update our containers for something that will last longer is a devaluing of the body, an assumption that the body is less important than the mind. I believe that Christ is in a physical, risen body. He didn’t rise from the dead as a spirit or a soul; he rose from the dead as a live man with holes in his hands and a cut in his side. So this idea that our body is less valuable than our minds is contradictory to Christianity and something the Christian tradition rejected from the early centuries of the Church.
Furthermore, from the Christian understanding, we are already immortal. We are immortal souls who will be either eternally with God or eternally separated from him.
How would you respond to someone who justifies superhuman modifications with the argument that they’re simply pursuing the eradication of sickness, poverty, and other suffering? At the same time, how can Christians harness these technologies for greater human flourishing?
The idea of upgrading our bodies is not a new idea; it’s called transhumanism. In many ways, there could be good restorative applications of these technologies. My father is an amputee, and it would be a good advance if he could have a leg that’s functional. But then we get to really fine-tuned ethical questions, like “At what point is he not a human anymore?” and “Is he really just a machine if he has all these upgraded pieces?” Again, we reach that question of “What does it mean to be human?” It’s this fundamental question that we have to answer, and once we have that, I think we can rightly pursue restorative means of medical applications in technology.
The idea that we need to be upgraded and that we’re not good enough, though, runs contrary to what the Biblical Scriptures teach about the uniqueness of man made in the image of God.
We should be pursuing innovation, because God’s given us this tool and ability to create. The guarded, fearful idea that we should stifle all innovation is not very Christian. We just need to approach it with wisdom. This is why I think society as a whole is starting to have conversations about ethics. As a Christian, I would argue that the ethic of human dignity comes from the image of God, and others would base that in other faith traditions or understandings of society. Ultimately, though, the wise approach comes from prizing and valuing humanity as distinct and unique and pursuing innovation for the sake of human flourishing.
If biological enhancements became the new norm in the next twenty to thirty years, would you refuse them?
First of all, I wouldn’t say I would refuse them now because we’re not there yet, and I don’t want to speak too early. There are a lot of theories about what we could do, like using nanotechnology to cure cancer. And that sounds awesome. My mother had lung cancer.
It’s when you turn the use of technology from pursuing human flourishing to being a means in itself or being a means to dehumanization – that’s where I would have more issues with it. In twenty or thirty years, we will still be created in the image of God. We will still be unique. And that doesn’t just apply to the people we think are useful. It also applies to the people who don’t have any perceived value to society. I personally believe it applies to both the baby in the womb and my 94 year-old grandmother as she lay dying in her bed, squeezing my hand for the last time. My grandmother had more value in that moment than any technological creation that will be created. Why? It’s not her usefulness. It’s not the baby’s potential for usefulness or the elderly’s past usefulness. It all comes down to being created in the image of God.
So I think rightfully understanding what technology is, who God is, and who we are is the key to a good ethic and moral basis moving forward.
How do you see AI impacting the future of work in our society?
I believe we’re facing a kind of turning point in the history of work, where the fundamental nature of work is going to shift. What we need to say as Christians is that our work doesn’t define us. What happens if we make our work our identity? If we lose our job, we lose our identity. We start to view each other based solely on utilitarian means, so your dignity and value to society is based on what you can contribute. But that’s completely contradictory to Christianity. I think that as Christians, we need to uphold the dignity and value of work as a way of upholding the dignity and value of each human being created in the image of God.
What advice would you give to people wanting to enter the field of technology?
First of all, I’d say “praise God.” The idea of doing ministry isn’t relegated just to vocational ministry or thinkers and writers. We need Christian programmers and computer scientists and doctors and lawyers, representing God and the gifts and talents he’s given us.
I think the advice I would give to Christians is that our faith isn’t something we leave at the doorstep when we go into the office. Our faith is integral to who we are because it’s fundamentally shifted and changed us. So it’s important to represent the Lord in our work, keeping central the framework of loving God and loving others. Having the aim of human flourishing and human dignity is also crucial to that.
The biggest thing I would say though is to just have conversations and be open about our faith. I know a number of Christians in big tech who are doing faithful and great work, and they’re leading their teams by saying that they value human dignity. We can be faithful to God as we do the work that he’s called us to do.
If you could have any superpower, what would you choose?
Someone asked me a couple weeks ago what my favorite AI-related TV show or movie is, and I told him that there are actually two books: Frankenstein and Fantasia. Mickey and the broom actually has something to do with superhuman killer robots and the future of technology. But in terms of superpowers, I still think flying would be pretty rad, to be able to travel really quickly and have that kind of perspective. My specific little annoyance or frustration in the moment kind of loses its power when I see the grandeur of what God has created. It’s just not something you see every day.
Jason Thacker writes regularly on technology, ethics, and faith.
Subscribe to his WeeklyTech newsletter here.
Interested in getting involved with Synesis or want to send us your feedback?
Contact [email protected].