Theology and Technology

Noreen L. Herzfeld

Technology is a multifaceted reality. It is not merely the tools and machines we use but also the skills and processes with which we use them and the attitudes, culture, and social structures they produce or enhance. Given that the purpose of technology is to transform our environment or ourselves, technology raises issues of human identity, our relationship to nature, our relationship to one another, and our ideals and hopes for the future. We see these issues especially in the current transformative tools of genetic engineering, nanotechnology, artificial intelligence and robotics, and energy technologies. Each of these technologies raises both new ethical decisions and quandaries as well as questions central to Christian theology, such as the nature of the imago Dei, sin, salvation, and the eschaton.

1 A blessing or a curse? Technology and transcendence, dominion, and relationship

Technology comprises the tools and the processes used to control or alter ourselves and our environment. While often thought of as referring to devices or tools – machines, chemicals, or instruments – technology also encompasses the processes and skills with which we manipulate these tools and the structures within which they are used. For example, the technology of the automobile consists not just of the car itself; it also requires the infrastructure of roads and bridges, fuel production, fuel distribution, and the driving skills, customs, and rules of the road.

The term technology is derived from the Greek techne, translated variously as craft, art, or knowledge. As opposed to episteme (passive knowledge of the nature or being of things), techne is the knowledge of doing or making. Techne is always instrumental. It changes and creates. The techne described by the ancient Greeks was focused on altering the physical world to aid human survival. Jacques Ellul describes the purpose of technology as ‘to defend man’ (1964: 45). We use technology to modify our environment, to gain protection from the elements and from predators, to procure the things we need to stay alive, and to make our lives longer and more comfortable. Contemporary technology goes further, significantly increasing the scope and power of our agency and impact on the world. Processes such as genetic engineering or nanotechnology not only modify existing objects but also create things that are entirely new.

Hans Jonas (1984) suggests that this increased scope and manipulative ability requires that we also expand our sense of responsibility beyond local and immediate concerns. Martin Heidegger (1977) views technology as a means of encounter, influencing our relationship to nature, each other, and, our very selves, noting that the technological view treats everything as something to be organized, improved, or used, while concealing the true nature of the object itself. Thus, technology can be thought of also as a mindset, one that affects the social as well as the physical world. Technologies both are defined by and define their society: ‘[w]e shape our tools and, thereafter, they shape us’ (Culkin 1967: 70). Social critic Neil Postman calls current Western culture a ‘technopoly’, a culture that engages in ‘the deification of technology, which means that the culture seeks its authorization in technology, finds its satisfactions in technology, and takes its orders from technology’ (1993: 71). Technology is central to our understanding of ourselves and the world, standing sometimes in tension, sometimes in harmony, with our religious traditions, and playing an undeniably greater role in our lives than it has at any previous time in human history.

1.1 Technology and transcendence

The history of that role has various interpretations. David Noble (1999) argues that, whether in tension or harmony, contemporary technology is an outgrowth of the Judeo-Christian worldview. Like Postman, Noble believes technology shares goals typically found in religion. He roots the drive to develop technologies not only in a soteriological desire to alleviate present suffering but also in a religious longing for transcendence. While Benedictine and Cistercian monks pioneered the development of clocks, watermills, windmills, and new agricultural techniques, the Benedictine motto Ora et Labora (‘Pray and Work’) gave these manual arts spiritual significance, thus raising the status of one’s tools. In chapter 31 of his Rule, St Benedict calls on the abbey’s cellarer to regard all utensils and goods of the monastery as if they were the sacred vessels of the altar. Noble finds an even clearer articulation of the nobility of technology in the ninth-century writings of Erigena, who speaks of the importance of the mechanical arts as a part of our endowment with God’s image at creation (1999: 16). Noble views Erigena as the precursor to a long line of Christian philosophers and theologians who express a longing for a new and better world of our own making, a longing that becomes the animating spirit of modern technology. He traces this vision of technology as the means for transcendence from the medieval monks through secular explorers, inventors, and scientists such as Christopher Columbus, Francis Bacon, and Isaac Newton to the Freemasons and Rosicrucians, and, finally, today’s astronauts, virtual reality pioneers, genetic biologists, and transhumanists, each seeking to develop tools that transcend the limitations of our human condition.

1.2 Technology and the imago Dei

According to Genesis 1, humans are created in the image of God. How theologians interpret this image has varied, yet most can be categorized in one of three ways: substantive interpretations view the image as an individually held property that is a part of our nature, most often associated with reason; functional interpretations root the image of God in action, specifically our exercise of dominion over the earth; and relational interpretations find God’s image within the relationships we establish and maintain. While the early Church Fathers tended toward a substantive view, echoing Aristotle’s distinction of humans from animals by our powers of reason, by the twentieth century functional and relational interpretations were dominant (Herzfeld 2002: 10–32).

Functionally, humans can be thought of as God’s deputies on earth. For historian Lynn White (1967), this is a remarkably anthropocentric view. He notes that our technologies have all too often brought not transcendence but ecological destruction. White points to Gen 1:27–28:

So God created humankind in his image, in the image of God he created them, male and female he created them. God blessed them, and God said to them, ‘Be fruitful and multiply, and fill the earth and subdue it; and have dominion over the fish of the sea and over the birds of the air and over every living thing that moves upon the earth.’

These verses underline God’s image, the imago Dei, as specifically human, providing a justification for our dominion over nature. White argues that Christianity has played a major role in fomenting our ecological crisis by promoting unlimited mastery over nature. He views modern technology as the expression of a specifically Western, post-Enlightenment worldview – one that prioritizes humans and roots our current environmental crises in our failure to recognize this worldview and alter it, away from an instrumental stance toward nature, suggesting instead a relational view that recognizes our continuity with the natural world.

The dominion prescribed in Genesis 1 can be construed, more positively, as stewardship or deputization. Biblical scholar Gerhard von Rad writes:

Just as powerful earthly kings, to indicate their claim to dominion, erect an image of themselves in the provinces of their empire where they do not personally appear, so man is placed upon earth in God’s image, as God’s sovereign emblem. He is really only God’s representative, summoned to maintain and enforce God’s claim to dominion over the earth. (von Rad 1961: 58)

While White views human dominion over nature with skepticism, rightly pointing out that this dominion has too often been less than benign, von Rad believes that the intent of the writer of Genesis 1 was to promote quite the opposite. In Genesis 1 chaos is ‘the great menace to creation’, and thus to humanity (1962: 144). Humans are called to join God in imposing order on nature and, in so doing, to participate in God’s saving plan. Developing technology is part of this participation.

In this regard, Philip Hefner (1993) has dubbed humans ‘created co-creators’ with God. Like Noble, Hefner views technology as deriving from our innate impulse to transcend the natural world. In answer to White’s criticism, Hefner includes the word ‘created’ to stress that, as created beings, we do not stand above the rest of the natural world. However, as co-creators, he believes we have been given the freedom to participate in and further God’s plan through the tools, processes, and societies that we develop. For Hefner, the discovery of a new vaccine or surgical procedure is emblematic of our co-creation with God. God’s purpose is to heal; he gives us the means through human creativity. Hefner’s vision of technology is, thus, a much rosier one than White’s; however, some consider the designation ‘co-creator’ as dangerously close to claiming equality with God (Lorrimar 2017).

Dominion is not the only way to understand the imago Dei of the Genesis 1 text. Karl Barth (1960) focuses on two very different portions of the text: ‘[l]et us make man in our image’ (Gen 1:26) and ‘male and female he created them’ (1:27). Barth interprets the plural in ‘[l]et us make man’ as referring not to a heavenly court but to the nature of God, justifying this with the observation that, if whomever God is addressing is to join in the act of creation, God must be addressing God’s self. In Christian terms, God as Trinity implies that relationship is at the heart of God’s very being. The second part of the text, ‘male and female’, denotes a plurality of humans. Thus, Barth posits that to be in God’s image means to be in relationship, with God or with other humans. According to Barth, ‘image has double meaning: God lives in togetherness with Himself, then God lives in togetherness with man, then men live in togetherness with one another’ (1960: 57).

1.3 Technology and relationship

New technologies change the societies that wield them. The history of technology is a cultural history, not just a history of inventions (Diamond 1998; McNeil 1990). It is likely that the first technology, the ability to make and use fire, marked the emergence of humans as a species. The ability to smelt iron, evidenced from around 1500 BCE, marked a complete change in social structures with an emerging division of labour. Genesis 4–9 illustrates the tensions created by newly emerging technologies. Conflict between the pastoralist and the new technology of agriculture is explored in the story of Cain and Abel, where Abel’s new technology, favoured by God, shatters the relationship between brothers. The story of the tower of Babel explores the rise of cities, with their new technologies, social systems, and diversification. As each city develops its own social constructs and languages whole societies cease to understand one another. Finally, Noah and his family are saved by the technology of the ark, perhaps because Noah engages in a covenant with God, one that repeats both the claim that humans are in God’s image and that they have dominion over the earth. The writer here grapples with the dual nature of our dominion over animals, simultaneously benign and malignant. Noah saves the animal world by bringing them onto the ark yet is also given permission to kill and eat them. These chapters provide a warning: wielding technology is necessary, yet without right relationship to God, one another, and the rest of creation, it can have unforeseen and disastrous results (Herzfeld 2009: 15).

Technology does not have intrinsic power or will, yet it shapes our wielding of power. ‘[Technology] is a power endowed with its own peculiar force’, writes Ellul. ‘It refracts in its own specific sense the wills that make use of it. Indeed, independently of the objectives which man pretends to assign to any technical means, that means always conceals in itself a finality which cannot be evaded’ (Ellul 1964: 140–141). We see this in communication technologies where, as Marshall McLuhan famously noted, ‘the medium is the message’ (1964: 7). For example, the skills needed to gather information in cyberspace have rewired our brains in favour of fragmentation and against the sort of deep concentration encouraged by, for example, the long novels of Tolstoy (Carr 2010: 131). Technology may also override individual choice. The philosopher Herbert Marcuse (1941) blamed technology for sublimating the will of the individual to that of the crowd. Thoreau, writing from his cabin at Walden pond in 1854 sighed, ‘We do not ride on the railroad, it rides upon us’ (Thoreau 1997: 16). While he could choose not to ride on the railroad himself, he could not turn back the clock to regain his rustic paradise as it was before the tracks were laid. While we can make individual choices regarding certain technologies, others – such as the modes of communication or transportation we use – are impossible to avoid once society has embraced them.

2 Altering the body: technology and healing

Modern medicine is the one technological area most frequently hailed as a blessing. It has considerably lengthened the human lifespan and holds the promise of therapies or cures for a variety of illnesses that have heretofore been intractable. The Christian scriptures urge us to heal. Jesus healing the sick dominates the miracle stories in the New Testament. In Matt 4:23 we read, ‘Jesus went throughout Galilee, teaching in their synagogues and proclaiming the good news of the kingdom and curing every disease and every sickness among the people’. Jesus also exhorts his disciples to heal, sending first the twelve (Luke 9:1) and then seventy-two followers to heal the sick (Luke 10:1). Clearly, none of us wishes to be sick nor to see a loved one suffer. In the past treatments were limited. They still are, yet we have pushed the boundaries of those limits in such a way that new ethical dilemmas arise, particularly in three areas: beginning of life, enhancement of our bodies or our children, and end of life.

2.1 Beginning of life and genetic engineering

Before modern technologies, children were considered a gift from God, for what minimal control we had over the inception or termination of a pregnancy was often ineffectual or worse. This has changed through a variety of fertilization techniques such as artificial insemination, in vitro fertilization, and surrogacy as well as birth control and abortion. These technologies give women control over when to have a child, how to have a child, and offer opportunities to screen for defects or traits using amniocentesis, prenatal screening, and even preimplantation genetic diagnosis.

The first theological concern is the question of what we believe God intends as ‘natural’ for humanity. The Catholic Church sets a clear answer. In the encyclical Humanae vitae, Pope Paul VI sets definite boundaries, declaring the use of technologies such as sterilization and contraceptives as against natural law, an assertion our domination and control above our God-given ends (1968: 17). Karl Rahner rejects artificial semination for similar reasons, namely, that it separates conception from the natural act of coitus (1972: 246).

The Catholic magisterium has consistently taught that procreation is both the goal and the unique province of the conjugal act. Thus, any creation of human life apart from coitus is prohibited, ruling out in vitro fertilization. Second, since the embryo ‘is to be respected and treated as a person from the moment of conception; and therefore from that same moment his rights as a person must be recognized, among which in the first place is the inviolable right of every innocent human being to life’ abortion is also not allowed (Pontifical Academy for Life 2000: 5). Other Christian traditions recognize that if we are ‘co-creators’ with God, what is ‘natural’ is less clear. Most Protestants are comfortable with contraception; abortion is proscribed by the evangelical community on the basis of life sacrality, while more mainstream Protestants see it as morally ambiguous.

Genetic engineering or enhancement is another new technology often used at or near the beginning of life. Somatic therapies edit faulty DNA within non-reproductive cells. Few ethicists regard this as ‘unnatural’ or problematic since it corrects a deficiency only in the individual treated. As William May notes, ‘[s]uch therapy is morally warranted as long as the risks posed [...] are not significant when compared with the [...] benefit to the patient’ (2008: 240). Germline therapy alters the DNA in cells that will ultimately produce reproductive cells (often the fertilized egg) allowing corrections to be passed to future generations. Currently, the Catholic Church distinguishes between somatic and germline therapy, prohibiting the latter because of its ability to introduce potential harms into the human gene pool in perpetuity and because it requires in vitro fertilization. However, few consider germline therapy intrinsically wrong. Ron Cole-Turner notes that given their commitment to healing the sick, most Christian churches do not oppose the idea of human germline therapy so long as two conditions are met: the use must be purely therapeutic rather than for enhancement; and the therapy should not rely on in vitro fertilization or any manipulation of the embryo outside the body (2008: 15).

Lutheran theologian Gilbert Meilaender worries that therapy might easily lead to enhancement, suggesting that the pressure to select or engineer – first for enhanced disease-fighting ability and later for other desired traits – could eventually become enormous, turning reproduction into an act of ‘technical manufacture’ rather than ‘embodied self-giving in love’ (2020: 79). Indeed, the first unauthorized use of CRISPR genetic engineering on a human embryo, in a Chinese lab in 2018, was for just such an enhancement – to confer resistance to HIV. This procedure was internationally condemned; however, religious views are mixed. A 2020 poll released by the Pew Research Center found that while a majority considered therapeutic genetic manipulation in the case of serious disease acceptable, only fourteen percent said it would be appropriate to change a baby’s genetic makeup to enhance its intelligence. Those for whom religion was very important tended to be more disapproving of scientific research on gene editing (Pew Research Centre, Funk et al. 2020).

While we are capable of human germline therapy, as demonstrated in 2018, governments have pressed for a general moratorium on its use due to a host of troubling ethical questions. Germline therapy could have unintended or unforeseen effects on future generations. Nor can we obtain the consent of our potential children and grandchildren. Germline therapy also raises concerns regarding human diversity. Pew (2020) found that respondents in Malaysia and Singapore feared that people might choose to westernize their children by creating babies with blond hair and blue eyes. The widespread use of germline therapy, along with prenatal screening, concerns the disabled community, who fear further minoritization or unrealistic expectations of perfection by parents or society (Edmonds 2011; Swinton and Brock 2007).

One theological concern is whether enduring genetic alterations threaten human nature and dignity. Scientists have inserted human DNA in pigs to grow cells or organs with a reduced immune rejection response. The implantation of nonhuman organs from such transgenic animals transfers animal genetic material into a human being. How much nonhuman genetic material could we implant and remain human? Catholic doctrine has long argued that we should stay within the confines of God’s creation. In Gaudium et spes, the Second Vatican Council affirms:

All things are endowed with their own stability, truth, goodness, proper laws and order. Man must respect these as he isolates them by the appropriate methods of the individual sciences or arts. (Vatican Council II 1965: 59)

Would we risk losing the image of God in which humans were created? Elaine Graham notes that a ‘self-constituting’ humanity could alienate us from relationship with our Creator, causing us to put our faith in human reason and technological ability, as we ‘seek redemption via technocratic means alone’ (2006: 66). Most Christians would agree with Nigel Cameron and Amy DeBaets that:

technology cannot cure our sinfulness so no genetic manipulation will grant us moral perfection. It is only in Christ that we may have the hope of restoration to who we were created to be, and that will not in this life be perfected. (Cameron and DeBaets 2008: 99)

Michael Sandel worries that germline alterations could lead to a new ‘arms race’ in which we face an ever-rising bar of physical or mental capabilities. Sandel fears we will embark on

a Promethean aspiration to remake nature, including human nature, to serve our purposes and satisfy our desires. The problem is not the drift to mechanism but the drive to mastery. And what the drive to mastery misses and may even destroy is an appreciation of the gifted character of human powers and achievements. (Sandel 2004)

Another related technology is that of cloning. Scientists have been cloning animals since the 1950s. In 1996, Ian Wilmut successfully produced the sheep, Dolly, the first cloned mammal. Since then, scientists have cloned many animals, including mice, cats, pigs, goats, cows, mules, and rabbits. The prospect of human cloning has raised valid safety concerns. Cloned animals tend to be much shorter lived, and many cloned offspring have serious genetic abnormalities. Beyond the technical difficulties, human cloning has also met resistance on theological and ethical grounds (Cole-Turner 1997). Reproductive cloning evokes images of carbon copies of ourselves walking around, of Brave New World scenarios of armies of identical workers or soldiers, or of parents desperately trying to replace a lost child. Once again, it represents a means of reproduction separated from the coital act and, when it is at its best, the loving action of two human being in relationship.

Therapeutic cloning, however, seeks not to create a new human being but to simply produce tissue or cells genetically identical to the patient’s, thus mitigating a rejection response. This technology alleviates a prior ethical issue. Earlier development of human tissue in the lab depended on the pluripotentency of foetal stem cells, harvested from embryos donated by in vitro fertilization clinics. Recent work has moved to cloning a patient’s own cells, which are induced to be pluripotent.

For those who believe human personhood begins at conception, the use, and destruction, of embryos even a few days in age is deeply problematic, constituting at best the preference of one human life over another. In 2001, Pope John Paul II issued a statement equating stem cell research with infanticide. John Paul’s declaration was followed by US President George W. Bush’s order that public funds could be used for research only on human embryonic stem cell cultures established before August 2001. However, the use of adult cells induced to be pluripotent has reduced the moral conflicts regarding stem cells research and therapies, as doctors can now use a patient’s own cells rather than those from an embryo. Similar stem cell technology is also used to create embryonic models, clusters of cells that develop like human embryos, but are not derived from fertilization of an egg. Indeed, such stem cells were used to develop mRNA vaccines for the Sars-COV2 virus.

2.2 End of life and life support

No one questions the benefits of diagnostic machinery, or the temporary use of mechanical ventilators, heart pumps, or kidney dialysis for acute care. Technology becomes an issue when the use of such machinery becomes a permanent necessity. Most of us wish for a dignified death for ourselves and our loved ones, yet many worry that their decision to terminate life support might make them instrumental in that death. When and how much technology should be used at the end of life?

The end of life raises many of the same existential issues as does the beginning of life. In a 1981 document, the Presbyterian Church notes the similarity between euthanasia and abortion:

There is an accompanying prejudice against the taking of life in both cases, since the conflict between doing no harm and protecting from harm has reference to one and the same individual. (Bay 1995: 41)

Assisted suicide can raise a fundamental conflict between the sanctity of life and the obligation to relieve suffering. Different traditions elevate one or the other; while many Protestant traditions emphasize the patient’s dignity and human freedom, Roman Catholics turn to natural law. The Orthodox Christian tradition focuses on the ambiguity of either choice due to human moral corruption. For all Christians, death is a tragedy that disrupts the unity of body and soul that constitutes the human person (Stempsey 1997: 259). Yet suffering is also a tragedy.

There is a difference between taking a life and allowing a person to die. Patients or families are increasingly faced with decisions about when to end the use of technological life supports. The Catholic Church has a tradition of permitting patients or their families to forgo extraordinary care. But what is extraordinary? We see nothing extraordinary in the use of machines such as x-rays or MRIs for diagnostics. Nor do we question life sustaining machinery, such as ventilators or dialysis, for acute care. However, their use for the terminally ill can be considered extraordinary. The US Catholic Bishops (2001) have deemed feeding tubes ‘in principle, an ordinary means of preserving life’ – however they are not obligatory. Pope John Paul II, in the late stages of Parkinson’s disease, refused to have such a tube implanted (a decision that was honoured) as his death was imminent. For most Christians, while life is to be sustained through ‘vigorous medical intervention’ as long as possible, when the hope of self-sustenance is lost, and the patient is clearly in the dying process, all technological interventions can be discontinued (Harakas 1999: 11).

Medical professionals generally use four criteria to determine the legitimacy of discontinuing mechanical life support: the presence of a fatal condition; the autonomy of the patient or an advance directive; whether the therapy is effective; and whether treatment places an excessive burden on the patient, family, or community. Determining efficacy can be problematic. For example, it is unclear whether to remove a feeding tube from a person in a persistent vegetative state when we cannot know if they will ultimately revive or if feeding is merely prolonging the dying process. What constitutes an excessive burden is also problematic. Prolonged treatment can place much stress on the patient’s family, particularly when the patient is a child. It is also a financial burden, raising issues of justice and accessibility.

William Stempsey writes: ‘[e]nd-of-life decisions reflect not only the meaning we find in dying, but also the meaning we have found in living’ (1997: 249). Traditionally, Christianity has considered disease, suffering, and death to be part of the cosmic narrative of sin, forgiveness, and salvation. Just as Jesus bore the suffering and death on the cross, Christians are expected endure suffering with patience and not fear death. Indeed, suffering can be seen as a vehicle of spiritual growth. Yet borderline cases raise existential questions difficult to answer. Before the advent of advanced medical technologies, it was relatively easy to answer the following: when does life begin or end? Who or what is a human person? When is suffering too much to bear? Technology has obviated previously clear answers.

3 Computer technology and human identity

Technology changes our self-image. Hydraulics led to an image of human beings as governed by the rise and fall of various humours. Remnants of the industrial revolution can be found in our self-descriptions of being ‘under stress’ or ‘letting off steam’. We came to regard the body as a machine and, as society moved from being predominantly rural to urban, we increasingly found our identity in the machines we surrounded ourselves with, the ‘modern’ world of locomotives, automobiles, radios, and appliances. The machine became the subject of art, architecture, and even poetry (Trachtenberg 1986). Today the technology that most shapes our self-image is the computer. We frequently use computational metaphors to discuss the working of the human mind, such as saying we are ‘out of bandwidth’ or putting something in ‘long-term memory’ – or even just saying something ‘doesn’t compute’. As Willem Drees notes, while we consider ourselves as being created in the image of God, ‘we speak of ourselves as if we were made in the image of machines’ (2002: 602).

Are our minds essentially computers? The prospect of the development of an artificial general intelligence (AGI), a machine capable of doing anything that a human can do at a similar or superior level, rests on this assumption. If a machine can replicate most, or all, human behaviours, what, if anything, makes humans unique? Returning to the imago Dei described in Gen 1:28, we note that its earliest interpretations located God’s image in our minds. David Cairns, in his historical survey The Image of God in Man, writes: ‘In all the Christian writers up to Aquinas we find the image of God conceived of as man’s power of reason’ (1973: 60). Twentieth-century biblical scholars (such as Johannes Hehn and Gerhard Von Rad), noting the juxtaposition of human creation in God’s image with the call to exercise stewardship and dominion, stressed human action, while systematic theologians (e.g. Karl Barth, Dietrich Bonhoeffer, Wolfhart Pannenberg, Catherine LaCugna) located the image of God in our relationships, with God and each other (Herzfeld 2009: 58–64).

The first approach to AI focused on rational thought, attempting to define human rationality as a system of symbols and rules, a basic computer program. Such systems were initially quite successful when applied to limited environments, such as the world of chess or the rules of calculus. They were less successful at navigating the vicissitudes of the real world. Roboticist Hans Moravec noted that, while computers could replicate the higher functions of human cognition, such as playing chess, it turned out to be much more difficult to replicate the skills of a toddler (1988: 15). While this led to an ‘AI winter’ of diminished hopes and a fundamental redirection of the field, the symbolic approach lives on in the philosophical view of ‘patternism’, which posits information, rather than matter as the fundamental building block of the universe (Davies and Gregersen 2010; Lloyd 2007). John Haught sees this as a positive move, suggesting that a God who sets up the universe as an information system, complete with redundancy and noise, better fits the universe we live in than the prior conception of God as ‘designer’ (2010: 317–318).

Likewise, Niels Henrik Gregersen notes that seeing everything as patterns of information corrects the traditional division between matter and spirit. For Gregersen, information is analogous to John’s Logos (see John 1), the all-pervasive structuring principle of the universe (Gregersen 2010: 319–342). One criticism of this view is that the patternist, like the strict materialist, makes no distinction between humans and animals, nor humans and machines – nor, for that matter, humans, and rocks, thus veering into panpsychism. All are simply patterns of information.

This view would suggest that artificial intelligence would be quite similar to human intelligence, since we could replicate the patterns or logic that make up our thoughts in silicon. Many, however, disagree. William Clocksin and A. G. J. MacFarlane argue that reason alone cannot account for human intelligence, which they see as impossible without social relationships. While Aristotle elevates rationality as the seat of intelligence and human distinctiveness, Clocksin and MacFarlane note that much of our thinking defies logical consistency:

People do not rely on deductive inference in their everyday thinking. Human experience is marked by incompetence, blunders and acts of misjudgment [...] We sometimes proceed not by what we discover to be true, but by what we want to be true. Yet non-rationality works in everyday situations. (MacFarlane and Clocksin 2003: 1732)

As philosopher Marius Dorobantu points out, ‘[m]ost of our best experiences in life, from falling in love, to enjoying a work of art, and to living a spiritual episode have an important irrational component’ (2021: 34).

By the 1980s, computer scientists moved from trying to capture human thinking in rule-based systems to designing robots and programs that were functional in a particular domain. Functional AI has given us both machines such as the Roomba or the Mars rover, ones that act as our representatives, doing tasks we either cannot or do not wish to do.

However, for AI to be functional in more general settings, it needs to navigate the world of human relationships. Research has moved in this direction. Relational robots now serve as receptionists in Japan, and Siri and Alexa do their best to answer our questions and navigate the internet for us. As computers become both more functional and increasingly relational, they encroach on areas that were previously thought to belong only to humans. One concern is that increasing automation may lead to a crisis in employment as computers take on more of our tasks. While this will likely create new highly skilled jobs, lower skilled occupations will be more easily automated, exacerbating inequality. Loss of work represents a possible loss of an important source of dignity and identity for many. Relational robots are also poised to move from the reception room to the bedroom. Sexbots in brothels or homes raise a host of theological issues, not least of which is a reconsideration of the purpose of sexual intercourse and the sanctity of marriage (Herzfeld 2017).

While such robots dominate the popular imagination of AI, most AI programs work quietly behind the scenes. AI determines our search results and social media feeds, giving us behavioural and interest-based results. It also determines who gets parole, who gets certain medical procedures or prescriptions, and whose resume might be forwarded to a committee. The algorithms behind these choices are proprietary and have been shown to frequently be biased against minorities. They are also opaque, in that it is frequently unclear either to the user or the developer how the algorithm reached its conclusions. While most of us enjoy the customized ads we see online or the suggestions for what we should read or watch next, these algorithms are criticized for their manipulation and surveillance of users as well as their role in political polarization.

These problems stem from our society’s failure to appreciate the human being as a whole consisting of body, mind, and soul. We have elevated one characteristic – namely intelligence – above all others. Sociologist Viktor Ferkiss warns:

While it is untrue that technology determines the future independently of human volition, there is no question that human individuals and human society are increasingly under pressure to conform to the demands of technological efficiency, and there is a real possibility that the essence of humanity will be lost in the process [...] Man must maintain the distinction between himself and the machines of his creation. (Ferkiss 1969: 59–60)

Virtual reality pioneer Jaron Lanier suggests that, in as much as we humans are more flexible than computers, we must guard against adapting to them rather than adapting them to us. He notes that:

the Turing test cuts both ways. You can't tell if a machine has gotten smarter or if you've just lowered your own standards of intelligence to such a degree that the machine seems smart. If you can have a conversation with a simulated person presented by an AI program, can you tell how far you've let your sense of personhood degrade in order to make the illusion work for you? (Lanier 2010: 32)

Lanier views our captivation by computer technology as a new religion, but not a spiritual path.

4 Altering nature: technology and the objectification of nature

Throughout the Middle Ages, scientists and philosophers sought a technology that would transmute base metals into gold. The means of this transformation was called the ‘philosopher’s stone’ and variously described as a material, a process, or an elixir that could not only produce gold but also perfect any substance, cure disease, and restore lost youth. While alchemists did come up with useful chemical and pharmacological substances, they failed to find such a universally salvific substance. Their quest, however, illustrates a philosophy of nature as object, to be manipulated by our technologies according to our whims.

The claims and hopes for nanotechnology are strikingly like those of alchemy. Nanotechnology matches the real results of medieval alchemy in that, as a method of materials science, it has already produced a variety of new and improved materials and promises far more in the decades ahead. The aspirations of alchemy are resurrected in claims that nanotechnology will ultimately give us a method through which matter can be changed at the molecular level from one form into any other, a salvific transformation that could lead to unprecedented wealth, restoration of the body, and even eternal life. Ideally, nanotechnology would let us assemble any substance we wish, snapping together molecules like Lego blocks. This requires tools that allow us to manipulate individual atoms and molecules. Physicist Richard Feynman (1960: 31) suggests:

I want to build a billion tiny factories, models of each other, which are manufacturing simultaneously [...] The principles of physics, as far as I can see, do not speak against the possibility of maneuvering things atom by atom. It is not an attempt to violate any laws; it is something, in principle, that can be done; but in practice, it has not been done because we are too big.

For futurists (Eric Drexler, Ray Kurzweil, Donna Freitas), nanotechnology is the new philosopher’s stone that will give us better lives, and, potentially, immortality. New materials, not found in nature, will be produced; nanobots will repair our bodies; there will be scarcity of nothing. On the other hand, nanobots gone wrong could result in what Eric Drexler (1986) has called the ‘gray goo problem’, in which the molecular reassembling of raw materials runs amok and obliterates life on earth. Nanotechnology, like so many of our technologies, rests on two assumptions: that progress is inevitable and inevitably positive, and that nature consists merely of objects to be manipulated as we choose.

4.1 ‘Playing God’?

Such an instrumental use of technology to radically alter either the natural or the human world has frequently been labelled ‘playing God’. Ted Peters notes that the phrase ‘playing God’ (most often understood as a cautionary warning to stop using a particular technology) can also be a reminder of the stewardship conferred on humans in Genesis 1, a suggestion that we ensure any technology ‘contributes to human welfare without creating new injustices’ (2003: 2). He notes that ‘playing God’ can have three overlapping meanings. The first is the goal of science: to reveal the mysteries of nature. The second is to hold power over life and death, a power latent in many of our technologies. The third, as seen in both genetic engineering and nanotechnology, is to alter the essence of nature, fundamentally changing what God has created. Peters argues that human freedom and our call to beneficence turn this last meaning from ‘stop’ to ‘proceed, but with caution’ (2003: 20–27). Grant Macaskill (2019) takes a similar approach, arguing that biotechnological research should be embraced when consistent with God’s goodness, care for the weak, and with regard for our propensity to sin.

In The Nature and Destiny of Man (1962), Reinhold Niebuhr notes that we humans are finite beings who envision, and thus continually strive for, the infinite. Created in the image of God, ‘man can find his true norm only in the character of God but is nevertheless a creature who cannot and must not aspire to be God’ (1996: 163). Drees warns against drawing a line between what is God’s work and what is ours, suggesting that this plays into a ‘God of the gaps’ mentality where we look to God only when our human skills fall short (2002: 645). For John Brooke and Geoffrey Cantor, a ‘God of the gaps’ implies that

it is presumption to attempt anything toward human improvement [...] It is as if arguments for divine wisdom require this to be the best of all possible worlds, with the corollary that attempts at improvement would both be sacrilegious and ineffective. (Brooke and Cantor 1998: 314)

Natural theologians, who seek to understand God through observing the ‘book of nature’, take a skeptical view of wholesale transformation of God’s creation. Playing God is usurpation of God’s powers.

Others (Sölle and Cloyes 1984; Heyward 1984; Hefner 1993) suggest that human creativity does not necessarily diminish God. Rather, the more we strive to improve and extend our boundaries the more God becomes God. Our technologies make God present in the world. Peters takes the middle ground, suggesting caution:

To accuse our scientists today of ‘playing god,’ is to accuse them of violating the sacred [...] The commandment against playing god is not biblical. So, what counts as the violated sacred? Here is the answer: nature [...] To violate nature is to risk nature’s retaliation, to risk letting a monster loose on the world. (Peters 2018: 146)

Peters cites the myth of Frankenstein as a call to monitor our creations, to ‘embrace caution, prudence, and sound judgment […] to anticipate problems and to avoid harms’ (2018: 150). In Jewish tradition, creation is incomplete, which gives humans a responsibility to work toward its completion through technology, so long as this is tempered by wisdom and responsibility. The difference between Rabbi Loew of Prague, the most famous creator of a golem, and Victor Frankenstein is that Loew controlled his golem while Frankenstein did not (Sherwin 2004: 53–54, 194–196).

4.2 Energy technologies and climate change

While nanotechnology is a nascent field that has yet to substantially alter nature, we have changed the world around us, wittingly and unwittingly, through the burning of fossil fuels. Energy technologies are foundational since they power most other technologies. They also form the bedrock of our culture, raising political barriers to change. Climate change, and its attendant side effects, has become the defining technological and environmental crisis of the twenty-first century, one that cries out for immediate control.

There is no longer any scientific dispute that the earth is warming. The Intergovernmental Panel on Climate Change (IPCC) 2021 report states that it is ‘unequivocal’ that this warming is in part caused by human agency. There is a direct correlation between the amount of carbon dioxide in the atmosphere and the observed rise in global temperature. The world is already 1.5 degrees Celsius hotter than it was in the pre-industrial nineteenth century. Ice sheets on Greenland and Antarctica are melting, and glaciers in both hemispheres receding. Ocean levels have risen 8 inches on average over the past century, and the rate of increase has doubled since 2006, threatening the existence of islands in the South Pacific. Storms and floods have become more frequent and fiercer, devastating parts of Europe and the American South. Meanwhile, droughts and their ensuing fires ravage Australia, the American West, Siberia, and the Mediterranean. The IPCC projects that by the end of the twenty-first century the Earth’s surface temperature will be 2 to 3 degrees Celsius warmer. Large parts of the Indian subcontinent, Middle East, and Africa will become uninhabitable while the melting of glaciers and calving of ice caps raises sea level 4 to 8 meters, inundating many coastal cities, most of Bangladesh, and several island nations. Such warming would lead not only to a massive loss of human life but also to mass extinctions of both plants and animals.

The theory of ‘deep incarnation’ posits a God that is incarnate in all that is. Wessel Bentley explains:

Deep Incarnation argues that God did not only become human in the person of Jesus, but through the Incarnation, God assumes a human body in the natural world with all its evolutionary progress and processes. (Bentley 2016: 3)

According to Gregersen, this means:

In Christ, God enters into the biological tissue of creation in order to share the fate of biological existence [...] In the incarnate One, God shares the life conditions of foxes and sparrows, grass and trees, soil and moisture. (Gregersen 2013: 389)

Elizabeth Johnson argues that this implies that to love God is to love the world and to care for the suffering of any and all creatures (Johnson 2018: 5.7).

We have alternative technologies to fossil fuels; what we lack is the political and social will to change our patterns of consumption. Should this not change, we may be forced to take a riskier route and deploy geoengineering technologies such as ocean fertilization, deployment of reflective particles in the atmosphere, or mechanical sequestration of carbon dioxide. Each geoengineering fix carries unforeseen consequences and their potential to alter climate either globally or locally is currently speculative. In the meantime, environmentalists worry that the possibility of a technological fix might distract policy makers from the politically challenging efforts to reduce greenhouse gas emissions and build societal resilience. In his papal encyclical, Laudato Si' (2015), Pope Francis criticizes the objectification of nature inherent in a ‘technocratic paradigm’ which equates technology with progress and imagines a technological fix for every problem. He chides the West for embracing each new technology without considering its effects on the environment and on the poor, who are the first and most deeply affected. Francis calls for an evaluation of all technologies based on ‘concern for our common home’ and the common good:

When technology disregards the great ethical principles, it ends up considering any practice whatsoever as licit [...] A technology severed from ethics will not easily be able to limit its own power. (Pope Francis 2015: 136)

Our reluctance as a society to move from fossil fuels to renewable energy sources makes it highly unlikely that we will be able to pass the Earth on to future generations in better condition than we received it. While some attributed this reluctance to change to concern for the economy, jobs, and the welfare of the poor, Francis notes that this is a false dichotomy, for human welfare depends on a functional ecosystem:

We have to realize that a true ecological approach always becomes a social approach; it must integrate questions of justice in debates on the environment, so as to hear both the cry of the earth and the cry of the poor. (Pope Francis 2015: 49)

5 The future of technology: utopian and dystopian visions

In his Gifford Lectures, Ethics in an Age of Technology (1992), Ian Barbour presents three faces of technology as liberator, threat, or instrument of human power. Technology as liberator offers a variety of positive contributions to human lives, taking over difficult or tedious work, presenting cures for all that ails us, facilitating communication and human discovery, entertaining us, and increasing the standard of living around the world. When unexpected consequences or environmental problems arise, technology as liberator suggests these can be resolved by further technologies. The public face of technology is that of a saviour, a path to a better future and the means of transcending the suffering inherent in the human condition. Technology as threat underlies our current environmental crisis and threats such as war conducted with nuclear or autonomous weapons. Ultimately, both visions rest on technology wielded by some humans over nature or other humans.

5.1 Technological millenarianism

In theological terms, the liberating face of technology is a millenarian project, one suited to a modern and scientific world. Religious millenarianism, the belief in an imminent age of blessedness, has typically risen among groups challenged by demographic change or climatic stressors. It has found purchase in agrarian cultures, such as Papua New Guinea, and in cosmopolitan ones, such as Renaissance Munster. It promises a solution to current problems delivered from outside, rather than from within the society itself. Religious millenarians have envisioned the immanent second coming of Christ, or of cargo, or of aliens. Technologies such as genetic engineering, nanotechnology, or AI present dreams of a salvation that is always a decade away, a salvation at the hands of something or someone other, remove the onus for solving society’s ills from its members and soothe those who feel powerless in the face of what look like unsurmountable obstacles. Help is just around the corner. When the corner is reached and God or AI or aliens have failed to deliver, millenarians simply recalibrate their timeline or suggest that we misunderstood the extent of their promises. Theologian Ted Peters sums up the millenarian dream for AI:

All we need do is turn a couple technological corners and, suddenly, the abundant life will be ours. We will be liberated from the vicissitudes of biological restraints such as suffering and death; and we will be freed by enhanced intelligence to enjoy the fulfilling life of a cosmic mind. (Peters 2011)

One could easily substitute genetic engineering or nanotechnology, for the promises are the same.

These promises have galvanized the transhuman and posthuman movements. Theirs is not the philosopher’s goal of coming to terms with suffering and death but a quasi-religious goal of surmounting them, ultimately achieving immortality through technological advances that improve us humans or supplant us with a higher form. Foundational to these movements is the assumption that progress is inherent in evolution, and technology is evolution’s current means. Heidi Campbell and Mark Walker state, ‘[t]ranshumanism is the view that humans should (or should be permitted to) use technology to remake human nature’ (2005: 1) while Max More sees a panoply of transhumanisms

that seek the continuation and acceleration of the evolution of intelligent life beyond its currently human form and human limitations by means of science and technology, guided by life-promoting principles and values. (More 2013: 3)

While transhumanism seeks to use technology to modify or improve the human person, posthumanism envisions the possibility of moving beyond human physicality, into a future where intelligence and agency are wielded by machines.

Computers present both transhumanist and posthumanist possibilities. Transhumanists envision uploading the neural pattern of the human brain to a computer. Ray Kurzweil notes the soteriological possibilities:

Up until now, our mortality was tied to the longevity of our hardware. When the hardware crashed, that was it [...] As we cross the divide to instantiate ourselves into our computational technology, our identity will be based on our evolving mind file. We will be software, not hardware. (Kurzweil 2000: 128)

Like most millenarians, Kurzweil has predicted a series of ever-receding dates by which he expects this capability. Unlike the Christian notion of salvation, transhuman salvation is a ‘do it yourself’ project, something we humans might accomplish through our own efforts and agency, thus coming closer to magic than to religious belief (Waters 2011). It offers more, but not eternal, life (Herzfeld 2009: 67). Brain uploading marks a return to a neo-Cartesian dualism in which the body is disposable. However, as Victoria Lorrimar points out, bodies

are fundamental to the way we make sense of the world in our present experience, and provide much of the scaffolding for a shared understanding and experience that makes meaningful communication with our fellow human beings possible. Religious belief cannot be disentangled from our bodily experiences. (Lorrimar 2019: 203)

Short of immortality, AI might lead to better life conditions by finding technological solutions we cannot envision. Computer scientist Ben Goertzel suggested to participants at Davos,

Our top priority should be the creation of beneficent artificial minds with greater than human general intelligence, which can then work together with us to solve the other hard problems and explore various positive avenues. (Goertzel 2013: 130)

Technology might solve sticky engineering problems in the fields of energy, climate, and medicine, make our legal systems fairer, and even, perhaps, find avenues toward world peace. Freed from the need to work, humans could enjoy goods and services produced by our AI servants or our nanotech machines in both physical and virtual reality. In this scenario, technology becomes a quasi-God, one that meets our needs and desires. However, Albert Borgmann (1984) worries that while life might be easier under the rule of technology, demanding but meaningful experiences could be lost.

For posthumanists, a superintelligent AI is the next step in the long evolutionary progression toward beings with greater complexity, greater mastery over their surroundings, and greater intelligence. Pierre Teilhard de Chardin (1959) posited a mutual attraction in all of creation, one that would evolve toward ever higher levels of consciousness, ultimately reaching an Omega point where all consciousness is interconnected. This unity would not erase individuality but would allow each to flourish within a larger whole, a single thinking envelope. Teilhard does not suggest that we will not die; rather, our intellect will become ‘incorporated into a reflective reality that will endure beyond [the] lifespan of the material body’ (Grumett 2011: 45).

Ilia Delio (2020) sees AI, and the rapid development of communication tools such as the internet, as a means to a mental convergence akin to Teilhard’s. Physicist Frank Tipler holds a similar vision. Tipler (1994) identifies Teilhard’s ‘Omega Point’ with God and suggests that as our computational resources increase, some society in the distant future will use those resources to emulate alternative universes in which each of us will reappear and thus be resurrected from the dead. Computer scientist Hans Moravec (1988) also envisions artificially intelligent computers as the next step in an evolution that maximizes intelligence, positing a posthuman future of purely mental creatures. Moravec predicts diversity rather than unity, imaging cyberspace as teeming with disembodied super-minds, as far above us on the evolutionary chain as we are above bacteria.

5.2 A dystopian singularity

Early philosophers of technology (Mumford 1934; Ellul 1964; Jonas 1984; Wiener 1950) were more pessimistic, as are recent writers such as the Unabomber, Joy (2000), and Morozov (2010). They view technology as a threat, a force in its own right that leaves humans little choice but to follow in its wake. They note the unexpected consequences of technology and its ability to be wielded by good and bad actors. While technologies may be developed in all good will and designed to free humans, they are all too often co-opted to exert power and control. One dystopian technological scenario posits the creation of an AI that outstrips human intelligence and control, one that could turn on its creators or be used as a weapon by those creators. But we do not need conscious or superintelligent AI to produce technologies that are damaging to the fabric of human society – a fact obvious in today’s world of computer viruses, ransomware attacks, autonomous weapon development, social media dispersal of falsehoods, and machine learning programs with inherent biases.

Transhumanist and posthumanist visions imagine a continuing human presence or, at minimum, a continuity with human purposes and ideals, in a technologically dominated future. However, this is not a given. In his seminal paper of 1993, computer scientist Vernor Vinge suggested four future possibilities that could enhance intelligence on earth. First, we could build an AI that is ‘awake’ and able to learn without human direction. Second, a large enough computer network might ‘wake up’ and gain superhuman intelligence. Third, computer/human interfaces might lead to superhuman intelligence. Finally, we might find ways to improve the capacity or use of our own brains. Vinge imagined that any of these would result in a ‘singularity’, a break with past evolution that would end with a ‘throwing-away of all the human rules, perhaps in the blink of an eye – an exponential runaway beyond any hope of control’ (1993). Futurist Nick Bostrom (2014) sees the outcome of such a singularity as either extremely good or extremely bad. Perhaps superintelligent machines would use their capabilities to solve our problems and make life easier. Or they might decide we are messy and superfluous and do away with humans entirely.

Warnings of the singularity have acquired a new resonance with the advent of programs such as DeepMind’s AlphaGo Zero and AutoML-Zero. The Zero in these names mean that these programs are designed to learn with zero human input. AlphaGo Zero, playing against itself, mastered Go in three days. In forty days, it exceeded all human-aided Go programs, and developed strategies human players neither used nor understood. This would seem to represent AI passing Vinge’s initial benchmark for a superhuman intelligence no longer constrained by the limits of human knowledge and tuition. Berkeley computer scientist Stuart Russell believes developing a machine that could understand human language would have an even better result, since it could read everything ever written:

Once we have that capability, you could then query all of human knowledge and it would be able to synthesize and integrate and answer questions that no human being has ever been able to answer because they haven't read and been able to put together and join the dots between things that have remained separate throughout history. (Thomas 2021)

There remains the question of whether the AI would understand what it read. Our lack of knowledge of how human intelligence and consciousness work has led many to believe the singularity is a long way down the road. According to MIT’s Rodney Brooks:

Building human level intelligence and human level physical capability is really, really hard. There has been a little tiny burst of progress [… but] we are less than 1% of the way there, with no real intellectual ideas yet on how to get to 5%. (Brooks 2018)

5.3 Technology and sin

Norbert Weiner (1950) warned that the greatest danger is posed not by a singularity or other runaway technology but by our commercial and military exploitation of technology’s power and our possible loss of control – not because the computer has outwitted us, but because we fail to comprehend the true nature of what we have asked it mindlessly to do. Like King Midas, or the finder of a djinni’s lamp, our will to power may give us precisely what we wish for but not what we want.

Technology obviously amplifies our abilities to feed the hungry, heal the sick, to provide food, clothing, and adequate shelter, and generally enhance our physical and mental wellbeing. But can technology amplify human goodness, per se? Thomas Douglas (2008) notes that we already have pharmaceutics that lower aggression. Ingmar Persson and Julian Savulescu suggest that we should expand these and look for other forms of bio-enhancement to eliminate racial aversions and to enhance ‘sympathy and justice as regards future generations and non-human animals’ (2012: 112). Celia Deane-Drummond counters that infusing rather than teaching morality seems likely to fail; but even should it succeed, it would impair human freedom and obviate the theological goal of growing in wisdom and virtue throughout a lifetime (2017: 184–185). Todd Daly believes moral bio-enhancement could backfire and enhance human pridefulness: ‘[p]ride encourages us to combat our helplessness by turning to our own devices, schemes, philosophies, and technologies in an attempt to secure our salvation’ (2017: 219).

Technology amplifies both human sin and human goodness. While many scientists and developers assume, and some philosophers have argued, that technology is morally neutral (Pitt 2014), others have shown how moral values or political beliefs are embedded in the design of artifacts (Winner 1980). Our technologies serve values that transcend them, though we may not be cognizant of the link. Big data, for example, is imbued with moral issues (often unwittingly) from the ways in which the data is gathered and aggregated, the preservation or loss of confidentiality, to the uses to which the data is put (Fuller 2017). Training sets for neural networks have exhibited unwitting bias, such as when facial recognition systems are trained using only white or male faces. Jonas writes that ‘the predictive knowledge falls behind the technical knowledge which nourishes our power to act’ (1973: 39).

Jonas warns against the ‘inherently utopian’ goals of modern technology and our reluctance to take responsibility for future unknowns (1973: 50–51). Peters points to the computer virus as emblematic of the ways in which human sinfulness creeps into most technologies. He cautions that:

[W]e should move forward, but we should not presume that progress in every respect is inevitable or guaranteed. Technological advance does not belong to an underlying or inevitable advance in human goodness or human achievement. (Peters 2011: 80–81)

The Catholic Church has long regarded technologies as morally ambiguous – good, neutral, or bad – depending on the outcome of their use (Green 2017: 7). Pope Francis, in Laudato Si', notes that some technologies (such as our arsenal of deadly weapons) are clearly evil, in that their sole intent is destruction; while other technologies, such as most medical advances, remain benign. While a particular technology might be morally neutral, Francis is clear that technology as a whole is not:

Science and technology are not neutral; from the beginning to the end of a process, various intentions and possibilities are in play [...we need] to appropriate the positive and sustainable progress which has been made, but also to recover the values and the great goals swept away by our unrestrained delusions of grandeur. (Pope Francis 2015: 104–114)

One reason we evade the responsibility Jonas and Francis call for is rooted in the way technology distances us, in both time and space, from our actions. Theologian Alfred Schutze notes:

Whereas only a few centuries ago evil, so-called, had to be considered pertinent to moral behaviour, more specifically the backsliding or weakness of the individual, today it also appears in a manner detached from the individual. It shows up impersonally in arrangements and conditions of social, industrial, technical, and general life which, admittedly, are created and tolerated by man. (Kimbrell 2000)

Andrew Kimbrell believes such a distancing results in a ‘cold evil’:

Obviously, few of us relish the thought that our automobile is causing pollution and global warming or laugh fiendishly because refrigerants in our air conditioners are depleting the ozone layer. I have been in many corporate law firms and boardrooms and have yet to see any ‘high fives’ or hear shouts of satisfaction at the deaths, injuries, or crimes against nature these organizations often perpetrate. (Kimbrell 2000)

Ronald Cole-Turner writes, ‘[w]hen we juxtapose technology and morals, as we must, we should learn to expect that technology will stretch, and often break, the categories for thought that previously defined our moral view’ (2011: 11). In effect, technology has altered our definition of sin. In an interview with the Vatican’s daily newspaper, L’Osservatore Romano, Monsignor Gianfranco Girotti (2008) notes the individuality of the traditional seven deadly sins (anger, pride, lust, gluttony, greed, sloth, and envy). His deadly sins are communal, including pollution, genetic manipulation, drug use, and economic disparity between the wealthy and the poor. Enabled by technology, these sins destroy the social fabric on a global scale. The World Council Churches has come to a similar conclusion and urges a new sense of responsibility and stewardship on issues such as human genetic research, biotechnology and agriculture, climate change, and the economics of globalization.

Yet communal life is, in the end, made up of the actions of myriad individuals. Muslim theologian Seyyed Hossein Nasr notes:

All these elements are tied together—new technologies, political systems, economic systems, and social structures—to affect the way things are changing [...] Having forgotten their vice-regency today men are trying to act as gods, and they will be punished in the most severe way for this sin. I have always said that however powerful we may appear to be as we try to destroy nature, nature will have the final say. (Hossein 2007: 93)

Borgmann believes that the powers we accrue through technology have played a part in the decline of religious faith:

A promise of salvation seems to have no purchase in a situation where health is secure, food and shelter are unfailingly available, where boredom and unease are countered by sophisticated diversion. (Borgmann 2003: 85)

Technology so far has not overcome disease, dissatisfaction, and ultimately, death. Brian Brock notes that we cannot ultimately place our hopes in technology:

Facing the reality that all we can do is intervene in processes we did not create and that are far bigger and more complex than we grasp raises more theological questions about what really sustains human life on earth, and how human life is related to the whole of creation. (Brock 2014: 114)

As we have already seen, each technological fix brings new problems. The problems lie not in the technology itself, but in our wielding of it in what Niebuhr considered a ‘willful refusal to acknowledge our own temporality’. Niebuhr calls this our ‘sin of insecurity’, a sin ‘of those, who knowing themselves to be insecure, seek sufficient power to guarantee their security, inevitably, of course, at the expense of other life’ (1996: 190).

Every technology so far developed has been directed toward a limited goal and thus fails, by itself, to sustain a long-range and integral view of human life and human needs (Borgmann 2003: 83). Technology has brought us incredible advances and advantages. Through it we live, in the aggregate, longer, healthier, and in many ways happier lives. But to look at technology in a clear-eyed way, we must let go of the simplistic instrumental view and recognize that every technology involves a delicate interplay between our tools, our selves, and our environment. Each has both short and long-term consequences. We yearn to both transcend and control the natural world, but we share its finiteness, its limitations, and ‘the condition of finiteness [...] is a problem for which there is no solution by any human power. Only God can solve this problem’ (Niebuhr 1996: 295).


Copyright Noreen L. Herzfeld (CC BY-NC)


  • Further reading

    • Cole-Turner, Ronald (ed.). 2008a. Design and Destiny: Jewish and Christian Perspectives on Human Germline Modification. Cambridge, MA: MIT Press.
    • Cole-Turner, Ronald (ed.). 2011b. Transhumanism and Transcendence: Christian Hope in an Age of Technological Enhancement. Georgetown, Washington, DC: Georgetown University Press.
    • Deane-Drummond, Celia, Sigurd Bergmann, and Bronislaw Szerszynski (eds). 2015. Technofutures, Nature and the Sacred: Transdisciplinary Perspectives. Farnham: Ashgate.
    • Delio, Ilia. 2020. Re-Enchanting the Earth: Why AI Needs Religion. Maryknoll, NY: Orbis Books.
    • Herzfeld, Noreen. 2012. Technology and Religion: Remaining Human in a Co-Created World. Conshohocken, PA: Templeton.
    • Johnson, Elizabeth. 2018. Creation and the Cross: The Mercy of God for a Planet in Peril. Maryknoll, NY: Orbis.
    • Midson, Scott. 2018. Cyborg Theology: Humans, Technology, and God. London/New York: I. B. Tauris.
    • Murphy, Nancey, and Christopher Knight. 2010. Human Identity at the Intersection of Science, Technology and Religion. Farnham: Ashgate.
    • Pope Francis. 2015. ‘Laudato Si’: On Care for Our Common Home’,
  • Works Cited

    • Barbour, Ian G. 1992. Ethics in an Age of Technology: Gifford Lectures. Volume 2. New York: Harper Collins.
    • Barth, Karl. 1960. Church Dogmatics: The Doctrine of Creation, Part 2. Volume 3. Edited by G. W. Bromiley and T. F. Torrence. Translated by J. W. Edwards. Edinburgh: T&T Clark.
    • Bay, Eugene. 1995. ‘The Christian Faith and Euthanasia’, in In Life and Death We Belong to God. Louisville, KY: Presbyterian Distribution Services.
    • Bentley, Wessel. 2016. ‘Re-Visiting the Notion of Deep Incarnation in Light of 1 Corinthians 15:28 and Emergence Theory’, Theological Studies 72, no. 4: 1–8.
    • Borgmann, Albert. 1984. Technology and the Character of Contemporary Life: A Philosophical Inquiry. Chicago: University of Chicago Press.
    • Borgmann, Albert. 2003. Power Failure: Christianity in the Culture of Technology. Grand Rapids: Brazos.
    • Bostrom, Nick. 2014. Superintelligence: Paths, Dangers, Strategies. Oxford: Oxford University Press.
    • Brock, Brian. 2014. Captive to Christ, Open to the World: On Doing Christian Ethics in Public. Edited by Kenneth Oakes. Cambridge: Lutterworth Press.
    • Brooke, John, and Geoffrey Cantor. 1998. Reconstructing Nature: The Engagement of Science and Religion. Edinburgh: T&T Clark.
    • Brooks, Rodney. 2018. ‘My Dated Predictions’,
    • Cairns, David. 1973. The Image of God in Man. New York: Collins.
    • Cameron, Nigel, and Amy DeBaets. 2008. ‘Germline Gene Modification and the Huma Condition Before God’, in Design and Destiny: Jewish and Christian Perspectives on Human Germline Modification. Edited by Ronald Cole-Turner. Cambridge, MA: MIT Press, 93–118.
    • Campbell, Heidi, and Mark Walker. 2005. ‘Religion and Transhumanism: Introducing a Conversation’, Journal of Evolution and Technology 14, no. 2: 1–15.
    • Carr, Nicholas G. 2010. The Shallows: What the Internet Is Doing to Our Brains. New York: W. W. Norton & Company.
    • Catholic News Agency. 2008. ‘Vatican Bishop Points to Modern Social Sins’,
    • Cole-Turner, Ronald. 1997. Human Cloning: Religious Perspectives. Louisville, KY: Westminster John Knox.
    • Cole-Turner, Ronald. 2008b. ‘Religion and the Question of Human Germline Modification’, in Design and Destiny: Jewish and Christian Perspectives on Human Germline Modification. Edited by Ronald Cole-Turner. Cambridge, MA: MIT Press, 1–28.
    • Cole-Turner, Ronald. 2011a. ‘The Transhumanist Challenge’, in Transhumanism and Transcendence: Christian Hope in an Age of Technological Enhancement. Edited by Ronald Cole-Turner. Georgetown, Washington, D.C.: Georgetown University Press.
    • Committee on Doctrine of the United States Conference of Catholic Bishops. 2001. ‘Ethical and Religious Directives for Catholic Health Care Service, Sixth Edition’, in
    • Culkin, John M. 1967. ‘A Schoolman’s Guide to Marshall McLuhan’, Saturday Review 71: 51–53, 71–72.
    • Daly, Todd. 2017. ‘A Transhumanist Moral Bioenhancement Program: A Critique from Barth and Bonhoeffer’, in Religion and Human Enhancement. Edited by Tracy Trothen and Calvin Mercer. Cham: Palgrave MacMillan, 213–228.
    • Davies, Paul, and Niels Henrik Gregersen (eds). 2010. Information and the Nature of Reality: From Physics to Metaphysics. Cambridge: Cambridge University Press.
    • Deane-Drummond. 2017. ‘The Myth of Moral Bio-Enhancement: An Evolutionary Anthropology and Theological Critique’, in Religion and Human Enhancement. Edited by Tracy Trothen and Calvin Mercer. Cham: Palgrave MacMillan, 175–190.
    • Delio, Ilia. 2020. Re-Enchanting the Earth: Why AI Needs Religion. Maryknoll, NY: Orbis Books.
    • Diamond, Jared M. 1998. Guns, Germs, and Steel: The Fates of Human Societies. London: Random House.
    • Dorobantu, Marius. 2021. ‘Cognitive Vulnerability, Artificial Intelligence, and the Image of God in Humans’, Journal of Disability & Religion 25, no. 1: 27–40.
    • Douglas, Thomas. 2008. ‘Moral Enhancement’, Journal of Applied Philosophy 25, no. 3: 228–245.
    • Drees, Willem B. 2002a. ‘Human Meaning in a Technological Culture: Religion in an Age of Technology’, Zygon 37, no. 3: 597–604.
    • Drees, Willem B. 2002b. ‘“Playing God? Yes!” Religion in the Light of Technology’, Zygon: Journal of Religion and Science 37, no. 3: 643–654.
    • Drexler, K. Eric. 1986. Engines of Creation. Anchor Press/Doubleday.
    • Edmonds, Matt. 2011. A Theological Diagnosis: A New Direction on Genetic Therapy, ‘Disability’ and the Ethics of Healing. London: Jessica Kingsley.
    • Ellul, Jacques. 1964. The Technological Society. New York, NY: Vintage Books.
    • Ferkiss, Victor C. 1969. Technological Man: The Myth and the Reality. New York: Mentor Books.
    • Feynman, Richard P. 1960. ‘There’s Plenty of Room at the Bottom’, Caltech Magazine. 22–36.
    • Fuller, Michael. 2017. ‘Big Data, Ethics and Religion: New Questions from a New Science’, Religions 8, no. 5: 86–96.
    • Funk, Cary, Alec Tyson, Brian Kennedy, and Courtney Johnson. 2020. ‘Biotechnology Research Viewed with Caution Globally, but Most Support Gene Editing for Babies to Treat Disease’, Pew Research Centre.
    • Goertzel, Ben. 2013. ‘Artificial General Intelligence and the Future of Humanity’, in The Transhumanist Reader. Edited by Max More and Natasha Vita-More. Wiley & Sons, 128–137.
    • Graham, Elaine. 2006. ‘In Whose Image? Representations of Technology and the “Ends” of Humanity’, in Future Perfect: God, Medicine, and Human Identity. Edited by Celia Deane-Drummond. New York: T&T Clark.
    • Green, Brian. 2017. ‘The Catholic Church and Technological Progress: Past, Present, and Future’, Religions 8, no. 6: 15–31.
    • Gregersen, Niels Henrik. 2010. ‘God, Matter, and Information: Towards a Stoicizing Logos Christology’, in Information and the Nature of Reality: From Physics to Metaphysics. Edited by Niels Henrik Gregersen and Paul Davies. Cambridge: Cambridge University Press, 319–348.
    • Gregersen, Niels Henrik. 2013. ‘Cur deus caro: Jesus and the Cosmos Story’, Theology and Science 11, no. 4: 370–393.
    • Grumett, David. 2011. ‘Transformation and the End of Enhancement: Insights from Pierre Teilhard de Chardin’, in Transhumanism and Transcendence: Christian Hope in an Age of Technological Enhancement. Edited by Ron Cole-Turner. Georgetown, Washington, DC: Georgetown University Press.
    • Harakas, Stanley. 1999. The Orthodox Tradition: Religious Beliefs and Healthcare Decisions. Park Ridge Center for the Study of Health, Faith, and Ethics.
    • Haught, John. 2010. ‘Information, Theology, and the Universe’, in Information and the Nature of Reality: From Physics to Metaphysics. Edited by Niels Henrik Gregersen and Paul Davies. Cambridge: Cambridge University Press, 301–318.
    • Hefner, Philip J. 1993. The Human Factor: Evolution, Culture, and Religion. Minneapolis: Fortress Press.
    • Heidegger, Martin. 1977. The Question Concerning Technology, and Other Essays. Translated by William Lovitt. New York: Garland. First published 1954.
    • Herzfeld, Noreen L. 2002. In Our Image: Artificial Intelligence and the Human Spirit. Minneapolis, MN: Fortress Press.
    • Herzfeld, Noreen L. 2009. Technology and Religion: Remaining Human in a Co-Created World. Bryn Mawr, PA: Templeton Press.
    • Herzfeld, Noreen L. 2017. ‘Religious Perspectives on Sex with Robots’, in Robot Sex: Social and Ethical Implications. Edited by John Danaher and Neil McArthur. Cambridge, MA: MIT Press.
    • Heyward, Isabel Carter. 1984. The Redemption of God. Lanham, MD: University Press of America.
    • Hossein, Seyyed. 2007. ‘The Islamic Perspective on the Environmental Crisis: Seyyed Hossein Nasr in Conversation with Muzaffar Iqbal’, Journal of Islam & Science 5, no. 1: 75–96.
    • Intergovernmental Panel on Climate Change. 2021. ‘Climate Change 2021: The Physical Science Basis’, in
    • Johnson, Elizabeth. 2018. Creation and the Cross: The Mercy of God for a Planet in Peril. Maryknoll, NY: Orbis.
    • Jonas, Hans. 1973. ‘Technology and Responsibility: Reflections on the New Tasks of Ethics’, Social Research 40, no. 1: 31–54.
    • Jonas, Hans. 1984. The Imperative of Responsibility: In Search of an Ethics for the Technological Age. Chicago: University of Chicago Press.
    • Joy, William. 2000. ‘Why the Future Does Not Need Us’, Wired.
    • Kimbrell, Andrew. 2000. ‘Cold Evil: Technology and Modern Ethics’, Twentieth Annual E. F. Schumacher Lecture.
    • Kurzweil, Ray. 2000. The Age of Spiritual Machines: When Computers Exceed Human Intelligence. New York: Penguin.
    • Lanier, Jaron. 2010. You Are Not a Gadget: A Manifesto. New York: Alfred A. Knopf.
    • Lloyd, Seth. 2007. Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos. New York: Vintage.
    • Lorrimar, Victoria. 2017. ‘The Scientific Character of Philip Hefner’s “Created Co-Creator”’, Zygon 52, no. 3: 726–746.
    • Lorrimar, Victoria. 2019. ‘Mind Uploading and Embodied Cognition: A Theological Response’, Zygon: Journal of Religion and Science 54, no. 1: 191–206.
    • Macaskill, Grant. 2019. ‘Playing God or Participating in God? What Considerations Might the New Testament Bring to the Ethics of the Biotechnological Future?’, Studies in Christian Ethics 32, no. 2: 152–164.
    • MacFarlane, A. G. J., and William F. Clocksin. 2003. ‘Artificial Intelligence and the Future’, Philosophical Transactions of the Royal Society of London Series A: Mathematical, Physical and Engineering Sciences 361: 1721–1748.
    • Marcuse, Herbert. 1941. Some Social Implications of Modern Technology. Institute of Social Research 9. New York: Institute of Social Research.
    • May, William. 2008. Catholic Bioethics and the Gift of Human Life. Huntington, IN: Our Sunday Visitor. 2nd edition.
    • McLuhan, Marshall. 1964. Understanding Media: The Extensions of Man. Cambridge, MA: MIT Press.
    • McNeil, Ian (ed.). 1990. An Encyclopedia of the History of Technology. London: Routledge.
    • Meilaender, Gilbert. 2020. Bioethics and the Character of Human Life: Essays and Reflections. Eugene, OR: Wipf and Stock.
    • Moravec, Hans. 1988. Mind Children: The Future of Robot and Human Intelligence. Cambridge, MA: Harvard University Press.
    • More, Max. 2013. ‘The Philosophy of Transhumanism’, in The Transhumanist Reader. Edited by Max More and Natasha Vita-More. Chichester: Wiley-Blackwell, 3–17.
    • Morozov, Evgeny. 2010. The Net Delusion: The Dark Side of Internet Freedom. Cambridge, MA: Public Affairs.
    • Mumford, Lewis. 1934. Technics and Civilization. New York: Harcourt, Brace & Co.
    • Niebuhr, Reinhold. 1996. The Nature and Destiny of Man. Volume 1. Louisville, KY: Westminster John Knox. First published 1962.
    • Noble, David F. 1999. The Religion of Technology: The Divinity of Man and the Spirit of Invention. New York: Penguin Books.
    • Persson, Ingmar, and Julian Savulescu. 2012. Unfit for the Future: The Need for Moral Enhancement. Oxford: Oxford University Press.
    • Peters, Ted. 2003. Playing God? Genetic Determinism and Human Freedom. New York: Routledge.
    • Peters, Ted. 2011. ‘Transhumanism and the Posthuman Future: Will Technological Progress Get Us There?’, Metanexus.
    • Peters, Ted. 2018. ‘Playing God with Frankenstein’, Theology and Science 16, no. 2: 145–150.
    • Pitt, Joseph. 2014. ‘“Guns Don’t Kill, People Kill”; Values in and/or Around Technologies’, in The Moral Status of Technical Artifacts. Edited by Peter Kroes and Peter-Paul Verbeek. Dordrecht: Springer Publishing, 89–101.
    • Pontifical Academy for Life. 2000. Declaration on the Production and the Scientific and Therapeutic Use of Human Embryonic Stem Cells. Vatican City: Vatican Press.
    • Pope Francis. 2015. ‘Laudato Si’: On Care for Our Common Home’,
    • Pope Paul VI. 1968. ‘Humanae Vitae’,
    • Postman, Neil. 1993. Technopoly: The Surrender of Culture to Technology. New York: Vintage.
    • Rahner, Karl. 1972. ‘The Problem of Genetic Manipulation’, in Theological Investigations, Vol. 9, Writings of 1965–67. Translated by G. Harrison. New York: Herder and Herder, 225–252.
    • Sandel, Michael J. 2004. ‘The Case Against Perfection’, The Atlantic.
    • Sherwin, Byron L. 2004. Golems Among Us: How a Jewish Legend Can Help Us Navigate the Biotech Century. Chicago: Ivan R. Dee.
    • Sölle, Dorothee, and Shirley A. Cloyes. 1984. To Work and to Love: A Theology of Creation. Minneapolis: Fortress Press.
    • Stempsey, William. 1997. ‘End-of-Life Decisions: Christian Perspectives’, Christian Bioethics 3, no. 3: 249–261.
    • Swinton, John, and Brian Brock. 2007. Theology, Disability, and the New Genetics: Why Science Needs the Church. New York: T&T Clark.
    • Thomas, Mike. 2021. ‘The Future of AI: How Artificial Intelligence Will Change the World’, Builtin.
    • Thoreau, Henry David. 1997. Walden. Edited by Bill McKibben. Boston: Beacon Press.
    • Tielhard de Chardin, Pierre. 1959. The Phenomenon of Man. New York: Harper.
    • Tipler, Frank J. 1994. The Physics of Immortality: Modern Cosmology, God and the Resurrection of the Dead. New York: Doubleday.
    • Trachtenberg, Alan. 1986. ‘The Art and Design of the Machine Age’, New York Times.
    • Vatican Council II. 1965. ‘Gaudium et Spes’,
    • Vinge, Vernor. 1993. ‘The Coming Technological Singularity: How to Survive in the Post-Human Era’, NASA Lewis Centre and Ohio Aerospace Institute VISION-21 Symposium
    • von Rad, Gerhard. 1961. Genesis: A Commentary. Translated by John H. Marks. Philadelphia: Westminster.
    • von Rad, Gerhard. 1962. Old Testament Theology. Translated by M.G. Stalker. New York: Harper and Row.
    • Waters, Brent. 2011. ‘Whose Salvation? Which Eschatology? Transhumanism and Christianity as Contending Salvific Religions’, in Transhumanism and Transcendence: Christian Hope in an Age of Technological Enhancement. Edited by Ronald Cole-Turner. Georgetown, Washington, D.C.: Georgetown University Press.
    • White, Lynn Townsend Jr. 1967. ‘The Historical Roots of Our Ecological Crisis’, Science 155: 1203–1207.
    • Wiener, Norbert. 1950. The Human Use of Human Beings: Cybernetics and Society. New York: Houghton Mifflin.
    • Winner, Langdon. 1980. ‘Do Artifacts Have Politics?’, Daedalus 109, no. 1: 121–136.

Academic tools