Social Issues

What we make makes us: the impact of tools on our lives

Anton Balint offers a philosophical-theological reflection on our use of tools in a bid to assess the morality of artificial intelligence.

Crucifixion was a brutal act reflecting the cruelty of the Romans who used it as a means of terror and subjugation, an instrument of their imperial strategy. 

Part of the savagery was the use of iron nails, driven into the victim’s hands to fix him to the wood. As the Norwegian bishop Eric Varden writes in his powerful work Healing Wounds, considering specifically the wounds of Christ on the Cross: 

“[Nails] inflict harsh wounds. Metal is just not supposed to penetrate human flesh.

There is a crying contrast between the suppleness of flesh, each tiniest fragment of which is alive, and the cruel, dead, inflexibility of iron.” 

He then recalls that it was a descendant of Cain – history’s first murderer, according to the Bible – who made ‘all kinds of bronze and iron tools’ (Gen 4:22). 

Indeed, this is the second reference in the Bible to manufactured objects after the fall into sin. The first one is found in the previous chapter of the book of Genesis where we read that God made Adam and Eve ‘garments of skin’ (Gen 3:21) to cover their shame due to their perceived nakedness. 

It is interesting to note that the contexts in which objects appear explicitly for the first time in salvation history are negative ones, not due to what these items are, but because of their proximity to sin. 

I say ‘explicitly’ because the Bible is not against instruments per se and the existence of certain objects is implied before man’s fall. 

We read, for instance, in the second chapter of the book of Genesis that God ‘took the man and settled him in the garden of Eden, to cultivate and care for it’ (Gen 2:15), which would imply the use of some tools – at least based on our understanding of what this activity entails. This is a positive context in which objects appear in the Bible, for man had not yet sinned (this only happens in chapter three). 

In fact, there are plenty of objects in Sacred Scriptures which are used for good, including Moses’ bronze serpent (Num 21:8-9), David’s lyre and harp (Ps 33:2), and, of course, Christ’s Cross whereby an instrument of brutality is converted into the instrument par excellence of divine love and our salvation. Here a tool made for a bad intention is turned to the good. 

One also sees tools made with a good intention used for good ends, such as Moses’ bronze serpent which was used to cure the Israelites bitten by venomous snakes.

But instruments made with a good intention can also be turned to bad ends. We see this with the same bronze serpent: the Israelites eventually began worshiping the lifeless reptile rather than thanking God for his help in curing them (2 Kgs 18:4), thereby falling into the sin of idolatry. As a result, the pious king Hezekiah felt obliged to order its destruction.

This biblical excursus shows us how tools can be used for good or bad, whatever might have been the original purpose of their fabrication. 

And we see this all the time in our day: a knife might be manufactured with the sole intent to injure, but someone else uses it to cut free an unlawfully detained captive, or simply to prepare food for cooking. Music can be used to delight listeners and praise God or to invoke demons and accompany sensuous forms of dancing.

All this can help us to assess artificial intelligence which, while certainly a tool, is even more certainly an ambiguous one.

Is it made with a good intention and, even if so, how often do people use it with a good one? 

But in order to evaluate its morality perhaps we need to ask ourselves not so much why it was made but what it actually is and does and if this is good or evil, or what moral theologians call its ‘moral object’.

This concept above all determines the moral environment of that tool, which encompasses not only the intention of its maker or that of its user but also the capability of that object to positively or negatively affect other persons beyond the maker and/or user. A nuclear bomb is always potentially destructive, in and of itself, whatever the intention of its maker and possible owner (which might be as a deterrent).

We should also bear in mind that there is no tool without a person because only people make them and only people use them. It is precisely this personal dimension which gives tools the moral dimension which they don’t have in their use by animals. A beaver building a dam with sticks and branches can never be blamed or praised for it.

But a human person makes a tool and a person uses it, with moral responsibility for their actions. The human person, made in the image and likeness of God, as the Bible says (Gen 1:26-27), thereby has a rational, spiritual dimension, but also a material one, which also somehow reflects God (as we see in Gen 2:7). Tools reflect our bodily-spiritual nature and are ways for both these dimensions to reflect God’s action.

God leaves his fingerprints on all that he has made, as St John of the Cross so beautifully expounded in his Spiritual Canticle: “A thousand graces diffusing, he passed through the groves in haste, and merely regarding them, as he passed, clothed them in his beauty”. All creation sings of the existence of God (Ps 148:1-14) and these ‘thousand graces’, like his words of revelation, carry God’s presence. 

In an imperfect, but real, manner man too leaves his fingerprints on what he makes, marks which also carry his presence: a craftsman is known by a trademark, a painter by a method, an architect by a style and so on. What we make bears witness to who we are as creatures, just like what God makes speaks of who he is as Creator.

This means that what we make not only has a moral dimension but also a personalistic one; when we make something, not only our intentions (morally good or bad) are involved, but our entire being, our full personhood as body and soul, is implicated in the process. 

When we pick up a katana, for example, we engage not only with the sword but also with its maker and, indeed, with a whole cultural world in which the katana came to be, with all its customary use as a sword, as well as with the more abstract and symbolic aspects related to the object. 

Similarly, when we wear a shirt or a dress crafted by a certain artisan, we carry on our bodies the presence – be it the memory or the living recognition – of that tailor or dressmaker embedded in the model, trademark, materials used, the colours, and so on. 

This is also true when it comes to objects that are made primarily to be interacted with mentally (like a piece of code) or emotionally (such as a piece of art), rather than physically.

Why is this important? Because it sheds light on the fact that morality is not a blind force or dead law, but something alive and personal. Our tools morally impact us and others at a personal level and, depending on what the tool is, its impact is more or less profound. What we make, and how we make it, in many ways make us.

Moreover, the tools we make affect our entire personhood: body and soul.

Man is not a duality, matter and non-matter. He or she is a whole. It is the intensity of this personal impact that an object can have, coupled with its ability to reach a large or small number of people, that imposes different levels of moral responsibility on its maker and/or its users. 

So, what kind of object is AI? 

AI is code – software – that in part imitates and, in part, augments certain aspects of human intelligence, such as patterns and structures of language. This piece of code is digital and requires physical infrastructure (a computer, a data centre, etc.) to operate. 

This tool, which assimilates information and optimises it in order to affect certain actions performed by man, primarily with one’s mental capabilities, such as calculating, memorising, and writing, has been developed due both to intellectual curiosity and broader business, military, and socio-political interests. 

Moreover, the physical infrastructure in which various AIs are embedded are objects (phones, computers, vehicles, etc.), which make up an increasingly indispensable part of the environment in which we live. 

Finally, AIs are built by a very small number of people but, whether for good or for bad, AI personally impacts billions of men and women, old and young, worldwide.

Although we cannot peer into the hearts of those making AI to see whether their intentions are good or bad, something which only God, as fashioner of the heart, can do (see Ps 33:15), we can deduce the degree of moral responsibility of those building and/or using AI based on how this tool impacts or can impact others, the force and moral nature of this impact, and the number of persons affected. 

Take for example the AIs that are used for facial recognition. Many smartphones today allow their users to lock or unlock them using their facial features. The same technology however can be (and, in some cases, already is) employed by governments to monitor and control millions of people. 

Therefore, the engineers developing these AIs must think carefully about the potential use of these technologies. 

The moral risk far outweighs the technical benefits of adding another layer of security to one’s mobile phone. 

There are however AIs that can have far more dreadful consequences. Consider for example those capable of producing fake worlds or believable images of known persons, such as political leaders. 

Or the AIs that can re-write thousands of books to reflect the ideological preferences of certain periods of time. Simply put, AIs have a giant moral cost associated with the technical or economic benefits that they promise.

Therefore, it must be stressed that the moral dimension of AI also has its source in the intention of those who make this tool. Just as one person might make a particular drug product and another might use it either to heal or to kill many, so there are coders creating this tool and others employing it. 

In both cases, coders or creators, there might be good or bad intention, or even a certain amorality. Someone might create a code out of the mere technical interest of doing so, though his or her very lack of moral consideration might be at least partially culpable, as they should have perhaps considered the ethical consequences of their creation.

St John Paul II argued in his Love and Responsibility that the artist and those who engage with a piece of artwork have a responsibility to ensure that the sanctity of the human person is not denigrated, especially when the piece of art can be accessed by a great number of people. 

Similarly, with AI. Both the coder and those using the code must realise that, because of the large-scale impact of this tool, the grade of moral responsibility, stemming from both their intention and from the personalistic nature of everything that we make, is colossal. Our actions never impact a mere ‘other’, but always another person who is fundamentally just like ourselves, uniquely loved by God.

Like what you’ve read? Consider supporting the work of Adamah by making a donation and help us keep exploring life’s big (and not so big) issues!

Anton Balint was born and grew up in Romania. In 2011 he moved to England to study law. Since graduation, Anton has been working in the financial services sector. His intellectual interests are literature, theology, and philosophy, and some of his favourite books include The Brothers Karamazov, The Spiritual Canticle, and Love and Responsibility. When time permits, Anton likes to spend it in the mountains, somewhere in Europe.

Leave a Reply

Your email address will not be published. Required fields are marked *