Updated: Apr 7
Picture this: Valentine’s Day rolls around and this year, you’re feeling spontaneous. Instead of spending it with phoney partners, friends, or old flames, you decide to try your luck online. By some miracle, you match with what seems to be the perfect person: someone who reads the same books, and has the same hobbies, interests and character. Inevitably, you ask - “Shall we grab a coffee?”. While this might seem like the start of a very romantic story, my take is that it could also be the start of a complete disaster. Here’s why:
Generative Pre-trained Transformer 3, otherwise known as GPT-3, is an exciting new artificial intelligence system developed by OpenAI; it can summarise text, follow instructions and answer all sorts of questions, and every day, people are finding new and inventive ways to put it to use. Whether it’s in creating a digital marketing campaign for a burnt-out executive, or writing code for a programmer on their lunch break – the natural language processing system is being used for things we never really thought it would be used for. The latest and craziest example of this, in my view, is GPT’s morally-sketchy involvement in online dating.
In other words, and to explain why the above romantic story might turn into a complete disaster, the person you thought you asked for coffee - is quite literally a robot. This is the new trend sweeping through online dating apps, and it shows no signs of stopping.
Reaction and Reality
What is your intuitive reaction to this new use for GPT-3? Does it fill you with anxiety? Despair? Are you perhaps curious? Impressed - at both the power of the system, and with the individuals who managed to effectively put it to such a use? It would be reasonable to feel all of these ways in varying degrees. While I find this use of GPT-3 to be an incredible example of the power and developing intelligence of AI systems, as an ethicist - it fills me with deep concern.
One simple reason for this is that when you sign up to online dating apps and websites, you expect to be interacting with the people you think you’re interacting with. You expect that the new match on your screen, with their witty bio and charming photographs, really does represent the actual human being it's claiming to be. While, as a result of the deceptive nature of many people who use these apps, this implicit expectation regarding the humanness of your new online love interest is never consistently reached, the involvement of GPT-3 really takes this type of lie to a whole new level.
Not only is the user you fell in love with not who they say they are…they aren’t even human. In fact, you have fallen in love with many, many lines of complex code; code that serves to manipulate the information it has on your interests, and on the words you use to communicate them, and to do so in order to fulfil a certain pre-programmed command function. Whatever your reaction, I think we can all agree that this isn’t what you signed up for.
At the root of this ethical conundrum is the absence of one fundamental value: transparency. What makes the act of using GPT-3 in the realm of online dating so unethical is that it intentionally clouds the reality of the people you match with; their understanding of the situation is totally corrupted by the involvement of the AI system, and what’s more, they have no say in the matter.
There are parallels here with the use of AI in other sectors, in which similarly, there exists a relation between a utiliser and a user. One such parallel is when our online behaviour is tracked, collected and processed in order to optimise ad targeting. Like the online dating example, in this, our expectations regarding the reality of the situation are way off the actual reality that we’re experiencing. We don’t really know what information is being collected, and up until recently, we didn’t even expect that it would. Fundamentally, situations like this become unethical when individuals or organisations exploit our expectations, and its difference from their own, and do so in order to gain some material or social benefit.
This explains why transparency is so important in situations where AI is being used; it functions to equalise the expectations of the parties involved, and ensure that no-one is being taken advantage of. That being said, exactly how this could be embedded into complex social contexts like online dating is still very much in question. The most immediate answer is that it simply requires individuals to be transparent, and to resist the urge to replace their character with that of a supercomputer. Now, if some of you are unconvinced by the ethically-saturated reason just given, I think there’s another good argument in support of reconsidering how GPT-3 is used in dating. And this one has less to do with them, and more to do with you.
It seems to me that not only is this GPT-3 and me trend unethical, but it is also damagingly convenient. What I mean by this is that the use of GPT-3 in online dating hinders the development of your personality and character. We already live in a world where people cower at the thought of introducing themselves to a stranger with the view to taking them on a date. In fact, many cower at the thought of introducing themselves to anyone – me included. The power of GPT-3 now means that people need to put even less effort into the process of connecting with a potential partner.
It's important to note that this problem is not only a potential one, but an actual one too. Already, the trusty text message comes with none of the physical barriers that can normally hinder a sparky interaction between prospective partners, and as a result, it is an easier way of getting to know someone in a stress-free context. It quite literally allows you to delve into the deepest parts of one another’s character from the respective comfort of your own couches, without either of you seeing any of the other’s nerves or quirks. While there is certainly good in this, there is also an argument to say that it hinders an accurate understanding of what makes that person who they are.
And so given this, the advent of GPT-3 and its colossal power has made this problem even greater. Now, not only can you delve into the deepest parts of another’s character from the comfort of your own couch, but you can even programme a computer to do the delving for you. As a result, we now have yet another excellent strategy for avoiding any type of uncomfortable interaction in the pursuit of finding a partner.
In a bid to provide solutions to this sort of problem, many questions will need our attention. I think one of these questions is, what - if any - good will come from GPT-3 being used by online dating users? Answering this question is fundamental if we are to avoid the prospect of millions of people socially piggybacking on top of a supercomputer. While a full response to it is better left to another blog post, the answer certainly revolves around developing an ethical and supportive relation between the person and the technology. We need to convince people that the technology can be used in ways that are good for them, good for others and good for the world. How exactly we do this is still to be considered, but fundamentally, the solution is always to bring the human back into the equation. To bring technology back into relation with ethical values and ambitions for ourselves and for the world.