Would you take relationship advice from a bot? A recent Institute for Family Studies blog post by data science consultant Bradford Tuckfield suggests that with recent advancements in AI, you may have to ask yourself that question sooner than later. GPT-3 (Generative Pre-trained Transformer 3) is an “autoregressive language model that uses deep learning to produce human-like text.” In other words, GPT-3 uses a small amount of input text to produce everything from articles and poems to news reports and dialogue. As Tuckfield notes, the technology could even be used to produce educational materials such as textbooks. On another front, according to Screenshot Media, dating apps could begin employing AI to optimize their matchmaking, or advise users when to end a relationship. Tinder CEO Sean Rad has even called it the ‘future of the dating industry.’
Say Hello to Chatbot Therapists
It’s likely that you’ve already encountered GPT-3 in the form of customer service chatbots. You know the drill: you open the chat function, type your question, and the bot prompts you with a series of questions to get you the right information. But this can’t compare with what some researchers aspire to create – chatbots which can offer “psychological and behavioral therapy.” Noting that young people in particular are “willing to ask deep questions to chatbots and pay attention to the replies,” Tuckfield drives home that it is becoming even more crucial to develop a healthy skepticism toward the real limits of AI. While tempting to treat AI as a sort of 21st-century oracle, and while some argue that there’s nothing AI can’t do, we should not lose sight of the fact that machines, however realistic they feel, can never really attain human sentience. AI’s astounding ability to churn data is the product of extensive conditioning – and it is not unbiased.
Persons Can’t Be Captured By Parameters
Gosia Szaniawska-Schiavo, the author of a study titled Love in the Age of AI Dating Apps, explains that machine learning could be used to make relationship recommendations, like whether to break up, using “a set of parameters” based on personal information and app user behavior, including “in-app user behavior, historical matches,” and conversation patterns. The caveat to this is that “classifying and predicting people’s behavior is based on the assumption that love–which is a strong emotion in itself–could be found based on rules or logic […] AI will be 100 percent trustworthy only in its own paradigm, which is purely based on ‘if this, then that’ logic.” Beyond the truism that love involves strong emotion (which machines obviously lack), love involves taking risks and making sacrifices for another. AI may be of limited use in the context of dating apps, but it can never be the final authority for loving another person. Often in love, there is a ‘both, and,’ a ‘give and take’ which evades logic and would seem to defy it. Tuckfield’s example of a GPT-3 answering the question “what are things to consider when deciding how many children to have?”, reflects a bias towards “a certain highly-cautious, effectively anti-natalist approach to family planning,” where the “majority of considerations it recommends are limiting factors in family planning: prioritizing career, financial, and lifestyle goals, as the model recommends, leads to having fewer children.”
Impersonal and Biased
How many personal decisions have we made solely based on logical criteria? How often have happy relationships, marriages and families taken a course deemed unrealistic or foolish to the rest of the world? Ultimately, relationships are between persons, and personal information often cannot be collapsed into neat little data points. Some are more or less important for certain people – and no collection of data points can adequately convey a person’s qualities or story. In the end, the subjective advice of friends and family based on their personal knowledge of you, however limited or humanly biased, is a far better resource in determining whether to begin or end a relationship than an impersonal and pre-conditioned (systematically biased) chatbot. The cues we receive from others in conversation, the way we interact with one another, indicates more about us and our needs than any chatbot could ever detect. As AI applications in the realm of dating and relationships emerge, let’s remind ourselves and each other of the messy yet irreplaceable role of people in our lives – and resist myopic overestimation of a machine’s ability to give us tidy answers.