For just $129, you too can make a friend. Technology startup Friend has taken over the New York City subway system with walls of minimalistic and dystopian advertising as part of a new ad campaign — one of the largest the city has seen. These white posters present supportive commitments and reassurance, such as “I’ll never bail on our dinner plans,” or “I’ll binge the entire series with you.”
The campaign feels as though it’s designed to instill dissatisfaction with real-life friends, presenting Friend as an alternative that’s always by your side. However, this supposed selling point is also its key point of controversy.
Friend is an artificial intelligence (AI) chatbot housed within a pendant worn around the user’s neck. The pendant includes a microphone that listens to the user’s conversations as they live their daily life, gathering information about and providing moral support to the user, just like a friend would. Friend is meant to be highly personalized for the user, being shaped by its interactions with the user as well as the user’s everyday conversations with others in real life.
The eeriness of the technology creeps up when one considers the amount of data it collects. The device detects and stores every interaction and tone shift, allowing it to better understand the user and predict their behavior. One could argue that this isn’t very different from a human person eavesdropping; however, unlike a regular person, Friend is present 24/7 and remembers everything.
These distinctions between a regular person and Friend are what the device’s developers are most proud of. Avi Schiffmann, web developer and founder of Friend, compared speaking to the device to speaking to God because each is an omnipresent, judgment-free and super-intelligent entity — one which is with you always.
In some ways, this is understandably appealing because human relationships are based on the needs of two people. Hence, a friend or partner can never fully accommodate one’s needs at all times. Having an AI friend who’s always present when you need it and always up-to-date with the developments in your life undoubtedly provides convenience and support. On the other hand, the effectiveness of this support is questionable to say the least.
Although Schiffmann has discussed Friend’s priceless ability to support users through difficult situations, in real life, it has proven to be shallow and lackluster when faced with the same situation Schiffmann himself brought up. For the amount of data Friend collects, one would expect it to be able to provide highly specific, tailored responses for every user and every situation. However, when journalist Eva Roytburg tested Friend and experienced a breakup during the testing, Friend had nothing to say outside of muttering, “Sounds like it’s been pretty active around you. Everything all good on your end right now?”
The promise of having an AI friend that’s always perceptive, understanding and supportive seems unfulfilled. What is fulfilled, though, is the promise of Friend always being there: It is always gathering information about its users.
Surveillance capitalism — the marketization of user attention and information, similar to the extraction of raw materials — is not new. However, it has become increasingly pervasive and invasive. More traditional platforms that practice surveillance capitalism — such as e-commerce sites, browsers and search engines — mainly operate as an exchange of sorts. Users give up their information privacy in exchange for well-developed, convenient and useful services. This exchange has become so normalized that most don’t even consider it.
Now, however, actors beyond these traditional platforms are gathering and accumulating unforeseen amounts of user data. Products like Friend promise a utopian future built upon groundbreaking AI technologies, but they are simply gathering more and more information behind the scenes. Friend’s terms and conditions — which require users to waive their right to participate in class or representative action or to resolve disputes with a jury or court of law — make their data collection practices all the more chilling.
People aren’t happy about Friend, especially New Yorkers. They have vandalized Friend advertisements, voicing resolute disapproval by spraying “AI wouldn’t care if you lived or died,” and “AI is not your friend,” onto the posters. In a world with increasing surveillance, customers must take mindful actions to understand the products they interact with to protect themselves, their data and their privacy.