By using this site, you agree to the Privacy Policy and Terms of Use.
Accept

GadgetBond

  • Latest
  • How-to
  • Tech
    • AI
    • Amazon
    • Apple
    • CES
    • Computing
    • Creators
    • Google
    • Meta
    • Microsoft
    • Mobile
    • Samsung
    • Security
    • Xbox
  • Transportation
    • Audi
    • BMW
    • Cadillac
    • E-Bike
    • Ferrari
    • Ford
    • Honda Prelude
    • Lamborghini
    • McLaren W1
    • Mercedes
    • Porsche
    • Rivian
    • Tesla
  • Culture
    • Apple TV
    • Disney
    • Gaming
    • Hulu
    • Marvel
    • HBO Max
    • Netflix
    • Paramount
    • SHOWTIME
    • Star Wars
    • Streaming
Add GadgetBond as a preferred source to see more of our stories on Google.
Font ResizerAa
GadgetBondGadgetBond
  • Latest
  • Tech
  • AI
  • Deals
  • How-to
  • Apps
  • Mobile
  • Gaming
  • Streaming
  • Transportation
Search
  • Latest
  • Deals
  • How-to
  • Tech
    • Amazon
    • Apple
    • CES
    • Computing
    • Creators
    • Google
    • Meta
    • Microsoft
    • Mobile
    • Samsung
    • Security
    • Xbox
  • AI
    • Anthropic
    • ChatGPT
    • ChatGPT Atlas
    • Gemini AI (formerly Bard)
    • Google DeepMind
    • Grok AI
    • Meta AI
    • Microsoft Copilot
    • OpenAI
    • Perplexity
    • xAI
  • Transportation
    • Audi
    • BMW
    • Cadillac
    • E-Bike
    • Ferrari
    • Ford
    • Honda Prelude
    • Lamborghini
    • McLaren W1
    • Mercedes
    • Porsche
    • Rivian
    • Tesla
  • Culture
    • Apple TV
    • Disney
    • Gaming
    • Hulu
    • Marvel
    • HBO Max
    • Netflix
    • Paramount
    • SHOWTIME
    • Star Wars
    • Streaming
Follow US
AILifestyleTech

The false promise of AI friendship

What the “Friend” subway backlash tells us about our hunger for connection and the limits of artificial intimacy.

By
Shubham Sawarkar
Shubham Sawarkar's avatar
ByShubham Sawarkar
Editor-in-Chief
I’m a tech enthusiast who loves exploring gadgets, trends, and innovations. With certifications in CISCO Routing & Switching and Windows Server Administration, I bring a sharp...
Follow:
- Editor-in-Chief
Oct 22, 2025, 2:15 PM EDT
Share
We may get a commission from retail offers. Learn more
A woman rests peacefully in bed wearing yellow floral bedding and a blue shirt, with soft morning light filtering through window blinds behind her. She wears a small circular Friend AI device on her chest while holding a smartphone displaying a red notification screen. A desk lamp is visible in the background. The word 'friend' appears in white text at the bottom of the image.
Image: Friend
SHARE

If you’ve ridden the New York City subway in the last few weeks, you’ve likely seen the ads: stark white posters, a single gnomic line or two, and an image of a small, round object that could pass for a minimalist AirTag or a designer pendant. The billboard says, in tones meant to be comforting, that this object will “ride on the subway with you.” The company behind it, Friend, paid to plaster more than 11,000 car cards and hundreds of platform posters across the system in what’s being called one of the city’s largest out-of-home buys in years. The splash reportedly cost north of $1 million.

Behind that campaign is a device and an argument. Friend is a small, wearable pendant meant to be hung from your neck; it listens to the world around you, ingests a steady stream of data about how you speak and act, and funnels that information into an app that promises personalized responses and emotional companionship. The pitch is simple and modern: loneliness is painful, and if humans can’t reliably supply companionship, maybe a listening, talking machine can. That premise — and the marketing that accompanies it — has not gone over well in New York. Many of the posters were defaced within hours, with one subway-goer crossing out “friend” and writing, bluntly, “AI would not care if you lived or died.”

Friend’s founder, Avi Schiffmann — who first made headlines as a teenager for a Covid-tracking website — has leaned into the controversy. He’s framed the subway buy as both an experiment and an aggressive act of brand theater; in a Fortune profile, he quipped that his “plans are measured in centuries.” Whether the campaign was intended as provocation or pure marketing, it has succeeded in making the product visible and the debate about it louder.

Why the outrage? Part of it is a visceral privacy anxiety: a device that’s always listening and hoovering up your conversations is, by design, a surveillance product. But the deeper worry is philosophical and ethical. The company’s core promise — that an algorithm can be your friend — collapses two very different things into one tidy, marketable package. Friendship between humans is reciprocal. It is tangled in obligations, flaws, mutuality, boredom, argument and care. A friend is not only someone who listens and responds; a friend is someone whose life matters to you as much as your life matters to them. That basic asymmetry is missing from a machine that is designed to reflect you back to yourself in the most flattering way possible.

The market for digitally mediated companionship, however, is not some niche idea hatched in a venture studio; it is responding to real social conditions. Loneliness is a persistent problem in modern America. Surveys have repeatedly shown elevated levels of loneliness across demographic groups, and certain populations — including many racial minorities and LGBTQ people — report weaker social support networks than others. A Cigna/Morning Consult series found striking differences in loneliness among racial and ethnic groups, and a 2023 KFF survey tied experiences of discrimination to worse health and social isolation. These are not abstract data points; they are the lived backdrop that makes an app promising attention and non-judgmental replies an attractive proposition.

Teens, in particular, have been quick to experiment with the idea. A large recent survey found that roughly 72 percent of teenagers reported using AI companions at least once, and more than half use them regularly; a significant minority even said conversations with these bots were as satisfying as real friendships. That trend helps explain why companies pour resources into polished hardware, slick branding and subway takeovers: there is an enormous, eager market.

But there is mounting evidence that these relationships can do harm. When an app’s incentives are to maximize attention, engagement, or the length of a session, it will naturally optimize for responses that keep you coming back. For a lonely person, that can mean reinforcement of fragile beliefs or encouragement of unhealthy rumination. Clinical and ethical experts warn that algorithmic companions may blunt people’s motivation to seek human support, and could, in extreme cases, substitute for intervention when a human would have recognized danger. The problem is not only what the AI says; it’s what it doesn’t — the inability to suffer, to ask for help, to reciprocate care, or to hold a friend accountable when they’re making bad choices.

The commercial logic of these products is also difficult to ignore. Friend isn’t the only player flirting with intimacy as a growth channel: companies like Replika, Character [.AI], Soul Machine and a string of startups have pitched products as companions, confidantes or relationship simulators. Some large language models originally built as productivity tools have been quickly repurposed to do emotional labor. The technology stacks are impressive and fast-moving, but the business models aim ultimately at scale and monetization — not at building durable, mutual human bonds. That mismatch invites a necessary skepticism: are we buying consolation or being sold something that will extract ever more personal data in the name of “care”?

There is a political and systemic answer to the loneliness problem — one that most of these startups do not want to sell. Real social repair requires long, structural work: economic policies that reduce precarity and give people time and space for social life; investments in public institutions, arts and community spaces where people can meet without surveillance; better support for parents and caregivers; and policies that address the discrimination and marginalization that make loneliness worse for some groups. Technology can be part of that ecosystem, but it can’t be a substitute for it.

This is not to deny the human pain that drives people toward machines. The descriptions of loneliness in literature and memoir are accurate and devastating: the sense of being shut out while the world bustles around you is real and corrosive. But a machine that mirrors a person’s desires back to them and calls that “friendship” risks teaching us to prefer blunt reflection to the messy rewards of human reciprocity.

What the New York subway protests around Friend show, more than anything, is that millions of urban residents still care about what it means to be seen by other people rather than by a mirror. Defacing an ad with the message “AI would not care if you lived or died” is, in its blunt way, an argument about what constitutes a life worth protecting: one that requires mutual regard, accountability, and the possibility of being known not only for our wants but for our obligations to others.

So what do we do? We should be demanding better design and stronger regulation: transparent data policies, limits on how companies can monetize emotional data, and stricter protections for minors who are disproportionately experimenting with AI companions. We should also resist the seductive promise that a single device, tuned to flatter us, can replace the slow, awkward, difficult work of making community.

Friend’s campaign may be loud enough to get a pendant in your Instagram feed; it’s not loud enough to remake the social infrastructure that could actually help people be less lonely. If friendship demands something of us — vulnerability, effort, uneven reciprocity — then no algorithm that’s built to serve itself will ever be a true friend. The device hanging from your neck might listen. What it cannot do is answer the one question the subway tags point to, in their blank, impossible optimism: who, exactly, is keeping watch over you when the machine looks away?


Discover more from GadgetBond

Subscribe to get the latest posts sent to your email.

Topic:Wearable
Leave a Comment

Leave a ReplyCancel reply

Most Popular

This Nimble 35W GaN charger with retractable cable is $16 off

25W Qi2 wireless comes alive with this Google Pixelsnap Charger deal

TACT Dial 01: turn it, press it, focus — that’s literally it

Perplexity Computer is the AI that actually does your work

Claude Marketplace lets you use one AI commitment across multiple tools

Also Read
Person typing on a MacBook at a wooden desk with the ExpressVPN app open on screen beside a dark code editor window, illustrating AI-assisted VPN control in a real-world workspace.

How to enable the ExpressVPN MCP server on your AI tools

Concept graphic showing a code snippet on the left flowing through a prism into the ExpressVPN desktop interface on the right, symbolizing an AI command connecting via MCP to change VPN location to USA–Chicago.

ExpressVPN MCP server: what it is, how it works, and who it’s for

Person typing on a MacBook at a wooden desk with the ExpressVPN app open on screen beside a dark code editor window, illustrating AI-assisted VPN control in a real-world workspace.

ExpressVPN is the first to plug VPN infrastructure into Anthropic’s MCP ecosystem

A person stands in front of a blue tiled wall featuring the illuminated word “OpenAI.” They are holding a smartphone and appear to be engaged with it, possibly taking a photo or interacting with content. The scene emphasizes the OpenAI brand in a modern, tech-savvy setting.

The Pentagon AI deal that OpenAI’s robotics head couldn’t accept

Nimble Fold 3-in-1 Wireless Travel Charging Dock

Charge iPhone, Apple Watch and AirPods with this Nimble 3‑in‑1 deal

99ONE Rogue 102321

99ONE Rogue wants to kill the ugly helmet comms box forever

Close-up of a person holding the Google Pixel 10 Pro Fold in Moonstone gray with both hands, rear-facing triple camera array and Google "G" logo prominently visible, worn against a silver knit top and blue jacket with a poolside background.

Pixel Care+ makes owning a Pixel a lot less scary — here’s why

Woman with blonde curly hair sitting outside in a lush park, holding a blue Google Pixel 10 and smiling at the screen.

Pixel 10a, Pixel 10, Pixel 10 Pro: one winner for every buyer

Company Info
  • Homepage
  • Support my work
  • Latest stories
  • Company updates
  • GDB Recommends
  • Daily newsletters
  • About us
  • Contact us
  • Write for us
  • Editorial guidelines
Legal
  • Privacy Policy
  • Cookies Policy
  • Terms & Conditions
  • DMCA
  • Disclaimer
  • Accessibility Policy
  • Security Policy
  • Do Not Sell or Share My Personal Information
Socials
Follow US

Disclosure: We love the products we feature and hope you’ll love them too. If you purchase through a link on our site, we may receive compensation at no additional cost to you. Read our ethics statement. Please note that pricing and availability are subject to change.

Copyright © 2026 GadgetBond. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | Do Not Sell/Share My Personal Information.