By using this site, you agree to the Privacy Policy and Terms of Use.
Accept

GadgetBond

  • Latest
  • How-to
  • Tech
    • AI
    • Amazon
    • Apple
    • CES
    • Computing
    • Creators
    • Google
    • Meta
    • Microsoft
    • Mobile
    • Samsung
    • Security
    • Xbox
  • Transportation
    • Audi
    • BMW
    • Cadillac
    • E-Bike
    • Ferrari
    • Ford
    • Honda Prelude
    • Lamborghini
    • McLaren W1
    • Mercedes
    • Porsche
    • Rivian
    • Tesla
  • Culture
    • Apple TV
    • Disney
    • Gaming
    • Hulu
    • Marvel
    • HBO Max
    • Netflix
    • Paramount
    • SHOWTIME
    • Star Wars
    • Streaming
Add GadgetBond as a preferred source to see more of our stories on Google.
Font ResizerAa
GadgetBondGadgetBond
  • Latest
  • Tech
  • AI
  • Deals
  • How-to
  • Apps
  • Mobile
  • Gaming
  • Streaming
  • Transportation
Search
  • Latest
  • Deals
  • How-to
  • Tech
    • Amazon
    • Apple
    • CES
    • Computing
    • Creators
    • Google
    • Meta
    • Microsoft
    • Mobile
    • Samsung
    • Security
    • Xbox
  • AI
    • Anthropic
    • ChatGPT
    • ChatGPT Atlas
    • Gemini AI (formerly Bard)
    • Google DeepMind
    • Grok AI
    • Meta AI
    • Microsoft Copilot
    • OpenAI
    • Perplexity
    • xAI
  • Transportation
    • Audi
    • BMW
    • Cadillac
    • E-Bike
    • Ferrari
    • Ford
    • Honda Prelude
    • Lamborghini
    • McLaren W1
    • Mercedes
    • Porsche
    • Rivian
    • Tesla
  • Culture
    • Apple TV
    • Disney
    • Gaming
    • Hulu
    • Marvel
    • HBO Max
    • Netflix
    • Paramount
    • SHOWTIME
    • Star Wars
    • Streaming
Follow US
AIPerplexityTech

Meet the experts powering Perplexity’s new Health Advisory Board

Perplexity is bringing frontline clinicians into the room so its health answers feel less like guesses and more like grounded, evidence‑based guidance.

By
Shubham Sawarkar
Shubham Sawarkar's avatar
ByShubham Sawarkar
Editor-in-Chief
I’m a tech enthusiast who loves exploring gadgets, trends, and innovations. With certifications in CISCO Routing & Switching and Windows Server Administration, I bring a sharp...
Follow:
- Editor-in-Chief
Mar 19, 2026, 1:50 PM EDT
Share
We may get a commission from retail offers. Learn more
Stylized illustration of a person in a white lab coat standing on a textured green landscape under a teal sky, with the words ‘Health Advisory Board’ in the center and three minimal circular medical icons connected by a thin line labeled 01, 02, and 03 across the top.
Image: Perplexity
SHARE

Perplexity is stepping more deliberately into health care, and it is doing something a lot of AI products have been criticized for not doing: putting clinicians directly into the decision-making loop. With its new Health Advisory Board, the company is trying to answer the core question anyone has when they type a health query into an AI box: “Can this thing actually be trusted?”

The move comes at a moment when AI is rapidly spreading through medicine and consumer health search, but worries about accuracy, bias, and outright hallucinations remain loud and justified. Studies in recent years have shown that large language models can misinterpret clinical scenarios, confidently repeat fabricated diseases, or miss crucial nuances that a trained physician would catch. At the same time, clinicians and health researchers increasingly see AI as a powerful assistant for synthesizing evidence, triaging information, and cutting through the noise of an overwhelming medical literature. Perplexity is trying to thread that needle: lean into AI’s strengths while hard‑wiring more medical rigor and guardrails into how health information is generated and presented.

At the heart of the announcement is a small but heavyweight roster of experts—practicing doctors, researchers, and health-tech leaders—who will advise on how Perplexity’s health-related experiences are designed. The board’s mandate spans product decisions, content quality, patient safety, and clinical workflows, with an explicit emphasis on evidence‑based medicine rather than vague wellness content. Perplexity is essentially saying that decisions about how its system searches, ranks, and phrases health information should be grounded in the same standards clinicians use when they evaluate a new drug, device, or guideline.

The initial lineup sets the tone. Cardiologist and researcher Eric Topol, one of the most cited physician-scientists in the world and a prominent voice on AI in medicine, is among the first members. Topol has spent years arguing that AI, done right, could actually make medicine more human by catching diagnostic errors, reading complex scans more reliably, and freeing doctors from administrative overload so they can spend more time with patients. His involvement signals that this is not just a branding exercise but an attempt to square AI’s potential with the realities of clinical practice, where mistakes are measured not in engagement metrics but in lives.

Joining him is Dr. Devin Mann, a professor of population health and medicine at NYU Grossman School of Medicine and strategic director of digital health innovation at NYU Langone Health. Mann’s work sits right where AI is already starting to have an impact: chronic disease management, remote patient monitoring, and AI-assisted clinical workflows that help busy hospital systems keep track of complex patients. If Perplexity wants its tools to feel useful to clinicians instead of like yet another dashboard, having someone who lives inside those workflows matters.

On the pediatric and genomics front, Dr. Wendy Chung brings a different but crucial perspective. She is the Mary Ellen Avery Professor of Pediatrics at Harvard Medical School and the Chief of Pediatrics at Boston Children’s Hospital, overseeing care for some of the most complex and vulnerable patients in the system. Chung has led NIH-funded research in human genetics and rare diseases, where evidence is often fragmented and decisions depend on stitching together data from small trials, registry studies, and evolving guidelines. That is exactly the kind of landscape where an AI system that can rapidly aggregate and compare evidence could shine—if it is carefully tuned and transparently sourced.

Rounding out the first cohort is Tim Dybvig, a health‑technology founder and operator with experience building patient-facing tools and health infrastructure at scale. While the physicians set the bar for clinical rigor, someone still has to translate those principles into actual product architecture, data pipelines, and everyday user experience. Dybvig’s inclusion is a nod to the reality that “safe” in health tech is not just what the model knows, but how the whole system—from APIs to UI—handles data, edge cases, and failure modes.

Perplexity is also tying this advisory effort directly to new capabilities. Alongside the board, the company is rolling out connectors that let users bring in their own health data and build personalized dashboards or applications within Perplexity Health, running on Perplexity Computer. In practice, that could mean everything from patients exploring trends in their lab results to developers building tools that sit on top of Perplexity’s search and reasoning engine to help clinicians compare therapies or summarize complex records. The board is supposed to guide how those features evolve, so they end up augmenting clinical conversations rather than trying to replace them.

All of this sits against a larger backdrop: AI is already a go-to starting point for many people’s health research, but trust is fragile. Surveys show that a growing share of patients use AI-generated summaries to orient themselves on medical topics, yet only a minority actually trust those summaries to be accurate or consistently check the cited sources. At the same time, safety analyses by health-system researchers and watchdogs warn that AI tools can easily propagate misinformation or reflect biased training data if not carefully designed and monitored. That tension—heavy usage, limited trust—is exactly the gap Perplexity is betting an advisory board of front-line clinicians can help close.

One thing the company is careful to underline is what Perplexity Health is not. It is framed as an educational tool that helps people understand their data and prepare for better conversations with their doctors, not as a standalone diagnostic engine or treatment-planning system. The standard disclaimers are explicit: it is not intended to diagnose, treat, or prevent diseases, and it is not a substitute for professional medical advice—especially for people who are pregnant, nursing, managing an eating disorder, or living with other significant medical conditions. That positioning aligns with what many medical ethicists and digital‑health researchers have called for: AI as a first‑pass explainer and evidence organizer, with clinicians staying firmly in charge of actual care decisions.

For Perplexity, which has built its reputation around live, citation-rich answers that combine multiple AI models with web search, health care is both a natural extension and a high-stakes test. In clinical and research settings, the same ability to synthesize trial data, guidelines, and real-world evidence that helps a journalist or lawyer can be transformative—if the sourcing is transparent and the limitations are clear. By bringing in high‑profile medical voices early and giving them a formal role in governance, the company appears to be signaling that it understands the difference between answering a trivia question and weighing in on someone’s treatment options.

Of course, an advisory board is not a magic shield. The real test will be whether the board’s recommendations translate into measurable safeguards: how the system handles ambiguous symptoms, how it flags uncertainty, how it deals with outdated or conflicting studies, and how easy it is for both patients and clinicians to see where an answer came from. It will also hinge on how Perplexity treats privacy and data governance as it starts ingesting more sensitive health information—something patients consistently say is their top concern when they use digital tools to manage care. Those are hard, slow, unglamorous problems, but they are the ones that will determine whether AI in health ends up being a passing novelty or an infrastructure layer people actually trust.

For now, Perplexity’s Health Advisory Board looks like a statement of intent: a public commitment to let clinicians set the bar for what “responsible” health information from AI should look like. As more members are added in the coming weeks, spanning other specialties and perspectives, the experiment will be worth watching—not just as another product feature, but as a possible template for how AI companies and health systems might share responsibility for the information patients increasingly turn to first.


Discover more from GadgetBond

Subscribe to get the latest posts sent to your email.

Topic:Health
Leave a Comment

Leave a ReplyCancel reply

Most Popular

What is Amazon Prime Video and how does it work for cord-cutters

Stop rebooting: grab 35% off Parallels Desktop and run Windows on your Mac the easy way

Google Doodle stitches up a shamrock logo for St. Patrick’s Day 2026

Local‑first OpenClaw agents on RTX and DGX Spark

LG rolls out new Combi, Hydro and Control heat pump units

Also Read
Perplexity Health promotional graphic showing the ‘perplexity health’ wordmark centered on a teal and green abstract background with faint circular lines and three minimal white icons labeled 01, 02, and 03 around the edges.

Perplexity Health adds secure connectors for real-world health data

Illustration of a woman working at a computer on tax documents surrounded by Google security icons, a shield with the Google “G”, a calendar marked “Tax Day”, a calculator, and symbols for secure payments, email protection, and account keys.

Tax scammers love chaos and Google is trying to shut them down

A playful, colorful illustration showing a simple white rocket outlined in black blasting off from a yellow hexagon, surrounded by hand‑drawn icons including a stopwatch, code brackets, a person symbol, and a notepad on blue shapes, representing fast, creative coding or hackathon projects.

Kaggle launches Community Hackathons with prizes up to $10,000

Colorful abstract illustration showing a blue shopping bag, magnifying glass, and chat-style cards connected by a flowing rainbow line on a light gradient background, representing AI-powered online shopping and search.

Google supercharges UCP for the next wave of AI shopping

Minimalistic Google-branded banner showing the text “Personal Intelligence” in the center on a soft gradient background, with Google Search, Google Photos, and Gmail icons floating around it.

Personal Intelligence arrives in AI Mode, the Gemini app and Chrome

Promotional graphic for Fitbit’s Personal Health Coach showing a smartphone screen with the Fitbit app dashboard, including a circular weekly cardio progress ring at 56%, tiles for steps, readiness, and sleep duration labeled ‘Good,’ and a detailed sleep summary card on a soft blue gradient background with the words ‘Personal Health Coach’ at the top.

Fitbit’s personal health coach just got way smarter this March

Minimal Google Doodle icon showing an orange basketball partially covering a simple orange “G” against a clean white background, representing Google’s tribute to the Men’s College Basketball Championship.

Google Doodle tips off 2026 Men’s College Basketball Championship

A smartphone screen shows a vertical YouTube Shorts video of a snow‑covered mountain range from above, with a bright pink‑to‑purple banner labeled “Reimagine this scene” and the familiar Shorts home, Shorts, create, subscriptions, and profile icons visible along the bottom navigation bar.

YouTube Shorts gets Reimagine, an AI button for instant remixes

Company Info
  • Homepage
  • Support my work
  • Latest stories
  • Company updates
  • GDB Recommends
  • Daily newsletters
  • About us
  • Contact us
  • Write for us
  • Editorial guidelines
Legal
  • Privacy Policy
  • Cookies Policy
  • Terms & Conditions
  • DMCA
  • Disclaimer
  • Accessibility Policy
  • Security Policy
  • Do Not Sell or Share My Personal Information
Socials
Follow US

Disclosure: We love the products we feature and hope you’ll love them too. If you purchase through a link on our site, we may receive compensation at no additional cost to you. Read our ethics statement. Please note that pricing and availability are subject to change.

Copyright © 2026 GadgetBond. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | Do Not Sell/Share My Personal Information.