By using this site, you agree to the Privacy Policy and Terms of Use.
Accept

GadgetBond

  • Latest
  • How-to
  • Tech
    • AI
    • Amazon
    • Apple
    • CES
    • Computing
    • Creators
    • Google
    • Meta
    • Microsoft
    • Mobile
    • Samsung
    • Security
    • Xbox
  • Transportation
    • Audi
    • BMW
    • Cadillac
    • E-Bike
    • Ferrari
    • Ford
    • Honda Prelude
    • Lamborghini
    • McLaren W1
    • Mercedes
    • Porsche
    • Rivian
    • Tesla
  • Culture
    • Apple TV
    • Disney
    • Gaming
    • Hulu
    • Marvel
    • HBO Max
    • Netflix
    • Paramount
    • SHOWTIME
    • Star Wars
    • Streaming
Add GadgetBond as a preferred source to see more of our stories on Google.
Font ResizerAa
GadgetBondGadgetBond
  • Latest
  • Tech
  • AI
  • Deals
  • How-to
  • Apps
  • Mobile
  • Gaming
  • Streaming
  • Transportation
Search
  • Latest
  • Deals
  • How-to
  • Tech
    • Amazon
    • Apple
    • CES
    • Computing
    • Creators
    • Google
    • Meta
    • Microsoft
    • Mobile
    • Samsung
    • Security
    • Xbox
  • AI
    • Anthropic
    • ChatGPT
    • ChatGPT Atlas
    • Gemini AI (formerly Bard)
    • Google DeepMind
    • Grok AI
    • Meta AI
    • Microsoft Copilot
    • OpenAI
    • Perplexity
    • xAI
  • Transportation
    • Audi
    • BMW
    • Cadillac
    • E-Bike
    • Ferrari
    • Ford
    • Honda Prelude
    • Lamborghini
    • McLaren W1
    • Mercedes
    • Porsche
    • Rivian
    • Tesla
  • Culture
    • Apple TV
    • Disney
    • Gaming
    • Hulu
    • Marvel
    • HBO Max
    • Netflix
    • Paramount
    • SHOWTIME
    • Star Wars
    • Streaming
Follow US
Elon MuskTechTeslaTransportation

Tesla starts unsupervised robotaxi testing on public streets in Austin

Elon Musk showcases Tesla robotaxi progress with driverless Austin ride.

By
Shubham Sawarkar
Shubham Sawarkar's avatar
ByShubham Sawarkar
Editor-in-Chief
I’m a tech enthusiast who loves exploring gadgets, trends, and innovations. With certifications in CISCO Routing & Switching and Windows Server Administration, I bring a sharp...
Follow:
- Editor-in-Chief
Dec 26, 2025, 7:55 AM EST
Share
We may get a commission from retail offers. Learn more
Interior view of a Tesla driving on a city street with no driver visible, showing an empty steering wheel, front seats, and a central touchscreen displaying Full Self-Driving navigation while the road ahead is seen through the windshield.
Image: @aelluswamy / X/Twitter
SHARE

Tesla’s long-promised robotaxi future has quietly moved from slideware to street level in Austin — and this week the company’s top two public faces for the project rode along to prove the point. Video shared on X and posts from Tesla executives show vehicles navigating city streets with no human behind the wheel, and Elon Musk and Tesla’s head of AI, Ashok Elluswamy, have publicly framed those runs as a new chapter in the company’s autonomy push.

That change is not an overnight pivot so much as the latest step in a program that has been unfolding in stages. Earlier this year, Tesla opened a small, geo-fenced robotaxi program in parts of Austin that initially carried paying passengers with a human safety monitor seated inside; the company described the service as a pilot designed to gather real-world data. Over recent months, however, some of those monitors have been removed for specific test runs and a subset of cars has been seen circulating entirely empty inside the mapped area. The shift from supervised rides to empty-seat tests reflects Tesla’s confidence in its camera-first, neural-network approach — and it marks a major escalation in how the company validates its software on public roads.

The optics of the rollout matter. Musk’s social posts — including a now widely circulated clip in which he says a robotaxi “drove him around” Austin — and Elluswamy’s “And so it begins!” reaction are meant as both proof points and PR. For Tesla, executive rides serve two purposes: they signal faith in the system to investors and customers, and they create a public narrative that the company is moving from supervised beta to unsupervised operation. Those posts, however, are only one part of a much more cautious reality on the ground: the cars are still confined to a map, a small fleet, and controlled conditions that engineers can rewind and analyze.

Austin is an obvious laboratory for Tesla. The company’s Gigafactory outside of town places engineering and operations within easy reach, and Texas’ legislative framework has been friendlier to AV testing than many other states — permitting vehicles to operate without a licensed driver under certain conditions and creating a regulatory path for data logging, insurance and testing. Those legal guardrails have helped Tesla scale experiments in the area, while allowing the company to keep a teleoperation layer available so remote operators can assist a car that needs help. Still, permissive law does not preclude scrutiny; regulators and lawmakers retain the ability to demand safety evidence before a broader commercial rollout is approved.

That scrutiny is already active. Federal investigators have been monitoring Tesla’s automated driving programs for months, and public records include multiple crash reports tied to vehicles operating under Tesla’s Full-Self Driving or Autopilot systems. The National Highway Traffic Safety Administration has pressed Tesla for information about how it classifies and reports crashes, and local filings from the Austin pilot show a string of incidents that, while often resulting in only minor damage or no reported injuries, keep the spotlight trained on validation and transparency. For a company promising unsupervised operation, those records are an inescapable part of the story.

From a technical perspective, Tesla’s approach diverges sharply from many of its rivals. Where Waymo and Cruise rely on lidar, high-definition maps and carefully constrained geofences, Tesla has pursued a camera-centric, end-to-end neural-network strategy: feed raw visual data and large amounts of driving video into a single model and let it learn the driving task directly. That architecture promises flexibility and massive scale if it works, because cameras are cheaper and the same software can theoretically be distributed to millions of cars. It also invites skepticism: some engineers worry that vision-only systems can be brittle in unusual conditions and that end-to-end models are harder to audit and verify than traditional, modular stacks. The industry comparison is real — and visible to customers and investors watching which technical bets will scale first.

On the ground in Austin, the tests have a distinctly local character. The fleet is still small — analysts and reporters count a few dozen active robotaxis in the pilot rather than a city-wide service — and many residents say they see white Model Ys with extra sensors more as a curiosity than an everyday option. Video clips of empty cars have gone viral, prompting debate among Austinites: some celebrate the sight as proof that autonomous mobility is arriving, others see it as a risky experiment on public streets. For now, those who want a paying ride often still see a human in the car; unsupervised loops are limited and selectively deployed as part of Tesla’s data gathering.

The legal and financial stakes are steep. Tesla has faced multimillion-dollar jury awards in cases tied to its Autopilot system in the past, and liability concerns only grow when vehicles operate with no one aboard. Regulators can demand lower disengagement rates, clearer crash disclosures, and tighter validation protocols before green-lighting broader commercial operations. Analysts say that while Musk has repeatedly framed end-to-end unsupervised operation as imminent, regulatory delays — and the need to demonstrate a rock-solid safety record — are now among the highest-probability risks to any fast timeline.

Tesla’s Arizona and California style competitors have shown that driverless fleets can work in limited settings; Waymo, for example, has operated fully driverless vehicles in a handful of cities with carefully managed safety regimes. Tesla’s bet is that a software-first, camera-driven stack can leapfrog those approaches economically. If Tesla’s tests in Austin scale without a major incident, the company will have a powerful validation for the robotaxi thesis — both for investors who value recurring-revenue ridesharing and for Musk’s long-standing argument that autonomy is Tesla’s real growth engine. If they don’t, the public backlash and regulatory fallout could slow or reshape the project for years.

What comes next will be shaped by three simple but unforgiving tests: can Tesla expand the fleet while keeping crash rates low and disclosures transparent; can the company convince regulators that its validation methods are rigorous enough for unsupervised commercial service; and can the software maintain reliability across the messy, edge-case world of urban driving? For Musk and Elluswamy, the unsupervised loops around Austin are both an experiment and a proof: a live demonstration of an audacious technical thesis and, simultaneously, a public trial whose outcome will determine whether Tesla’s robotaxis become a template or a cautionary tale.


Discover more from GadgetBond

Subscribe to get the latest posts sent to your email.

Leave a Comment

Leave a ReplyCancel reply

Most Popular

The $19 Apple polishing cloth supports iPhone 17, Air, Pro, and 17e

Apple MacBook Neo: big power, surprising price, one clear target — Windows

Everything Nothing announced on March 5: Headphone (a), Phone (4a), and Phone (4a) Pro

OpenAI’s GPT-5.4 is coming — and it’s sooner than you think

BenQ’s new 5K Mac monitor costs $999 — here’s what you’re getting

Also Read
Close-up of a person holding the Google Pixel 10 Pro Fold in Moonstone gray with both hands, rear-facing triple camera array and Google "G" logo prominently visible, worn against a silver knit top and blue jacket with a poolside background.

Pixel Care+ makes owning a Pixel a lot less scary — here’s why

Woman with blonde curly hair sitting outside in a lush park, holding a blue Google Pixel 10 and smiling at the screen.

Pixel 10a, Pixel 10, Pixel 10 Pro: one winner for every buyer

Google Search AI Mode showing Canvas in action, with a split-screen view of a conversational AI chat on the left and an "EE Opportunity Tracker" scholarship and grant tracking dashboard on the right, displaying a total funding secured amount of $5,000, scholarship cards with deadlines, and status labels including "To Apply" and "Awarded."

Google’s Canvas AI Mode rolls out to everyone in the U.S.

Google NotebookLM app listing on the Apple App Store displayed on an iPhone screen, showing the app icon, tagline "Understand anything," a Get button with In-App Purchases noted, 1.9K ratings, age rating 4+, and a chart ranking of No. 36 in Productivity.

NotebookLM Cinematic Video Overviews are live — here’s what’s new

A Google Messages conversation on an Android phone showing a real-time location sharing card powered by Find Hub and Google Maps, displaying a live map view near San Francisco Botanical Garden with a blue location dot, labeled "Your location – Sharing until 10:30 AM," within a chat about meeting up for coffee.

Google Messages real-time location sharing is here — here’s how it works

Screenshot of the Perplexity Pro interface with the model picker dropdown open, displaying GPT-5.4 labeled as New with the Thinking toggle switched on, and other available models including Sonar, Gemini 3.1 Pro, Claude Sonnet 4.6, Claude Opus 4.6 (Max-only), and Kimi K2.5.

GPT-5.4 is now on Perplexity — here’s what Pro/Max users get

A Microsoft Excel spreadsheet titled "Consumer Full 3 Statement Model" displaying a Balance Sheet in millions of dollars with historical financial data across four years (2020A–2023A), showing line items including cash and equivalents, accounts receivable, inventory, PP&E, goodwill, total assets, accounts payable, current debt maturities, and total liabilities, alongside an open ChatGPT sidebar panel where a user has asked ChatGPT to build an EBITDA-to-free-cash-flow conversion bridge with charts placed on the Balance Sheet tab, and the AI is actively responding by planning the analysis, filling in financing cash rows, and executing multiple actions in real time.

ChatGPT for Excel is here — and it runs on GPT‑5.4

ChatGPT logo and wordmark in white on a soft blue and orange gradient background, representing OpenAI’s ChatGPT platform.

OpenAI’s GPT-5.4 can click, type, and work your PC for you

Company Info
  • Homepage
  • Support my work
  • Latest stories
  • Company updates
  • GDB Recommends
  • Daily newsletters
  • About us
  • Contact us
  • Write for us
  • Editorial guidelines
Legal
  • Privacy Policy
  • Cookies Policy
  • Terms & Conditions
  • DMCA
  • Disclaimer
  • Accessibility Policy
  • Security Policy
  • Do Not Sell or Share My Personal Information
Socials
Follow US

Disclosure: We love the products we feature and hope you’ll love them too. If you purchase through a link on our site, we may receive compensation at no additional cost to you. Read our ethics statement. Please note that pricing and availability are subject to change.

Copyright © 2026 GadgetBond. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | Do Not Sell/Share My Personal Information.