By using this site, you agree to the Privacy Policy and Terms of Use.
Accept

GadgetBond

  • Latest
  • How-to
  • Tech
    • AI
    • Amazon
    • Apple
    • CES
    • Computing
    • Creators
    • Google
    • Meta
    • Microsoft
    • Mobile
    • Samsung
    • Security
    • Xbox
  • Transportation
    • Audi
    • BMW
    • Cadillac
    • E-Bike
    • Ferrari
    • Ford
    • Honda Prelude
    • Lamborghini
    • McLaren W1
    • Mercedes
    • Porsche
    • Rivian
    • Tesla
  • Culture
    • Apple TV
    • Disney
    • Gaming
    • Hulu
    • Marvel
    • HBO Max
    • Netflix
    • Paramount
    • SHOWTIME
    • Star Wars
    • Streaming
Add GadgetBond as a preferred source to see more of our stories on Google.
Font ResizerAa
GadgetBondGadgetBond
  • Latest
  • Tech
  • AI
  • Deals
  • How-to
  • Apps
  • Mobile
  • Gaming
  • Streaming
  • Transportation
Search
  • Latest
  • Deals
  • How-to
  • Tech
    • Amazon
    • Apple
    • CES
    • Computing
    • Creators
    • Google
    • Meta
    • Microsoft
    • Mobile
    • Samsung
    • Security
    • Xbox
  • AI
    • Anthropic
    • ChatGPT
    • ChatGPT Atlas
    • Gemini AI (formerly Bard)
    • Google DeepMind
    • Grok AI
    • Meta AI
    • Microsoft Copilot
    • OpenAI
    • Perplexity
    • xAI
  • Transportation
    • Audi
    • BMW
    • Cadillac
    • E-Bike
    • Ferrari
    • Ford
    • Honda Prelude
    • Lamborghini
    • McLaren W1
    • Mercedes
    • Porsche
    • Rivian
    • Tesla
  • Culture
    • Apple TV
    • Disney
    • Gaming
    • Hulu
    • Marvel
    • HBO Max
    • Netflix
    • Paramount
    • SHOWTIME
    • Star Wars
    • Streaming
Follow US
AIAppsMicrosoftTechWindows

Microsoft PowerToys advanced paste now runs AI locally on Windows 11

Microsoft's AI-powered copy and paste just went local.

By
Shubham Sawarkar
Shubham Sawarkar's avatar
ByShubham Sawarkar
Editor-in-Chief
I’m a tech enthusiast who loves exploring gadgets, trends, and innovations. With certifications in CISCO Routing & Switching and Windows Server Administration, I bring a sharp...
Follow:
- Editor-in-Chief
Nov 21, 2025, 8:02 AM EST
Share
We may get a commission from retail offers. Learn more
Screenshot announcing Microsoft PowerToys 0.96 release in November 2025, featuring a modern Windows interface with clipboard history, an input field for specifying paste format, and a configuration panel displaying selectable AI models including Foundry Local and Azure OpenAI.
Image: Microsoft
SHARE

The quiet revolution in productivity software just accelerated. In November 2025, Microsoft silently dropped something that could fundamentally change how millions of Windows users interact with their machines—and how much they’ll pay for AI features they thought required expensive cloud subscriptions.

Microsoft’s PowerToys Advanced Paste tool, the clipboard-enhancement utility that’s been hiding in the depths of Windows settings, just got smarter in version 0.96. But here’s the kicker: it no longer needs to phone home to OpenAI‘s servers to handle your copy-paste tasks. Instead, it can now process everything directly on your device, using dedicated neural processing hardware that’s already sitting dormant in most modern Windows laptops.

For years, cloud AI has been the default assumption—the idea that smart processing always requires connecting to someone else’s servers. Want to translate text? Send it to the cloud. Need to summarize an article? Ship it off somewhere. The tradeoff seemed inevitable: convenience and capability for a price (often measured in API credits you have to purchase) and a privacy compromise (your clipboard contents briefly exist on remote servers).

Microsoft’s Advanced Paste tool operated exactly this way until now. Every time you copied text and wanted AI to enhance it—translating, summarizing, formatting—the entire operation went through OpenAI’s infrastructure. You needed API credits. Your data traveled over the internet. The process took longer. And every interaction leaves traces in the cloud.​

The 0.96 update changes the entire equation.

With the new capabilities, Advanced Paste can route requests through two pathways. The first is Microsoft’s Foundry Local, a new infrastructure that runs AI directly on your device’s Neural Processing Unit (NPU). The second is Ollama, an open-source platform that lets you run AI models on your local hardware without any Microsoft intermediary.​

Screenshot of the PowerToys Advanced Paste settings in Windows 11, showing options to enable AI-powered clipboard formatting and a list of selectable AI model providers including OpenAI, Foundry Local, Google, Mistral, Azure AI, and Ollama with a dropdown menu to add models to the tool.
Image: Microsoft

Both options share a critical advantage: they process everything on your machine, using hardware specifically designed for AI tasks. No cloud round-trip. No API keys. No billing surprises.

The hardware that makes local AI possible

If you’ve bought a Windows laptop in the past year—especially if you paid attention to marketing about “Copilot+ PCs“—you’ve already got the hardware for this. NPUs (Neural Processing Units) are specialized chips that manufacturers like Intel, AMD, and Qualcomm have been embedding into their latest processors.

These aren’t your graphics card’s GPU or your computer’s CPU. NPUs are dedicated silicon specifically optimized for the mathematical operations that power AI inference—running pre-trained models to make predictions or generate outputs. Unlike general-purpose processors that drain battery life when handling intensive workloads, NPUs are engineered to perform these AI tasks with minimal power consumption.​

The numbers are staggering. Qualcomm’s Snapdragon X Elite chip, which powers many of the newest Windows laptops, can perform more than 40 trillion operations per second using its NPU alone. That’s enough compute power to run sophisticated language models entirely locally—all while keeping your device cool and your battery charge descending at a manageable pace.​

Intel’s Core Ultra processors and AMD’s latest Ryzen chips include similar NPU capabilities. For the first time, this hardware exists in millions of devices. It’s been underutilized. Advanced Paste’s update represents one of the first mainstream applications genuinely designed to take advantage of it.

The economics: no more API credits

Open a PowerToys discussion forum, and you’ll find a recurring complaint: “I’d love to use Advanced Paste’s AI features, but I don’t want to pay for OpenAI API credits.” It’s a reasonable concern. Cloud AI pricing scales with usage. Even modest adoption adds up. A professional who translates fifty snippets a day, or a writer who summarizes dozens of web clippings, could easily justify the expense—but most users balk at the subscription-like uncertainty.​

Local AI flips the economic model. Once you’ve downloaded the local models (typically between 500MB to a few GB, depending on complexity), you’re paying zero per inference. The hardware cost is already sunk—it’s embedded in your laptop or desktop. The electricity cost is negligible. The bandwidth cost is zero.​

This isn’t just a cost savings. It’s a psychological shift. Cloud API pricing creates friction for the undecided user. Local AI eliminates that friction entirely, making AI-enhanced productivity features feel like they’re “already included” in your device—because, technically, they are.

For teams and enterprises, the implications are even more significant. A company deploying Advanced Paste across thousands of machines no longer needs to provision cloud infrastructure, manage API budgets, or forecast usage costs.​

The privacy dimension: your clipboard stays yours

The technical advantage is obvious. The economic advantage is compelling. But the privacy story might be the most important reason Microsoft is pushing this direction.

Every time you use clipboard-based AI features in the cloud, you’re transmitting the contents of your clipboard—whatever that happens to be—to a remote server. You might be copying sensitive financial data, medical records, personal notes, client communications, or passwords before they go into a password manager. All of that briefly exists on someone else’s infrastructure.​

Companies have compliance obligations around this. Healthcare organizations operate under HIPAA. European companies respond to GDPR. Financial services firms have SEC rules. Any data that touches a remote server creates a compliance headache. Some organizations simply can’t use cloud AI features, not for technical reasons, but for regulatory ones.

Local processing eliminates this entire category of risk. When Advanced Paste processes your clipboard content on your device’s NPU, the data never leaves your hardware. It’s not transmitted. It’s not logged to a server. It’s not processed by anyone else’s infrastructure. That’s not just a marketing talking point for privacy-conscious users—it’s a genuine requirement for some professional environments.​

Microsoft’s emphasis on Foundry Local and Ollama support here sends a signal. The company understands that privacy and compliance aren’t nice-to-haves for productivity software. They’re deal-breakers. By making local AI the default path of least resistance, Microsoft is positioning itself as privacy-forward at a moment when enterprise customers increasingly demand it.

Breaking free from OpenAI lock-in

For years, Advanced Paste had a single connection to the cloud: OpenAI. Want to use Claude or Mistral? Tough. Want to use Google’s Gemini? Impossible. You got OpenAI, and that was your only choice.

Version 0.96 smashes that limitation into pieces.

Advanced Paste now supports multiple cloud providers simultaneously: Azure OpenAI (Microsoft’s managed version of OpenAI’s models), OpenAI directly, Google Gemini, and Mistral. Users can also connect to Hugging Face models or build custom integrations.​

This is not accidental. This reflects a broader realization in the AI industry that the early assumption—that a single provider would dominate all use cases—was always misguided. Different models excel at different tasks. OpenAI’s GPT models are powerful general-purpose systems. Google’s Gemini has particular strengths in multimodal tasks. Mistral excels in specific languages and domains. Specialized medical AI models exist on platforms like Hugging Face. The best tool for any given job depends on the job.

By supporting multiple providers, Advanced Paste lets users match their AI provider to their actual needs rather than forcing a compromise. A translator might prefer Gemini’s multilingual capabilities. A developer might use different models for code generation versus documentation writing. A researcher might access specialized models for domain-specific tasks.

This choice architecture—supporting local models, multiple cloud providers, and specialized platforms—represents a fundamental shift in how Microsoft thinks about its AI strategy. It’s no longer about lock-in. It’s about integration.

The interface gets smarter too

The technical improvements matter most, but Microsoft didn’t skip the user experience. Advanced Paste now displays your current clipboard content directly in the interface, alongside a dropdown menu for selecting which AI model to use.​

This seems like a small thing until you’re actually using it. It means you can see exactly what you’re about to process. It means switching between local and cloud models is as simple as clicking a dropdown. It means context is visible before you commit to an action.

The redesign reflects the broader philosophy: as tools become more powerful, the interface that controls them needs to be more transparent, not less. Showing your clipboard contents and making model selection explicit acknowledges that these are consequential operations. You’re not just pasting; you’re choosing how your data gets processed and by which system.

What this means for PowerToys’ growing ecosystem

PowerToys has quietly become one of the most important productivity suites in Windows. Most people have never heard of it. Power users consider it essential.​

The suite includes utilities like FancyZones (custom window arrangement), PowerRename (batch file renaming with AI), Command Palette (quick access to system functions), and dozens of others. PowerToys consistently ranks among the most-starred open-source projects on GitHub, developed in partnership with Microsoft but maintained collaboratively.​

Advanced Paste’s evolution signals that PowerToys is becoming an incubator for mainstream Windows features. What starts as a power-user utility often becomes a standard feature. Snap Layouts, the Windows 11 feature for organizing multiple windows on your screen, originated in PowerToys’ FancyZones tool. The pattern repeats: innovate in PowerToys, validate with users, and integrate into Windows itself if successful.

Advanced Paste’s on-device AI support might follow the same trajectory. If these features resonate with users—and the privacy and cost advantages suggest it will—expect to see local AI processing baked into core Windows features within a few release cycles.

The broader context: Windows betting on AI hardware

None of this happens in a vacuum. Microsoft’s push for local AI processing is part of a larger strategy to differentiate Windows at a moment when the operating system seems mature and stable (which is both good and bad for market positioning).

The Copilot+ PC initiative represents Microsoft’s attempt to make AI a first-class feature of Windows hardware, not an afterthought. NPUs are becoming as standard as graphics cards. Foundry Local represents the infrastructure to actually use that hardware. Advanced Paste represents one of the first genuine consumer-facing applications of that infrastructure.

This matters because it establishes proof of concept. If Advanced Paste works smoothly, if users see the benefits, and if no major privacy incidents emerge, it creates demand for other local AI features. Over the next 12 to 18 months, expect to see:

  • More Windows utilities shipping with local AI capabilities
  • Integration of local processing into Microsoft Office applications
  • Competitive pressure on Apple and Google to emphasize their own on-device AI stories
  • Standardization around how different applications access local models

The commodity AI models that power most of these features—like Meta’s Llama, Mistral’s open models, or Microsoft’s own Phi family—are becoming genuinely good. They’re not Claude-level or GPT-level for all tasks, but they’re perfectly adequate for common operations like translation, summarization, formatting, and categorization. At the commodity scale, local is starting to beat cloud on every dimension: cost, latency, privacy, and increasingly, quality.

The practical implications

Let’s translate this from technology into actual user experience. What changes in your day if you update PowerToys to 0.96 and enable local AI processing?

If you’re a researcher, you can now copy a dense paragraph, select “Summarize,” and get a condensed version processed entirely locally in about 2 seconds, with zero cost per operation. You can batch-process dozens of summaries per day without worrying about accumulating API charges.

If you’re a translator, you can use an on-device model to translate clipboard snippets between dozens of language pairs, instantly, without round-tripping to cloud servers. No waiting for APIs. No API key management.

If you’re someone with sensitive data (healthcare professional, lawyer, financial advisor), you can now use AI-enhanced clipboard operations without ever compromising data residency or compliance posture.

If you’re a systems administrator rolling out PowerToys across an enterprise, you can choose to disable cloud API features and mandate local-only processing if your security policy requires it. No exceptions for cloud dependencies.

For most users, the change is subtle but meaningful: some of your existing workflows get faster, some get cheaper, and some become possible that weren’t before.

The limitations worth mentioning

If this sounds like local AI is universally better than cloud AI, pump the brakes. There are genuine tradeoffs.

Local models are good, but they’re not state-of-the-art for every task. If you need the absolute cutting-edge performance on complex reasoning, code generation, or nuanced language tasks, cloud models (particularly OpenAI’s o-series reasoners or Claude) remain superior. Local models are optimized for speed and efficiency, not maximum capability.​

Hardware constraints matter too. Your device’s NPU has finite capacity. The most advanced models require more VRAM than all but high-end devices provide. There are practical limits to how sophisticated a local model can be on ordinary consumer hardware.​

Maintenance and updates are different. Cloud AI automatically improves as providers release new versions. Local models require active management—keeping them updated, swapping between versions, and troubleshooting compatibility. It’s not difficult, but it’s not zero-friction either.

And for applications that legitimately need to scale across thousands of concurrent users or process truly massive datasets, cloud infrastructure remains essential.​

These limitations are real. They’re also mostly irrelevant for Advanced Paste’s use cases. For clipboard-based tasks—translation, summarization, formatting—local models deliver 95% of the capability of cutting-edge cloud systems, with vastly better economics and privacy properties.

What comes next

The bigger picture suggests this is the beginning, not the endpoint.

Expect to see:

  1. Expanded model support: As Hugging Face and open-source communities release better specialized models, Advanced Paste and similar tools will add support for them.
  2. Tighter Windows integration: Microsoft will likely integrate Foundry Local more deeply into Windows itself, making it easier for third-party applications to access local AI capabilities without reinventing infrastructure.
  3. Hardware competition: Seeing the utility of on-device AI, chip manufacturers will invest more heavily in NPU design. The next generation of NPUs will be significantly more capable than today’s.
  4. Enterprise adoption: The privacy and compliance advantages will drive enterprise adoption faster than consumer adoption. Expect corporate rollouts of Advanced Paste and similar tools before mainstream consumer adoption.
  5. Competitive responses: Apple has already invested heavily in on-device AI with Apple Intelligence. Google’s on-device AI initiatives are accelerating. The industry is moving toward on-device processing being the default, not exceptional.

This represents a genuine inflection point in how AI services will be delivered. For five years, the assumption was that AI belonged in the cloud, accessed via APIs, managed by centralized providers. That assumption is being challenged at a hardware level, at an economic level, and at a policy level.

When Microsoft starts baking support for local AI into productivity tools, it signals that the assumption is shifting. The future of AI won’t be exclusively in the cloud. It won’t be exclusively on-device either. It’ll be hybrid—with users and organizations making deliberate choices about what gets processed where, guided by cost, privacy, latency, and capability requirements.

Advanced Paste’s evolution from cloud-only to local-first represents that shift in miniature. It’s a small feature in a utility that many Windows users have never heard of. But it’s a small feature that represents a genuinely different approach to how computing is done. And in technology, small features that represent fundamental shifts have a way of becoming important in hindsight.

Update your PowerToys. Try the local models. See how they feel. The future of how AI integrates into everyday tools is being shaped right now, and it looks different from what we assumed five years ago.


Discover more from GadgetBond

Subscribe to get the latest posts sent to your email.

Topic:Claude AIWindows 11
Leave a Comment

Leave a ReplyCancel reply

Most Popular

Gemini 3 Deep Think promises smarter reasoning for researchers

Why OpenAI built Lockdown Mode for ChatGPT power users

Ring cuts off Flock Safety partnership before launch

Google Docs now speaks your notes aloud

DOOM, Quake, and 35 years of id Software innovation

Also Read
Apple iPhone Air MagSafe Battery

Apple’s iPhone Air MagSafe Battery just got a rare price cut

HBO Max logo

HBO Max confirms March 26 launch in UK and Ireland with big shows

Sony WF‑1000XM6 earbuds in black and platinum silver.

Sony WF‑1000XM6 launch with class‑leading ANC and premium studio‑tuned sound

Promotional image for Death Stranding 2: On the Beach.

Death Stranding 2: On the Beach brings the strand sequel to PC on March 19

The image features a simplistic white smile-shaped arrow on an orange background. The arrow curves upwards, resembling a smile, and has a pointed end on the right side. This design is recognizable as the Amazon's smile logo, which is often associated with online shopping and fast delivery services.

Amazon opens 2026 Climate Tech Accelerator for device decarbonization

Google Doodles logo shown in large, colorful letters on a dark background, with the word ‘Doodles’ written in Google’s signature blue, red, yellow, and green colors against a glowing blue gradient at the top and black fade at the bottom.

Google’s Alpine Skiing Doodle rides into Milano‑Cortina 2026 spotlight

A stylized padlock icon centered within a rounded square frame, set against a vibrant gradient background that shifts from pink and purple tones on the left to orange and peach hues on the right, symbolizing digital security and privacy.

OpenAI rolls out new AI safety tools

Promotional image for Donkey Kong Bananza.

Donkey Kong Bananza is $10 off right now

Company Info
  • Homepage
  • Support my work
  • Latest stories
  • Company updates
  • GDB Recommends
  • Daily newsletters
  • About us
  • Contact us
  • Write for us
  • Editorial guidelines
Legal
  • Privacy Policy
  • Cookies Policy
  • Terms & Conditions
  • DMCA
  • Disclaimer
  • Accessibility Policy
  • Security Policy
  • Do Not Sell or Share My Personal Information
Socials
Follow US

Disclosure: We love the products we feature and hope you’ll love them too. If you purchase through a link on our site, we may receive compensation at no additional cost to you. Read our ethics statement. Please note that pricing and availability are subject to change.

Copyright © 2026 GadgetBond. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | Do Not Sell/Share My Personal Information.