Figma’s latest move isn’t another flashy feature buried in a menu — it’s plumbing. This week, the company widened the reach of its Model Context Protocol (MCP) server so that AI coding agents can stop making educated guesses from screenshots and instead read the actual code behind a prototype built in Figma Make. That shift — from “what the app looks like” to “how the app is built” — is a small change in wording but a big one for anyone who’s ever watched an LLM try to reconstruct a design from a flat image.
What changed
Previously, tools trying to turn a Figma design into working UI had to infer structure and behavior from rendered artboards or screenshots. The MCP server acts like a translator between Figma files and external tools: it exposes structured design data and, now, the code that Figma Make generates from prompts. In short, AI agents that speak MCP can ask Figma, “Show me the code that makes this button behave like this,” instead of guessing from pixels. That code is indexed by the MCP server, so clients only request and receive exactly what they need.
Kris Rasmussen, Figma’s chief technologist, summed it up: “By using a Figma Make file via an MCP client, AI models can see the underlying code instead of a rendered prototype or image.” It’s the company’s way of saying: don’t guess the construction — look at the blueprint.
Who can use it today?
Figma says the Make → MCP path is available to a handful of tools and editors right away: Anthropic, Cursor, Windsurf and VS Code are listed as supported clients. That means you don’t need the desktop Figma app to let an AI assistant inspect your Make files — code editors and browser-based agents can now query the MCP server remotely. Figma also plans to open the door wider later so third-party MCP servers can plug into Make.
Community projects and open-source MCP implementations already exist — you’ll find adapters on GitHub that show how editor agents can be wired into Figma’s MCP story — which suggests this won’t be a single-vendor ecosystem for long.
Why this matters (and why it might feel like magic)
If you’ve ever used a prompt-to-app tool, you know the friction: the model describes a layout, the design team polishes it, then an engineer ports it to code — often reworking visual decisions to fit a codebase’s patterns. With MCP indexing Make’s code, an AI agent can generate (or regenerate) components that match both the visual design and the implementation details (naming, variables, layout constraints) that your app relies on.
For designers, this could mean faster prototypes that are also more faithful to production. For engineers, it can reduce the tedious translation work. For product teams, it could shorten the loop between idea and working demo. That’s the promise Figma is selling: tighter design-to-code continuity where AI is a collaborator that references the source of truth rather than guessing from screenshots.
New features that tag along
Figma also flagged two concrete features that ride alongside the MCP expansion:
- Design Snapshot — converts Figma Make snapshots into editable layers inside Figma Design, meaning a snapshot of a generated app becomes material you can edit directly. That feature was slated to land this week, per Figma’s update.
- In-canvas AI editing — a tool that would let users manipulate designs with AI prompts without leaving the Design canvas is in testing. That’s the more visible, designer-facing side of the updates — little prompts that tweak a component, right where you’re already working.
A few caveats (because plumbing needs valves)
This is powerful, but not risk-free. Exposing design and code via an API-style protocol raises obvious questions:
- Access control & privacy. You don’t want arbitrary models or cloud agents scraping internal product code or design tokens. Figma’s docs and MCP guides emphasize that MCP integrations must be configured and that supported clients are how connections are established — but teams should still treat MCP endpoints like any other sensitive dev resource and enforce permissions.
- Dependence on generated code. Figma Make creates code that’s useful for prototypes and iteration; whether that code is production-grade or matches an org’s architecture is still design- and team-dependent. The MCP server makes it easier for agents to use that code, but engineers still need to validate and integrate it thoughtfully.
What this does to workflows
Think of three plausible short-term workflows:
- Designer + AI assistant in the editor. A designer prompts Make to generate an app screen, snapshots it into editable layers, and then asks an AI in VS Code to scaffold a React component that matches naming conventions and existing style tokens. Result: fewer handoffs.
- Agent-led bug fixes. An AI agent that understands both your Figma design and your codebase could propose a fix when a component visually drifts from the spec, and even suggest the code change needed to bring implementation back in line.
- Prototyping at speed. Product teams can iterate on feature ideas by generating live prototypes via Make and letting MCP-backed agents convert them into working demos that engineers can refine.
The bigger picture
Figma has been leaning into AI for a while — from asset search to prompt-driven features — but this feels less like feature-creep and more like infrastructure. Making design context available “everywhere you build” reframes Figma: not just a canvas, but a live source of truth that tools can query programmatically. That’s the direction lots of platform vendors have been hinting at, and Figma is trying to make it practical for teams that want AI to help implement rather than just imagine.
Bottom line
If you care about faster, less error-prone design-to-code workflows, Figma’s MCP expansion for Make is a notable step. It doesn’t replace engineers or rigorous review, but it hands AI agents better blueprints to work from — and in the near future, that could shave hours or days off iteration cycles. For teams that treat design files as living artifacts, not static pictures, this is the sort of behind-the-scenes upgrade that quietly changes how work gets done.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
