Razer showed up at CES 2026 like a company that’s decided “AI gaming ecosystem” isn’t just a slide in a keynote, it’s the next few years of its product roadmap. Instead of a single flashy concept, the booth is built around one idea: AI that lives on your desk, in your headset, in your dev rig, even in a plug‑in accelerator, all tied back to how you play and build games.
At the center of it all is Project AVA, which has gone from “AI esports coach” to something much more tangible and, frankly, a lot more fun. The new AVA is a 5.5‑inch animated desk companion inside a cylindrical, jar‑like enclosure, complete with a transparent shell, glowing Razer green base, and a camera perched on top so it can actually see what you’re doing on your PC. Under the glass, a full‑body avatar animates in real time, reacting to your games, your work, and whatever’s on‑screen via what Razer calls PC Vision Mode, which pipes your display into AVA’s local computer vision stack. It’s not just game coaching anymore: AVA can analyze your UI, flag loadouts or builds, help with quick summaries or reminders, and lean into that “slightly judgy backseat gamer who also helps you get your life together” persona that early hands‑ons have described. Razer is positioning it less like a faceless chatbot and more like a persistent NPC that lives on your desk, adapts its personality over time, and plugs into multiple AI backends, with support today for engines like Grok and promises of broader compatibility later. Reservations are already live in the US with a small, refundable deposit, and Razer is openly targeting a launch in the second half of 2026, which suggests this is one of the rare CES “projects” that’s meant to actually ship.

If AVA is the friendly face of Razer’s AI ambitions, Project Motoko is the company’s attempt to fuse AI wearables and gaming headsets into a single category. On paper, it’s a wireless headset concept powered by Snapdragon platforms, but the pitch is closer to “smart glasses without the glasses” – a head‑worn device with dual first‑person cameras at roughly eye level, always watching the world in front of you so the assistant can see what you see. That setup lets Motoko handle real‑time translation, text recognition, contextual prompts, and computer‑vision‑heavy tasks like tracking gym reps or summarizing documents, with stereo depth and a wide field of attention that extends beyond what you’d naturally notice in your peripheral vision. Audio is just as layered: multiple microphones in far‑field and near‑field configurations listen for your voice, nearby conversations, and environmental sound, feeding into an AI stack that can whisper directions, warnings, or quick answers without forcing you to juggle a phone. Razer is leaning on broad compatibility here too – think integration with mainstream AI platforms such as Gemini or OpenAI, plus Razer’s own services – and framing Motoko as something that should feel as natural on a commute or in a meeting as it does in a late‑night Valorant session. It’s still very much a concept, with no hard release window, but thematically it fits Razer’s message: AI won’t just sit on your desk; it wants to sit on your head as well.

Behind the glossy concepts, Razer is laying down infrastructure for people who actually build AI models and tools, not just use them. The Razer Forge AI Developer Workstation is a full‑fat tower (with a path to rackmount deployment) designed for local training and inference, stuffing in multiple professional‑grade GPUs, workstation‑class CPUs, and hefty memory bandwidth so devs can fine‑tune large language models, run simulations, or iterate on game AI entirely on‑prem. The messaging is very “no subscription, low latency”: keep your data local, ditch cloud‑compute wait times, and still get cloud‑like performance when you’re testing NPC behavior trees or a generative tools pipeline for level design.

Sitting above that is Razer AIKit, an open‑source framework that tries to make the messy bits of local AI less painful by automatically discovering GPUs, forming compute clusters, and tying everything back into Razer’s hardware so you don’t spend half your day editing config files. Because it’s hosted on GitHub and explicitly open for contributions, Razer is clearly hoping AIKit becomes a community‑driven stack for low‑latency, local‑first workflows – whether that’s building plug‑ins for AVA, experimenting with game‑adjacent assistants, or just running your own LLM playground on a Forge box.

To make that story more portable, Razer is working with Tenstorrent on a compact AI accelerator you can literally drop next to a laptop. It’s a small, modular device built around Tenstorrent’s Wormhole technology, connecting over Thunderbolt 5 to any compatible system and essentially acting as an external AI co‑processor for heavier workloads like LLM inference, image generation, or robotics‑adjacent model testing. Developers can daisy‑chain up to four units for more headroom, which is the sort of spec that matters if you’re traveling with a thin‑and‑light notebook but still need to demo your AI tools in a hotel suite or on the CES show floor. Tenstorrent is supplying an open‑source software stack, while Razer is clearly putting its stamp on enclosure design and the broader ecosystem fit, treating this as another piece of its “AI everywhere” puzzle for creators and coders rather than a consumer toy.
Not everything at Razer’s CES booth is about invisible models and accelerators; some of it is built to be felt, literally. Project Madison is a concept gaming chair that pulls together three of Razer’s favorite ingredients – Sensa HD Haptics, THX Spatial Audio, and Chroma RGB – into a single multi‑sensory seat designed to make you feel explosions, footsteps, or engine rumble across your back and shoulders. The idea is that Madison syncs with supported games and media so you aren’t just hearing directional audio through your headset or speakers; you’re getting synchronized tactile feedback in your spine and seat while the lighting around you matches the action on‑screen. It’s the same “future of immersion” pitch Razer has made before, but in a more integrated, furniture‑grade form that imagines the gaming chair as a full‑body output device rather than just a place to sit.

For something you can actually pre‑order, Razer is also refreshing its mainstream seating line with the Iskur V2 NewGen. This version doubles down on ergonomics with a HyperFlex lumbar system that shifts with your posture, plus a perforated dual‑density cold‑cured foam seat that’s meant to stay cooler over marathon sessions. Razer is also pushing a second‑generation EPU leather with its CoolTouch treatment, aiming to fix one of the longest‑running complaints about faux‑leather gaming chairs: heat build‑up and stickiness during longer play. The chair is up for global pre‑orders, which makes it one of the more concrete, buy‑this‑year products in a CES lineup otherwise heavy on “Project” branding.

On the living‑room side, Razer’s pitch is clear: cloud gaming needs a better controller, not another dongle. The Razer Wolverine V3 Bluetooth is being billed – very specifically – as the world’s fastest wireless controller when paired with compatible ultra‑low‑latency devices, and it launches under LG’s new “Designed for LG Gaming Portal” program. The controller talks to LG TVs over ultra‑low latency Bluetooth, handles TV navigation with integrated controls, and layers in the usual Razer pro‑grade flourishes (responsive buttons, competitive‑leaning ergonomics) to make couch gaming actually feel close to a wired console setup. In practice, this is Razer admitting that streaming services and console‑style cloud catalogs living on smart TVs are big enough now that they deserve bespoke hardware, rather than leaving users to muddle through with whatever generic controller came bundled in.

Taken together, Razer’s CES 2026 lineup sketches a fairly cohesive vision of where the company thinks gaming is going. Project AVA wants to become the AI character that sits permanently in your peripheral vision, watching your games and your desktop; Project Motoko is a bet that head‑worn AI will outgrow “just” smart glasses; Forge, AIKit, and the Tenstorrent accelerator are a quiet but important push into tools for people building the next generation of AI‑powered experiences. Madison and Iskur V2 NewGen remind everyone that Razer still cares a lot about the physical experience of gaming, while Wolverine V3 Bluetooth plants a flag in the rapidly growing cloud‑on‑TV space. Strip away the buzzwords and you’re left with a simple throughline: Razer doesn’t just want to sell you a mouse or a keyboard anymore – it wants to be the connective tissue between your games, your hardware, and the AI that increasingly sits in the middle.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
