The tech giant’s new “moonshot” is a wild bet on solar-powered satellites to solve AI’s massive energy problem. But is it science fiction or the inevitable future of computing?
If you’ve paid any attention to the artificial intelligence boom, you know two things: the technology is incredibly powerful, and it is incredibly hungry.
Not just for data, but for energy.
The generative AI models that power everything from chatbots to drug discovery run on massive, warehouse-sized buildings called data centers. These “hyperscale” facilities are so power-intensive that they’re straining energy grids across the globe. We’re not talking about plugging in a few more servers; a single, large AI data center can consume hundreds of megawatts, enough to power a small city.
This is quickly becoming an unsustainable problem. The energy and water required to cool these facilities are immense. By some estimates, the planned wave of new data centers in the U.S. alone could demand an extra 81 gigawatts of power by 2030—roughly the entire current electricity usage of Texas.
The industry is facing a genuine crisis: how do you power the future of computing when you’re running out of power on Earth?
Google’s answer, unveiled in early November, is perhaps the most “Google” answer possible: just move the data centers off the planet.
The sun is the ultimate power source
The project, fittingly codenamed Project Suncatcher, is the latest “moonshot” from Google’s secretive labs. The core idea, laid out in an official blog post and a detailed technical paper, is to build a data center that plugs directly into the sun.
The vision is a radical fix: a constellation of satellites orbiting the Earth, equipped with Google’s custom AI chips (its Trillium Tensor Processing Units, or TPUs), all powered by vast solar arrays that are never shadowed by clouds, atmosphere, or night.
“The sun is the ultimate energy source in our solar system, emitting more power than 100 trillion times humanity’s total electricity production,” wrote Travis Beals, senior director of Google’s Paradigms of Intelligence.
He has a point. On the ground, solar panels are limited by weather and the 24-hour day/night cycle. But as Beals noted, in the right orbit, solar panels can be “up to 8 times more productive than on Earth,” soaking up near-continuous, unfiltered sunlight.
“In the future,” he concluded, “space may be the best place to scale AI compute.”
A supercomputer in the sky
This isn’t just a plan to launch a few powerful satellites. Google is proposing a single, distributed supercomputer floating in the vacuum of space.
The Suncatcher architecture describes “clusters” of these satellites flying in a precise formation in a sun-synchronous low-Earth orbit. This means they would always pass over the same part of the Earth at the same time of day, ideal for both sunlight collection and communication.
The satellites within a cluster would be separated by 100 to 200 meters, forming a combined “virtual data center” roughly a kilometer wide.
But how do you connect them? You can’t use fiber optic cables. Instead, Google plans to use ultra-high-bandwidth optical links—lasers shooting data between the satellites. According to the technical paper, Google’s team has already built and tested a “bench-scale” version of this link, achieving a mind-boggling 1.6 terabits per second (tbps) of bandwidth.
To put that in perspective, that’s fast enough to download hundreds of HD movies every second. It’s the kind of speed you need to make dozens of separate satellites “talk” to each other so quickly that they function as one, unified computer brain.
The “astronomical” challenges
While the vision is grand, Google’s own researchers are the first to admit that the technical and economic hurdles are, well, astronomical.
The paper outlines a daunting checklist of problems they need to solve.
1. The radiation problem: Space is a hostile, high-radiation environment. Cosmic rays can blast through a chip and flip a ‘0’ to a ‘1’, creating a “bit flip” that can corrupt data or crash a system. This is bad for any computer, but it could be catastrophic for an AI model during training, where millions of tiny calculations are happening every nanosecond.
To their credit, Google tested this. They took their Trillium TPU chips to a lab and blasted them with a proton beam. The good news? The chips were “surprisingly resilient.” The memory systems only started to falter at nearly three times the expected radiation dose for a five-year mission.
Still, the authors caution that while the chips might handle “inference” tasks (running a pre-trained model), the effect of radiation on full-scale “training” (teaching a new model from scratch) is still a huge unknown.
2. The launch cost problem: This is the big one. Project Suncatcher is only feasible if the cost of launching cargo into space collapses.
Google’s entire analysis hinges on launch prices falling to “less than $200/kg by the mid-2030s.”
That figure is, to put it mildly, aspirational.
Right now, commercial launch costs are far higher. A 2023 McKinsey estimate places the cost for a heavy-launch rocket to low-Earth orbit at around $1,500/kg. Google is betting that companies like SpaceX, with its reusable Starship, will be so successful that they cut the cost of getting to orbit by nearly 90% from today’s best prices.
If that doesn’t happen, the economics of Suncatcher fall apart.
3. The orbital dance problem: Keeping a single satellite in place is hard enough. Keeping a kilometer-wide formation of satellites perfectly stable to counteract gravitational pulls and orbital decay is a feat of aerospace engineering that has never been attempted on this scale.
From whitepaper to orbit
This isn’t just a thought experiment. Google is reportedly already moving from theory to testing.
The company announced a partnership with the Earth-imaging firm Planet to launch two prototype satellites by early 2027. This isn’t the full data center—it’s a “learning mission” to test the core technologies in the unforgiving environment of space. They need to see if the hardware works, if the optical links are stable, and if the whole thing can survive its first cosmic-ray bombardment.
Project Suncatcher is clearly not a solution to tomorrow’s energy crunch. A single 100-MW data center on Earth is a behemoth; replicating that capacity in orbit would require a constellation of staggering scale and cost.
But what this project really shows is the industry’s rising desperation for sustainable compute power. The quest to scale AI has become so energy-intensive that it’s forcing one of the world’s biggest companies to look for answers beyond Earth itself.
Google is essentially asking: If we can’t power the future of AI on our own planet, maybe we just need to plug into a bigger battery.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
