Smartphone photography has come a long way, often packing in features that rival standalone cameras. But if you’ve ever felt that your phone’s camera sometimes goes too far—over-sharpening, aggressive tone mapping, or limited manual controls—Adobe’s freshly announced Project Indigo might just be the breath of fresh air you’ve been waiting for. Launched June 13, 2025, as a free Adobe Labs app for iPhone, Project Indigo is the handiwork of Marc Levoy and Florian Kainz, veterans of Google’s Pixel camera team, now aiming to blend the best of computational photography with a more natural, SLR-like aesthetic and robust pro controls.
Marc Levoy and Florian Kainz helped pioneer many of the computational photography techniques that made Google’s Pixel phones renowned for their camera performance. After leaving Google in 2020, both joined Adobe’s Nextcam team, charged with exploring how mobile cameras could evolve beyond the status quo. Their vision: an app that addresses common photographer gripes—over-processing, lack of manual controls, and limited dynamic range—while still harnessing the power of multi-frame HDR and on-device AI.
One frequent complaint about smartphone images is the “smartphone look”: overly bright scenes, boosted saturation, heavy smoothing, and aggressive sharpening that yield images easier to glance at on small screens but less appealing on larger displays. Some apps have gone the “zero-process” route, promising minimal intervention—but in reality, some processing (white balance, demosaicing, basic corrections) is always required. Through conversations with photographers, Levoy and Kainz concluded that what many truly want is not zero-process but a subtler, more natural look akin to SLR output. Hence, Indigo applies only mild tone mapping, modest saturation boosts, and gentle sharpening, preserving textures and avoiding the “HDR-ish” over-processed feel.
At its core, Project Indigo leans heavily on tried-and-true computational photography strategies:
- Stronger under-exposure per frame: By under-exposing individual frames more than typical camera apps, Indigo preserves highlights, reducing the risk of blown-out bright areas.
- Multi-frame merging up to 32 frames: Whereas many implementations might combine fewer frames (e.g., Google’s HDR+ originally used around 15 frames), Indigo can capture and align up to 32 images per shot, dramatically lowering noise in shadows and retaining detail in challenging lighting.
The result: images with fewer blown highlights and cleaner shadows, at the cost of a slightly longer capture time. But as the team notes, “after a few seconds you’ll be rewarded with a better picture.” Because Indigo biases toward minimal spatial denoising, it retains more natural textures even if some noise remains—an intentional trade-off to avoid plastic-like smoothing.
Many smartphone camera apps offer some degree of manual control, but often fall back to single-frame captures or limited interfaces. Project Indigo, by contrast, provides full manual adjustments—shutter speed, ISO, focus—while still applying computational merging behind the scenes. Significantly, it supports both JPEG and RAW (DNG) outputs benefiting equally from multi-frame processing, something uncommon even among high-end cameras. In Photo mode, Indigo runs a “zero shutter lag” mechanism by continuously buffering raw frames; when you press the shutter, it selects the last good frame as a reference and merges it with prior captures, ensuring you nail fast-moving scenes.
Night mode in Indigo steps things further: in low light, it extends exposure times and captures up to 32 frames (sometimes up to 1 second per frame if on a tripod), relying on careful alignment to reduce handshake blur. Handheld use is supported, though best results emerge when stabilizing the phone—either with two hands or bracing against objects.
Beyond core HDR merging, Project Indigo surfaces experimental AI-driven features under a “Technology Previews” section. A standout is “Remove Reflections,” which uses AI to strip unwanted window or glass reflections in-camera immediately after capture. This proves useful when shooting through car, train, or plane windows, delivering a clean image shareable on the spot without needing post-processing software. Other previews hint at deeper integration with Adobe’s editing ecosystem: profiles compatible with Lightroom/Camera Raw allow toggling between SDR and HDR looks during editing, and future previews may include portrait mode, panorama, computational video features, and more.
Indigo’s interface balances simplicity and depth: it launches into Photo mode but prompts switching to Night mode in darker scenes. The viewfinder aims for future real-time “what you see is what you get” rendering, though that’s still in development. When installed, Indigo can be set as the default camera from within Lightroom mobile, streamlining capture-to-edit workflows. Despite its heavy computation demands—hence requiring iPhones with at least 6GB of RAM (iPhone 12 Pro onward, or iPhone 14 and later non-Pro models)—the app requires no Adobe sign-in for now, lowering the barrier to try it out.
Because Indigo combines many frames and applies AI features, capture and processing take a bit longer than default camera apps. Users may notice a pause after shutter press, but the payoff is higher-quality images with better dynamic range and cleaner shadows. The heavy lifting means newer iPhones deliver smoother performance; Adobe recommends recent models for optimal experience. The app is free in the Apple App Store and runs on Pro/Pro Max iPhones from series 12 onward, and non-Pro from series 14 onward.
Project Indigo enters a competitive field: third-party apps like Halide and others have long offered manual controls, and several tout “natural” or “minimal processing” modes. Yet Indigo’s pedigree—driven by the minds behind Pixel’s computational breakthroughs—and its deep integration with Adobe’s editing suite set it apart. The inclusion of AI features like reflection removal further extends its utility beyond basic capture.
As with any camera app, users may wonder about data handling. Adobe’s announcement indicates on-device processing for core features, minimizing need to upload images for computation. No Adobe sign-in is required at launch, suggesting local processing by default. Users should review permissions and privacy statements, but early signs point toward respecting user data by keeping processing on the phone.
Adobe confirms that an Android version is “for sure” on the horizon, though no specific timeline is given. Planned features include alternative “looks,” a more controlled portrait mode with higher image quality, panorama capture, computational video experiments, exposure and focus bracketing for HDR or all-in-focus stacks, and deeper Lightroom integration (e.g., passing bracketed bursts for stack-based editing). This positions Indigo not just as a standalone app but as a platform for prototyping and testing next-gen camera and editing experiences.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
