If your Pixel just shrugged when you told Google Photos to “make this brighter and remove that passerby,” you might not be alone — at least not if you live in Texas or Illinois. Google confirmed to Engadget that the Gemini-powered Ask Photos feature (and reportedly the related Conversational Editing tools) simply aren’t available to users in those two states right now. The company’s statement was blunt and short on detail: the editing tools “are not available to users in Texas and Illinois at this time,” and Google is “working to determine how to make Ask Photos available to more users.”
That silence left reporters and privacy lawyers filling in the blanks. The Houston Chronicle, which flagged the absence, pointed to two big settlements in Google’s recent history as likely culprits: Illinois litigation tied to Google Photos’ face-grouping feature (a multi-million dollar settlement), and a much larger Texas settlement this year tied to the company’s collection of biometric identifiers. Those cases — which turn on how companies collect and use facial data — line up neatly with the one technical thread both Ask Photos and Conversational Editing pull on: face grouping / facial recognition.
Ask Photos and many of Google Photos’ conversational tricks lean on the app knowing which faces are which: grouping photos of the same person together, identifying recurring subjects across your library, surfacing “best” shots from your travels. That convenience depends on automated facial recognition — the same mechanism that drew legal fire in Illinois and, more recently, in Texas. When state law treats biometric checks as something that requires separate, informed consent, the practicality of a global rollout gets complicated fast.
The legal snag isn’t just theoretical. Illinois’ Biometric Information Privacy Act (BIPA) has been the legal backbone for claims against big tech companies over face grouping for years; Google settled a class action tied to Photos’ face grouping features in 2022. Texas more recently exacted a much larger settlement related to the alleged collection of biometric identifiers without permission. With those precedents, companies like Google face a choice: build local legal workarounds, require extra consent flows, or simply block risky features in states where the legal picture is uncertain.
A cautious rollout — or an inconsistent one?
What’s odd is the inconsistency across Google’s own ecosystem. Reporters noted that similar editing capabilities may still be accessible through other Google products — for example, the Gemini app’s photo tools — which suggests Google is applying state-level restrictions to specific apps or code paths rather than to the underlying models. That raises two practical problems for users: first, you might buy a Pixel expecting the full suite of features only to find some of them disabled; second, Google’s marketing materials and feature notes don’t clearly flag the state-level restrictions, so some buyers are getting surprised.
If you live in Texas or Illinois, the immediate takeaway is simple: you can still use Google Photos for storage, basic edits, and search, but the natural-language “Ask Photos” editing and conversational tweak features may not appear. That’s not the same as your data being thrown away or used differently — Google has said the company is working to expand availability — but it does mean the convenience of “tell Photos what to change and it does it” may not arrive in your state until Google either adjusts the feature or the legal risk profile changes.
This is a neat, modern example of how privacy law is shaping not only what AI can do, but where it can do it. Tech companies often build tools that assume broad, uniform access and then run into a patchwork of state laws that demand separate consent, data handling, or outright bans on certain techniques. The result is either a spaghetti-work of conditional UI flows that demand extra consent screens, or a simpler — if clunkier — approach: pull the feature in the risky jurisdictions until the legal fog clears. Expect more of both as generative AI lands in everyday consumer apps.
So what should users do?
If the feature is important to you and you live in Texas or Illinois, you have a few options: use alternative apps or the Gemini app (where available), wait for Google to expand access, or keep an eye on account settings for a face-grouping toggle and any new consent prompts. For the privacy-minded, this might actually be welcome: it’s another reminder to check what recognition features are enabled and whether you — or the people in your photos — have knowingly consented to that use of biometric data.
Ask Photos’ absence in Texas and Illinois is a small symptom of a much larger tension: the race to ship generative features and the reality of state-level biometric and privacy law. Google’s brief statement confirms the block; the chain of lawsuits and settlements helps explain it. For now, users in those states lose a neat convenience; regulators and privacy advocates see a proof point that law can still shape how — and where — AI features roll out.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
