Yesterday, Microsoft hosted a major celebration to mark 50 years of technological dominance. Balloons, speeches, the whole shebang. The stage is set, the livestream’s rolling, and Microsoft AI CEO Mustafa Suleyman is mid-sentence when—bam—a voice cuts through the crowd like a knife. “Shame on you,” it rings out, sharp and unapologetic. “You are a war profiteer. Stop using AI for genocide.” The room freezes. The voice belongs to Ibtihal Aboussad, a Microsoft software engineer who’s just turned the company’s milestone party into a viral moment of reckoning.
It’s April 4, 2025, and what was supposed to be a feel-good corporate celebration in Redmond, Washington, has morphed into something else entirely—a raw, public confrontation over Microsoft’s role in the messy intersection of tech and geopolitics. Aboussad didn’t stop at “shame on you.” As security ushered her out, she kept going: “Stop using AI for genocide in our region. You have blood on your hands. All of Microsoft has blood on its hands. How dare you all celebrate when Microsoft is killing children?” Heavy stuff, right? And she wasn’t done.
Minutes later, Aboussad fired off an email to hundreds (check at the end)—maybe thousands—of her colleagues, laying out exactly why she’d just crashed the party. It’s a gut-punch of a message, the gist is this: she’s been with Microsoft’s AI Platform team for 3.5 years, and she’s horrified to learn her work’s been tangled up in what she calls a genocide in Palestine. She’s not just mad at Suleyman—she’s calling out the whole company, from the C-suite to the rank-and-file, for cashing checks she says are stained with blood.
Let’s rewind a bit. Microsoft’s 50th anniversary was a big deal. We’re talking Bill Gates, Satya Nadella, and Steve Ballmer sharing the stage, reminiscing about the good ol’ days while hyping up the future of Copilot, their shiny AI assistant. Suleyman, the guy who’s been steering Microsoft’s AI ship since joining from DeepMind, was mid-pitch when Aboussad made her move. She didn’t just heckle from the back—she strode toward the stage, clutching a keffiyeh (a scarf that’s become a symbol of Palestinian solidarity), and let loose.
Aboussad wasn’t alone, either. Later in the day, another employee, Vaniya Agrawal, pulled a similar stunt during a panel with Nadella, Gates, and Ballmer. “Shame on you all. You’re all hypocrites,” she shouted, accusing Microsoft of enabling the deaths of 50,000 Palestinians in Gaza. She was hustled out fast, but the one-two punch left no doubt: some of Microsoft’s own are fed up, and they’re not afraid to ruin a party over it.
What’s she so mad about?
So, what’s got Aboussad and Agrawal this riled up? It’s all about Microsoft’s tech—specifically its AI and cloud services—being sold to the Israeli military. According to Aboussad’s email, she joined the AI Platform team dreaming of building tools for good: think accessibility apps, translation services, stuff to “empower every human,” as Microsoft’s slogan goes. Instead, she says she found out her code’s been weaponized—literally. She points to a $133 million contract with Israel’s Ministry of Defense, claiming Microsoft’s AI is powering surveillance, transcription, and targeting systems that make Israel’s military “more lethal and destructive” in Gaza.
She’s not pulling this out of thin air. Back in February 2025, the Associated Press dropped a bombshell report saying Israel’s use of Microsoft and OpenAI tech “skyrocketed” after the October 2024 Hamas attacks—usage spiked nearly 200 times higher than pre-conflict levels. By July 2024, the Israeli military was stashing over 13.6 petabytes of data on Microsoft servers. That’s a lot of zeros—enough to store millions of hours of audio or video. Aboussad says this data comes from mass surveillance—phone calls, texts, intercepted messages—fed into Israel’s “target bank” and population registry. The result? More precise, deadlier strikes.
The numbers she throws out are staggering: 300,000 Gazans killed in the last 18 months, by “some estimates.” (For context, official counts from groups like the UN put the death toll closer to 50,000 as of early 2025, though underreporting is a known issue in war zones.) She cites recent horrors—like Israel allegedly executing 15 paramedics and burying them in sand—as proof of war crimes enabled by Microsoft’s tech. The UN and International Court of Justice have called it genocide, she notes, and the International Criminal Court’s issued arrest warrants for Israeli leaders. Whether you buy her framing or not, it’s clear she’s haunted by what she’s seen—and what she thinks her work’s helped make possible.
Microsoft’s ties to Israel
Microsoft’s no stranger to big defense contracts—its $22 billion deal to supply HoloLens headsets to the U.S. Army made headlines a few years back. But the Israel connection’s been flying under the radar until recently. Aboussad name-drops a site called “An Introduction to Microsoft’s Complicity in Apartheid and Genocide” (a real page, hosted by activist groups), which lists contracts for software, cloud services, and consulting raking in millions. Israeli PM Benjamin Netanyahu’s even bragged about his “strong ties” to Microsoft, she says. And just a day before her protest, the BDS (Boycott, Divest, Sanctions) campaign—a global push to pressure Israel over its treatment of Palestinians—tagged Microsoft as a “priority boycott target.” Coincidence? Maybe not.
This isn’t the first time Microsoft employees have kicked up a fuss over this. In 2024, the company fired two staffers, Abdelrahman Mohamed and Hossam Nasr, for holding a vigil at its Washington HQ for Gaza victims. They were part of a group called “No Azure for Apartheid,” which Aboussad’s now rallying behind. That group’s been begging Microsoft to ditch its Israeli military deals for years, arguing Azure—Microsoft’s cloud platform—is a cog in the war machine. Aboussad’s petition, linked in her email, echoes that call: “We will not write code that kills.”
The fallout: what happens next?
So, where does this leave Microsoft? The company’s stayed tight-lipped so far, only saying it offers “many avenues for all voices to be heard” without “business interruption.” But Aboussad and Agrawal might’ve already paid a price—both reportedly lost access to their work accounts post-protest, a sign they could be out the door.
For Aboussad, it’s personal. She writes about “images of innocent children covered in ash and blood” breaking her, about feeling tricked into coding for carnage. She’s betting her job—and maybe her career—on waking up her colleagues. “Silence is complicity,” she says, urging them to sign petitions, bug leadership, and spread the word. She’s banking on Microsoft’s past—like when it ditched apartheid South Africa or an Israeli facial recognition firm after employee pushback—to prove it can change course.
But will it? Tech giants like Microsoft thrive on government contracts—defense included. The Israel deal’s a drop in the bucket compared to, say, its Pentagon ties, but it’s a lightning rod. And with AI’s role in warfare growing (think drones, targeting algorithms, surveillance nets), this debate’s not going away. Just last year, Google faced its own employee revolt over Project Nimbus, a $1.2 billion cloud deal with Israel. The pattern’s clear: coders are starting to ask what their lines of code are really building.
Aboussad’s outburst isn’t just about Microsoft—it’s a flare-up in a bigger fight over tech’s soul. Can you build AI for “good” while selling it to militaries? Is it enough to say “I’ve been a software engineer” when your work picks targets? She’s not wrong that Microsoft’s got a legacy to protect—50 years of innovation could get a dark footnote if this sticks. “Is working on deadly AI weapons something you can tell your children about?” she asks. It’s a question that stings.
For now, the party’s over, and the hangover’s setting in. Aboussad’s gone from cubicle coder to corporate rebel, and whether she’s a hero or a headache depends on who’s watching. One thing’s for sure: she’s made it impossible for Microsoft—or its employees—to look away.
Here’s the full email:
Hi all,
As you might have just seen on the livestream or witnessed in person, I disrupted the speech of Microsoft AI CEO Mustafa Suleyman during the highly-anticipated 50th anniversary celebration. Here’s why.
My name is Ibtihal, and for the past 3.5 years, I’ve been a software engineer on Microsoft’s AI Platform org. I spoke up today because after learning that my org was powering the genocide of my people in Palestine, I saw no other moral choice. This is especially true when I’ve witnessed how Microsoft has tried to quell and suppress any dissent from my coworkers who tried to raise this issue. For the past year and a half, our Arab, Palestinian, and Muslim community at Microsoft has been silenced, intimidated, harassed, and doxxed, with impunity from Microsoft. Attempts at speaking up at best fell on deaf ears, and at worst, led to the firing of two employees for simply holding a vigil. There was simply no other way to make our voices heard.
We are witnessing a genocide
For the past 1.5 years, I’ve witnessed the ongoing genocide of the Palestinian people by Israel. I’ve seen unspeakable suffering amidst Israel’s mass human rights violations – indiscriminate carpet bombings, the targeting of hospitals and schools, and the continuation of an apartheid state – all of which have been condemned globally by the UN, ICC, and ICJ, and numerous human rights organizations. The images of innocent children covered in ash and blood, the wails of mourning parents, and the destruction of entire families and communities have forever fractured me.
At the time of writing, Israel has resumed its full-scale genocide in Gaza, which has so far killed by some estimates over 300,000 Gazans in the past 1.5 year alone. Just days ago, it was revealed that Israel killed fifteen paramedics and rescue workers in Gaza, executing them “one by one,” before burying them in the sand — yet another horrific war crime. All the while, our “responsible” AI work powers this surveillance and murder. The United Nations and the International Court of Justice have concluded that this is a genocide, with the International Criminal Court issuing arrest warrants for Israeli leaders.
We are Complicit
When I moved to AI Platform, I was excited to contribute to cutting-edge AI technology and its applications for the good of humanity: accessibility products, translation services, and tools to “empower every human and organization to achieve more.” I was not informed that Microsoft would sell my work to the Israeli military and government, with the purpose of spying on and murdering journalists, doctors, aid workers, and entire civilian families. If I knew my work on transcription scenarios would help spy on and transcribe phone calls to better target Palestinians (source), I would not have joined this organization and contributed to genocide. I did not sign up to write code that violates human rights.
According to AP news, there is “a $133 million contract between Microsoft and Israel’s Ministry of Defense.”
“The Israeli military’s usage of Microsoft and OpenAI artificial intelligence spiked last March to nearly 200 times higher than before the week leading up to the Oct. 7 attack. The amount of data it stored on Microsoft servers doubled between that time and July 2024 to more than 13.6 petabytes.”
“The Israeli military uses Microsoft Azure to compile information gathered through mass surveillance, which it transcribes and translates, including phone calls, texts and audio messages, according to an Israeli intelligence officer who works with the systems. That data can then be cross-checked with Israel’s in-house targeting systems.”
Microsoft AI also powers the most “sensitive and highly classified projects” for the Israeli military, including its “target bank” and the Palestinian population registry. Microsoft cloud and AI enabled the Israeli military to be more lethal and destructive in Gaza than they otherwise could.
Microsoft has also been providing software, cloud services, and consulting services to the Israeli military and government, totaling millions in profit. War Criminal Benjamin Netanyahu has explicitly mentioned his strong ties to Microsoft. A list of these contracts with the Israeli military and government can be found here: An Introduction to Microsoft’s Complicity in Apartheid and Genocide
In fact, Microsoft is so deeply connected to the Israeli military that it was just yesterday designated one of the priority boycott targets of the BDS (Boycott, Divest, Sanctions) campaign.
Regardless of your political stances, is this the legacy we want to leave behind? Is working on deadly AI weapons something you can tell your children about? Do we want to be on the wrong side of history?
Even though your work could be unrelated to the cloud that the military uses, your work benefits the company and allows it to take on the contract. Regardless of your team, you serve a company that is arming the Israeli occupation. It is undeniable that part of your compensation, no matter how small, is being paid by genocide.
Whether you work on AI or not, you will be complicit if you do nothing. It is now OUR job to take a vocal stand against Microsoft AI’s involvement in crimes against humanity.
This is why I decided to speak up today, and why I signed this important petition to demand Microsoft cut ties with genocide. And I urge you all to do the same.
Call to Action
Silence is complicity. But action always has a reaction, no matter how big or small. As workers for this company, we must make our voices heard, and demand that Microsoft does the right thing: stop selling technology to the Israeli military.
If you are also concerned about this news, and you also want your work to be used ethically, I urge you to take action:
Sign the No Azure for Apartheid petition: We will not write code that kills. And join the campaign to add your voice to the growing number of concerned Microsoft employees.
Join me in showing our discontent in this thread. If you also feel tricked into deploying weapons which target children and civilians, urge leadership (CC’ed) to drop these contracts.
Don’t stop speaking up. Urge SLT to drop these contracts at every opportunity.
Start conversations with your co-workers about the points above – so many employees may not know!
Microsoft’s human rights statement prohibits retaliation against anyone who raises a human rights-related concern: Human rights statement | Microsoft CSR
Our company has precedents in supporting human rights, including divestment from apartheid South Africa and dropping contracts with AnyVision (Israeli facial recognition startup), after Microsoft employee and community protests. My hope is that our collective voices will motivate our AI leaders to do the same, and correct Microsoft’s actions regarding these human rights violations, to avoid a stained legacy. Microsoft Cloud and AI should stop being the bombs and bullets of the 21st century.
Sincerely,
A concerned Microsoft employee
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
