Skip to main content

**“Where Is Daddy?”

 

How AI Turned Family Homes into Target Practice**

Welcome to the future of warfare — where algorithms do the thinking, lawyers do the justifying, and families do the dying.

In this brave new age, bombs no longer fall randomly. No, that would be barbaric. Today, death is data-driven, AI-assisted, and ethically laundered through buzzwords like precision, efficiency, and security.

And the star of this technological circus?

A system chillingly nicknamed: “Where’s Daddy?”

Yes. That’s real.


Step 1: Let the Algorithm Decide Who Looks Killable

According to investigative reporting by Israeli outlets +972 Magazine and Local Call, the Israeli military has relied on an AI system known as Lavender to generate massive lists of Palestinians flagged as “suspected militants.”

How does Lavender decide?

Not through trials.
Not through warrants.
Not through verified intelligence.

But through pattern recognition, metadata, phone usage, social connections, and behavioral assumptions — the kind of data science Silicon Valley uses to recommend shoes, now repurposed to recommend death.

Human Rights Watch confirms that such digital systems assign suspicion scores based on vast but unreliable datasets — a process that is inherently error-prone and biased, especially in a besieged population under surveillance.

Thousands of names. Minimal human review. Efficiency achieved.

Progress!


Step 2: Track the “Suspect” — Preferably When He’s Home

Enter “Where’s Daddy?”

Once Lavender flags a person, this second system allegedly tracks their mobile phone location and alerts operators when the person enters their family home.

Why the home?

Because, according to intelligence sources quoted in the investigation, it’s easier to strike someone when they are not on the battlefield.

Translation:
They’re asleep.
They’re unarmed.
They’re surrounded by civilians.

Human Rights Watch notes that mobile phone location data is imprecise and unsuitable for life-and-death targeting decisions, yet it has reportedly been used to cue strikes anyway.

But hey — who needs accuracy when you have confidence?


Step 3: Call It “Precision” and Press Launch

This is where the magic happens.

A machine says: Target is home.
A human glances at the screen.
A bomb is dropped.

The family disappears.

And the press release later explains it all with comforting phrases like:

  • “High-value target”
  • “Operational necessity”
  • “Collateral damage”

Because when an algorithm helps choose the timing, suddenly mass civilian death becomes a technical side effect, not a moral catastrophe.

Human Rights Watch has warned that such systems risk violating international humanitarian law, particularly the principles of distinction and proportionality — the basic rules meant to prevent exactly this scenario.

But rules are old-fashioned. AI is modern.


Why This Isn’t Just “Technology Gone Wrong”

Let’s be very clear:

This is not a bug.
This is not an accident.
This is not “unfortunate misuse.”

This is design logic.

A system that:

  • Mass-produces targets
  • Tracks them digitally
  • Prefers moments of domestic presence
  • Accepts civilian death as tolerable

Is not a defensive tool.

It is automation of moral abdication.

UN experts have warned that using AI in this way risks turning warfare into algorithmic extermination, where accountability dissolves into code and responsibility is outsourced to machines.


The Most Honest Part Is the Name

Let’s pause on the name again: “Where’s Daddy?”

Not Where’s the combatant?
Not Where’s the threat?
Not Where’s the weapon?

Daddy.

A word that assumes a home.
A family.
Children nearby.

It is perhaps the most truthful label of this entire operation — accidentally honest in a war otherwise drowning in euphemisms.


Conclusion: The Future We’re Being Sold

We are told this is the future of war:

  • Smarter
  • Cleaner
  • More precise

But Gaza shows us the truth.

AI doesn’t make war humane.
It makes killing faster, easier, and psychologically distant.

It allows people to die not because someone chose to kill them — but because a system suggested it, a screen confirmed it, and a bureaucracy approved it.

And the world watches.

Livestreamed.
Documented.
Scrolled past.

Because nothing says civilization like witnessing a massacre in real time — and calling it innovation.


Key Sources

  • +972 Magazine / Local Call investigations into Israeli AI targeting systems
  • Human Rights Watch, Questions and Answers: Israeli Military’s Use of Digital Tools in Gaza
  • UN Special Rapporteurs on AI, warfare, and civilian protection


Comments

Popular posts from this blog

A Rabbi Against the State: When Faith Refuses Power

In a world where identity is weaponized and religion is drafted into political armies, the sight of an ultra-Orthodox rabbi standing beside Palestinian flags unsettles nearly everyone. Yet there stands — black coat, beard, sidelocks — calmly declaring something that scrambles modern assumptions: “ Judaism is not Zionism.” For him, this is not rebellion . It is obedience . Affiliated with , a small and highly controversial Haredi sect, Rabbi Beck represents a theological current that predates modern nationalism. His argument is not secular. It is not progressive. It is not post-modern. It is ancient . And that is precisely the point. The Interview That Disturbs Categories In one widely circulated long-form interview, the exchange unfolds with almost disarming simplicity. Interviewer: Rabbi Beck, how can you oppose Israel as a Jewish rabbi? Rabbi Beck: Judaism and Zionism are two completely different things. Judaism is a religion. Zionism is a political movement founded little more ...

When the Warning Comes from the General Moshe Ya’alon, Jewish Supremacy, and the Echo Nobody Wanted to Hear

History has a cruel sense of irony. Sometimes the most devastating indictments do not come from the oppressed, the bombed, the buried, or the silenced—but from the very architects of power who once swore they were different. This week, that indictment came from Moshe Ya’alon : former Israeli Defense Minister, former IDF Chief of Staff, lifelong pillar of Israel’s security establishment. Not a dissident poet. Not a radical academic. Not a Palestinian survivor. A general. And what he said shattered the last polite illusion. “ The ideology of Jewish supremacy that has become dominant in the Israeli government is reminiscent of Nazi race theory.” Pause there. Sit with it. This was not shouted at a protest . It was not scribbled on a placard. It was written calmly, deliberately, after attending a Holocaust Remembrance ceremony —then reading reports of Jewish settlers attacking Palestinians , blocking ambulances , fracturing skulls , burning homes. Never Again, apparently, now ...

“Not Auschwitz — Yet Still Genocide”: When Israeli Holocaust Historians Break the Silence on Gaza

  There are moments in history when the most unsettling truths do not come from one’s enemies, but from within. From those who know the past most intimately. From those whose moral authority is built not on ideology, but on memory. In December 2025, two of Israel’s most respected Holocaust and genocide scholars— Prof. Daniel Blatman and Prof. Amos Goldberg of the Hebrew University of Jerusalem—published a deeply unsettling opinion article in Haaretz . What they argued was not casual, rhetorical, or activist hyperbole. It was a grave historical judgment. Their conclusion was stark: What is happening in Gaza is not Auschwitz. But it belongs to the same family of crimes: genocide. Why This Voice Matters Blatman and Goldberg are not marginal figures. They are historians whose professional lives have been devoted to studying Nazi crimes, genocide mechanisms, memory, and moral responsibility . Their scholarship is rooted in the very catastrophe that shaped modern Jewish iden...

Even the Dead Are Not Safe: How Power Desecrates Graves and Calls It Security

  There is a final dignity that every civilization, every faith, every moral tradition claims to respect: the dignity of the dead. In Gaza and the West Bank, even that has been revoked. Homes can be flattened. Children can be starved. Hospitals can be reduced to ash. These crimes, we are told, are “tragic necessities.” But graves ? What threat does a corpse pose to a modern army armed with drones , tanks , and nuclear ambiguity ? Apparently, enough to be bulldozed. Graves as Enemy Infrastructure According to detailed reporting by Al Jazeera , Israeli forces in Gaza did not merely fight the living — they waged war on cemeteries . Tombstones were crushed. Graves were excavated . Human remains were scattered, mixed, lost . Families returned not to mourning, but to forensic horror: bones without names, names without bodies. This was not collateral damage . This was not crossfire. This was methodical excavation . Heavy machinery was deployed to retrieve the body of one ...

Don’t Spoil the Show: Gaza, Davos, and the Business Class of Peace

There is a rule at Davos—unwritten, but strictly enforced. Reality is bad for business. Yossi Alpher learned this the hard way. Sitting on a panel at a luxury resort near the Dead Sea, surrounded by ministers, executives, and conflict “experts,” he made the unforgivable mistake of speaking honestly. Grim facts. Grim assessments. No PowerPoint optimism. No Riviera renderings. No applause. A prominent Israeli industrialist later pulled him aside and explained the crime: “ Don’t spoil the show . The idea is to radiate optimism that nourishes an investment climate . It’s all about business. No room for realism .” That sentence may be the most accurate peace-process doctrine of the 21st century. Phase II: Now With Billionaires Fast forward to Davos again. This time, the stage is Gaza—or rather, Gaza™ , the investment opportunity. Trump’s “Board of Peace,” staffed by billionaires and brand managers of global destruction , announces Phase II of a Gaza peace plan with all the s...