Skip to main content

**“Where Is Daddy?”

 

How AI Turned Family Homes into Target Practice**

Welcome to the future of warfare — where algorithms do the thinking, lawyers do the justifying, and families do the dying.

In this brave new age, bombs no longer fall randomly. No, that would be barbaric. Today, death is data-driven, AI-assisted, and ethically laundered through buzzwords like precision, efficiency, and security.

And the star of this technological circus?

A system chillingly nicknamed: “Where’s Daddy?”

Yes. That’s real.


Step 1: Let the Algorithm Decide Who Looks Killable

According to investigative reporting by Israeli outlets +972 Magazine and Local Call, the Israeli military has relied on an AI system known as Lavender to generate massive lists of Palestinians flagged as “suspected militants.”

How does Lavender decide?

Not through trials.
Not through warrants.
Not through verified intelligence.

But through pattern recognition, metadata, phone usage, social connections, and behavioral assumptions — the kind of data science Silicon Valley uses to recommend shoes, now repurposed to recommend death.

Human Rights Watch confirms that such digital systems assign suspicion scores based on vast but unreliable datasets — a process that is inherently error-prone and biased, especially in a besieged population under surveillance.

Thousands of names. Minimal human review. Efficiency achieved.

Progress!


Step 2: Track the “Suspect” — Preferably When He’s Home

Enter “Where’s Daddy?”

Once Lavender flags a person, this second system allegedly tracks their mobile phone location and alerts operators when the person enters their family home.

Why the home?

Because, according to intelligence sources quoted in the investigation, it’s easier to strike someone when they are not on the battlefield.

Translation:
They’re asleep.
They’re unarmed.
They’re surrounded by civilians.

Human Rights Watch notes that mobile phone location data is imprecise and unsuitable for life-and-death targeting decisions, yet it has reportedly been used to cue strikes anyway.

But hey — who needs accuracy when you have confidence?


Step 3: Call It “Precision” and Press Launch

This is where the magic happens.

A machine says: Target is home.
A human glances at the screen.
A bomb is dropped.

The family disappears.

And the press release later explains it all with comforting phrases like:

  • “High-value target”
  • “Operational necessity”
  • “Collateral damage”

Because when an algorithm helps choose the timing, suddenly mass civilian death becomes a technical side effect, not a moral catastrophe.

Human Rights Watch has warned that such systems risk violating international humanitarian law, particularly the principles of distinction and proportionality — the basic rules meant to prevent exactly this scenario.

But rules are old-fashioned. AI is modern.


Why This Isn’t Just “Technology Gone Wrong”

Let’s be very clear:

This is not a bug.
This is not an accident.
This is not “unfortunate misuse.”

This is design logic.

A system that:

  • Mass-produces targets
  • Tracks them digitally
  • Prefers moments of domestic presence
  • Accepts civilian death as tolerable

Is not a defensive tool.

It is automation of moral abdication.

UN experts have warned that using AI in this way risks turning warfare into algorithmic extermination, where accountability dissolves into code and responsibility is outsourced to machines.


The Most Honest Part Is the Name

Let’s pause on the name again: “Where’s Daddy?”

Not Where’s the combatant?
Not Where’s the threat?
Not Where’s the weapon?

Daddy.

A word that assumes a home.
A family.
Children nearby.

It is perhaps the most truthful label of this entire operation — accidentally honest in a war otherwise drowning in euphemisms.


Conclusion: The Future We’re Being Sold

We are told this is the future of war:

  • Smarter
  • Cleaner
  • More precise

But Gaza shows us the truth.

AI doesn’t make war humane.
It makes killing faster, easier, and psychologically distant.

It allows people to die not because someone chose to kill them — but because a system suggested it, a screen confirmed it, and a bureaucracy approved it.

And the world watches.

Livestreamed.
Documented.
Scrolled past.

Because nothing says civilization like witnessing a massacre in real time — and calling it innovation.


Key Sources

  • +972 Magazine / Local Call investigations into Israeli AI targeting systems
  • Human Rights Watch, Questions and Answers: Israeli Military’s Use of Digital Tools in Gaza
  • UN Special Rapporteurs on AI, warfare, and civilian protection


Comments

Popular posts from this blog

Ana Kasparian: The Voice That Won’t Be Silent — A Call for Truth in an Age of Power

  Ana Kasparian is one of the most recognized and outspoken voices in contemporary political media. As a co-host of The Young Turks — a trailblazing online news and commentary program — she has spent nearly two decades dissecting U.S. politics, media, power, and foreign policy with unapologetic clarity and fierce conviction. She is not just a commentator — she is a truth-seeker who challenges power at every turn , refusing to soften her words for comfort. Schooled in journalism and political science, Ana’s commentary continues to mobilize millions, especially younger generations who feel unheard in mainstream discourse. A Voice Against the Status Quo Ana’s rhetoric can be bold, controversial, and deeply passionate — because she refuses to accept narratives that obscure the underlying truth about power and influence. On American democracy and foreign policy, she strikes at the heart of what many hesitate to articulate: “ We don’t actually live in a true democracy here in t...

Dr. Randa Abdel Fattah. De-Invited by Association: When Grief Becomes a Pretext and Palestinian Identity a Liability

How Dr. Randa Abdel-Fattah Was Silenced in the Name of “Sensitivity” In a remarkable feat of moral gymnastics, Australia’s literary establishment has once again demonstrated how grief can be weaponised, principles suspended, and Palestinian identity rendered dangerously “inappropriate ” —all in the name of cultural sensitivity. Dr. Randa Abdel-Fattah , a respected author, academic, and public intellectual, was quietly de-invited from Adelaide Writers’ Week following the Bondi Junction massacre. Not because she had any connection—real, implied, or imagined—to the atrocity. Not because she endorsed violence. Not because she violated any law or ethical standard. But because, apparently, the mere presence of a Palestinian Muslim woman who speaks about justice is now considered culturally unsafe during national mourning . One wonders: unsafe for whom? The Logic of the Absurd Festival organisers were careful—almost impressively so—to state that Dr. Abdel-Fattah had nothing to do wi...

Gaza and the Collapse of World Order: When the Guardian of Human Rights Sounds the Alarm

There are moments when the language of diplomacy fails, when caution becomes complicity, and when silence becomes an accomplice to destruction. On January 9, 2026, Agnès Callamard—Secretary General of Amnesty International—crossed that threshold. Her words were unambiguous, unprecedented, and devastating: The United States is destroying world order. Israel has been doing so for the last two years. Germany, through complicity and repression, is helping govern its demise. This was not activist rhetoric. It was a diagnosis from the very institution tasked with guarding the moral and legal architecture of the modern world. The Collapse of the Post-War Moral Architecture The international order that emerged after World War II was built on a promise: never again . Never again genocide. Never again collective punishment. Never again impunity for powerful states. That promise was codified in international law, human rights conventions, and multilateral institutions. But Gaza has...

Rebranding Genocide: When Killing Learns New Words

  There are moments in history when crimes do not end — they simply learn new language. Gaza is living inside such a moment. The bombs have not stopped falling. The children have not stopped dying. The displaced have not stopped freezing in tents pitched atop rubble that was once their homes. What has changed is the vocabulary . And in the modern age, vocabulary is power . If you can rename atrocity, you can anesthetize conscience. First, it was called self-defense — a phrase emptied of meaning by its repetition. Then it became a war , despite the grotesque imbalance: one side armed with one of the most advanced militaries on earth, backed by the world’s most powerful empire ; the other a besieged civilian population without an army, navy, air force, tanks, or safe shelter. Now it is branded a ceasefire — a word invoked not to stop violence, but to conceal it. This is not peace. It is genocide with a quieter soundtrack. The Illusion of Restraint A slowed rate of killing is not m...

Our Genocide: When Silence Becomes Complicity

The world watches. The bombs fall. And a human tragedy of unfathomable scale unfolds. On July 28, 2025, B’Tselem , Israel’s foremost human rights organization, issued a report titled Our Genocide — a document that shatters decades of euphemism and denial. For the first time, a major Israeli human rights group did not merely describe violence in Gaza as disproportionate or unlawful — it named it for what it is: genocide . “ A coordinated attack to destroy Palestinian society” B’Tselem did not arrive at this conclusion lightly. The report painstakingly documents the consequences of nearly 22 months of war — cities erased, families obliterated, a society made into rubble. “An examination of Israel’s policy in the Gaza Strip and its horrific outcomes, together with statements by senior Israeli politicians and military commanders about the goals of the attack, leads us to the unequivocal conclusion that Israel is taking coordinated action to intentionally destroy Palestinian societ...