Skip to main content

**“Where Is Daddy?”

 

How AI Turned Family Homes into Target Practice**

Welcome to the future of warfare — where algorithms do the thinking, lawyers do the justifying, and families do the dying.

In this brave new age, bombs no longer fall randomly. No, that would be barbaric. Today, death is data-driven, AI-assisted, and ethically laundered through buzzwords like precision, efficiency, and security.

And the star of this technological circus?

A system chillingly nicknamed: “Where’s Daddy?”

Yes. That’s real.


Step 1: Let the Algorithm Decide Who Looks Killable

According to investigative reporting by Israeli outlets +972 Magazine and Local Call, the Israeli military has relied on an AI system known as Lavender to generate massive lists of Palestinians flagged as “suspected militants.”

How does Lavender decide?

Not through trials.
Not through warrants.
Not through verified intelligence.

But through pattern recognition, metadata, phone usage, social connections, and behavioral assumptions — the kind of data science Silicon Valley uses to recommend shoes, now repurposed to recommend death.

Human Rights Watch confirms that such digital systems assign suspicion scores based on vast but unreliable datasets — a process that is inherently error-prone and biased, especially in a besieged population under surveillance.

Thousands of names. Minimal human review. Efficiency achieved.

Progress!


Step 2: Track the “Suspect” — Preferably When He’s Home

Enter “Where’s Daddy?”

Once Lavender flags a person, this second system allegedly tracks their mobile phone location and alerts operators when the person enters their family home.

Why the home?

Because, according to intelligence sources quoted in the investigation, it’s easier to strike someone when they are not on the battlefield.

Translation:
They’re asleep.
They’re unarmed.
They’re surrounded by civilians.

Human Rights Watch notes that mobile phone location data is imprecise and unsuitable for life-and-death targeting decisions, yet it has reportedly been used to cue strikes anyway.

But hey — who needs accuracy when you have confidence?


Step 3: Call It “Precision” and Press Launch

This is where the magic happens.

A machine says: Target is home.
A human glances at the screen.
A bomb is dropped.

The family disappears.

And the press release later explains it all with comforting phrases like:

  • “High-value target”
  • “Operational necessity”
  • “Collateral damage”

Because when an algorithm helps choose the timing, suddenly mass civilian death becomes a technical side effect, not a moral catastrophe.

Human Rights Watch has warned that such systems risk violating international humanitarian law, particularly the principles of distinction and proportionality — the basic rules meant to prevent exactly this scenario.

But rules are old-fashioned. AI is modern.


Why This Isn’t Just “Technology Gone Wrong”

Let’s be very clear:

This is not a bug.
This is not an accident.
This is not “unfortunate misuse.”

This is design logic.

A system that:

  • Mass-produces targets
  • Tracks them digitally
  • Prefers moments of domestic presence
  • Accepts civilian death as tolerable

Is not a defensive tool.

It is automation of moral abdication.

UN experts have warned that using AI in this way risks turning warfare into algorithmic extermination, where accountability dissolves into code and responsibility is outsourced to machines.


The Most Honest Part Is the Name

Let’s pause on the name again: “Where’s Daddy?”

Not Where’s the combatant?
Not Where’s the threat?
Not Where’s the weapon?

Daddy.

A word that assumes a home.
A family.
Children nearby.

It is perhaps the most truthful label of this entire operation — accidentally honest in a war otherwise drowning in euphemisms.


Conclusion: The Future We’re Being Sold

We are told this is the future of war:

  • Smarter
  • Cleaner
  • More precise

But Gaza shows us the truth.

AI doesn’t make war humane.
It makes killing faster, easier, and psychologically distant.

It allows people to die not because someone chose to kill them — but because a system suggested it, a screen confirmed it, and a bureaucracy approved it.

And the world watches.

Livestreamed.
Documented.
Scrolled past.

Because nothing says civilization like witnessing a massacre in real time — and calling it innovation.


Key Sources

  • +972 Magazine / Local Call investigations into Israeli AI targeting systems
  • Human Rights Watch, Questions and Answers: Israeli Military’s Use of Digital Tools in Gaza
  • UN Special Rapporteurs on AI, warfare, and civilian protection


Comments

Popular posts from this blog

Never Attack a Revolution—Unless It’s Gaza

  By Malik Mukhtar There is a peculiar confidence that comes with being wrong for decades and still being invited back to explain the world. Yossi Alpher—former Mossad official, veteran intelligence analyst, and institutional voice of Israeli “realism”—offers us precisely that confidence in his January 12, 2026 reflections on Iran. His message, distilled, is simple: things are complicated, revolutions are unpredictable, and humility is required . This is sound advice. It just arrives from the wrong mouth, at the wrong time, over the wrong bodies. Because while Alpher warns us—correctly—not to “attack a revolution, ” Israel has spent the last two years doing something far more obscene : attacking a trapped civilian population with no revolution , no army , no air force, no escape —and calling it self-defense . Intelligence: A Sacred Failure, Repeated Faithfully Alpher recalls, with admirable candor, the catastrophic ignorance of Western and Israeli intelligence during...

Gaza Beyond the Alibi of Hamas: Genocide as Method, Silence as Accomplice.( From Chris Hedges report )

We are the most informed generation in human history—and perhaps the least disturbed by what we know. From the first missiles that struck Gaza’s residential blocks to the slow starvation that followed, everything was visible. Every destroyed home. Every burned hospital. Every child pulled from rubble. And yet, the global emotional temperature barely rose. In an age of total visibility, feeling itself has become scarce. Watching has replaced witnessing. Knowing has replaced responsibility. This moral numbness is not accidental. It is cultivated . And at the center of this cultivation stands a single word, endlessly repeated, ritually invoked, and strategically deployed: Hamas . Hamas has functioned not as an explanation, but as an alibi. The Choice Was Announcedk From Day One From the earliest days of Israel’s assault, the policy was articulated with chilling clarity: Gaza’s population would be given two options— stay and starve, or leave . This was not the language of counte...

Ras ‘Ein al-‘Auja: How Ethnic Cleansing Happens Without a Declaration

Ethnic cleansing rarely announces itself with sirens or official decrees. More often, it arrives quietly—through sleepless nights, smashed water tanks, stolen sheep, armed men grazing livestock on stolen land, and the slow realization that survival itself has become impossible. On 8 January 2026 , Israel completed what it had been methodically engineering for months: the forcible transfer of 26 Palestinian families from the shepherding community of Ras ‘Ein al-‘Auja in the southern Jordan Valley. That is 124 people , including 59 children , pushed from homes their families had lived in for decades—not by a single evacuation order, but by sustained terror. This is not a humanitarian crisis caused by “clashes.” It is not a byproduct of war. It is a deliberate policy outcome . Violence as Policy, Militias as Instruments Ras ‘Ein al-‘Auja lies about ten kilometers north of Jericho. It is the last remaining shepherding community in the southern Jordan Valley , and the largest sti...

Ana Kasparian: The Voice That Won’t Be Silent — A Call for Truth in an Age of Power

  Ana Kasparian is one of the most recognized and outspoken voices in contemporary political media. As a co-host of The Young Turks — a trailblazing online news and commentary program — she has spent nearly two decades dissecting U.S. politics, media, power, and foreign policy with unapologetic clarity and fierce conviction. She is not just a commentator — she is a truth-seeker who challenges power at every turn , refusing to soften her words for comfort. Schooled in journalism and political science, Ana’s commentary continues to mobilize millions, especially younger generations who feel unheard in mainstream discourse. A Voice Against the Status Quo Ana’s rhetoric can be bold, controversial, and deeply passionate — because she refuses to accept narratives that obscure the underlying truth about power and influence. On American democracy and foreign policy, she strikes at the heart of what many hesitate to articulate: “ We don’t actually live in a true democracy here in t...

Dr. Randa Abdel Fattah. De-Invited by Association: When Grief Becomes a Pretext and Palestinian Identity a Liability

How Dr. Randa Abdel-Fattah Was Silenced in the Name of “Sensitivity” In a remarkable feat of moral gymnastics, Australia’s literary establishment has once again demonstrated how grief can be weaponised, principles suspended, and Palestinian identity rendered dangerously “inappropriate ” —all in the name of cultural sensitivity. Dr. Randa Abdel-Fattah , a respected author, academic, and public intellectual, was quietly de-invited from Adelaide Writers’ Week following the Bondi Junction massacre. Not because she had any connection—real, implied, or imagined—to the atrocity. Not because she endorsed violence. Not because she violated any law or ethical standard. But because, apparently, the mere presence of a Palestinian Muslim woman who speaks about justice is now considered culturally unsafe during national mourning . One wonders: unsafe for whom? The Logic of the Absurd Festival organisers were careful—almost impressively so—to state that Dr. Abdel-Fattah had nothing to do wi...