Creating new perspectives since 2009

Is WhatsApp putting Palestinians at risk of being killed in Gaza?

April 30, 2024 at 5:23 pm

In this photo illustration, logos of WhatsApp are displayed on mobile phone screen and computer screen in Ankara, Turkiye on May 23, 2023 [İsmail Kaplan – Anadolu Agency]

For decades, humans have lived with the prospect of a future where wars would be fought with so-called killer robots driven by technology that always seemed otherworldly.

That is now a terrifying reality, given Israel’s use of artificial intelligence in its ongoing deadly assault on the Gaza Strip, documented in reports and investigations by various outlets, none more important than Israeli publications +972 Magazine and the Hebrew-language, Local Call.

They revealed the use of AI programs such as “The Gospel”, “Lavender” and “Where’s Daddy?”, all of which were used to identify tens of thousands of Gazans as targets, track and strike people specifically in their homes, and essentially run a “mass assassination factory”  with minimal human oversight.

A critical detail in their early April report about “Lavender” and “Where’s Daddy?” related to how the software were purportedly gathering data from WhatsApp — the communication behemoth owned by tech giant, Meta.

WATCH: Palestine This Week: Israel’s AI war machine and the targeting of civilians

That particular piece of information piqued the interest of Paul Biggar, a software engineer, innovator and founder of Tech For Palestine, a coalition of technology experts working to benefit Palestinians.

He put out a blog raising concerns over Meta’s possible involvement in Israel’s devastating AI-powered war on Gaza.

Speaking to Anadolu, Biggar said he views “Lavender” as one of the tools that Israel is using as a “way of automating the genocide (in Gaza)”.

“It allows them to target individuals and create a layer of plausible deniability, where they say that these individuals were identified by AI as being valid targets, which is not true,” he said.

He said there is no “real reason to believe that any of these targets are valid” and the Israeli military does “no due diligence in identifying or investigating the targets suggested by the AI ​​system”.

WhatsApp and Meta’s alleged involvement

Biggar said his blog was “specifically about Meta’s involvement” because the reporting on “Lavender” suggested that one of the ways the system identifies targets is “through what WhatsApp groups people are part of”.

What he was referring to was a part in the +972 and Local Call report about “a short guide to building a ‘target machine’, similar in description to “Lavender”, based on AI and machine-learning algorithms.”

That guide, according to the report, was in a book — titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” — released under a pen name, Brigadier General YS, in English in 2021.

The report said the +972 and Local Call investigation had confirmed the author “to be the current commander of the elite Israeli intelligence unit 8200.”

In that guide to create an AI system “were several examples of the ‘hundreds and thousands’ of features that can increase an individual’s rating (the likelihood to be identified as a target), such as being in a WhatsApp group with a known militant, changing cell phone every few months and changing addresses frequently.”

Biggar termed that a “ludicrous” suggestion.

READ: Israel military using AI system to target militants, bomb civilians

“We know from other sources that Hamas does not coordinate attacks on any sort of mobile phone-based things, WhatsApp or anything like that,” he said.

“So, what they’re really suggesting is who do people know? Who are they friends with? … The membership of a WhatsApp group is in no way incriminating and it is ludicrous to suggest it.”

Is Meta giving information to Israel?

For Biggar, it is a “fact” that Israel is getting WhatsApp data, but it is still unclear whether it is being directly provided by Meta.

“Perhaps the IDF (Israel Defence Forces) has other ways of accessing this data … Perhaps they don’t get it from the front door through Meta,” he said.

One possibility is that they are “getting it through the many Unit 8200 members” now working at Meta, he said, referring to the same Israeli intelligence unit mentioned in the +972 and Local Call report about “Lavender”.

“Lots of people at Meta used to work at the IDF, used to be in Unit 8200, including their chief information security officer,” Biggar claimed.

“There is also Sheryl Sandberg, their former COO and one of the major people who built Facebook to be what it is today and who remains on their Board. “She has also been on a propaganda tour for Israel,” he added.

For Meta founder, Mark Zuckerberg, Biggar pointed out that he has “given donations to (Israeli NGO) Zaka, who are some of the people who created the false propaganda used to justify the genocide to Israelis and to the Western world”.

So, Israel could be getting the data directly from Meta through information requests, through the backdoor or a third undisclosed way, he said.

In any of these scenarios, the main issue becomes that “Meta is pitching WhatsApp as being this end-to-end secure thing when they, it seems, should know that is not true,” he said.

READ: Israel using Meta’s WhatsApp to kill Palestinians in Gaza through AI system

‘Vulnerable to abuse and intrusive external surveillance’

Bahraini blogger and activist, Esra’a Al Shafei, believes the reports about WhatsApp data possibly being used by Israel should “definitely be taken very seriously”.

“If the lawsuit in the report is true, it shows that by using WhatsApp, people are risking their lives,” Al Shafei, a board member of the Tor Project, a digital privacy and freedom group, told Anadolu.

She pointed to metadata — “data around the data itself” — as an area of ​​vulnerability, saying privacy advocates strongly oppose its collection and storage “particularly for apps like WhatsApp which falsely advertise their product as fully private.”

While WhatsApp encrypts the content of messages, it still collects various information, including “app activity … location, financial information (if the user has ever connected that number to a payment portal managed by or accessible to Meta), contacts, groups,” he explained.

“Even though WhatsApp is end-to-end encrypted, and claims to not have any backdoors to any government, the metadata alone is sufficient to expose detailed information about users, especially if the user’s phone number is attached to other Meta products and related activities ,” she said.

“This is why the IDF could plausibly utilise metadata to track and locate WhatsApp users,” she added.

However, Al Shafei emphasised that all of this does not mean that Meta or WhatsApp are necessarily collaborating with Israel, but “by the very act of collecting this information, they’re making themselves vulnerable to abuse and intrusive external surveillance.”

More hacking than handoff

Researcher and journalist, Sophia Goodfriend, believes the case of Israel using WhatsApp data has more to do with hacking than deliberate cooperation.

“With WhatsApp, in particular, that’s a case of the military simply hacking into phones and going through WhatsApp data,” she told Anadolu.

“WhatsApp isn’t necessarily providing all of that information to the Israeli military,” said Goodfriend, who has written for various outlets, including +972 Magazine, on warfare, automation and digital rights.

This is less about “collaboration and more the Israeli military, like many militaries around the world, (being) able to hack into these technologies,” she said.

READ: US looking at report that Israel used AI to identify bombing targets in Gaza

Also, she explained, there are many instances where militaries use technologies in ways that are against the policies of the companies themselves.

“There was a case in March, a report by the New York Times, about the (Israeli) military using a Google image database to build up biometric surveillance of Gazans who were fleeing from the north into the south in the first few months of the war,” she said.

“That’s a case of the Israeli military using an open-source database against its own stipulations.”

In such cases, she stressed that tech firms “who find their technologies being used to build up military surveillance systems or inform military operations have a responsibility to ensure that their own technologies are not going directly against their use policies.”

While major corporations directly collaborating with militaries for such operations are “not happening as much”, there are other kinds of collaboration with private civilian technology firms, she said.

“You have plenty of examples of start-ups contracted to build up different surveillance technologies that could be informing lethal targeting systems,” she said.

“It’s a messy kind of technological ecosystem and plenty of civilian firms are implicated just by virtue of contracts and sub-contracting, and even just having their technology out there and the Israeli military being able to use it.”

Regarding what Israel uses to feed the AI ​​systems it has employed in Gaza, she pointed to its “pretty advanced network of surveillance technologies … across the West Bank, Gaza, as well as Israel proper.”

“This includes biometric surveillance, cyber-hacking technologies, drone reconnaissance, GPS tracking and social media monitoring as well,” she said.

“All of these different sources feed into systems like the ones we’re seeing rolled out in Gaza, including “Lavender” and “Where’s Daddy?”,” she added.

On the actual AI systems and their role in the destruction of Gaza, Goodfriend said it was another example of “how militaries can use these systems to carry out their own agendas.”

“We saw Israel’s military really emphasising destruction rather than accuracy … (and) we saw these systems really abetting this military campaign,” she said.

READ: Israel aims to be ‘AI superpower’, advance autonomous warfare

However, she emphasised that these systems are “quite rudimentary” and really “are not autonomous weapon systems acting without human oversight.”

“It was actually the decision-makers who are calling the shots. “If all the reporting is confirmed, it’s really the decision-makers who are … directing the military to bomb all these targets and to rely on AI to produce an endless stream of targets to be bombed,” she said.

Call for answers

In a statement to Anadolu, a WhatsApp spokesperson said the company has “no information that these reports are accurate.”

“WhatsApp has no backdoors and we do not provide bulk information to any government. For over a decade, Meta has provided consistent transparency reports and those include the limited circumstances when WhatsApp information has been requested,” read the statement.

Such requests are reviewed and evaluated “based on applicable law and consistent with internationally recognised standards, including human rights,” it said.

“Our next report will come next month, on time. We do agree there is much more to privacy than end-to-end encryption, which is why we work hard to protect the limited information available to us and we continue to build more features to protect people’s information,” the statement added.

For Biggar, the Tech For Palestine founder, that is not enough, and he wants answers from Meta.

“Meta should be issuing a public report that states exactly what they know. They should be doing an investigation beyond what they know. They need to do an investigation both internally and externally … to discover, was the IDF accessing this information … (or) who were they getting it from internally?” he said.

“If it was a hack that was not revealed, did Meta know about this hack or flaws in their encryption? “They have been marketing it as end-to-end … (so) they were aware that it was not end-to-end for people in Palestine?”

Al Shafei concurred, saying the “responsible thing for Meta to do is fully investigate the … investigation in the reports on “Lavender” regarding how they’re utilising WhatsApp’s metadata to track, harm or kill its users throughout Palestine.”

“Simply claiming that they’re absolved of liability because they don’t actively play a role in providing a backdoor is insufficient. WhatsApp is used by billions of people and these users have a right to know what the dangers are in using the app, or what WhatsApp and Meta will do to proactively protect them from such misuse,” she said.

READ: UN experts denounce use of purported AI to commit ‘domicide’ in Gaza

The views expressed in this article belong to the author and do not necessarily reflect the editorial policy of Middle East Monitor.