WASHINGTON: The UN has no way to verify Israel’s reported use of an artificial intelligence-powered program that is used to help identify bombing targets in the Gaza Strip, a UN spokesman said Thursday.
“We’ve read the press reports, we have no way of verifying them, but upon reading them, I think they are very clear illustrations of the kinds of concerns that the Secretary-General (Antonio Guterres) raised directly,” Stephane Dujarric told reporters when asked about the recent reports.
Dujarric said it could be a “real world example” of how the technology is being used.
A report found that the Israeli army is using an artificial intelligence-powered program that generates thousands of Palestinian men and their homes as potential targets for military strikes in Gaza.
The AI system, called Lavender, is designed to mark all suspected operatives in the armed wings of the Palestinian groups, Hamas and Islamic Jihad, as potential targets.
“I think it is yet another also another reason and motivation for member states to coalesce and agree on safeguards, frameworks, protection on the use of such technology,” said Dujarric.
He added that the UN chief pushed for global agreements:
“I think he’s made his position known, but it is for those who control that technology both at the state level and the private sector level to agree on a way forward that doesn’t endanger the human race,” he added.