The Guardian is reporting that Israeli intelligence officers say the I.D.F. used an A.I. called Lavender to target over 37,000 people in Gaza with connections to Hamas.
The A.I. ‘did it coldly,’ i.e. without emotion and one operator said he spent about twenty seconds to approve each target. See quote:
Another Lavender user questioned whether humans’ role in the selection process was meaningful. “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.”
Here is the link: www.theguardian.com/…
I know we crack jokes about Skynet, but there is worry about the accountability of A.I. in many areas, particularly warfare. It will be interesting to see nations and communities respond to this news, and perhaps learn some valuable moral lessons.