When Your Feature Gets Someone Hurt

An old photo album containing pictures of ruins. (Generated with Gemini)

A story making the rounds recently opened some disturbing implications. The New York Times ran the story as Israel Deploys Expansive Facial Recognition Program in Gaza which says in part:

To supplement Corsight’s technology, Israeli officers used Google Photos, the free photo sharing and storage service from Google, three intelligence officers said. By uploading a database of known persons to Google Photos, Israeli officers could use the service’s photo search function to identify people.

Whether this particular use of Google Photos is compliant with the usage policy is irrelevant. Those are concerns for PR firms and lawyers. Google Photos is a consumer product. People use it for tagging their family and friends in their vacation photos. The product works pretty well for this purpose.

The humanity behind the product developers and those who are unexpectedly and unwittingly blindfolded and led away to some unkonwn prison somewhere is an entirely different matter.

The intended usage of a product – especially software products – matters greatly. People responsible for these features decide its fit for purpose based on this intended usage. From their point of view, a mislabelled photo is an inconvenience; possibly a funny one. The error thresholds were never decided based on whether someone could lose their life for it.

Some developer wrote a conditional branch without giving it a second thought. And now it decides the fate of a human life. That’s quite the turn of events.

There’s no moral here. People are going to use your products in ways that you never intended. I bet the people who designed coins have no idea the extents to which those coins were flipped for. And so it goes.