Consent Optional Tech Genius

An AI company deleted 3 million OkCupid photos and the facial-recognition models trained on them after FTC scrutiny, because apparently "we found a huge pile of dating photos" was not a real consent framework

Reuters reports Clarifai said it deleted millions of OkCupid user photos and related facial-recognition models after an FTC settlement over data that had originally been shared in 2014.

What Happened

Reuters reports AI company Clarifai said it deleted 3 million OkCupid user photos and facial-recognition models trained on them after the Federal Trade Commission settled with OkCupid over privacy violations tied to a 2014 data transfer. According to Reuters, court documents showed Clarifai's founder wrote that OkCupid must have a "HUGE amount of awesome data" while the company was collecting images to train facial-recognition systems.

That is already the whole genre in one sentence. A dating site had intimate user photos. An AI company looked at them like a bulk discount warehouse for machine vision. Years later, regulators had to step in and ask whether maybe people uploading pictures for romance were not actually volunteering for faceprint experiments.

Why This Belongs Here

This is internet nonsense at industrial scale: take one context, strip out the human expectations attached to it, and feed the leftovers into a totally different business because the database looked useful. The underlying logic is always the same. If the data exists, someone in tech will eventually convince themselves it is ethically available enough.

The especially stupid part is that none of this required a complicated moral puzzle. Dating-profile photos are among the easiest examples of context-dependent personal data imaginable. People shared them to meet other humans, not to help train systems that classify faces by identity, age, race, or gender. Yet the industry keeps acting stunned whenever users object to discovering that their private-seeming digital life was actually raw material.

The Bigger Joke

We are supposed to be impressed that the company deleted the models after scrutiny. Fine. Better than not deleting them. But it also means those models existed long enough to become a regulatory afterthought instead of an obvious nonstarter from day one. Somewhere in corporate America, adults looked at a pile of dating-app photos and said yes, this seems like robust machine-learning nutrition.

The AI boom loves telling us the future is inevitable. Funny how the inevitable part always seems to involve harvesting first, apologizing later, and calling the cleanup a sign that the system worked.

Source

Reuters: AI company deleted OKCupid user photos, data after FTC scrutiny


← Back to Internet Nonsense