Gay test pictures

broken image
broken image

In an extensive set of authors’ notes that anyone commenting on the topic ought to read, Michal Kosinski and Yilun Wang address a variety of objections and questions. And it demonstrates, as it is intended to, a class of threat to privacy that is entirely unique to the imminent era of ubiquitous computer vision.īefore discussing the system itself, it should be made clear that this research was by all indications done with good intentions. It relies on cues apparently more subtle than most can perceive - cues many would suggest do not exist. But the accuracy of the system reported in the paper seems to leave no room for mistake: this is not only possible, it has been achieved.

broken image

In addition to exposing an already vulnerable population to a new form of systematized abuse, it strikes directly at the egalitarian notion that we can’t (and shouldn’t) judge a person by their appearance, nor guess at something as private as sexual orientation from something as simple as a snapshot or two. The research is as surprising as it is disconcerting. Today’s illustration of this fact is a new paper from Stanford researchers, who have created a machine learning system that they claim can tell from a few pictures whether a person is gay or straight. We count on machine learning systems for everything from creating playlists to driving cars, but like any tool, they can be bent toward dangerous and unethical purposes, as well.

broken image