Cynthia Dwork on AI Guidelines

EPIC President Marc Rotenberg requested input from several groups – including the EPIC Advisory Board – for a one-page document outlining principles for the ethical use of AI. “Universal Guidelines for Artificial Intelligence” has twelve principles, which includes a “Fairness Obligation” calling for institutions to ensure that “AI systems do not reflect bias or make impermissibly discriminatory decisions.” But as EPIC Advisor Cynthia Dwork points out, algorithmic “fairness” is hard to define.

Dwork stopped by to engage EPIC staff in an open discussion on the issue of fairness in the development of algorithms and to share a recent lecture she gave on the issue. As a scientist for Microsoft and through her many years of practice, she has worked on projects involving private data analysis, cryptography, combating spam, complexity theory, web search, voting theory, distributed computing, interconnection networks, and algorithm design. “When I see something that says algorithms should be fair, what does that mean?” Dwork said.  “When you say you want an algorithm to be fair, on its face it is hard because there is no source of ground truth.”

Rotenberg suggested this may be a reason to keep AI out of it and leave decision making to individuals. But according to Dwork, AI systems are little worse than what humans are doing. Since “algorithms can only learn what it sees,” she said, the solution might be to find ways to apply algorithms objectively. “Something can be useful without being perfect.” Dwork said. While she thinks transparency is a concern, for her it may also be a red herring. Instead, Dwork has spent a lot of time thinking about interpretability – essentially, trying to understand why an algorithm interprets data one-way verses another. For instance, why might results of two similarly situated people be so different when one minor factor like region is changed? For Dwork this is the better question and where more understanding may lessen the resistance to machine learning. “I’m in favor of testing the system” she said. “And I am in favor of the code being available, but I don’t think it [transparency] will solve the problem.”

For more information visit www.EPIC.org. Defend Privacy. Support EPIC.