This site uses cookies to ensure the best viewing experience for our readers.
Algorithms That Affect People’s Lives Should Be Transparent, Says Cyber Law Expert

Mind The Tech

Algorithms That Affect People’s Lives Should Be Transparent, Says Cyber Law Expert

Nimrod Kozlovski, partner at Israel-based law firm Herzog, Fox, Neeman, and lecturer at Tel Aviv University spoke at Calcalist’s Mind the Tech conference in Tel Aviv

Lior Gutman | 17:46, 25.11.19
Recently, U.S. legislators admitted for the first time that they were wrong to let commercial companies rate people to determine who gets credit or who stays in prison, Nimrod Kozlovski, partner at Israel-based law firm Herzog, Fox, Neeman, and lecturer at Tel Aviv University, said Monday.

Kozlovski spoke at Calcalist and Israel’s Bank Leumi’s Mind the Tech conference in Tel Aviv. From now on, every artificial intelligence project in the U.S. must have its database evaluated to determine whether it creates consistent bias, Kozlovski said.

Nimrod Kozlovski. Photo: Yariv Katz Nimrod Kozlovski. Photo: Yariv Katz Nimrod Kozlovski. Photo: Yariv Katz

Kozlovski presented three examples where databases allegedly created bias in decision-making processes, directly affecting people’s lives.

In the U.S., the number of prisoners per capita is among the highest in the world, Kozlovski said. This led authorities to examine house arrests as an alternative to incarceration, he explained. The next logical step was to build an artificial intelligence-based system that examines every potential prisoner to create a ranking of their likelihood to repeat their crime, he added. “The ranking effectively determined who was released to house arrest and who remained in prison, but when the results were examined, the databases were found to be outdated or otherwise inaccurate.”

The second example Kozlovski gave was U.S. social services, where algorithms attempt to determine whether a child is considered at-risk. According to Kozlovski, examining this system also revealed bias as it was based on decades-old assumptions.

Related stories

The last example was credit ratings, where women tended to score lower than men based on outdated assumptions as to their careers.

Companies and public agencies should be required to reveal the data their algorithms are basing decisions on, giving people a chance to appeal or sue when a faulty algorithm affects their lives, Kozlovski said.
share on facebook share on twitter share on linkedin share on whatsapp share on mail

TAGS