Paid for by patrons
Episode 17 - Diurnal High Variance
Hilary and Roger talk about Amazon Echo and Alexa as AI as a service, the COMPAS algorithm, criminal justice forecasts, and whether algorithms can introduce or remove bias (or both).

Show Notes:


In Two Moves, AlphaGo and Lee Sedol Redefined the Future (http://www.wired.com/2016/03/two-moves-alphago-lee-sedol-redefined-future/)


Google’s AI won the game Go by defying millennia of basic human instinct (http://qz.com/639952/googles-ai-won-the-game-go-by-defying-millennia-of-basic-human-instinct/)


Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks (https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing)


ProPublica analysis of COMPAS (https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm)


Richard Berk’s Criminal Justice Forecasts of Risk (http://www.amazon.com/Criminal-Justice-Forecasts-Risk-SpringerBriefs/dp/1461430844?ie=UTF8&*Version*=1&*entries*=0)


Cathy O’Neill: Weapons of Math Destruction (http://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815)


Cathy O’Neill: I’ll stop calling algorithms racist when you stop anthropomorphizing AI https://mathbabe.org/2016/04/07/ill-stop-calling-algorithms-racist-when-you-stop-anthropomorphizing-ai/


RMS Fact package: https://cran.r-project.org/web/packages/rmsfact/index.html


Use R! 2016: http://user2016.org