Sociotechnical systems approach to developing AI Ethics skills.
The rapid adoption of 4IR tools into our human systems has alerted us to an urgent need to enhance the ethical acuity of all graduates (Gleason, 2018) and foster their ability to understand systemic consequences of new technologies (Seldon, 2018). The OECD (2019) stresses the need to consider social policy and protection when organisations are adapting to Industry 4.0 to avoid dystopian futures of widespread unemployment.
There have been too many cases of poorly implemented 4IR tools into human systems that have come at deep ethical costs (O’Neil, 2016; Pasquale, 2015; ProPublica, 2019). This has sometimes been ascribed to a lack of more liberal education skill-sets in technology designers, engineers, and business managers. Ethical acuity of systemic social impact of 4IR technologies is not only needed by graduates working in tech engineering and biotech, but all research graduates planning to work in fields that intersect with technology and society. Which means just about everyone!
An often cited example of ethically irresponsible application of new technologies to social systems is that of a recidivism prediction software platform implemented in several states in the USA. In 2016 ProRepublica, a non-profit investigative journalism organisation, uncovered deeply flawed bias in the application of the software.
“Overall, Northpointe’s assessment tool correctly predicts recidivism 61 percent of the time. But blacks are almost twice as likely as whites to be labeled a higher risk but not actually re-offend. It makes the opposite mistake among whites: They are much more likely than blacks to be labeled lower risk but go on to commit other crimes.” ProRepublica’s, retrieved February 2020
|Labeled Higher Risk, But Didn’t Re-Offend||23.5%||44.9%|
|Labeled Lower Risk, Yet Did Re-Offend||47.7%||28.0%|
Source: ProPublica analysis of data from Broward County, Fla
There has been substantial debate on the reported bias of the Northpointe software. Overall, the evidence indicates in-built bias (Chouldechova, 2017) in the software, likely due to sociological flaws in the risk assessment frameworks (Werth, 2019). Flaws in algorithmic risk assessment hold even more critical impact when we consider the development of wide spread social applications such as health insurance ratings, social credit, and friend-networks. These are clear cases of 4IR technologies be implemented where engineers need to collaborate with social scientists and consider systemic ethical impact on society. These types of skills are best developed in education environments.
As 4IR technology becomes more pervasive and prevalent in our STS, it should be incumbent upon HEIs to better prepare all graduates to ethically assess the impact of new technologies before implementation. To address this, we must ask what qualities help future employees design, implement, and interact with ethically responsible technologies? Are there capabilities that need to be explicitly added to graduate qualities’ lists to address ethically responsible leadership in 4IR?
Our PhD graduates receive the highest level of education in our societies and as such should be equipped to be leaders in their organisations and communities beyond economic considerations (Molla & Cuthbert, 2019). The need for deeper ethical understandings of cyber-physical systems amongst higher education graduates has been widely discussed (i.e. Davidson, 2017; Gleason, 2018). As PhDs are often expected to be leaders in new ways of thought (Walker & Fongwa, 2017) these types of skills should be given high priority in GRE.