Dignity in Tech: An origin story for how my PhD topic on dignity-centred AI development came to be

Lorenn Ruster
7 min readMar 31, 2023

Adapted from my Thesis Proposal Review (submitted June 2022)

(Some of) the backstory

In September 2015, I was sitting in a room with ten other Acumen Global Fellows from all over the world and Acumen’s CEO, Jacqueline Novogratz. We were in the United States countryside, a few hours outside of New York City, for a weekend discussing the Good Society Readings — a diverse set of readings from Martin Luther King, Plato, Amartya Sen, Amin Maalouf, Ursula Le Guin and many others. Acumen, is a global non-profit changing the way the world tackles poverty through ‘patient capital’ — taking money in as philanthropy and investing it as venture capital, patiently over long timeframes. It also invests in people and ideas, for example through Acumen Academy — the world’s school for social change. Somehow, I was a part of this amazing fellowship of change-makers ‘in-training’ before setting off for a year to work in an entrepreneurial venture Acumen had recently invested in; my fellowship would take me to Uganda where I would be working in a solar energy company.

Sitting there, devouring these readings and the conversation that ensued, what I felt was in equal measure confronting and inspiring. I dare say, a nurturing of a ‘hard-edged hope’ that is at the core of Acumen’s philosophy.

Acumen Global Fellows during Good Society Readings, 2015. Photo credit: Lujain Al Ubaid

One of Jacqueline’s core messages (at least to me), was that the opposite of poverty, is not wealth, but dignity. And this notion, core to Acumen, of building a world based on dignity, really stuck.

The opposite of poverty is dignity.

It also took me back a year or so to my experiences at Singularity University in 2014 as part of their Global Solutions Program — a 10-week immersive program with around 80 global innovators, many of them technology entrepreneurs, high on the hope and the promise of technology to solve the world’s most pressing problems. We were given one task: to imagine and prototype a solution that could leverage the latest technology to solve a problem for a billion people in a decade. No small feat.

I often felt like an outsider there, concerned at questions about why? and who for? that remained foreign territory at the time. Although it has been adjusted since, discussions around ‘tech ethics’ or ‘being responsible’ were minimal. Of the 21 projects that ensued, our all-women group found ourselves in a lane all of our own — we moved away from the latest IoT gadget or designing for space, and instead decided to leverage the humanitarian experience of teammate, Selene Biffi, to apply the latest technologies to communities generally ‘left out’, putting power into the hands of those in post-conflict areas who are too scared to use their recently landmine-cleared land. Despite a lot of initial interest, a few awards, and some considerable efforts post the Singularity University program to further pursue the idea, we ultimately failed to raise capital and, to this day, the problem persists.

The Bibak Team (Left to Right) — Selene Biffi, Me and Shirley Andrade

A broken ankle cut my time short in the Ugandan solar energy enterprise, yet Acumen’s call to moral imagination — “the humility to see the world as it is, and the audacity to imagine the world as it could be” — stayed percolating.

It percolated whilst I was part of the establishment and growth of a majority-Indigenous owned, staffed and managed consulting company. Grounded in values of truth, respect and self-determination, we worked out ways to build deep, trusted relationships to collaborate across sectors for change.

Little by little, I began to realise that what I really wanted to see was ‘a world based on dignity’ in the tech sector, particularly with entrepreneurs who are imagining, enacting and shaping our futures. Imagine the future we could yield if we were able to merge the best of the Acumen philosophy with a depth of relationship-building at the heart of Indigenous ways of working and the promise of technology shared by enthusiastic tech entrepreneurs.

Figuring out what this reimagination could mean in practice led me to the Australian National University’s School of Cybernetics (SoCy) where I was able to explore technology from technical, social and ecological perspectives and dive into notions of dignity and their intersection with technology.

I did this through my Masters maker project which looked at what dignity-enabling technology could look like for my 91-year-old grandmother through the development of a tech-assisted prayer prototype and online course for technology designers and developers, and through my capstone project, which began to develop notions of dignity as an ecosystem and explore the extent to which AI ethics instruments in the governments of Australia, UK and Canada protect and/or promote dignity.

Left: Online shortcourse for technology designers and developers (2020) | Right: Whitepaper exploring the role of dignity in government AI ethics instruments (2021)

These experiences ultimately led to my PhD research today. This research aims to investigate leverage points that enable and thwart dignity-centred Artificial Intelligence (AI) development in entrepreneurial ecosystems. It hopes to contribute a small part to such a reimagination and contribute to SoCy’s wider mission of creating a new branch of engineering and a new generation of practitioners that help others navigate major societal transformations, including those that involve new and emerging technologies

My PhD in a nutshell

What does my PhD focus in on?

  1. It focuses in on AI development, as a specific part of broader technology development and as a building block of cyber-physical systems, because of the simultaneous potential of AI models to dehumanise, discriminate and disempower at scale (see for example Benjamin 2019; Eubanks 2017; Noble 2018; O’Neil 2016) and also humanise, validate and empower.
  2. It focuses in on entrepreneurs and the entrepreneurial ecosystem because of what Friedman & Hendry (2019) refer to as the interactional stance of technology — where humans shape technologies and technologies shape humans. Accordingly, there is much power and much responsibility associated with technology development, and entrepreneurs are often at the forefront.
  3. It focuses in on dignity at the centre because of a belief that this may shape technologies in a ‘better direction’, in ways that put what it means to be human in the centre. This contrasts with what might be currently called ‘human-centred’ approaches to technology development that put human wants at the centre. Putting human wants at the centre is not the same as putting what it means to be human at the centre. There is a subtle, yet powerful difference.
  4. It focuses in on moving from principles to practice. This is intervention research; it starts with a stance that dignity-centred AI development is needed and valuable. It focuses on the ‘how’.

Who cares? Why is my PhD important?

  • We need demonstration cases of how to move from principles to practice in order for there to be responsible AI of the future
  • Dignity in the context of design and governance of AI development is currently understudied, but potentially really valuable
  • Discourse around “Responsible AI” usually focuses on the ‘big end of town’. Undoubtedly this is important. However, working out how to turn principles into practice is also of relevance for (responsibility-aware) entrepreneurs. And what they work out may provide some pathways for other larger players too!

Around half way in to the PhD, I look forward to sharing more about its fruits in due course!

Lorenn Ruster is a social-justice driven professional and systems change consultant. Currently, Lorenn is a PhD candidate at the Australian National University’s School of Cybernetics and a Responsible Tech Collaborator at Centre for Public Impact. Previously, Lorenn was a Director at PwC’s Indigenous Consulting and a Director of Marketing & Innovation at a Ugandan Solar Energy Company whilst a Global Fellow with Impact Investor, Acumen. She also co-founded a social enterprise leveraging sensor technology for community-led landmine detection whilst a part of Singularity University’s Global Solutions Program. Her research investigates conditions for dignity-centred AI development, with a particular focus on entrepreneurial ecosystems.

References:

Benjamin, R, 2019, Race after technology: abolitionist tools for the new Jim code, Polity, Medford, MA.

Eubanks, V, 2017, Automating inequality: how high-tech tools profile, police, and punish the poor, First Edition, St. Martin’s Press, New York, NY.

Friedman, B & Hendry, DG, 2019, Value Sensitive Design: Shaping Technology with Moral Imagination, MIT Press.

O’Neil, C, 2016, Weapons of math destruction: how big data increases inequality and threatens democracy, First, Crown, New York

Noble, SU, 2018, Algorithms of oppression: how search engines reinforce racism, New York University Press, New York.

Novogratz, J, 2020, Manifesto for a moral revolution: Practices to Build a Better World, First edition, Henry Holt and Company, New York.

Ruster, LP & Snow, T, 2021, Exploring the role of dignity in government AI Ethics instruments, Centre for Public impact, Viewed 19 March 2021, https://www.centreforpublicimpact.org/partnering-for-learning/cultivating-a-dignity-ecosystem-in-government-ai-ethics-instruments.

--

--

Lorenn Ruster

Exploring #dignity centred #design #tech #AI #leadership | PhD Candidate ANU School of Cybernetics | Acumen Fellow | PIC, SingularityU, CEMS MIM alum|Views =own