How you learned to read might help you learn how AI works too!

Lorenn Ruster
4 min readApr 28, 2024


You think you understand something. Until you have to explain it to others. And this is why I am always so happy for opportunities to share some of my PhD work — on how startups building AI-enabled products and who care about ‘being responsible’ do that in practice — with diverse audiences. In the process though, you realise how confusing it can all be.

This post shares how a little poll on how you learned to read might help with understanding the difference between machine learning and deep learning!

Are you #team_phonics or #team_wholelanguage?

Over the past few months, I’ve received a few emails, texts and calls, from people who I used to work with (mainly in consulting). The conversation goes a bit like this:

this AI thing…
we need to be having a conversation about it…
but we’re not.
That’s your thing now right?
Can you come and do some sort of session with us on what it is please so we can meaningfully engage in how it’s impacting us and our clients?”.

Of course, you can’t cover all of that in a one hour lunch’n’learn (surprise!). But what has been emerging is a real need to demystify what AI is and talk about it in ways that are a lot more accessible for organisations who are not directly in AI.

As someone who is in this space day after day and has been in some ways since my Masters in Applied Cybernetics in 2020, you’d think that I’d understand AI by now. Working out a way to communicate it is a completely different ballgame.

What does asking how you learned to read have to do with AI?

When I was a kid, learning to read with phonics was not best practice. Many (many) years later, my Mum still tells stories about this — waiting for some cards with ca-, da-, pa- sounds written on them to come home, but to no avail. Instead, she was told that phonics were out and “the whole language approach” was in; over time, I’d work out that a long word starting with ‘b’ might be ‘beautiful’. There is a large underlying debate about these two approaches (Dr. Jon Reyhmer refers to this as “The Reading Wars”) and since then, kids have been learning with one or both of these methods combined.

As I was trying to get my head around how I would explain the difference between machine learning and deep learning to an audience with a wide range of familiarity with AI, it occurred to me that these approaches may help!

A group of people answering how they learned to read (at this stage, confused as to the relevance of this question to demystifying AI).

#team_phonics ~ machine learning

If you’re #team_phonics, this is most aligned with how machine learning works. In the same way that you might get exposure to loads of examples of different phonics in order to work out a specific combination reads as “c-a-t = cat”, machine learning trains a machine by showing it lots of examples of specific features of a cat (whiskers, pointy ears, tail etc) to give a prediction of cat.

#team_wholelanguage ~ deep learning

If you’re #team_wholelanguage, then this is more aligned with deep learning which uses a neural network to solve a problem. When learning to read with this approach, no specific directions (or phonics) are given; rather, it is thought that over time, the child will recognise patterns in the text and the context around particular words to read them. Similarly, deep learning doesn’t tell the model to look for anything specific; rather it learns patterns of features in the data and associates these with “cat” over time. For example, it might recognise that a cat is often next to a dog in a picture.

And of course, in practice, AI can also use a bit of both (as do many kids now who learn to read!).

A summary used in one of the lunch’n’learns is below.

Explaining the difference between Machine Learning and Deep Learning by connecting it to ways in which we learn how to read. Source: Lorenn Ruster presentation

What do you think? Could this be a helpful analogy for having a high-level understanding of the difference between machine learning and deep learning?

Currently, Lorenn is a PhD candidate at the Australian National University’s School of Cybernetics, a Responsible Tech Collaborator at Centre for Public Impact and undertakes freelance consulting on responsible AI, governance, stakeholder engagement and strategy. Previously, Lorenn was a Director at PwC’s Indigenous Consulting and a Director of Marketing & Innovation at a Ugandan Solar Energy Company whilst a Global Fellow with Impact Investor, Acumen. She also co-founded a social enterprise leveraging sensor technology for community-led landmine detection whilst a part of Singularity University’s Global Solutions Program. Her research investigates conditions for dignity-centred AI development, with a particular focus on entrepreneurial ecosystems.



Lorenn Ruster

Exploring #dignity centred #design #tech #AI #leadership | PhD Candidate ANU School of Cybernetics | Acumen Fellow | PIC, SingularityU, CEMS MIM alum|Views =own