Pages

Monday, April 27, 2026

Lifelong Unlearning

I was talking to a colleague the other day and we were discussing how our formal systems always assume that learning is additive. We are always talking about more courses, more skills, more knowledge. But in practice, growth also comes from subtracting. Anyone who has worked hard to drop a bad habit knows how tough it is to let go. Whether it is a habit, our assumptions and biases, or even poor work practices and ways of working, "unlearning" is hard to do but has nothing to show. There is no concrete evidence of unlearning or any artifacts or portfolio to show what and how much we have unlearned. We don't give out certificates for what people have stopped doing!

Unlearning is subtle, but it is powerful. After unlearning my fair share, I know that it has made space for better judgment, greater adaptability, and more thoughtful practice.


I wrote about this previously, in 2012, in my blog post titled:
Emptying Your Cup: Unlearning to Learn.

The point I am making is that not all growth comes from learning something new. Sometimes, it comes from letting something go. And we have to be intentional about it.

What did you let go of recently that contributed to your growth? How do you encourage others to unlearn? How do we design "learning" systems that also enable and support unlearning?

Thoughts?

----------

Additional reading:

  • Chapter 15: Learning, relearning and unlearning

    November 2024 Open Books and Proceedings
    DOI:10.38140/obp1-2024-15
    License CC BY 4.0
    Authors:
    Rosemary Akinyi Menya-Olendo
    Lucy Mawang
    Kenyatta University


In this chapter, the authors describe the following key terms:

Learning is contextualised as the continuous ability to acquire competencies relevant for the 21st century. This form of learning enables people to continuously improve their performance, expand their horizons, and equip themselves for the future. It further allows individuals to grow and develop personally and professionally throughout their lifespan. Turk (2023) observes that this type of learning occurs in various contexts, such as formal education, informal learning, and experiential learning. 

Unlearning means leaving behind old, outdated, and obsolete knowledge that is deemed inefficient in ad-dressing current challenges. It therefore entails questioning one’s assumptions and beliefs, and opening up to new perspectives that can help solve present problems. No wonder Turk (2023) postulates that the unlearning process may be challenging, as it necessitates confronting personal biases and preconceptions.

Relearning is the process of learning something again, often in a new or different way. It involves building on previous knowledge and experiences to gain a deeper understanding of a subject or skill. Relearning is important because it allows individuals to update their knowledge and skills in response to new information and changing circumstances (Turk, 2023).



"Multimodal large language models (MLLMs) are trained on massive multimodal data, making data unlearning increasingly important as data owners may request the removal of specific content. In practice, these requests often arrive sequentially over time, creating the problem of MLLM Lifelong Unlearning."


Tuesday, April 14, 2026

Lessons from the Far Side of the Moon

“human in the loop”

I am sure you have seen and heard this phrase that has become almost a default in conversations about AI and work. I have been thinking about it for months. It is supposed to be a reassuring phrase. That, in everything that AI is taking over, humans are still somewhere in the system overseeing, validating, and intervening when needed. But in the last few weeks, the more I have mulled over this phrase, the less reassured I have felt.

But something shifted last week.

Image credit: NASA
Seen during Artemis II’s lunar flyby on April 6, 2026, the Moon and Earth align in the same frame, each partially illuminated by the Sun.

On April 10, 2026, four astronauts completed a mission to fly around the far side of the Moon. They went further from Earth than any humans in over 50 years. On their way back, for about 40 minutes, they were completely cut off from NASA mission control. 

There was no signal and no loop; it was all human

While I am not a 100% sure, I bet the team at NASA focused a lot on how to build and nurture the competencies of the four astronauts for this 40-minute blackout when the Orion module would be cut off entirely. All these efforts would be in preparing the humans to respond to unknown and unpredictable situations and trusting that the humans inside the spacecraft will know what to do. 

Image credit: NASA
The Artemis II crew – (clockwise from left) Mission Specialist Christina Koch, Mission Specialist Jeremy Hansen, Commander Reid Wiseman, and Pilot Victor Glover – pause for a group photo inside the Orion spacecraft on their way home.

NASA did not send AI to the Moon, they sent humans because fundamentally, there are many things where humans are not just meant to be in the loop, they are meant to be in the lead. I was reassured by such a mission where humans were not just a checkpoint or validation point in the system, instead they were the ones leading the design and the decisions. And if you have seen all the pictures and the beautiful, poetic expressions of the astronauts trying to reflect on what they were seeing and feeling and learning, you will agree that humans are not just validators; we are value creators. 

It was Accenture CEO Julie Sweet who said, "AI future should be human in the lead", and I love this reframing. This difference between loop and lead is not a subtle one. It is a powerful way to think about how organizations must use AI, but not at the expense of losing their own ability to think, learn, grow, and become more intelligent.

When an organization like NASA sends humans to space through missions like Artemis II, it is not because machines are incapable. It is because, in environments where there is ambiguity and uncertainty and the consequences are as real as it gets, we can't afford to outsource our judgement.

So, as AI is reshaping our work and our world, instead of thinking about whether or not we keep humans in the loop, we have to think more intentionally about how to keep humans in the lead! This means thinking about how we design roles, systems, training and work so that humans continue to build their capability and thinking and they continue to feel and express and to question and engage with each other. 

I have said this before. I don't think the real risk of AI is replacement. I think it is the erasure of our unique fingerprints. It is the things that each of us leaves behind as breadcrumbs that highlight the mix of who we are through our choices, decisions, and connections with other humans.

In the race to AGI, if we focus only on making AI more intelligent, we may end up designing systems and organizations that make us less intelligent. Which also means that perhaps the real work ahead is not just building smarter AI, but redefining human intelligence itself.

Tuesday, March 3, 2026

Learning and Development in the New World

I finally read Donald H Taylor's L&D Global Sentiment Survey (GSS) 2026 report last week, and I haven't been able to put it down. No, it didn't have all the answers but what it did say confirmed something that I have been watching build for years. 

Donald calls it the “New World”. 

For me, the most striking data point in the entire report was that the word "human" appeared in 64 challenge responses this year for example in comments like: “how to properly utilize AI while not removing the human element”. The word human didn't appear at all in 2022 or 2023.

Donald's read on this: "On the surface, this comment is about design, but it could also be about something more profound: about L&D's very sense of identity coming under threat from AI." (p.21).

I think that's exactly right.

Donald notes that "The biggest riser this year is 'Showing value', up from #7 last year to #5 this year, with its highest-ever share of the vote. Anecdotally, it seems L&D is feeling the pressure to justify its existence."

Yes. We don't have a technology or AI problem; we have an existential problem about who L&D is and what our role is in this New World.

Donald frames L&D's new role as "a shift from content creation to performance consulting, from training delivery to capability ecosystems, and from reactive service to proactive partnership" (p.17)

I agree. But I don't think this shift will happen through better technology or more AI. Instead, it will happen through bringing even more humanity into our work.
I think the shift is contingent on L&D practitioners who know their craft deeply enough to make it human, visible, and connected to outcomes that matter to the business.

The survey's conclusion lands on a note of cautious optimism: "while we have no map, we do have a direction." (p.22)

Yes, we may have a New World and perhaps the directions to get there may look somewhat intriguing. But after 25+ years in this field working at the intersection of workforce development, competency assessment, and learning design across industry sectors, I can say that the destination for L&D has always been the same.

Does the worker know more, perform better, and contribute more fully as a result of what L&D did? Everything else is just the method we chose to get there.

The GSS 2026 report is available at https://donaldhtaylor.co.uk/research_base/global-sentiment-survey-2026/

Well worth your time.

hashtag#LearningAndDevelopment #WorkplaceLearning #TalentDevelopment #GSS2026 #HumanFirst #PerformanceConsulting #FutureOfWork
hashtag
hashtag