I’ve been working on this essay for a while.
It started as a comment in my journal after a discussion with my students, and it grew into one of the largest things I’ve written. Teaching a class on Emerging Technologies makes it almost impossible to avoid talking about one of the biggest technological shifts this generation is watching unfold in real time: the wide use of AI and Large Language Models in our lives.
Nothing against programmable matter, quantum computing, or gene editing, but my students seem to have a deep interest in everything that relates to Artificial Intelligence and its impact on our society. However, this deep interest spreads across all circles I inhabit. Not just my students but also my friends, coworkers, and family members have a deep interest in the changes that will come due to this new technology.
The most common theme in these conversations concerns the relationship between AI and the future of work, specifically, having workers replaced by Artificial Intelligence.
I’m not familiar with most industries in the world, but I’ve spent my entire professional life in education and music, and I believe I have a solid sense of how AI is going to unfold in at least one of those fields. And it’s not music.
Artificial Education
The Starting Point:
The original prompt for my discussion about AI with my students was this post by David Perel, where he talks about his relationship with AI as a writer and educator.
Although I never spoke to David, he’s one of the most important individuals in my career. Trying to be part of his “Liftoff” project, a high-schooler-based spinoff of his “Write of Passage” course, was the canon event that freed me from the shackles of self-doubt around pivoting my whole career to English, which led me in a wonderful journey that has led me to where I am today, teaching, writing and podcasting in English. It goes without saying that I pay attention to what David writes.
In his post, he makes a series of predictions about AI and its integration with writing and teaching. Besides the one in the image, the other part that really resonated was this:
"The common thread here is humanity. People are also interested in people. Their stories, their struggles, their emotions, their drama, their unique insights into how the world works."
I’m fully onboard with David here, leaning into your personal story is one of the most powerful moves you can make as a knowledge worker or content creator. When it comes to education, though, I have a few extra thoughts to add.
You can Automate Logic, but not Presence.
When ChatGPT started to become “a thing” years ago, a coworker at Entrepreneurial Gym messaged me saying something along the lines of “ChatGPT is coming for your position. We don’t need you anymore!”
We joked about it, and eventually he asked me if I was worried it could take my job, since it could design learning experiences at will. Back then, I laughed out loud and told him that, even if it could, we would still need real people to implement these different workshops. Not only that, but back then, designing an excellent learning experience using ChatGPT demanded a high amount of previous knowledge about the topic, since the alternative would be relying on basic prompts, getting mediocre outcomes.
I was recently telling this story to a group of friends, and someone pointed out that I probably wasn’t laughing anymore.
You know what? I am.
I don’t believe AI will replace me, even if we reach a point where students attend school through Augmented or Virtual Reality instead of in-person or online classes.
Why am I so certain of this? Because Artificial Intelligence can’t be me.
I know that this might seem the most obnoxious thing to say, but let me explain why I think this matters.
In 1988, Hans Moravec wrote a book called “Mind Children: The Future of Robot and Human Intelligence”. In it, he observed a notable pattern within AI. He noticed that tasks involving logical reasoning and calculation, which humans usually find more challenging, were relatively straightforward for machines. In contrast, tasks that humans considered intuitively easy, such as sensory perception or interpreting emotions, proved unexpectedly hard for AI systems. This is what we now call Moravec’s Paradox.
It has become clear that it is comparatively easy to make computers exhibit adult-level performance in solving problems on intelligence tests or playing checkers, and difficult or impossible to give them the skills of a one-year-old when it comes to perception and mobility.
Moravec
So, mathematical and logical problems, which most humans find challenging, are relatively simple for machines. Conversely, tasks that humans handle effortlessly, such as physical movement or interpreting emotions, pose significant difficulties for these systems.
And if you think about it, from an anthropological perspective, the human need for abstract reasoning is relatively recent in evolutionary terms.
We are all prodigious olympians in perceptual and motor areas, so good that we make the difficult look easy. Abstract thought, though, is a new trick, perhaps less than 100 thousand years old. We have not yet mastered it. It is not all that intrinsically difficult; it just seems so when we do it.
The world has changed a lot since 1988, and the ability to think deeply has been growing in importance.
All the different LLMs are of great use in order to assist human, deep thinking. Are they coming up with new ideas by themselves? Numerous people have been prompting different models, trying to come up with original ideas, and the results aren’t particularly impressive. But maybe they will, soon enough.
So, if your goal is to build a career that is as antifragile as possible, it makes sense to leverage areas where machines still struggle, right? To learn how to do this, we must first understand the narratives that permeate societal views on work and careers.
The world is built around narratives.
For a long time, the prevailing narrative was: “Be a good student, go to a good college, get a good degree, and then you’ll get a good job”. For a lot of people, this is no longer true. Higher education is no longer a guaranteed pathway to professional life, so narratives were adjusted.
While some people still lived their lives believing that higher education would unlock infinite possibilities, others realized that the key was to be extremely good at technical skills, becoming part of the top 10% of that particular field.
While this is true, the strength of this narrative is also slowly eroding. For most fields, especially if you’re a knowledge worker, your technical skills no longer provide the edge they used to. Artificial Intelligence can outperform most people in technical tasks. Large Language Models give you access to an extreme amount of knowledge, at a very low cost, democratizing information. People can use it to have access to the previously hidden knowledge that you hold by being technically gifted at something. This goes back to David’s original post.
If AI can give you “80th percentile feedback on your writing”, why would people ask David or his team for feedback? They no longer need that. Even a lot of programmers are seeing their “throne” being taken away by things like “vibe coding” (turning words into code).
By understanding that technical skills no longer hold the advantage they did for the past few years, the narrative that “AI will put us all out of work” is born and spreads amongst people.
So what can we do?
You can’t code Consciousness.
My thoughts about this started when I listened to “Life as No One Knows It: The Physics of Life’s Emergence”, an audiobook by Sara Imari Walker, where she mentions the 3 most profound challenges in Science:
The Hard Problem of Matter - how everything observable arises from the interaction of particles and forces;
The Hard Problem of Life - what fundamentally distinguishes living systems from non-living matter;
The Hard Problem of Consciousness - understanding the nature of subjective experience, or what it feels like to be oneself.
I want to focus on the last one.
Researchers from all kinds of fields (from computer science to philosophy) have been intrigued by this notion of “Consciousness” and how can we fully understand it.
A useful concept is “Qualia”, a term coined by C.I. Lewis, an American philosopher, referring to the subjective, individual aspects of perception, or the “what it’s like” quality of your personal experience.
What is it like for you to drink coffee? Or experience the color Orange? How can you know if that experience is similar or different when compared to my own? Do we perceive coffee the same way? Or a certain color? We can recognize it, but is it perceived the same way?
These seemingly minor details are so deeply embedded in our personal experience, in our notion of who we are, that we tend to overlook how challenging they can be to convey to someone else.
If it’s hard for you to explain this, even though you get to experience it, how hard can it be for a machine?
So, inviting antifragility to your career in a world filled with Artificial Intelligence is achieved by leaning more into your “Real Experience”, your unique vision, your story. Not just through your experiences and writing, as David points out in his article, but also through what makes you unique when interacting with other people, regardless of the medium.
If you’re an educator, you get to surpass AI by being “more you”. It’s not technical, it’s personal. That makes the whole difference.
No AI can be the Only Person In Portugal.
The growth of LLMs has been a big part of my journey at The Socratic Experience. It has provided the perfect context in which I get to try different ways to surpass artificial intelligence in the classroom.
What I’ve realized is that many of the things I do by embracing my personality and personal story are nearly impossible to replicate. This isn’t just because they depend heavily on my unique context, after all, I’m the one living this life, but also because, from a learning perspective, they don’t follow a logical pattern. Still, they work beautifully precisely because there’s no manual involved, nothing that could simply be handed over to a machine.
Let me walk you through some of these.
When I joined TSE, students were extremely intrigued by my age, since I was younger than most of the other guides. I used that mystery and curiosity and kept my age private, letting them guess how old I was. Eventually, as a joke, the class decided that I was 140 years old, the basis of many jokes throughout the year.
Don’t get me wrong, I’m not complaining. Quite the opposite, I leaned into it fully. Anytime we talked about something from the early 20th century, I’d chime in with a remark about how I remembered it clearly, having lived through it myself. This became a meme that spread outside of my cohort, making other students wonder about my age.
In another recurring gag, I was the only person alive in Portugal and every person that I knew was actually an imaginary friend. They would even “correct me” when I talked about anyone, by saying “your imaginary friend” or “your imaginary girlfriend”. Once again, I leaned into that and started to proactively frame it that way.
From a purely educational standpoint, this might seem pointless. It’s not something that any LLM would recommend you do as a guide in a class because it offers no clear benefit. But that’s not true.
By making these jokes and embracing them, I’ve achieved one of the most important elements in any learning process: connection. All of them knew that I wasn’t really 140 or the only person alive in Portugal. But the fact that I ran with it made them feel more engaged, the environment got lighter, and that facilitated connection.
Humor fosters an emotional state of playfulness, which, in my experience, enhances learning.
However, it’s important to notice how much these stories rely on unique, personal elements (being from Portugal and being “young”) and how much they’re focused on the relationship with students. This is a key element to be able to provide what students both want and need.
There aren’t many things that AI can’t know 10x better than I do. But I have stories of things that have happened to me and those around me, experiences that have shaped me, and, more than anything else, human emotions that allow me to create an emotional landscape that can be used to design the best learning experience.
Unfortunately, this isn’t a widely used framework by educators.
You should be leaning heavily into your own personal life, as a way to breed connection and complement whatever knowledge you’re sharing with an emotional landscape that starts in your relationship with the student.
That’s something no machine can copy.
The Machine Waits, Your Agency makes it move.
A couple of weeks ago, I hosted a salon on George Mack’s essay “High Agency”.
At a given moment, the conversation shifted through the connection between what Mack called “the most important skill in the 21st century” and the current state of technology.
Access to these tools (from LLMs to any other product built using artificial intelligence) is widely accessible. However, they demand curiosity and agency from those who use them. Unprompted, there’s nothing an LLM can do. The secret, then, is to be able to proactively engage with it, regardless of being a student, educator, worker, or manager.
Part of what can distinguish an educator is the understanding of what ideas to connect. Even if the final construction is done through LLMs, the building blocks are given by educators. One might not have the capacity to connect the different ideas, but has enough scope of understanding to properly prompt the machine to come up with original results.

In an age of AI-enhanced education, educators must value their learning journey, reading across all fields, and maintaining an open mind, in order to collect sources that can then be connected in novel ways, providing insights for their students.
You no longer have the role of providing the information, but to help students turn it into knowledge and, hopefully, wisdom.
What’s your sample?
The internet has popularized information silos, echo chambers, where we get opinions that already match our views delivered by an algorithm, creating the illusion of diversity of information.
Now, don’t get me wrong. I love Artificial Intelligence and, as I’ve said before, I think it has the potential to transform education. But if you run in certain circles on X or Substack, it’s easy to think that AI will shift education for everyone, and self-directed learning is the best thing that happened to the field. I can’t agree with those generalizations.
My challenge, if you hold this belief, is to speak with 10 students, spread across 3 different countries, across a variety of socio-economic backgrounds. I truly believe that, if you do this, your bubble will burst. A lot of students are far from having that kind of relationship with AI.
Now, don’t get me wrong. I do believe AI will become as widely used as the internet, and students from all over the world, in time, will adopt it and transform their lives. I just don’t think we’re there yet. Especially if we’re talking about a transformative experience, a bigger sense of ownership over what they learn, not the just colloquial “write me an essay on Crime and Punishment”.
Just because a small group of highly motivated students successfully uses AI to achieve their best learning experiences, the only human element being present in the role of a coach, not a teacher, it’s unrealistic to assume that the same thing will necessarily happen at a larger scale. Just because you have access to AI doesn’t mean that we’ll see a rise of modern Einsteins and Hemingways. It can facilitate people’s learning experiences, as I’ve mentioned before, but amongst students, you’ll still have different levels of focus, commitment, and curiosity.
Gathering a group of students that has all of those traits, giving them AI, and watching them change the world is a great idea. To assume that that’s replicable for everyone makes no sense to me.
This is why I believe that using AI to discern what impact students want to make makes much more sense than using it just to learn. For now, we still need humans to help students through their learning journey, inspiring them to become more agentic and curious. But the interconnection between these 2 elements is, in my view, where the perfect harmony lies.
In “The Man Who Solved The Market”, Gregory Zuckerman explores the life of Jim Simons, founder of Renaissance Technologies and its highly successful Medallion Fund, both known for transforming financial markets through algorithm-driven trading. Toward the end of the book, Zuckerman recounts an ironic moment during the stock market collapse of 2018. Despite Simons spending three decades relying entirely on algorithms and artificial intelligence to make trading decisions, in a moment of crisis, he ignored his computers and instead called another investor. Both men sought comfort and reassurance from each other’s insights, rather than the rational but impersonal response of a machine.
At the end of the day, human connection wins.
But not everyone working in education is interested in doing that.
Are you a Teacher or an Educator?
Before I begin, I want to acknowledge that I’ve had some phenomenal teachers throughout my life. Unfortunately, I’ve also had a few bad ones—and in conversations with friends and family, I’ve learned that this mix is all too common. So when I refer to “teachers” in this section, I’m speaking in general terms. It’s simply for the sake of clarity and flow, rather than constantly saying “most” or “some” teachers.
Identity and Behavior
I never thought of myself as a teacher, even though I’ve been teaching for a long time. Excluding my history summaries and explanations for my colleagues in the 5th and 6th grades, I taught my first class when I was 15. It was as weird being called a teacher then as it is now.
One of the things that I really appreciate at The Socratic Experience is that every single person teaching there is actually referred to as “a guide” instead of a teacher. It might seem like a small, semantic difference, but I think it carries a deeply meaningful distinction.
While doing my first course on Neuro-Linguistic Programming, years ago, I got obsessed with the concept of “The Neurological Levels” by Robert Dilts. In it, he laid out a system to facilitate change in human beings by addressing different levels of human perception.
The pyramid starts with the external world, things outside of you that still shape your perception. From there, it moves through behaviors, abilities, beliefs and values, your sense of identity, and, finally, your mission. Interestingly, both the base and peak of the pyramid are connected to things beyond the self, with the former being an influence from the outside, and the latter a transcendence on a mental or spiritual level.
During my own practice and research, I found this shortcut within the pyramid. It’s possible to move between non-adjacent levels, specifically, between identity and behavior. Who you believe you are can shape what you do and vice versa, your actions can redefine your sense of self.
In 2018, James Clear, with his famous book “Atomic Habits”, popularized this idea of identity and behavior being linked, even though the notion is much older than that, with Jung and the Stoics talking about it in the distant past.
Who we are dictates what we do. What we do influences who we are.
The identities we choose to adopt have a profound impact on our behaviors.
So, a question arises.
Why don’t I identify as a teacher?
Here’s my problem. A lot of what I do, most of my behaviours, are associated with the figure of a teacher.
I teach a couple of classes, I design learning experiences and curricula, I give out assignments and provide feedback to students afterwards. And yet, I cringe at the thought of being called a “teacher”.
As I mentioned earlier, every identity comes with a set of associated behaviors. While many of my actions do align with the identity of a teacher, I have to be very deliberate in choosing which behaviors to adopt. Without this careful selection, I risk falling into patterns I’ve seen other teachers follow, which may contradict my own beliefs about what education should be.
When I was a student, I often clashed with teachers. I constantly questioned their authority. Who gave them the right to decide what I should learn, why it mattered, and based on what reasoning? That rebellious instinct got me into trouble more than once. A few teachers took the time to explain the purpose behind their lessons, and I respected that. But most relied on the “I’m the teacher, I know best” attitude, something I was never able to tolerate.
Because of it, when I found myself in a teaching position, I was very wary of my own approach, trying my best to not be hypocritical, actively resisting that identity.
Students are not the audience. They are actors.
I remember reading Paulo Freire’s “The Pedagogy of the Oppressed” and being so surprised by the way he articulated this phenomenon. He called it “the banking concept”. We perceive teachers as holders of very precious knowledge that is shared with students whose only job is to “save” that precious knowledge in their minds.
Unfortunately, a lot of teachers, still see themselves this way: a special kind of human that must be taken into account at face value without any kind of questioning.
One of the (many) reasons to love alternative education is the active challenge to this particular stigma. There’s no sense of superiority, but a genuine effort to collaborate with students. “Teachers” actually share meaningful knowledge while also listening to the students’ perspective and way or understanding the world.
Regular classes, with regular teachers, don’t happen that way.
The focus is often placed on the teacher, while students are expected to behave like what a colleague at The Socratic Experience aptly calls “audience members.”
The best educators meet students where they are instead, with a genuine sense of curiosity towards the student, as opposed to putting themselves in a position of superiority.
Can you come down from your pedestal?
In Portuguese, there’s an expression (“descer do teu pedestal”) which is similar to “get off your high horse.” It captures the kind of shift that AI will demand from teachers: a move away from rigid authority toward greater adaptability.
The changes I’ve been discussing in this essay require educators to be flexible and open to transformation, which are qualities that often clash with the traditional image of the teacher as a figure of unquestioned superiority.
If, as a teacher, you have convinced yourself that “you know best”, you’re in for a treat. Just not a very sweet one.
Knowledge, as we’ve seen, no longer carries the same weight it once did. LLMs make information widely accessible, permanently shifting the role of the teacher. It’s no longer about WHAT to teach the student, but how to help them process the information and apply it in a meaningful way.
Education is no longer about delivering information but guiding transformation.
We can access new knowledge in so many different ways. I can read a book, attend a class, or listen to a podcast, to name some obvious ones. And yet, true learning experiences demand a deeper emotional dimension that can only be activated through interaction with other humans. For the longest time, in the education industry, teachers had complete control over the learning environment, with no real competing sources of knowledge. As a result, many lost sight of this deeper, emotional aspect of a class, promoting a decline in the quality of the learning experiences.
Josh Wolfe, from Lux Capital, talks about this often, claiming that publicly funded schools offer such a high degree of safety for teachers and such an absence of competition that the quality of the offering can only go down. Suddenly, teachers no longer need to enjoy what they’re doing, they just need to make sure they’re doing the minimum work in order to preserve their job. Fortunately, some teachers love their students or, at least, the subject that they’re teaching. But that’s far from being the norm in a public setting.
Yet another difference between teachers and educators. How much do they actually love what they do, the people they have in front of them?
Can you take a moment to care about your students?
I’ve always been passionate about people. When I co-founded Academia do Sim, our corporate training company, some teachers said my enthusiasm for education came from working only with adults. Later, at What Drives Youth, they suggested my energy came from not repeating cohorts. At Entrepreneurial Gym, they thought it was because the work wasn’t daily. Now, at The Socratic Experience, I work with the same cohort every single day—and I genuinely love it. They’ve run out of excuses.
I often stay up late (2 or 3 a.m.) just to chat with a few students on the other side of the world after their school day ends. And I genuinely love these conversations. I’m deeply interested in their perspectives and fully committed to helping them push past their limitations through learning.
A lot of people could think that this is counterproductive, but I wholeheartedly believe that the relationship I built with each student is the thing that will make a difference in their journey with me.
Maybe because I don’t come from a traditional teaching background, I tend to see students as both learners and customers in all the projects I’ve work on. I want them to grow and feel genuinely delighted, and the only way to achieve that is by building a strong relationship with them.
This is because those two identities sometimes come into conflict. In order for a student to grow, the customer may not always be delighted in the moment. The key to making sure that they do, in the long term? The relationship.
Some teachers are highly resistant to this idea of “delighting” the customer. They think that this is the equivalent of letting students do whatever they want to. It’s not. It’s about challenging them as learners (what they need), while still trying to find opportunities to provide a great experience (what they want).
Can you do that? Can you help a student go through the emotional frustration of not understanding something even after asking AI to explain it in 10 different ways? Can you think of yet another alternative way to explain it? Do you care enough to do it?
Can you go out of your way to understand the cultural landscape that you’re students inhabit, as a way to be able to better articulate ideas in terms that resonate with what they know, while never compromising the original thought? Can you become a knowledge tailor, making sure that each idea fits the mind of the student in front of you?
If you do, AI will never replace you.
An attempt at tying everything together
I feel like there are 2 different parts of this essay, both of which connect by a personal burning desire to promote education as the key element to achieve a Modern Golden Age.
Artificial Intelligence and Large Language Models are powerful tools, there’s no denying it. Integration is going to be imminent. They won’t replace you as an educator, assuming you’re not positioning yourself as superior to your students.
Progress demands adaptation from all of the stakeholders in education, which implies a raising of standards for what an educator should be like.
If this is already part of the way you think about your own work, then, there’s nothing to worry about. If you see your own learning journey as something deeply important, and you’re eager to understand integration and strive for better things, this is just the next step on a sequence of everlasting improvement.
The stage is already set. We already have every single tool to transform education and build a Modern Golden Age.
It’s time to get to work.
Thank you, João, for this extensive and insightful essay - A lot to reflect upon in our own journeys as educators, wherever we are on this planet, as we move from "delivering information" to "guiding transformation".
This is one I'll quote a lot in many conversations, for certain !