Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Maybe ELM Can Help You Learn – the LEARN Framework.

As someone who is really interested in developmental psychology, I often find myself thinking about how different tools shape our habits and influence our cognitive behaviours, particularly when it comes to thinking and learning.

The kind of thinking I’m referring to is the kind that happens when you’re halfway through solving a problem or trying to understand a concept, and your first instinct is to open ELM and ask it for help. For those unfamiliar with it, ELM is the University of Edinburgh’s platform that provides staff and students with access to generative AI models in a secure environment. You can find more information about it here: https://elm.edina.ac.uk.

Sometimes that instinct feels completely reasonable. After all, tools like ELM are designed to help us make sense of things. They can explain complex ideas, summarise long readings, and even help us spot mistakes in our reasoning. But recently I’ve found myself pausing and asking a slightly uncomfortable question:

  • When does asking for help become asking something else to do the thinking for us?
  • At what point does this habit start hindering learning?
  • And at what point am I no longer learning something new from the work I am doing?

The temptation of instant answers

One of the most remarkable things about generative AI is how quickly it can produce an answer.

  • Stuck on a piece of code? Paste it in and ask ELM to fix it.
  • Not sure why a concept isn’t making sense? Ask for a simplified explanation.
  • Struggling to structure an argument? Ask it to suggest an outline.

In many ways, this is genuinely useful. It lowers barriers and can make difficult ideas more accessible. But it also creates a new habit, one where the moment we encounter friction, we try to remove it. And learning, as inconvenient as it may sound, often involves a bit of friction.

Learning rarely feels smooth

Think about the last time you properly understood something difficult. It might have involved rereading a paragraph several times, trying different approaches to a problem, or slowly piecing together why something wasn’t working. Those moments can feel frustrating at the time. But they are also often the moments where understanding begins to form. If we immediately hand that moment of struggle over to AI, something interesting happens: the problem gets solved, but the thinking process that leads to understanding may quietly disappear.

This is what psychologists sometimes refer to as cognitive offloading letting tools take over mental tasks that we might otherwise do ourselves. And with tools like ELM, cognitive offloading can happen very easily.

The responsibility still sits with us

It’s tempting to talk about Generative AI as if it’s something that simply acts upon us, a powerful technology that changes how we learn. But the reality is slightly less dramatic and slightly more complicated. Tools like ELM don’t decide how we use them. We do. We choose whether we ask for the final answer, or whether we ask for help understanding the process behind it.

For example, there is a big difference between asking “Fix this answer.” and asking “Explain where my reasoning might be incorrect.” Both prompts involve the same tool. But they lead to very different kinds of learning. In one case, the problem disappears. In the other, the thinking continues.

A small framework for learning with Generative AI

While thinking about this, I found myself sketching a simple way of approaching Generative AI tools when learning something new. It’s not a prompting framework, and not a strict method, but more of a reminder of how we might keep the learning process intact when using tools like ELM.
I’ve been calling it the LEARN framework.

The idea is fairly simple: Generative AI can support learning, but the learner should still remain actively involved in the thinking process. The framework provides a way to use Generative AI while making sure that you are still developing your own skills, understanding, and knowledge.

L — Learn first

Start by attempting the problem yourself. This does not mean you have to solve the entire problem on your own, but it does mean beginning the thinking process. Consider how you might approach the task based on your existing knowledge. What do you already know about the problem? What might be a possible starting point? At this stage, try to avoid using Generative AI and spend a few minutes engaging with the problem independently.

E — Explain your thinking

Now provide ELM with the problem and explain your thinking. Share the reasoning, ideas, or partial solutions you have come up with so far. If the problem is completely new to you, you can simply explain what you understand about the question itself and ask whether your interpretation is correct. This step encourages reflection and ensures that you are actively engaging with the material rather than immediately outsourcing the task.

A — Ask for guidance

Once you have explained your thinking, ask Generative AI for guidance. Questions at this stage might include:

  • Is my understanding of the problem correct?
  • Am I approaching this the right way?
  • Is there another way to think about this?

The key here is not to ask for the final answer, but to ask for the next step in the process.

R — Reflect on feedback

When you receive a response, take time to understand it. Compare it with your original reasoning. Does the explanation give you new insights? Does it change how you think about the problem?

If something is still unclear, you can repeat the E and A steps: explain what you understood from the response and ask for further guidance. You might ask ELM to break the explanation down into smaller steps or clarify a specific part of the process.

N — Now check your solution

After working through the cycle of Explain, Ask, and Reflect, you can use ELM to check your solution. At this stage, ask yourself a few questions:

  • Did I complete the task successfully?
  • Do I understand the reasoning behind the solution?
  • Could I explain this concept to someone else?
  • Would I be able to approach a similar problem on my own in the future?

The goal is not to slow down the process unnecessarily, but to ensure that Generative AI remains a learning partner rather than a replacement for thinking.
Using Generative AI in this way helps ensure that you are still developing the skills needed to solve problems independently. Getting to the final answer is useful but understanding how to arrive at that answer yourself remains just as important.

Learning rarely happens without a little friction. The challenge when using Generative AI tools like ELM is making sure we don’t remove that friction entirely.

What this looks like in practice

Of course, frameworks are often much neater on paper than they are in real learning situations. In practice, learning rarely follows a clean sequence of steps, and using Generative AI while learning is often far more experimental and fluid than any framework might suggest.

Most of the time, using Generative AI for learning looks much less structured and much more exploratory. For me, it often means using Generative AI to reshape information so that it connects more naturally to how I think about things.

For example, when I encounter a concept that feels too abstract, I sometimes ask ELM to translate it into something more visual or concrete. At other times, I try to connect new information to things I already care about or understand. If I’m learning something technical, I might ask ELM to explain the idea through the lens of education, learning behaviour, or psychology, because those are areas I naturally think about. This helps turn unfamiliar ideas into something that feels more relatable and easier to reason about.

Another approach I often use is asking ELM to help me check my reasoning rather than give me the answer. For example, I might outline how I think a concept works or provide it an analogy of my own and then ask, “Does this interpretation make sense, or am I misunderstanding something?”


Keeping the friction in learning

What I’ve realised over time is that the most useful role Generative AI can play in learning is not to remove all difficulty, but to help us navigate it more effectively. Some friction is not just unavoidable in learning, it’s necessary. Moments where something doesn’t quite make sense, where we pause, rethink, and try again, are often the moments where understanding begins to develop.

Generative AI can easily smooth over those moments by providing immediate answers. But if we rely on it that way too often, we risk skipping the very process that allows knowledge to form. In other words, knowledge is rarely built without a little friction along the way.

The challenge when using tools like ELM isn’t simply deciding whether to use them. It’s making sure that, when we do, we’re still engaging with the thinking that learning requires.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel