In praise of friction: Why the future of AI needs more resistance
In an AI-driven world obsessed with seamless efficiency, businesses must intentionally design “positive friction” into systems to preserve human judgment and accuracy
Many firms are directing a significant portion of their Artificial Intelligence (AI) and technology investment toward tools that improve customer experience, productivity, and automation.
Australian firms surveyed by the Reserve Bank of Australia (RBA) report that technology investment has grown significantly, with many planning to increase investment in AI and related tools. However, adoption is still largely at a pilot or early stage. These firms expect AI to boost productivity in the long term, although there is considerable uncertainty about the timing and magnitude of these gains. Barriers such as skills shortages and regulations continue to limit productivity improvements so far.
Where would investment and improvements generate the most value for businesses and their customers? AI promises a world without friction – one that is faster, easier, and seamless. But MIT behavioural scientist Professor Renée Richardson Gosline says our obsession with frictionless systems might actually be slowing us down.

In conversation with UNSW Business School Interim Dean Professor Paul Andon, Senior Deputy Dean (Education & Student Experience), during the World Business Forum held in Sydney this year, Prof. Gosline argued that a bit of productive resistance – what she calls “positive friction” – is essential to keep humans thinking critically and organisations steering in the right direction.
“Friction is a force. It’s neither good nor bad. It’s a force to be harnessed – and the objective is to use it for optimal good,” Prof. Gosline said.
The hidden cost of a frictionless world
Prof. Gosline’s research began at home. As a new mother trying to juggle work and parenting, she turned to technology for help. Facial recognition unlocked her phone without any effort, and a smart monitor listened for her baby’s cries. “I wanted to get it right,” she said. “And as I realised I was adopting more of these tools, I started thinking, hang on – this is really changing the way I behave.”
Her personal reliance on convenience sparked a decade-long inquiry into how removing “pain points” from daily life might come with hidden costs. “We’ve built a culture that equates friction with pain – and sees its removal as progress. But that assumption hasn’t been proven,” Prof. Gosline said.
In business, this mindset translates into the pursuit of seamless workflows, instant insights, and one-click decisions, often powered by AI. Yet, Prof. Gosline cautions that what appears to be efficiency could be eroding deep and critical thought. “AI removes effort,” she said. “But when we remove too much friction, we may also remove the very cognitive processes that make us human – deliberation, judgment, creativity.”
Professor Andon said this challenge extends beyond individual decision-making to how organisations learn and grow. He asked if junior staff rely too heavily on AI to think for them, what happens to the future of leadership and problem-solving? “If people no longer engage in effortful thinking, they lose the mental muscle needed to handle situations not found in an AI’s training data,” Prof. Gosline said.
In her upcoming book In Praise of Friction, Prof. Gosline defines friction as any deliberate pause or cognitive resistance that helps balance a system, particularly one increasingly governed by automation (and therefore AI). “When you remove friction from one touchpoint, there are effects elsewhere in the system. Everything is connected,” she said. “Think about a runner. Friction allows acceleration and pivot. Without it, the runner slides out of control. You can move quickly, but not strategically.”
For businesses, this means designing workflows that balance automation and human reasoning. “We want organisations that can accelerate and change direction. So yes, remove pain points – but also place friction intentionally where it helps you pivot,” she added.
Subscribe to BusinessThink for the latest research, analysis and insights from UNSW Business School
Testing friction in the field
Through her research with global firms like Accenture, Prof. Gosline has experimentally introduced friction into AI-assisted workflows to identify what she calls the “Goldilocks zone” – not too much, not too little.
In one study, participants used generative AI to create content. Some received results instantly, while others encountered small delays or prompts encouraging review. The outcome? “Adding beneficial friction at the point of AI output significantly improved accuracy. People caught more errors and produced better work – without meaningfully increasing task time,” Prof. Gosline said.
In another experiment, teams informed that humans had been involved in the AI process rated the outcomes as higher quality and more persuasive. “Human involvement creates a sense of authenticity. It reassures customers that care and accountability are still part of the system,” Prof. Gosline said.
Her research has also found that friction doesn’t have to come from humans at all. “AI itself can prompt friction. We’ve programmed systems where one AI checks another’s work – flagging potential errors and telling the user, ‘You might want to take a second look.’ That nudge is friction, and it works.”
However, friction must evolve dynamically. "Humans acclimate quickly. What feels like friction today can fade into the background tomorrow. So organisations need to adapt their friction points over time,” she said.
Cognitive bias and the need for guardrails
Beyond workflow design, Prof. Gosline’s research has exposed two major cognitive biases that complicate human-AI collaboration: overconfidence bias and anchoring bias.
First, users tend to overestimate how much they actually get out of AI. “People aren’t very good at knowing how much benefit they get from AI. Even when friction improves their performance, they often don’t recognise it,” she said.
That’s why behavioural data is crucial. “What people say and what they do are often worlds apart. You can’t rely on individuals to safeguard against these biases. You need systemic friction baked into the design,” Prof. Gosline said.
Anchoring bias, meanwhile, reveals how easily humans defer to AI. “When people receive AI-generated content, they anchor on it, and about 60 to 80% of their final work mirrors the AI’s suggestion. Once they’ve seen it, you can’t un-ring the bell.”
The solution is to insert friction early. “You have to intervene before or immediately after AI output. That’s when humans are most receptive to thinking critically,” she said.
Importantly, Prof. Gosline also highlighted that AI is not predetermined and that humans remain in control of how it develops and is used. Since most advanced AI development today is driven by private companies, there are risks that must be carefully balanced. “It’s not academics shaping the world – it’s the people deploying and using AI every day. That’s a tremendous amount of power,” she said.
Therefore, business leaders, designers, and policymakers must recognise that seamlessness should not be the ultimate goal. “When we design for frictionless experiences, we risk designing out humanity. We need to design for thoughtful engagement instead. AI can do many things for us, but it shouldn’t do our thinking for us. Friction gives us space to think – and that’s where our real advantage lies.”