
“The patient arrives with an MRI report in one hand and a smartphone in the other. Before you’ve even performed a single provocation test, they present you with a neatly bulleted ‘recovery road map’.
This isn’t a collection of disparate Google searches. It is the output of ChatGPT: a conversational artificial intelligence (AI) chatbot that allows users to upload their medical documents and receive a bespoke, albeit automated, management plan.
As the Royal Australian College of General Practitioners recently highlighted, we have reached a crossroads where innovation meets significant clinical risk. For the physiotherapist, this shift represents a fundamental challenge to our role as the primary architects of physical rehabilitation.
The algorithm knows the average; we know the exception.
The Royal Australian College of General Practitioners’ recent analysis of ChatGPT sparked a necessary debate: is this a digital assistant or a ‘Dr Google 2.0’? While our GP colleagues are primarily concerned with diagnostic errors, the threat to physiotherapy is more subtle and existential.
It is the threat of de-contextualised prescription.
Consider Sarah, a 34-year old office worker with chronic lower back pain. Her ChatGPT-generated plan recommends a progressive loading program based on her uploaded MRI showing ‘mild L4-L5 disc bulge’ and her self-reported pain scores. The protocol is evidence based and the exercise selection is textbook. The periodisation follows current best practice.
It is also completely wrong for Sarah.
The algorithm cannot see that Sarah guards into extension because of a previous pregnancy-related diastasis. She catastrophises every twinge because her father was disabled by back surgery. Her workstation set-up loads her spine asymmetrically for nine hours daily. Most critically, she is three weeks away from a work deadline and psychologically incapable of adhering to anything that requires more than ten minutes.
ChatGPT knows that, statistically, loading is the gold standard for disc pathology. But it cannot palpate the protective muscle spasm, observe the kinetic chain compensation at the hip or read the anxiety in her face when mentioning the word ‘deadlift’.
The algorithm has read thousands of papers. We have read thousands of bodies.
Physiotherapists spend years mastering the art of therapeutic language: explaining that a disc bulge is as normal at 45 as grey hair, degeneration is correlation rather than causation and pain rarely maps neatly onto structural findings.
ChatGPT, by contrast, inadvertently practises the opposite. It validates the client’s worst fears by treating every radiological finding as a structural problem requiring a structural solution. The AI-generated plan for that degenerative disc often includes phrases like ‘to address the severe degeneration’ or ‘targeting the damaged area’ language that embeds a pathoanatomical narrative we’ve spent two decades trying to dismantle.
We are now treating not just the injury, but the AI’s nocebo effect.
Client’s return to physiotherapists as they need someone who can answer the question the algorithm cannot. ‘Will this work for me?’
That question requires pattern recognition across ten thousand previous clients. It requires reading facial expressions during movement. It demands knowing when to push and when to reassure, when evidence supports aggression and when clinical experience counsels patience.
A physiotherapist’s value is no longer in knowing what to prescribe. It is in the irreducible judgement of knowing when, how and why to prescribe it for the specific person in front of us.
In an era where information is infinite and free, context is the scarcity.”
Barry Nguyen – Physiotherapist & software engineer



