AI-Assisted Pathology Is Improving Diagnostic Accuracy — But the Implementation Story Is Complicated


Pathology sits at the centre of clinical decision-making. Roughly 70% of all clinical decisions are influenced by pathology results, from cancer diagnosis to infectious disease management. When a pathology result is wrong — a missed malignancy, a false positive that leads to unnecessary surgery — the consequences cascade through the entire treatment pathway.

So when AI tools demonstrate the ability to match or exceed human pathologists in specific diagnostic tasks, it’s worth paying close attention. Not with breathless enthusiasm, but with the careful scrutiny that a technology this consequential deserves.

What AI Pathology Tools Actually Do

The most mature AI pathology applications focus on histopathology — the analysis of tissue samples under a microscope. Digital pathology scanners convert glass slides into high-resolution digital images, and AI algorithms analyse these images to identify patterns associated with specific diseases.

Harrison.ai, an Australian company that’s been one of the more serious players in this space, has developed AI models for analysing pathology slides across several cancer types. Their approach is notable because they’ve partnered directly with pathology providers rather than trying to replace them — the AI provides a second opinion, flagging areas of concern that the pathologist then reviews.

This “AI as assistant” model is the dominant paradigm, and for good reason. Pathology diagnosis isn’t just pattern recognition — it requires clinical context, knowledge of the patient’s history, and the kind of integrative reasoning that current AI models can support but not replicate independently.

The Accuracy Numbers

Let’s look at what the evidence actually shows.

For breast cancer detection in histopathology, several studies have demonstrated that AI models can identify invasive carcinoma with sensitivity above 95% — comparable to experienced pathologists and significantly better than less experienced ones. A study published in The Lancet Digital Health found that AI-assisted pathologists reduced their error rate by 23% compared to unassisted reading.

For cervical cytology — Pap smear analysis — AI screening tools have shown even more striking results. These are high-volume, repetitive screening tasks where human fatigue is a real factor. AI pre-screening can reduce the workload on cytotechnicians by 60-70% while maintaining or improving detection rates.

In prostate cancer grading — assigning Gleason scores to biopsy samples — AI tools have demonstrated strong concordance with expert pathologists. This is significant because Gleason scoring has historically been one of the more subjective areas of pathology, with inter-observer variability of 30-40% between pathologists. AI provides a more consistent baseline.

But these numbers come from controlled studies, often using curated datasets from leading academic institutions. The performance in routine clinical practice — with variable slide quality, unusual presentations, and the full diversity of patient populations — is typically somewhat lower.

The Australian Implementation Landscape

Several Australian pathology providers are now running AI tools in various stages of implementation.

The large private pathology companies — Sonic Healthcare, Healius, Australian Clinical Labs — have all invested in digital pathology infrastructure. This is the critical prerequisite: you can’t run AI on glass slides. The entire workflow needs to be digital, which means scanner hardware, massive storage capacity (a single histopathology slide generates 2-5 GB of data), and network infrastructure to move these files around.

The public hospital system is further behind. Digital pathology adoption in Australian public hospitals is patchy, with some major teaching hospitals well advanced and many regional and smaller metropolitan hospitals still working primarily with glass slides and optical microscopes.

This digital divide creates an equity concern. If AI-assisted pathology improves diagnostic accuracy — and the evidence suggests it does — then patients whose samples are analysed with AI assistance get better care than those whose samples aren’t. Given that regional and rural Australians already face significant healthcare access challenges, adding a diagnostic accuracy gap is troubling.

The Workforce Question

Australia has a well-documented shortage of pathologists, particularly in regional areas. The Royal College of Pathologists of Australasia has been flagging workforce concerns for years, and the situation isn’t improving. Training a pathologist takes 10-13 years after medical school. There’s no quick fix.

AI doesn’t solve this shortage directly — you still need qualified pathologists to make final diagnostic decisions. But it can amplify existing capacity. If AI pre-screening reduces the time a pathologist spends on each case by 30-40%, the same number of pathologists can handle more cases. And if AI catches potential misses that would otherwise require re-review or additional testing, it reduces downstream workload.

There’s a more nuanced workforce effect too. Junior pathologists — registrars in training — could benefit significantly from AI second opinions during their learning curve. Rather than waiting for a consultant pathologist to review their work (which might happen hours or days later), they get immediate algorithmic feedback. This doesn’t replace mentorship, but it supplements it.

The Risks That Don’t Get Enough Attention

It’s not all upside, and responsible discussion requires acknowledging the risks.

Automation bias. When pathologists know an AI has already flagged or cleared a slide, it changes how they look at it. Studies in radiology — which is further along in AI adoption — have shown that clinicians tend to agree with AI assessments even when the AI is wrong. This is a real risk in pathology too. If a pathologist subconsciously trusts the AI screen and spends less time on cases the AI has cleared, misses that the AI made become misses that the human makes too.

Data bias. AI models trained predominantly on tissue samples from one population may perform differently on samples from other populations. Melanoma detection algorithms trained primarily on light-skinned tissue samples are a well-documented example. In Australia’s multicultural population, this matters.

Regulatory uncertainty. The TGA’s framework for regulating AI-based diagnostic tools is still evolving. Most current AI pathology tools are classified as clinical decision support rather than diagnostic devices, which means they’re subject to less rigorous regulatory scrutiny. As these tools become more integral to diagnostic workflows, the regulatory framework will need to mature.

Where This Goes

I think AI-assisted pathology will be standard practice in Australian hospitals within five years. Not because it’s perfect, but because the combination of workforce pressures, demonstrable accuracy improvements, and increasing digital pathology infrastructure makes it inevitable.

The question isn’t whether AI will be part of pathology — it’s how well we implement it. That means investing in digital infrastructure across the public hospital system, not just private labs. It means training pathologists to work with AI tools effectively, including understanding their limitations. And it means building feedback systems that continuously monitor AI performance in real clinical settings, not just in curated research datasets.

The technology is promising. But technology alone doesn’t improve healthcare. Implementation, governance, and equity of access are what determine whether AI in pathology helps everyone or just the patients who happen to be treated in well-resourced urban hospitals.