Australia’s TGA has released new guidance for Software as a Medical Device (SaMD), with strong focus on AI and adaptive algorithms. The document outlines requirements for documentation, validation, and cybersecurity. Essential reading for companies aiming to expand globally in digital health.

On 29 August 2025, the Therapeutic Goods Administration (TGA) published a comprehensive guide clarifying the regulatory framework for Software as a Medical Device (SaMD). The document addresses the growing role of digital health tools, from mobile apps to AI-driven diagnostic systems, and responds to the challenges posed by their rapid adoption in clinical practice.

Qualification and classification criteria SaMD TGA

The guidance confirms that intended purpose remains the decisive factor for SaMD regulation. Examples include:

  • mobile apps interpreting diagnostic images,
  • clinical decision support systems (CDSS),
  • AI models predicting patient risk for certain diseases.

The platform or delivery method (cloud, mobile, or embedded) is irrelevant — what matters is whether the software is intended for a medical purpose.

Exemptions

Similar to MDR/IVDR, the TGA excludes software designed solely for:

  • lifestyle or wellness purposes (e.g. fitness trackers),
  • educational content,
  • administrative healthcare support tools.

Yet, borderline cases require careful consideration. For instance, a sleep monitoring app may qualify as SaMD if it generates data relevant to diagnosing sleep disorders.

Artificial intelligence and adaptive algorithms

TGA highlights three priority areas:

  1. Validation: algorithms must be tested for accuracy and reproducibility across populations.
  2. Change management: retraining an algorithm on new datasets may trigger a new conformity assessment.
  3. Cybersecurity: AI systems must be resilient against adversarial attacks such as data poisoning or manipulation.

Documentation and conformity assessment

Manufacturers are expected to maintain complete technical documentation covering:

  • detailed intended purpose and target users,
  • dataset description and training methodology,
  • risk analysis (aligned with ISO 14971),
  • PMS plan with continuous data collection,
  • usability and human factors evaluation.

Global alignment

The guidance aligns with EU MDR/IVDR and the upcoming AI Act, as well as FDA’s frameworks for SaMD. This convergence enables manufacturers to prepare one consistent body of evidence to support multi-market approvals.

Summary

TGA’s guidance is both a challenge and an opportunity. It makes clear that regulatory responsibility lies with the manufacturer, regardless of whether AI models are developed in-house or sourced externally. Companies that integrate regulatory requirements into the full lifecycle of their digital products — from design to PMS — will be best positioned to ensure safe adoption of AI-driven healthcare technologies.