Continuous Improvement
We actively seek to improve what we do and can demonstrate progress over time.
Version 1.0 - First Edition
Why This Domain Matters
Quality is not a state that a practice achieves and then holds. It is a discipline - something that has to be actively maintained, regularly tested, and periodically reset against a changing external environment. A practice that completed an excellent self-assessment two years ago and has not revisited it since is not a quality practice. It is a practice that was organised two years ago.
Domain 8 is concerned with the mechanisms that keep quality alive over time: the processes by which a practice identifies what needs to improve, sets goals, takes action, reviews whether the action worked, and starts again. It covers internal audit, data use, peer review, and regulatory currency - and it covers the culture that makes all of those things possible.
This is also the domain that gives the Clinically Quality Framework its longitudinal value. A single self-assessment tells you where you are. A repeated self-assessment, tracked against an improvement plan with documented outcomes, tells a story of progress. That story is what distinguishes a practice that is genuinely committed to quality from one that treats frameworks as compliance exercises.
A practice that scores strongly in Domain 8 is one where improvement is expected, structured, and demonstrable - not because someone external is watching, but because the people who work there believe it matters.
Quality Statements
We have a structured, documented approach to identifying priorities and tracking improvement.
Indicators
- 8.1.1 The practice has a documented quality improvement plan that is reviewed at least annually
- 8.1.2 The improvement plan identifies specific goals, responsible persons, and target timeframes
- 8.1.3 The improvement plan is informed by the outcomes of the practice's self-assessment against this framework
- 8.1.4 Progress against improvement goals is reviewed at defined intervals (at least quarterly)
- 8.1.5 Completed improvement actions are documented and the outcomes recorded
- 8.1.6 Where an improvement action did not achieve its intended outcome, this is reviewed and a revised approach is documented
- 8.1.7 The improvement plan is accessible to relevant staff and is not held solely by a single individual
- 8.1.8 Quality improvement goals are prioritised by patient safety impact, not organisational convenience
- 8.1.9 The practice revisits its full self-assessment against this framework at least once every two years
We regularly review our own processes against defined standards and act on what we find.
Indicators
- 8.2.1 The practice conducts at least two internal audits per year
- 8.2.2 Internal audits are planned in advance and cover a mix of clinical and operational topics
- 8.2.3 Internal audit findings are documented and shared with relevant staff
- 8.2.4 Internal audits result in documented actions where gaps are identified
- 8.2.5 Audit topics are selected based on risk, prior incidents, or known areas of variability
- 8.2.6 At least one internal audit per year addresses a clinical process or outcome (not solely administrative topics)
- 8.2.7 Previous audit findings are reviewed in subsequent cycles to assess whether improvements have been sustained
- 8.2.8 The practice has conducted a medication management audit in the past two years (where medications are held or administered)
- 8.2.9 The practice has conducted an infection prevention and control audit in the past two years
- 8.2.10 The practice has conducted a health records audit in the past two years
We collect, review, and act on data about our practice's performance.
Indicators
- 8.3.1 The practice has identified a small set of key performance indicators (KPIs) relevant to its operations
- 8.3.2 KPIs are reviewed at defined intervals by the principal practitioner(s) and practice manager
- 8.3.3 Waiting time data (time from referral to appointment) is monitored and reviewed
- 8.3.4 Did-not-attend (DNA) and cancellation rates are monitored and reviewed
- 8.3.5 Patient feedback data (see Domain 4) is reviewed in the context of quality improvement
- 8.3.6 Incident and near-miss data is reviewed in aggregate at least annually to identify patterns
- 8.3.7 Complaint data is reviewed in aggregate at least annually to identify patterns
- 8.3.8 Where available, clinical outcome data is reviewed by the principal practitioner(s) at least annually
- 8.3.9 Data review findings are connected to the improvement plan where action is indicated
- 8.3.10 The practice does not rely solely on absence of complaints as evidence of quality
We treat things that go wrong as opportunities to improve, not just problems to resolve.
Indicators
- 8.4.1 A documented process exists for reviewing incidents and near misses for learning, separate from the initial response process
- 8.4.2 Significant incidents are subject to a structured review (e.g. case discussion, root cause analysis) within a defined timeframe
- 8.4.3 Learning identified from incident reviews is translated into documented changes to policy, process, or training
- 8.4.4 Changes made in response to incidents are communicated to relevant staff
- 8.4.5 Complaints are reviewed for themes and patterns at least annually (see also Domain 1)
- 8.4.6 Learning from complaints is connected to the improvement plan where systemic issues are identified
- 8.4.7 The practice monitors external safety alerts, recalls, and clinical advisories relevant to its specialty and acts on them promptly
- 8.4.8 Near misses are treated with the same learning intent as incidents that caused harm
- 8.4.9 Staff feel safe to report incidents and near misses without fear of blame (see also Domain 6)
- 8.4.10 The practice can demonstrate at least one documented improvement that originated from an incident or complaint in the past two years
We look outside our own practice to test and calibrate the quality of what we do.
Indicators
- 8.5.1 The principal practitioner(s) participate in at least one form of structured peer review, case discussion, or clinical audit activity annually
- 8.5.2 Peer review activity is documented (including the nature of the activity and any learning identified)
- 8.5.3 Where the practice's specialty college or professional association offers clinical audit or benchmarking programs, the practice has considered participation
- 8.5.4 The practice is aware of relevant national or state-based clinical registries in its specialty and has considered participation where applicable
- 8.5.5 Findings from peer review or benchmarking activities are connected to the practice improvement plan where relevant
- 8.5.6 Where the practice participates in clinical training (students, registrars, or fellows), feedback from supervisory bodies is reviewed and acted on
We stay current with our legal obligations, relevant standards, and evolving clinical guidance.
Indicators
- 8.6.1 A named person in the practice is responsible for monitoring changes to regulation, standards, and guidelines relevant to the practice
- 8.6.2 The practice has a defined process for reviewing and acting on updates to relevant legislation (including the Privacy Act, Work Health and Safety Act, and applicable state health legislation)
- 8.6.3 The practice monitors updates from AHPRA and relevant specialist college(s) and acts on guidance relevant to its operations
- 8.6.4 Clinical policies and procedures are reviewed at defined intervals (at least every two years) and updated to reflect current standards
- 8.6.5 The practice has reviewed its obligations under the Australian Privacy Principles in the past two years
- 8.6.6 The practice is aware of current mandatory reporting obligations (AHPRA, child protection) and reviews these at staff induction and at least biennially thereafter
- 8.6.7 The practice monitors safety alerts and product recalls from the TGA and other relevant bodies
- 8.6.8 Relevant changes to regulation or standards are communicated to affected staff promptly
- 8.6.9 Policy updates triggered by regulatory change are documented with the reason for the update and the effective date
Our leadership actively creates the conditions for improvement to happen.
Indicators
- 8.7.1 The principal practitioner(s) visibly champion quality improvement within the practice
- 8.7.2 Quality improvement is a standing agenda item at practice meetings
- 8.7.3 Staff at all levels are encouraged to identify improvement opportunities and their suggestions are taken seriously
- 8.7.4 Staff who identify problems or raise concerns are thanked, not managed
- 8.7.5 The practice celebrates and communicates improvements that have been achieved
- 8.7.6 Time and resource are allocated (however modestly) to quality improvement activity - it is not expected to happen in staff members' personal time
- 8.7.7 New staff are introduced to the practice's quality framework and improvement approach during induction
- 8.7.8 The practice manager has access to professional development relevant to healthcare quality and governance
- 8.7.9 The practice does not treat this framework as a compliance exercise - it can articulate at least two specific improvements made as a result of using it
Not sure what counts as evidence?
The evidence guide provides concrete examples of what evidence looks like at each maturity level for every indicator in this domain.
View Evidence GuideReady to assess your practice?
Rate your practice against every indicator in this domain using the self-assessment tables.
Self-Assessment GuideNotes for Practice Managers
8.1.8 (Prioritisation by patient safety impact) is the most important indicator in this domain. Improvement plans fail most commonly not because practices lack goodwill but because they prioritise the easy and the visible over the risky and the complex. Repainting the waiting room is an improvement. Installing a new coffee machine is not. The test for any improvement action is: what is the impact on patient safety or care quality if we do not do this? Answers that cannot be connected to that question belong somewhere else.
8.2 (Internal Audit) does not require specialist expertise or expensive software. A clinical audit can be as simple as pulling 20 random records and checking whether they contain a current medication list, an allergy record, and a documented consent note. A referral turnaround audit involves counting how many specialist letters were sent within five business days last month. The discipline is not in the sophistication of the methodology - it is in doing it, documenting what you find, and acting on the gaps. Practices that have never conducted a formal internal audit almost always find something material in the first one.
8.3.10 (Absence of complaints is not evidence of quality) is the most commonly violated principle in specialist practice. Low complaint rates in specialist settings often reflect barriers to complaining - patients who feel dependent on their specialist, who don't know how to raise concerns, or who assume that if something seems wrong it must be normal - rather than absence of problems. A practice that has received no complaints in five years and has collected no other quality data cannot meaningfully claim that its care is good. It can only claim that no one has told it otherwise.
8.4.7 (External safety alerts and recalls) is frequently overlooked. The TGA issues product recalls and safety alerts for medical devices, medications, and consumables that affect specialist practices regularly. AHPRA and specialist colleges issue safety advisories. Medical indemnity insurers publish risk alerts. None of these automatically land in a practice manager's inbox in a way that ensures action. A named person with a defined process for monitoring these channels - even monthly - makes a measurable difference.
8.5 (Peer Review) connects the practice to the broader professional community. Peer review does not have to mean a formal college program. It can mean a quarterly case discussion with a colleague, participation in a Mortality and Morbidity meeting, or a shared clinical audit with a practice in the same specialty. What matters is that the principal practitioner's clinical performance is subject to some form of external scrutiny beyond their own self-assessment. Practices in which the principal practitioner is never accountable to a peer for their clinical decisions carry a qualitatively different risk profile to those where that accountability exists.
8.7.9 is both the hardest and the most honest indicator in the framework. A practice that has worked through this self-assessment, built an improvement plan, implemented changes, and can name two things that are now better than they were - that is what quality looks like. Everything else is preparation.
Discuss this domain
Have a question about Continuous Improvement? Join the SPQF community to discuss implementation, share evidence examples, and learn from other practices.
Join the Community