The Patient Satisfaction Questionnaire (PSQ)
Because your patients' opinions matter — and not just the ones your trainer overhears through the wall.
🌐Web Resources
A hand-picked mix of official guidance and real-world GP training resources. Because sometimes the best pearls are not hiding in the official documents.
⚡Quick Summary — One-Minute Recall
The Numbers at a Glance
✅ What you must know
- Done once, in ST3, after the midpoint
- Measures empathy and relationship skills — not clinical knowledge
- You must not hand out or collect the forms yourself
- You complete a self-assessment first (before seeing results)
- Results go to your ES first, then are released to you
- Below-average scores → discuss and consider repeating the PSQ
- Works for telephone and face-to-face consultations
📋 The 9 Domains (summary)
- Feeling relaxed and welcome
- Being listened to
- Clear explanation
- Involvement in decisions
- Confidence in the management plan
- Knowing what happens next
- Knowing what to do if things worsen
- Respect and dignity
- Trust in the doctor
The PSQ is essentially asking: "Did this doctor treat me like a human being?" If you genuinely listen, explain things clearly, involve patients in decisions, and safety-net well — good scores tend to follow naturally. It is a mirror held up to your consultation skills, not a trick exam.
📋What is the PSQ?
The Patient Satisfaction Questionnaire (PSQ) is one component of the MRCGP Workplace Based Assessment (WPBA). It gives you direct, structured feedback from your patients about how they experienced their consultation with you.
Unlike a CBD or COT — where a trainer or supervisor assesses your clinical reasoning — the PSQ captures the patient's perspective. It focuses specifically on your empathy, communication, and relationship-building skills. This is patient voice, not educational voice.
The PSQ matters because it captures something that no supervisor can fully observe: what it actually feels like to be your patient.
What the PSQ is designed to measure
- Whether patients feel heard and respected
- Whether explanations were clear and understandable
- Whether patients felt involved in their care
- Whether they understood their safety-net and next steps
- Overall trust and confidence in you as their doctor
Where it sits in WPBA
The PSQ is one of several WPBA tools. Others include CbD, COT, MiniCEX, MSF, CEPS, QIP/QIA, Prescribing Assessment, and Clinical/Educational Supervisor Reports. Each tool captures a different dimension of competence. The PSQ is unique in that patients — not clinicians — provide the feedback.
When you write your ePortfolio reflection on your PSQ, link it to these capabilities — especially Communication & Consultation Skills.
🗂️The PSQ in the Wider WPBA Family
The PSQ is one of eight WPBA tools. Understanding how it compares to the others helps you see what it is — and isn't — designed to assess.
| WPBA Tool | Who provides feedback? | What does it measure? | When? |
|---|---|---|---|
| PSQ ← you are here | Patients | Empathy, communication, relationship-building, safety-netting | Once in ST3 (after midpoint) |
| MSF (Multi-Source Feedback) | Colleagues (doctors, nurses, admin, other HCPs) | Professionalism, clinical performance, teamworking | Once per training year |
| COT (Consultation Observation Tool) | Trainer / supervisor observing live or recorded consultations | Consultation skills: history, diagnosis, management, communication | Throughout training |
| CbD (Case-based Discussion) | Trainer / supervisor in structured case discussion | Clinical reasoning, decision-making, professional judgement | Throughout training |
| MiniCEX | Trainer / supervisor observing a focused clinical encounter | Clinical and communication skills in brief encounters | Throughout training |
| CEPS (Clinical Examination & Procedural Skills) | Trained observer | Physical examination and procedural competence | Throughout training; some mandatory in ST3 |
| QIP / QIA (Quality Improvement) | Supervisor assessment of project/activity | Quality improvement, systems thinking, service development | QIP in ST1/2 primary care; QIA each training year |
| Prescribing Assessment | Trainer review of 20 of your prescriptions | Safe prescribing, error identification, reflection on prescribing habits | ST3 (mandatory) |
The PSQ is the only WPBA tool where patients — not clinicians — provide the feedback. Every other tool relies on a colleague, trainer, or supervisor as the assessor. This means the PSQ captures something fundamentally different: the lived experience of being your patient, which no supervisor can fully observe or replicate. It is less a test of what you know, and more a test of how you make people feel.
If your COTs show a recurring theme — say, that you sometimes rush the explanation or don't always explore patient concerns — and your PSQ results show lower scores on Q3 or Q4, those two pieces of evidence are telling the same story from different angles. Your ES will look for these alignments. When WPBA tools converge on the same theme, that is the most reliable signal about a genuine learning need.
❓The 9 PSQ Questions
Read these before you start your PSQ. Understanding the domains means you can actively work on the relevant skills beforehand, and better interpret your scores afterwards. (Updated 2017 RCGP form.)
An older PSQ form had 11 questions and used a scale of Poor-to-Fair through Outstanding. The current form (2017) has 9 questions with a Yes/No response scale. Always use the current version through the FourteenFish ePortfolio. The 2017 form is the one that counts for WPBA.
The 5-point response scale
not at all
not really
not fully
completely
Questions 5, 6, and 7 have an additional "Not applicable" or "Not relevant to this consultation" option. Patients also have space to write free-text comments at the end.
Questions 1–4 cover the relationship and process of the consultation (rapport, listening, explaining, involving). Questions 5–7 cover the outcome (confidence, next steps, safety-netting). Questions 8–9 capture the overall professional relationship (dignity, trust). If you tend to score lower in questions 6 or 7, the solution is usually better safety-netting and clearer follow-up — not personality change!
💬Developing the Communication Skills That Drive PSQ Scores
The best way to get good PSQ scores is not to worry about the PSQ — it is to genuinely develop your consultation communication skills. These practical tips map directly onto the 9 PSQ domains.
Q1 & Q8 — Making patients feel welcome and respected
- Stand or lean slightly forward when the patient enters — small gestures of welcome matter
- Make eye contact and use the patient's name early in the consultation
- Avoid starting by immediately staring at the computer
- Knock before entering (home visits); gesture to a seat; introduce yourself clearly
- Dignity applies to how you talk about the patient's body, their condition, and their choices — not just how you physically treat them
- Avoid dismissive or minimising language ("Oh, that's nothing to worry about") — even well-meaning, it undermines patients
Q2 — Feeling listened to
- The most powerful listening technique is also the simplest: look at the patient, not the screen, when they are speaking
- Use non-verbal listening cues — nodding, brief verbal affirmations ("I see", "go on")
- Don't interrupt early — let patients tell their story before you start gathering data
- Briefly summarise back what you've heard: "So if I've understood correctly, the pain started about three weeks ago and it's been getting worse?" — patients feel genuinely heard when you reflect accurately
- Silence is a communication skill — a brief pause after a patient finishes often invites more of the story
Q3 — Explaining clearly
- Chunk information — give one piece at a time, then check understanding before moving on
- Use the patient's own language and metaphors wherever possible
- Avoid medical abbreviations and jargon — "there's a small infection" rather than "you have a localised suppurative focus"
- Use the "teach-back" technique: "Just to make sure I've explained that clearly, could you tell me in your own words what you'll do if things don't improve?"
- Written information — leaflets, practice website links, or a brief written summary — can supplement verbal explanations enormously
Q4 — Involving patients in decisions
- Present options, not just recommendations: "We have two main options — would you like me to run through both?"
- Ask about preferences: "Is there anything that would make one option suit you better than the other?"
- Explicitly check: "How does that plan sound to you?"
- Avoid rescuing the patient from the decision — some patients prefer clinician guidance, others want full autonomy; calibrate to the individual
- Shared decision-making does not mean giving the patient anything they want — it means involving them meaningfully in a plan that is clinically appropriate
Q5 — Confidence in the management plan
- Confidence comes from clarity — if the patient understands why you're recommending something, they are more likely to trust it
- Name your reasoning briefly: "I'm recommending this because…"
- Acknowledge uncertainty honestly when it exists — paradoxically, honesty about uncertainty often increases trust, not decreases it
- Don't apologise for your clinical decision — be warm but confident in your recommendation
Q6 — Knowing what happens next
- Make next steps explicit: "You should receive a letter from the hospital within about four weeks"
- Avoid vague closings: "We'll keep an eye on it" — what does that actually mean to the patient?
- Signpost the pathway: "I'm going to refer you to the dermatology team. They'll contact you directly. In the meantime, carry on with the cream."
- Brief written summaries at the end (or SMS via the practice system) dramatically improve patient understanding of next steps
Q7 — Knowing what to do if things worsen (Safety-netting)
- Safety-netting is not just a legal obligation — patients who understand their safety-net feel more confident and more cared for
- Be specific: "If the pain spreads to your chest, your left arm, or your jaw — call 999 immediately" is far better than "come back if you're worried"
- Give time frames: "If this hasn't improved within five days, please come back or call us"
- Signpost clearly — when to call the practice, when to call 111, when to go to A&E, when to call 999
- End with: "Do you know what to watch out for and what to do?" — checking they've understood is as important as saying it
Q9 — Trust in the doctor
- Trust is built across every other domain — it is the cumulative outcome of everything else in the consultation
- Honesty when you don't know something builds trust: "I'm not sure — let me find out for you"
- Consistency matters — patients who see the same doctor repeatedly tend to report higher trust scores
- Patient trust in trainees can actually be higher than expected — patients often appreciate the time and attention trainees give, especially early in training
Warmth without clarity scores lower than clarity with warmth. Many trainees assume PSQ scores are mainly about being friendly. Forum discussions and trainer feedback consistently identify a different pattern: trainees who are warm and personable but give unclear, poorly structured explanations still score surprisingly low — especially on Q3 (explanation) and Q5 (confidence in the plan). The consultations that score highest combine warm rapport and a clear, logical structure that patients can follow.
🗣️ Signposting — underused but high-impact
Signposting tells patients where the consultation is going — which makes them feel safe, organised, and respected. UK GP training scheme resources consistently identify it as one of the most reliable ways to raise PSQ scores on structure and confidence-related questions.
- "First I'll ask you a few questions to understand what's going on, then we'll talk about what I think and what we can do."
- "I'm going to examine you briefly, and then I'd like to explain what I think is happening."
- "There are two things I want to cover today — the main problem, and then I'll touch on your blood results."
Patients perceive signposting as a sign of competence, organisation, and safety — all of which feed directly into Q5 (confidence) and Q9 (trust).
⏱️ Managing expectations early
High-scoring trainees often set the scene at the start of the consultation with a brief, friendly expectation-setter. This prevents patients feeling rushed or blindsided at the end, and makes the whole consultation feel structured rather than reactive.
- "We've got about 10 minutes today — let's make sure we focus on the most important thing first."
- "I'd like to hear everything that's been going on, so just start from the beginning."
Patients who feel their time was respected and the consultation was well-managed score higher on Q1 (welcome), Q4 (involvement), and Q9 (trust).
💬 Empathy that actually registers — avoid the generic
Generic empathy statements — "I understand", "that must be difficult", "I see" — are so commonly used that patients often barely register them. Trainee forum discussions and GP communication teaching both identify a consistent pattern: specific empathy that reflects the patient's actual situation scores significantly higher on Q1, Q2, and Q6 than general empathy phrases.
- "I understand how you feel."
- "That must be difficult."
- "I can see this is hard."
These sound scripted. Patients hear them constantly and often feel they weren't truly listened to.
- "It sounds like this has really been affecting your sleep."
- "You've been dealing with this for a long time without answers — that must be exhausting."
- "I can hear how worried you've been about this."
Specific empathy mirrors the patient's individual experience — they feel genuinely heard, not processed.
💛 Sympathy vs Empathy — A Critical Distinction
UK GP training resources consistently draw a distinction between these two responses — because patients experience them very differently, and PSQ scores reflect the difference. Empathy is rated more highly because it shows the doctor has understood the meaning of the experience, not just heard the facts.
"I'm sorry to hear about your dad."
Registers that something happened. Polite and kind — but does not show the doctor has understood the emotional weight of the experience.
"I can really appreciate that losing your dad has had a massive impact on you and your family."
Shows the doctor has understood what the experience means to this particular person. Patients rate this significantly higher on warmth, trust, and feeling respected.
Rushing through an empathic statement under time pressure — "I'm very sorry about your dad, now let's talk about your back pain" — is often worse than saying nothing. It signals that the empathy was performative rather than genuine. If you acknowledge something emotional, give it a moment to land before moving on.
🔍 Picking Up Cues — Being Interested in the Patient as a Whole Person
The PSQ question about being treated as a whole person rather than just a number is directly addressed by cue-handling skills. Trainees frequently report that when they started picking up and acknowledging cues, their consultations "unlocked" — patients volunteered more, the consultation became more efficient, and PSQ feedback improved.
💬 Verbal cues
Emotive words, repetition of a theme — a patient who mentions their mum twice is telling you something matters.
"You mentioned your mum just then — can you tell me a bit more about that?"
👁️ Non-verbal cues
Tearfulness, closed body posture, poor eye contact. When you notice these, name what you see — then pause.
"I can't help but notice you seem really down today."
🗺️ Situational cues
An elderly parent brought in by an adult child; a patient attending multiple times for vague symptoms. These often signal something beyond the presenting complaint.
Ask about context, not just symptoms.
A principle from UK GP training educators that trainees consistently describe as transformative: in the first half of the consultation, your job is primarily to help the patient get their story out, not to fill silence with medical questions. This means tolerating brief pauses, using open questions, and following the patient's lead. Trainees who understand this principle tend to pick up far more cues — and patients feel far more heard.
⏱️ The Micro-Summary — One of the Highest-Impact Consultation Habits
Highlighted in Pennine GP Training tutorials as one of the most reliable ways to improve both consultation quality and patient satisfaction scores. Different from a full closing summary — it is a brief, mid-consultation check at around minutes 4–5:
Briefly recap the patient's key issues — especially their worries, context, and expectations — and check:
"So from what you've told me, the pain has been there for three weeks, it's been getting worse, and your main worry is that it might be something more serious — is that about right?"
Trainees often list every symptom rather than reflecting the patient's priorities — worries, hopes, and context. Keep the micro-summary to under 30 seconds, focused on what matters most to the patient, not what you have gathered clinically.
Why it matters — 4 effects
- Demonstrates genuine listening — patients feel heard before the management phase begins
- Prevents the "minute-nine reveal" — the devastating moment when a patient says "Actually, the main thing I wanted to discuss was..." with 60 seconds left
- Builds rapport and trust before you move into your plan — directly improving Q9
- Allows the patient to correct you — if you've misunderstood something, this is the moment to catch it rather than after you've explained a plan they didn't need
Practise the micro-summary in real consultations, not just in role-play — it needs to feel natural before patients are rating you on it. It will feel slightly awkward the first few times; it stops feeling awkward very quickly.
After outlining your management plan, add this single question before closing the decision:
"Is there anything about that plan you'd like to change, or anything that wouldn't work for you?"
This question dramatically increases patients' sense of participation in decisions (Q4) because it gives explicit permission to push back — and patients feel involved even when they have nothing to change. It also protects against the common failure of developing a clinically excellent plan that doesn't fit the patient's actual situation or preferences.
📚 The recommended communication skills resource
The Bradford VTS has long recommended Skills for Communicating with Patients by Silverman, Kurtz, and Draper (also known as the "yellow book") as the gold standard text for developing the consultation skills that drive PSQ scores. It covers:
- The Calgary-Cambridge consultation model
- Gathering information effectively
- Building the relationship and rapport
- Providing structure and explanation
- Shared decision-making in practice
The biggest gains in PSQ scores come from deliberate practice in real consultations — not from reading about it. The most effective combination is:
- Understand the skill (reading, tutorials)
- Observe a skilled clinician demonstrate it
- Practise it yourself in consultations
- Get feedback (COT, debrief with trainer)
- Reflect and adjust
🆚What Good Looks Like — Domain by Domain
For each PSQ domain, here is the practical difference between a consultation that would score poorly and one that would score well. These are based on real consultation patterns — not theoretical ideals.
Read across each row and ask honestly: which column better describes what you typically do? The right column is not about perfection — it is about consistent, deliberate habits that make a real difference to how patients experience the consultation.
| PSQ Domain | ❌ Likely to score poorly | ✅ Likely to score well |
|---|---|---|
| Q1 — Relaxed & welcome | Calling in without greeting; immediate attention to the screen; no eye contact on entry; abrupt or rushed manner from the first moment | Standing or turning to greet the patient; using their name; warm and unhurried manner; brief human connection before diving into the problem |
| Q2 — Listened to | Interrupting within 20–30 seconds; typing while the patient speaks; asking closed questions before the patient has told their story; visibly distracted | Allowing the patient to speak without interruption; eye contact while they talk; brief reflecting back of what was said; comfortable silence when appropriate |
| Q3 — Clear explanation | "Your LFTs are mildly deranged — we'll monitor" (to a non-medical patient); long technical monologue; no checking of understanding; information dumped at the end in a rush | "The blood tests show your liver is slightly inflamed — nothing alarming, but worth keeping an eye on. Let me explain what that means…"; chunk, explain, check — then repeat as needed |
| Q4 — Involved in decisions | "I'm going to start you on metformin" (no discussion of options or preferences); or conversely, "I'll leave it entirely up to you" (unhelpful abandonment of clinical guidance) | "We have a couple of options here. I'd suggest X because… but your thoughts matter — what feels right to you, and is there anything that would put you off either option?" |
| Q5 — Confidence in the plan | Vague or hesitant recommendation with no rationale; excessive hedging that leaves the patient uncertain; plan changes three times in one consultation | Clear, confident recommendation with brief reasoning: "I'd recommend X because [rationale]. The evidence for this is good." Acknowledge uncertainty where it genuinely exists — but be specific about what is and isn't known |
| Q6 — Knows what happens next | "I'll refer you" (no further detail); "We'll see how it goes" (no timeline); consultation ends without a clear next step; patient has to ask "so what do I do now?" | "I'm referring you to the dermatology team — you'll receive a letter within around four weeks. In the meantime, continue the cream. If things change significantly before that, ring us." |
| Q7 — Knows what to do if worse | "Come back if worried" (patient doesn't know what 'worried' should feel like); no specific symptoms mentioned; no escalation pathway given; safety-netting bolted on as an afterthought in the final five seconds | "If you develop severe pain, or difficulty breathing, or the rash spreads rapidly — go to A&E straight away. If it's not improving but not those things — call us in five days. Do you have a sense of what to watch out for?" |
| Q8 — Respect & dignity | Referring to the patient as "the diabetic in Room 2"; making assumptions about lifestyle without asking; being dismissive about a concern the patient clearly found significant; not knocking before entering during an examination | Using the patient's name and preferred pronouns; treating their concerns as real even when they are not clinically serious; asking before examining; acknowledging the human dimension of a diagnosis or change in management |
| Q9 — Trust | Any combination of the above — trust is the cumulative product of everything else; a single consultation that scores poorly across multiple domains will tend to score low on trust too | Honesty when you don't know: "I'm not certain about that — let me find out for you"; consistency between what you say and what you do; genuine interest in the patient as a person, not just their problem |
📞The PSQ & Telephone / Remote Consultations
The PSQ explicitly applies to telephone and online consultations as well as face-to-face ones. With a significant proportion of GP consultations now being remote, this is an increasingly important area — and one where the communication challenges are subtly different.
In a telephone consultation, you lose the most powerful rapport-building tool you have: non-verbal communication. No eye contact, no posture, no facial expression, no nodding — just voice. This places an even greater premium on tone, pace, phrasing, and active listening cues. Patients are highly attuned to whether a doctor sounds hurried, distracted, or genuinely present — even without seeing their face.
Per-domain telephone tips
Q1 — Making patients feel relaxed and welcome (phone)
- Your opening words set the tone for the entire call — "Hello, is that [name]? This is Dr [name] calling from [practice]. Is now still a good time to talk?"
- Acknowledge that phone consultations are different: "I know it's not quite the same as being face to face, so please take your time"
- Check they can talk freely: "Are you somewhere private where you can talk?"
- Avoid starting with closed questions — invite the patient to tell their story
Q2 — Listening (phone)
- Verbal listening cues become critical: "I see", "go on", "mm-hmm", brief reflections
- Without visual cues, patients cannot tell you are listening unless you signal it verbally
- Don't type loudly on the keyboard while the patient is speaking — it is audible and feels dismissive
- Brief summaries matter even more: "So from what you've said, the main things are X and Y — is that right?"
Q3 — Explaining clearly (phone)
- You cannot use diagrams, hand gestures, or pointing at things — explanations must be purely verbal
- Pace your explanations more slowly than face-to-face — without visual anchoring, patients process spoken information more carefully
- Signpost more explicitly: "I'm going to explain three things. First…" — structure helps on the phone where there are no visual cues to follow
- Follow up with written information where possible — "I'll send a message through the practice system with the key points"
- Check understanding more explicitly — on the phone you cannot see a confused expression
Q7 — Safety-netting (phone)
- Safety-netting is even more important on the phone — you cannot visually assess the patient and the stakes of a missed escalation are higher
- Be very specific about named symptoms and exactly what to do: "If you develop [X], call 999. If it's [Y], call 111. If it's just not improving, call us in three days."
- End by checking: "Just to make sure I've been clear — what would you do if [X]?"
- Offer a safety net for the uncertainty of remote consulting itself: "And if there's anything you felt you couldn't describe clearly on the phone, please don't hesitate to ask for a face-to-face review"
Logistics: PSQ for telephone patients
- Electronic PSQ links sent via the practice messaging system work well for telephone patients — they can complete it at their convenience after the call
- The message must come from the practice team, not from your personal account
- Consider your practice's messaging platform — some patients may not engage with certain systems; know what your practice uses
- You can combine telephone and face-to-face patient responses in the same PSQ cycle
- Be aware that response rates for electronically sent PSQ links can be lower than for paper forms given in person — plan accordingly
Trainees who do a significant proportion of telephone PSQ consultations often find scores on Q2 (feeling listened to) and Q3 (clear explanation) are slightly lower than for face-to-face — simply because the communication demands are higher without visual cues. If you notice this in your results, it is a telephone consultation skill issue, not a relationship problem. Targeted practice with verbal listening cues is the fix.
| Face-to-face | Telephone equivalent |
|---|---|
| Eye contact | Verbal affirmations ("I see", "go on", "mm") |
| Nodding | Brief reflections ("That sounds really difficult") |
| Open body posture | Warm, unhurried tone of voice |
| Pointing at a diagram | Clear verbal signposting ("First… then… finally…") |
| Facial expression of concern | Explicit empathy in words ("That must have been worrying for you") |
| Checking patient's expression for confusion | Actively asking "Does that make sense? Is there anything I can explain more clearly?" |
| Written leaflet handed over | Message sent via practice system with key points in writing |
🧠The Psychology of Patient Satisfaction — What the Evidence Says
Patient satisfaction is not just a subjective feeling — it is a measurable construct with a well-developed evidence base. Understanding what actually drives it (and what doesn't) is genuinely useful for any doctor, not just trainees completing a PSQ.
What the evidence shows drives patient satisfaction
- Being listened to — consistently the most important single factor in patient experience research. More important than clinical outcome in many studies.
- Clear, understandable explanations — patients who understand their diagnosis and treatment plan are more satisfied, more adherent, and have better health outcomes
- Feeling involved in decisions — patients who feel involved report higher satisfaction even when the clinical outcome is the same or worse
- Time perception, not actual time — the sensation of being hurried (whether or not the consultation was genuinely short) significantly reduces satisfaction scores
- Empathy and genuine warmth — patients distinguish authentic connection from performed warmth; the former increases trust, the latter can actually reduce it
What does NOT predict satisfaction as much as doctors assume
- Consultation length alone — a well-structured 8-minute consultation can score higher than an unfocused 15-minute one
- Clinical accuracy — patients often cannot assess whether a diagnosis is technically correct; they assess how they were treated in the process of reaching it
- Giving patients what they asked for — patients are often satisfied when you decline a request (e.g. antibiotics) if you explain clearly and empathically why, and offer an alternative plan
- Years of experience — trainees often score comparably to established GPs in patient satisfaction; the extra time and attention trainees give can compensate for experience gaps
- Age or cultural background of the doctor — research consistently shows no significant overall differences by ethnicity or gender
Patients who are more satisfied with their consultation are more likely to follow the agreed plan. This is not just an administrative metric — it has direct clinical consequences. A patient who felt heard and understood is more likely to take the medication, attend the follow-up, and return early if things deteriorate. Good PSQ scores and better patient outcomes are not coincidental: they flow from the same consultation behaviours.
Research shows a strong "halo effect" in patient satisfaction surveys: patients who feel warmly about you overall tend to rate you highly across most domains, even if specific aspects were imperfect. Conversely, patients who feel unheard or dismissed at any point in the consultation tend to give lower scores across the board, even for things that actually went well. This is why Q1 and Q2 — the first impressions and listening questions — disproportionately influence the overall pattern of results.
Michael Balint described the doctor themselves as "the most powerful drug in the pharmacopoeia." The therapeutic relationship — the experience of feeling cared for by a trusted, warm, competent clinician — has measurable clinical benefit independent of any prescription. The PSQ is, in essence, a patient-reported measure of the potency of that relationship. High PSQ scores are not vanity metrics: they are evidence that the therapeutic relationship is working.
🔄Step-by-Step Process
The PSQ involves careful logistics. Getting the process wrong can invalidate the results or cause unnecessary anxiety. Here is exactly what to do, in order.
📅 Timing: When to do it
- Completed once during your GP training career (in ST3)
- Recommended after the midway point of your ST3 placement
- Not too early (you need to have settled in) and not so late you run out of time before your final ARCP
- Agree the timing with your Educational Supervisor at the start of ST3
- Factor in time for your admin to enter data and your ES to review results before your next ES meeting
Plan backwards from your final ARCP date. You need time for: data collection → admin entry → ES review → ES discussion. Allow at least 4–6 weeks from starting to sitting down with your ES to discuss. Red Whale's WPBA programme directors recommend launching in around month 7 of ST3 (February for August starters) as the ideal timing — and closing the PSQ at least six weeks before ARCP.
If you use paper forms, an administrator enters the data using a ticket code from FourteenFish. That ticket code has an expiry date. If you start the PSQ too late and distribution spills into the final weeks of ST3, the code may expire before your admin has finished data entry — and that means starting the entire PSQ again. Always check the expiry date of your admin ticket code when you set up the PSQ, and give it to your admin in writing with the expiry date clearly highlighted. This problem is entirely avoidable with early planning — and entirely disruptive when it isn't.
🔁 Electronic vs Paper
| Method | How | Best for |
|---|---|---|
| Electronic (preferred) | Send patient a link via practice messaging system (NOT from your personal account) | Telephone consultations; modern practices with online messaging. No ticket code. No data-entry burden. |
| Paper | Receptionist hands form to patient before appointment; collected in sealed box at reception | Practices with older systems; patients less comfortable online. Note ticket code expiry risk. |
| Combined | Both paper and electronic can be combined in the same PSQ cycle | Getting to 34 faster |
📅 Recommended ST3 PSQ Timeline — Month by Month
Based on Red Whale WPBA guidance and programme director advice from multiple deaneries. Adjust start and end months for your specific rotation — this assumes an August start to ST3.
| ST3 Month | Approximate month | Recommended PSQ action |
|---|---|---|
| Months 1–4 | Aug – Nov | Focus on COTs and CbDs; actively incorporate consultation skills feedback; build consultation habits that will drive good PSQ scores |
| Months 5–6 | Dec – Jan | Review COT feedback with trainer — identify any domains consistently scoring below expected; these are exactly the PSQ domains to work on before distributing forms |
| Month 7 ⭐ | February | Ideal time to launch PSQ — per Red Whale and programme director guidance. Agree the date with your ES. Set up FourteenFish, complete self-assessment, brief receptionist, note ticket code expiry if using paper |
| Month 8 | March | Check response count periodically; follow up with admin/reception if behind target; switch to electronic if paper uptake is low |
| Month 9 | April | Close PSQ once 34+ fully completed responses reached; notify ES; confirm all paper forms entered before closing |
| Month 10 | May | ES reviews results and adds comments; arrange dedicated PSQ feedback meeting with ES; discuss results using the 6-comparison framework |
| Month 11 | June | Upload structured reflection to FourteenFish ePortfolio; link to Communication & Consultation Skills capability; complete any PDP actions agreed with ES before final ARCP |
⭐ If your ARCP is earlier than August, adjust all months back accordingly. The key rule: PSQ should be closed and fully reflected upon at least 6 weeks before your ARCP panel date.
Step 1 — Log into FourteenFish and click on "Patient Feedback"
Navigate to the PSQ section and complete your self-assessment first — scoring yourself on the same 9 domains before you see any patient data. This is mandatory and must be completed before any patient responses are collected.
Step 2 — Brief your receptionist or practice manager
You should not hand out questionnaires yourself. Give the forms (or instructions for the electronic link) to your receptionist, who should hand them to consecutive patients irrespective of their likelihood of responding — no selection bias. Suggest preparing around 50 paper copies to ensure you get at least 34 back.
Step 3 — Patients complete and return forms — not to you
Completed paper forms should be handed to reception or left in a sealed box at reception. For electronic, patients follow the link and submit directly to the system. Either way, responses must remain anonymous — you must not be able to identify individual patients or see individual responses.
Step 4 — A practice administrator enters paper data — not you
Invite an administrator via FourteenFish (scroll to bottom of the PSQ survey page for instructions). They can enter paper form data using a provided ID and password. Electronic forms are already captured automatically. You may remind patients to complete the form before leaving but keep it brief and neutral.
Step 5 — Close the PSQ once 34+ responses are entered
You'll receive an email notification when the minimum is reached. Go to your survey in FourteenFish and click "Close survey" in the "Your progress" section. This sends results to your Educational Supervisor. Before closing, confirm with your admin that all paper forms have been entered — a closed PSQ cannot be reopened.
Step 6 — ES reviews results and adds their comments
Your ES receives a notification, reviews the scores and free-text comments, and adds their observations to the Portfolio. They will compare your scores against your self-assessment and against peer norms before you meet.
Step 7 — Arrange a dedicated meeting to discuss results with your ES
This is a protected feedback conversation — not a rushed five-minute add-on to another meeting. Come prepared with your own reflections. Your ES will release the results to you, and you discuss together.
Step 8 — Write a reflective learning log entry in FourteenFish
Upload your structured reflection on the feedback. Use the Bradford VTS reflection template (in the downloads section above). This evidences the Communication & Consultation Skills capability and contributes to your ARCP evidence. A good reflection goes beyond describing what the scores were — it should show what you will start, stop, continue, or change.
- You must not hand questionnaires directly to patients
- You must not collect completed forms from patients yourself
- You must not enter the data into the ePortfolio yourself
- Electronic messages with the PSQ link must not be signed by you personally — send from the team or the practice
- These rules exist to protect the anonymity of patients and ensure valid, unbiased feedback
📋 Admin & Logistics — Practical Tips That Make the Difference
Getting good PSQ results is 50% consultation skill and 50% organising the process well. These practical logistics points are consistently highlighted by trainees who wish they had known them earlier:
- Avoid bank holiday weeks — response rates drop significantly when surgery is disrupted and reception is stretched
- Avoid periods of staff shortage — if reception is understaffed, briefing gets forgotten and forms don't get handed out consistently
- Aim for a stable, routine clinic week where your lists are representative and the practice is running normally
- Starting during a high-pressure period (post-bank holiday backlog, locum cover, etc.) often results in lower response rates and less representative results
Don't assume the responses are accumulating — check your FourteenFish survey page daily or every few days. Trainees who leave tracking to the end sometimes discover they only have 18 responses with two weeks to go. The FourteenFish portal shows a live count for electronic responses. For paper, ask your admin contact for a regular update.
A brief, in-person conversation with the person handling your forms makes a significant difference. Key things to say explicitly:
- "Please give a form to every patient coming to see me — not just the ones you think will respond well."
- "It doesn't matter if they seem busy or rushed — just hand the form over."
- "If you're ever unsure, the answer is always: give them one."
Before clicking "close survey", confirm with your admin that all paper forms have been entered and that the count looks right. Last-minute data entry errors are easier to fix before you close than after. A closed PSQ cannot be reopened — so a 5-minute check is worth it. Also verify that incomplete paper forms (questions left blank) have been excluded from the count.
✅ What Good PSQ Preparation Actually Looks Like — A Practical Checklist
Synthesised from UK GP training resources, Pennine training tutorials, and community insights. Trainees who prepare this way consistently describe better scores and richer reflections than those who simply launch the PSQ and hope for the best.
Before you launch the PSQ
- Review your COT feedback first — your COT scores are the best available proxy for how patients experience your consultations. If any COT domain consistently scores below expected (e.g. listening, explaining, involving the patient), those are the exact PSQ domains to work on before distributing forms
- Ask your trainer to focus COT observations on PSQ-relevant domains — specifically: warmth and welcome, listening, involvement in decisions, safety-netting. This gives you targeted, actionable feedback in the weeks before you start
- Record yourself consulting — watch the recording, even without sound, to notice body language, screen time, and pacing you weren't aware of during the consultation. Trainees are consistently surprised by how much they look at the screen
- Ask your trainer to specifically observe your opening 90 seconds — this is the period when patients form their primary impression of you. Most COT observations focus on the history and management; few specifically focus on the opening
During the PSQ collection period
- Do a "cue audit" with your trainer — after reviewing a recorded consultation, ask together: "How many cues did the patient give that I didn't acknowledge?" This is one of the most revealing exercises for trainees who are technically competent but missing the emotional layer of consultations
- Practise the micro-summary in real consultations — not just in role-play; it needs to feel natural before patients are rating you. It will feel slightly awkward the first few times and then stop feeling awkward entirely
- Focus on your consultation endings — the closing two minutes disproportionately influence PSQ scores (Q6 and Q7). Consciously slow down for explanation and safety-netting
Read the 9 PSQ questions. Put them somewhere you'll see them. Consult with them in mind. This takes 2 minutes and is still the most commonly overlooked preparation step.
🔍The Self-Assessment — Why It Matters
Before any patient data is collected, you score yourself on the same 9 domains — rating how well you think you perform in each area. This is the self-assessment, and it is a mandatory first step.
It might feel like an afterthought, but the self-assessment is actually one of the most educational parts of the whole PSQ process. The gap between how you rate yourself and how patients rate you is often where the richest learning happens.
Your ES will compare your self-scores to patient scores for each domain. A large gap in one direction reveals either poor insight or a specific skill gap — and both are worth exploring in your ES meeting.
Three possible patterns to be aware of
Good self-insight. You see yourself as patients see you. Well-calibrated.
You rate yourself higher than patients rate you. Possible blind spots in how you come across. Don't be defensive — this is useful data.
Patients rate you more highly than you rate yourself. You may be overly self-critical. Confidence building may be a learning need.
It's tempting to rate yourself highly to appear confident, or to rate yourself low to look humble. Neither serves you. Give your genuine, considered opinion of your own performance in each area. That honest self-appraisal is what makes the comparison with patient scores genuinely educational.
📊Understanding Your Results
Once your ES has reviewed and released your results, you will see a statistical summary for each of the 9 questions. Here is how to make sense of what you are looking at.
What the results show you
- Mean — the average score across all responses for each question
- Median — the middle value; less affected by extreme responses (often most useful)
- Range — spread between the lowest and highest scores
- Peer comparison — how your scores compare to other GP trainees nationally
- Self vs patient — your self-assessment compared to patient scores
- Free text comments — anonymous written feedback from patients
How to interpret what you see
- Focus on the median — it is the most robust measure
- A narrow range = consistent responses; a wide range = mixed opinions
- Compare each domain to the peer median — this contextualises your scores
- Specific low domains are more actionable than an overall average
- Read free-text comments carefully — patterns in the text often explain the numbers
- One very negative comment doesn't define you — look for recurring themes
What your ES is looking at — the 6-point framework
| # | Comparison | Why it matters |
|---|---|---|
| a | Your score vs your self-assessment | Measures self-insight — are you aware of how you come across? |
| b | Your score vs peer scores nationally | Context — are you performing at, above, or below the expected level? |
| c | Your score vs any previous PSQs | Trajectory — are you improving over time? |
| d | Free text comments — themes | Often the richest source of specific, actionable feedback |
| e | Scores in specific domains + other evidence | Does a low score on "listening" align with COT or CBD observations? |
| f | What you will stop, start, continue, or change | The action plan that makes this educational, not just administrative |
Free text comments are anonymous and can sometimes feel raw to read. Resist the urge to dismiss critical comments or over-identify with positive ones. The most useful approach is to look for patterns — if three separate patients independently mention the same thing, that is signal worth acting on. One isolated harsh comment, however, may just reflect a difficult consultation dynamic and nothing more.
⚠️ Three score interpretation pitfalls to avoid
Fixating on whether your mean is 4.2 vs the peer mean of 4.5 misses the point. What matters is relative weakness within your own profile.
✅ Better: Which domain scored lowest for you? That is where your development lies.
Numbers tell you what scored lower. Free-text comments tell you why. Trainees who skip the comments miss the most actionable part of the whole report.
✅ Better: Read every comment. Look for repeated themes. Comments > numbers for learning.
A single outlier response — from an angry patient, a difficult consultation, or just someone having a bad day — does not define your practice.
✅ Better: Look for patterns across multiple responses before drawing conclusions.
🪞Receiving Feedback Well — The Emotional Dimension
Nobody warns you quite how strange it feels to read patient feedback about yourself for the first time. This section is here precisely because almost no training resource addresses this directly — and yet it genuinely matters.
What trainees typically feel on first reading PSQ results
- Relief — scores are better than feared; patients actually like you
- Surprise — both positive and negative; "I didn't realise patients noticed that"
- Defensiveness — especially on specific comments that feel unfair or contextually misrepresented
- Vulnerability — reading free text comments can feel very personal, especially early in training
- Curiosity — what does this tell me that I couldn't see from the inside?
- Imposter syndrome amplification — lower scores can briefly intensify existing self-doubt
All of these are normal. They do not mean you are struggling — they mean you are human.
A framework for receiving feedback constructively
- Pause before responding — don't react to results immediately; let the initial emotional response settle before trying to analyse
- Look for patterns, not outliers — one strongly positive or negative comment is less meaningful than a theme that appears in multiple responses
- Separate your identity from your scores — your PSQ scores describe behaviours in specific consultations, not your value as a person or a doctor
- Ask "what can I learn?" — even from feedback that feels unfair, there is usually something worth exploring
- Discuss it — don't sit with it alone — share your reaction with your trainer, ES, or a trusted peer before writing your formal reflection
- You find yourself thinking about specific patient comments repeatedly and cannot set them aside
- Lower-than-expected scores significantly increase anxiety about clinic or about the PSQ process itself
- You are reluctant to discuss results with your trainer or ES
- You are modifying your practice in ways that feel inauthentic or defensive rather than genuinely developmental
If any of these apply, speak to your ES, your VTS pastoral lead, or your occupational health service. These responses are more common than trainees realise — and support is available.
The most educational PSQ feedback is almost always the feedback that initially feels hardest to receive. If every score were perfect and every comment glowing, there would be nothing to learn. The specific areas where patients rate you lower — even if it stings — are the areas where your development will have the most real-world impact on the people you are caring for. That is not a platitude: it is the whole point of the exercise.
The PSQ and the long game — revalidation
As a qualified GP, you will continue to seek patient feedback as part of GMC revalidation — typically as part of your five-yearly revalidation cycle through the GP appraisal process. The NHS GP Patient Survey, practice-level feedback tools, and other patient experience instruments will be part of your professional life indefinitely.
The habits you build now — genuinely listening, explaining clearly, safety-netting explicitly, involving patients in decisions — are not just PSQ preparation. They are the habits that will define your practice for your entire career. The PSQ is a useful early milestone, but the real goal is the consultation style that follows from taking it seriously.
💭Reflecting on Your PSQ Feedback
Reflection on PSQ results is where the real learning happens. A good reflection moves beyond "my scores were X" and asks: what does this tell me, and what will I do differently?
The Stop/Start/Continue/Change model
This is one of the most practical reflection frameworks for PSQ feedback. For each domain where scores stand out — positively or negatively — ask:
e.g. "Stop checking the computer while patients are talking to me"
e.g. "Start explicitly asking 'Does that all make sense?' before closing"
e.g. "Continue my warm greeting — patients consistently rate this highly"
e.g. "Change my safety-netting approach — be more specific about what symptoms to look out for"
Writing your ePortfolio reflection
A meaningful PSQ reflection in FourteenFish should address all of the following:
- Brief description — when you did it, how many responses, overall picture
- Key findings — highlight the areas of strength and the areas scoring below your expectations or the peer median
- Self-assessment comparison — where did your view align with patients? Where did it differ?
- Themes from free text — what did written comments reveal?
- Learning needs identified — be specific; "improve communication" is too vague
- Actions taken or planned — what will you actually do? (Reading, a tutorial, practising a skill in clinic, a COT focused on listening)
- RCGP capability links — Communication & Consultation Skills at minimum; consider Person-Centred Care and Professionalism
Writing "I will do more reading" as your action plan is not sufficient and will frustrate your ES. Instead: "I discussed my PSQ results with my trainer and we agreed to focus on safety-netting in my next three COTs. I also attended the Bradford VTS communication skills session and identified two phrases I will use to check patient understanding before closing." Specific and evidenced wins every time.
| Stage | Question | In the PSQ context |
|---|---|---|
| Description | What happened? | Describe the PSQ process, timing, and response numbers |
| Feelings | What were you thinking and feeling? | Honest reaction to your results — surprise, reassurance, discomfort? |
| Evaluation | What was good and bad? | Where were scores strong? Where were they weaker? |
| Analysis | What sense can you make of it? | Why might scores be lower in specific areas? What context is relevant? |
| Conclusion | What else could you have done? | What specific consultation behaviours would have improved scores? |
| Action Plan | What will you do next? | Concrete next steps — be specific and achievable |
🔬 The 4-Step Reflection Framework — What Trainers Actually Want to See
Alongside Gibbs, trainers consistently describe a sharper, more clinical reflection model that is particularly effective for PSQ feedback. This four-step approach moves directly from data to behaviour change — exactly what ARCP panels are looking for:
| Step | Question | Example |
|---|---|---|
| 1. Identify the pattern | What did the data show? | "Scores were lower on 'explaining treatment clearly' compared to my other domains" |
| 2. Interpret the meaning | What does this suggest about my practice? | "I may be overestimating how well patients understand my explanations" |
| 3. Link to behaviour | Which specific behaviour is driving this? | "I tend to give long explanations without pausing to check understanding" |
| 4. Action plan | What will I do differently, specifically? | "I will use chunk-and-check and teach-back in every consultation for the next four weeks" |
- "My scores were good" — no analysis
- "Patients were satisfied" — no learning
- "I will read more about communication" — no specificity
These score poorly in supervision and carry no evidential weight for ARCP.
Strong reflections move through all 4 steps and end with a concrete, named, time-bound action. Reflection = evidence of behaviour change — not description of a score. The ARCP panel cannot see your scores improving — but they can see you demonstrating insight and a commitment to developing.
📉What If My Scores Are Low?
Lower-than-expected PSQ scores are not a disciplinary matter, and they do not fail you outright. They are educational data — a learning opportunity that the WPBA process is specifically designed to surface. Most trainees who receive lower scores feel anxious at first, but most also find the subsequent discussion and targeted skill development genuinely helpful for their practice.
What happens next
- Your ES will discuss the results with you in detail — this is a supportive conversation, not a tribunal
- Together, you'll identify the specific domains where scores were low and why
- A learning plan and specific actions will be agreed — potentially including a focused tutorial on communication skills, targeted COTs, or additional reading
- If scores are significantly below the peer average, the RCGP recommends repeating the PSQ at a later date to demonstrate progression
- Improvement over time is viewed positively — it shows insight and a commitment to developing
Common causes of lower scores
- Running very late — patients feel rushed before they've even come in
- Spending too much time on the computer or notes during the consultation
- Not exploring patient concerns (ICE) adequately
- Explanations that are technically accurate but incomprehensible to the patient
- Inadequate or vague safety-netting at the end of consultations
- Not explicitly involving the patient in decisions — being subtly paternalistic
- Language barriers (where relevant) without adequate adjustments
- Certain consultation types (e.g. very brief admin reviews) that make questions like Q6/Q7 "not applicable"
The PSQ captures a snapshot in time. If you are going through a particularly difficult period personally, or if you have recently changed practice or adjusted to a new clinical environment, this may affect your scores. Mention this to your ES as context — it does not excuse scores, but it helps interpret them fairly. Most ESs are human beings who understand that doctors are human beings too.
⚠️Common Pitfalls & Trainee Traps
These are the mistakes that come up most often. Read them once and you'll probably never make them.
- Leaving it too late in ST3 — not allowing enough time for data collection, admin entry, ES review, and discussion before your final ARCP
- Handing forms out yourself — even if it's faster, this breaches anonymity rules
- Forgetting to do your self-assessment first — once you close the PSQ, you lose the comparison data
- Not briefing your receptionist properly — resulting in inconsistent or selective distribution
- Sending the electronic link signed in your own name — patients may feel they're giving direct personal feedback, undermining anonymity
- Selecting "easier" patients — subconsciously avoiding rushed, complex, or difficult consultations creates biased feedback; trainers and ARCP panels can often spot an unrealistically clean score distribution. Always use true consecutive sampling — include late patients, language barrier consultations, and complex multi-morbidity. A credible PSQ includes imperfect consultations.
- Treating the PSQ as a tick-box — completing it, uploading it, and moving on without meaningful reflection is the most common way trainees undermine its educational value. The PSQ is not about your score. It is about what you learn from it and what you change as a result.
- Dismissing low scores without exploring why — "patients just don't understand what I was trying to do"
- Over-identifying with isolated negative comments — one harsh comment in 40 responses may not represent a pattern
- Writing a surface-level ePortfolio entry — describing scores without analysis or action plan
- Focusing only on positive scores — good scores are encouraging but the learning is in the lower ones
- Not linking to the RCGP capabilities — your reflection earns evidence; don't miss this opportunity
- Using generic action plans — "I will read more about communication skills" is not enough
Trainees most frequently score lower than expected on questions 6 and 7 — "Do you know what will happen next?" and "Do you know what to do if things get worse?" This usually reflects insufficiently explicit safety-netting and follow-up planning, rather than a fundamental relationship problem. It is also the easiest area to improve — with a deliberate habit of explicitly asking "Do you know what to watch out for?" at the end of every consultation.
💎Insider Pearls — Real-World Wisdom
Trainees who look at the 9 PSQ questions before starting their PSQ cycle — and genuinely think about each domain in their consultations for the weeks leading up to it — tend to score significantly better than those who just run the PSQ without preparation. Familiarity with the domains is not gaming the system; it is exactly the purpose of the exercise.
The question that catches trainees out most reliably is Q7 — safety-netting. Not because trainees don't safety-net, but because their safety-netting is often generic ("come back if it gets worse") rather than specific enough for patients to actually follow. Specific, named symptoms and clear escalation pathways are what patients remember and report positively.
The single most impactful thing a trainee can do to improve PSQ scores is: look at the patient, not the screen, while they are speaking. This one habit shifts scores on Q1, Q2, Q4, and Q9 simultaneously. It is remarkably hard to do consistently when you're also trying to gather a history efficiently — but when you get it right, patients notice.
You might assume longer consultations = higher PSQ scores. In reality, patients are often more satisfied with shorter, well-structured consultations where they felt heard and left with a clear plan — than with longer consultations that were vague and unfocused. It is the quality of attention, not the duration, that patients respond to.
Free text comments in the PSQ can feel very personal, and some trainees find them emotionally difficult to read — especially early in training. It can help to read them with your trainer or ES present for the first time, so you are not alone with the initial reaction. The learning value in the comments is often highest when you approach them with curiosity rather than defensiveness.
Research suggests that IMGs and trainees from non-White British backgrounds are actually more positively disposed towards WPBA including the PSQ — and many are pleasantly surprised by their results. If you are an IMG and are anxious about patient responses, remember that patients generally respond to warmth, listening, and clarity — which transcend cultural background. Language barriers in consultations, where relevant, are worth noting in your reflection as contextual factors.
💎More Insider Pearls — Hidden Gems From the Consultation Room
The first 30 seconds of a consultation have a disproportionate influence on patient satisfaction scores. Patients form rapid impressions — warmth, attentiveness, whether the doctor seems genuinely present. Trainees who consciously invest in those opening moments — even just by turning to face the patient, using their name, and not immediately diving into the problem — consistently see this reflected in Q1, Q2, and Q9 results.
One practical technique that makes an immediate difference: when a patient finishes describing their main concern, turn slightly away from the screen, put down the keyboard if you've been typing, and make deliberate eye contact before asking your next question. This one physical gesture communicates more attentiveness than any amount of nodding while typing. Patients notice — and it shows in Q2 and Q9.
Closing with "Just to make sure I've explained that well — could you tell me in your own words what you're going to do?" feels slightly exposing the first few times, but patients almost universally respond positively. It shows you care whether they understood, not just whether you explained. Trainees who adopt this habit routinely see improvement in Q3 and Q6 scores — and find that it also dramatically reduces the "rang back with questions" callbacks that erode efficiency.
The most consistently effective safety-net takes two sentences: one for the "when to worry" symptoms, one for the "what to do" action. "If you develop [specific symptom], [specific action]. If it's just not improving, come back within [timeframe]." Patients can remember two sentences. A list of seven symptoms followed by "call us, call 111, or go to A&E as appropriate" is remembered by nobody — and scores accordingly on Q7.
Exploring a patient's Ideas, Concerns, and Expectations is taught as a consultation model — but it also directly maps onto PSQ questions 4 and 9. Patients who had their concerns explicitly explored tend to rate their involvement in decisions much higher, and their overall trust in the doctor much higher, even if the clinical plan is identical to what it would have been without exploring ICE. The act of asking "what was your main concern about this?" is itself therapeutic.
Asking "is there anything else you wanted to bring up today?" — with genuine openness, before closing — affects PSQ scores on Q4 (involvement) and Q9 (trust) noticeably. Patients who feel they had an opportunity to raise additional concerns, even if they didn't take it, feel more involved and more respected than those who felt the consultation ended when the doctor decided it was over. Timing matters: this question has to come before the clinical closing, not as the patient is leaving.
Running significantly behind clinic does not automatically tank your PSQ scores — but it changes the dynamic of every consultation that follows. Patients who have been waiting can sense a hurried or stressed doctor. A brief acknowledgement — "I'm sorry I'm running a little late — please don't feel rushed now you're here" — resets the emotional tone remarkably effectively. It also directly addresses Q1, often before the main consultation has even started.
Consciously or not, trainees sometimes inadvertently time their PSQ collection during periods when they are on their best form — which is entirely reasonable. What is less reasonable is timing it during your absolute worst (post-nights, illness, personal crisis). The PSQ is meant to capture representative practice. If you have significant contextual factors that might have distorted your results, mention this to your ES — it is legitimate context for interpreting scores.
Many trainees are surprised — and pleased — to discover that patients often trust and rate trainees highly, sometimes more highly than they expected. There is evidence that patients value the additional time and attentiveness that trainees typically give. A trainee who spends twelve minutes in a genuine, well-structured consultation often outscores a busy established GP who spends seven minutes efficiently but without warmth. The PSQ is measuring something that time in the job does not automatically confer. That should be genuinely encouraging.
💡 "What I Wish I Knew" — Five Insights From Trainees Who've Been Through It
Even when the diagnosis is uncertain — which in GP is much of the time — a clear and honest plan scores well. Patients do not penalise uncertainty. They penalise vagueness. "I'm not entirely sure yet, but here's what we're going to do" consistently outscores a confident but confusing explanation of a definite diagnosis.
The closing two minutes of a consultation have a disproportionate influence on how patients score the whole encounter. Trainees who consult warmly and thoroughly for eight minutes, then rush the explanation and close abruptly, often see this reflected in Q3, Q6, and Q7. The end of the consultation is when patients form their lasting impression. Slow down, especially for the explanation and safety-net.
Rather than just telling patients what to do, adding a brief rationale makes a significant difference to Q5 (confidence) and Q9 (trust). Compare: "Take these tablets" vs "These tablets reduce inflammation, which should ease your symptoms within a few days." The second takes five extra seconds and dramatically improves how competent and caring the doctor seems. Patients feel more involved in their care when they understand the reasoning behind it.
This is especially true in chronic disease, medically unexplained symptoms, and mental health consultations. Patients who leave feeling properly understood — even if the clinical outcome is the same — score significantly higher than those who feel efficiently managed but not truly heard. Slowing down when the patient is speaking about what they're worried about, and reflecting it back specifically, is the single most reliable driver of the "listened to" domain.
A pattern consistently described across trainee experience: a consultation that is otherwise excellent is remembered negatively if the doctor spent significant time looking at the screen while the patient was speaking. Even one noticeable screen-focused moment during history-taking can reduce scores on Q2 (listening) and Q1 (welcome). The fix is simple: when the patient is speaking, turn to them. The notes can wait thirty seconds.
🔧 Three Practical GP Shortcuts That Reliably Improve PSQ Scores
🔧 The "30-Second Summary"
Before explaining your thinking or plan, summarise back what the patient has told you:
"So from what you've told me, you've had this pain for about three weeks, it's been getting worse, and you're most worried it might be something serious..."
Boosts Q2 (listening), Q3 (explanation), and Q9 (trust). Takes 30 seconds. Worth every one of them.
🔧 The "2-Option Explanation"
When presenting a management plan, frame it as options where appropriate:
"We can either monitor this carefully and see how it develops, or treat it now — let me explain what both involve and what I'd recommend."
Directly boosts Q4 (involvement in decisions) and Q5 (confidence in plan) — two of the domains trainees most often score lower on.
🔧 The "Closing Loop"
Close every consultation by explicitly looping back to the plan and safety-net:
"So to summarise what we've agreed today... and if things don't improve — what would you do?"
Strongly improves Q6 (next steps) and Q7 (what to do if worse) — the two questions trainees most consistently score lower on.
🗣️What Trainees & Trainers Actually Say — Community Wisdom
This section gathers insights from UK GP training community discussions, deanery trainer-facing resources, regional VTS guidance, and published research involving trainees and trainers. These represent recurring themes and honest reflections from those who have been through the PSQ process — curated and verified against official guidance so that nothing here contradicts RCGP or GP educator advice.
💬 Recurring themes from trainees who have been through the PSQ
One of the most consistently reported experiences is the surprise trainees feel when they actually read the 9 PSQ questions for the first time — especially after they've already started collecting responses. Trainer-developed resources from multiple UK schemes now specifically advise trainees to familiarise themselves with all 9 questions before starting — not as coaching, but because knowing what patients are being asked fundamentally shapes how you consult in the weeks leading up to it. You cannot improve your safety-netting if you don't know patients are being asked about it.
A repeatedly mentioned practical difficulty across regional training scheme guidance is the challenge of actually getting sufficient completed responses within the window. Trainees report that electronic responses have lower return rates than paper (patients click away), that busy reception staff can forget or deprioritise the forms, and that certain session types (very brief admin reviews) yield patients less willing to complete a full questionnaire. The practical advice that consistently emerges: start earlier than you think necessary, brief your receptionist properly, and if possible run a mix of electronic and paper simultaneously.
Published research involving trainees consistently shows that many are genuinely surprised by how positively patients rate them. This reflects something well-documented in GP training: anxiety about the PSQ process is often worse than the results. Trainer-facing resources from multiple UK deaneries specifically note that trainees tend to be harder on themselves in their self-assessment than patients are in their ratings — making the self vs patient comparison one of the most instructive parts of the whole exercise.
Across trainer-facilitated feedback discussions and educational supervisor guidance, the consistent finding is that written comments — even when mostly positive — carry disproportionate emotional weight. Trainers are specifically advised to be present when trainees read their free text for the first time, or to create a protected space to discuss them. The educational value in written comments is high; the emotional impact can be significant. Knowing this in advance takes some of the sting out.
Timing pressure is one of the most frequently cited practical problems in PSQ completion. Regional training schemes specifically flag that trainees who start the PSQ in the final weeks of ST3 risk running out of time for the full cycle: data collection → admin entry → ES review → ES meeting → reflection → log entry. The guidance from multiple deaneries is consistent: agree the timing with your ES at your first ST3 supervision meeting, plan for at least 6–8 weeks from start to completion, and treat the PSQ as a planned clinical event, not a box to tick at the last moment.
This is a repeatedly documented learning failure in WPBA log entries generally — and the PSQ reflection is no exception. Educational supervisors and ARCP panels consistently distinguish between descriptive entries ("my scores were X, the median was Y") and genuinely reflective ones that show analysis, insight, and a specific action plan. Trainer resources explicitly note that simply reporting the numbers without engaging with what they mean, why they came out that way, and what concrete changes will follow is considered an inadequate reflection — regardless of how good the scores were.
📊 What published research tells us (relevant to trainees)
Patients in training practices rate their doctors more highly
A large national study using GP Patient Survey data (published in the British Journal of General Practice) found that patients registered with GP training practices rated the doctor care domain of questions significantly higher than patients at non-training practices — and this effect held specifically for trainees, not other staff. The patient satisfaction advantage associated with training practices was consistent across multiple metrics.
What this means for you: patients in training environments are often positively predisposed to the extra care and time they receive from trainees. Your PSQ is not being completed by patients who are disappointed to see a trainee instead of a "real" GP.
IMGs are more positive about WPBA — including PSQ
A 2024 national survey of 1,176 GP trainees and 912 trainers (published in Education for Primary Care) found that international medical graduates (IMGs) trained outside the EEA were significantly more positive towards WPBA overall than UK graduates. Contrary to commonly expressed concerns, the research found no significant differences in outcomes by sex or ethnicity. Both trainees and trainers were generally positive about WPBA assessments including the PSQ.
What this means: if you are an IMG and have been anxious about whether the PSQ process is fair to you — the evidence says it is.
🌍 IMG-Specific Insights — The Consultation Model Adjustment
Several themes emerged specifically from IMG discussions in UK GP training communities that are worth addressing directly — not because IMGs perform worse (the research says they don't), but because the adjustment involved is real and specific.
IMGs frequently describe the most significant adjustment as shifting from a doctor-centred, directive consultation model — where the doctor leads, asks structured questions, and presents a plan — to the patient-centred, conversational model expected in UK general practice.
In UK GP, patient satisfaction is closely tied to patients' sense of being heard, respected, and included — not just to clinical accuracy. The two models are not incompatible, but the adjustment is real.
"I used to wait for the patient to finish and then give the plan. But in UK GP, the consultation is a dialogue. The patient's story has to come out fully, and the doctor's job in the first five minutes is mostly to listen, not to ask questions."
— IMG trainee reflection from UK GP training community discussion
A widely shared observation in UK GP training communities: patients recall how you made them feel more readily than what you actually said or prescribed. This aligns precisely with the PSQ domains around warmth, trust, and respect — all of which are emotional impressions rather than clinical assessments.
Trainees who focus heavily on getting the management plan clinically correct but neglect the emotional tone of the consultation often receive lower PSQ scores than their clinical accuracy would suggest. The PSQ is not testing your medical knowledge. It is testing your ability to make patients feel cared for.
The GP training community consistently flags keyboard/screen use as one of the most common sources of lower PSQ scores. Specific techniques that work:
- Type key words only during data-gathering — write up the full consultation note after the patient leaves, or during natural pauses
- Position your monitor so you can maintain eye contact more naturally without turning away fully
- Signal explicitly when you need to look something up — this single habit converts a trust-reducing moment into a trust-building one (see below)
Patients notice when a doctor looks something up — and the unspoken subtext matters. Searching silently while a patient watches can feel dismissive or alarming.
The transparency technique: be explicit about what you're doing.
"I want to double-check the latest guidelines on this for you — bear with me one moment."
This converts a potential loss of confidence into a demonstration of thoroughness and honesty — both of which directly support Q5 (confidence in the plan) and Q9 (trust).
A 2024/2025 qualitative study involving 20 GP trainees who had received a developmental ARCP outcome (Outcome 2 or 3) — meaning they had not progressed as expected — identified several patterns relevant to the PSQ specifically. Themes emerging from their experiences included:
- Insufficient engagement with feedback — acknowledging the data but not genuinely interrogating it or producing a meaningful action plan
- Delay in addressing identified difficulties — receiving feedback that revealed a development need and not acting on it promptly, leading to the same issue appearing in subsequent assessments
- Inadequate documentation of learning — developing genuinely as a result of feedback but failing to document that development in the ePortfolio in a way the ARCP panel could see
The learning: the PSQ does not fail trainees. What leads to difficulty is when trainees receive developmental feedback and do not visibly act on it. Engaging genuinely with PSQ results — and documenting that engagement — is what protects you at ARCP.
🗺️ Practical nuances from regional training schemes
Several regional schemes (including Barnet VTS) flag an important practical detail: a PSQ report will not be generated unless the minimum number of responses have all questions answered. Partially completed forms do not count towards the minimum. This means that even if you have 34 responses entered, if several are incomplete (patients skipping questions), you may not reach the threshold for a valid report. The practical advice from these schemes: aim to collect around 40–45 responses to allow for incomplete forms — giving you a buffer over the minimum. This is why some regional guidance still quotes 40 rather than 34: it is a safe practical target, not a different minimum.
Multiple regional VTS schemes specifically advise trainees to agree a date for both the start of the PSQ process AND the feedback discussion with their ES at the very beginning of the PSQ cycle — not after the results are in. This serves two purposes: it forces early planning (preventing the common "left it too late" mistake), and it creates a protected conversation slot that doesn't get squeezed out by the rest of a busy ES agenda. Several schemes also recommend setting this date at the first ST3 ES meeting, before the PSQ process has even started.
Following the PSQ feedback discussion with your ES, the conversation itself can be recorded using the Professional Conversation Log in the Education section of the FourteenFish ePortfolio. This is separate from your main reflective log entry on the PSQ results. Using both — a conversation record and a personal reflection — creates richer evidence for ARCP panels and demonstrates that you engaged substantively with the feedback process, not just the numbers.
NHS England WPBA guidance explicitly notes that PSQ responses via FourteenFish follow the same standard as the GMC's requirements for revalidation-grade patient feedback. This means the skills you develop for PSQ — seeking patient feedback, reflecting on it, acting on it, documenting that action — are exactly the same skills you will use for GMC revalidation every five years as a qualified GP. Making this connection explicit in your reflection is genuinely impressive to ARCP panels: it demonstrates awareness that your development doesn't stop at CCT.
📖 What the RCGP curriculum says about the consultation — mapped to PSQ domains
The RCGP's own curriculum guidance on Consulting in General Practice contains several principles that map directly onto PSQ domains. These are not just educational ideals — they are the standard the curriculum expects you to demonstrate:
| RCGP Curriculum Principle | PSQ Domain(s) it underpins | What it means in practice |
|---|---|---|
| "Discovering the reason for attendance — including a specific worry or anticipated outcome — is important to properly address concerns and improve satisfaction" | Q2 (listening), Q4 (involvement), Q5 (confidence) | Actively exploring ICE is not optional — it is curriculum-level expectation, and patients notice when it doesn't happen |
| "Patient-centred consulting involves being attentive to what people are communicating both verbally and non-verbally" | Q1 (welcome), Q2 (listening), Q6 (trust) | Non-verbal attentiveness — eye contact, posture, facial expression — is curriculum content, not personality |
| "Longer consultations have been linked to better health outcomes, increased patient satisfaction and enablement scores — balanced against competing demands" | Q3 (explanation), Q4 (involvement), Q7 (knowing what to do) | The evidence supports longer consultations but recognises the reality of NHS demand — structured, efficient warmth beats rushed thoroughness |
| "Patients sometimes prefer to delegate autonomy, particularly in times of illness or distress — being willing to take responsibility when appropriate is patient-centred care" | Q4 (involvement), Q5 (confidence) | Shared decision-making is not about forcing choices on reluctant patients — calibrate to what this patient wants right now |
| "Constructive feedback on your consultation — from both patients and colleagues — can help improve consulting skills" | All domains | The curriculum explicitly values patient feedback as a learning tool, not just a compliance exercise — frame your PSQ reflection with this in mind |
Several UK training schemes (including the Pennine GP Training Scheme) run structured consultation video teaching sessions where trainees watch demonstrations of both good and poor consultations — specifically to identify what communication skills look like in practice, not just in theory. If your scheme offers this, the PSQ collection period is an ideal time to engage with it: watch how other doctors handle the opening of a consultation, how they explore concerns, how they explain and safety-net. These are transferable skills, and watching them modelled by experienced GP educators accelerates development more reliably than description alone. Your trainer can set up COTs in the same period to triangulate what you're learning from PSQ feedback against directly observed consultation behaviour.
🧠Memory Aids & Cheat Sheets
The PSQ Mnemonic: WELCOME
PSQ quick-check: Before closing every consultation
- ☑ Did I look at the patient while they spoke?
- ☑ Did I summarise what I'd heard?
- ☑ Did I explain in plain language?
- ☑ Did I ask what the patient thought or wanted?
- ☑ Did I explain what happens next specifically?
- ☑ Did I safety-net with named symptoms and clear escalation?
- ☑ Did I check the patient understood? ("Does that make sense?")
Run this mental checklist in the last 60 seconds of every consultation during the PSQ collection period — and beyond.
The PSQ at a glance
For Trainers & Educational Supervisors — Teaching Pearls
Facilitating the PSQ feedback discussion
The ES discussion about PSQ results is one of the most valuable — and potentially sensitive — conversations in GP training. Handle it well and it can unlock significant reflection and growth.
- Review results yourself first — be familiar with scores, range, and free text before sitting down with the trainee
- Note whether scores align with your own observations of the trainee's consultation skills
- Prepare a few open questions based on specific domains where scores stand out
The 6 comparisons to structure your discussion
- a) Their score vs their self-assessment — explore gaps with curiosity, not judgement
- b) Their score vs peer scores — frame as context, not competition
- c) Their score vs previous PSQs — look for trajectory over time
- d) Free text themes — spend time here; often the richest learning
- e) Domain scores + other WPBA evidence — does a low "listening" score align with COT observations?
- f) Stop/Start/Continue/Change — always end with a clear, specific action plan
Useful tutorial prompts
- "Which scores surprised you most — positively or negatively?"
- "Where do you think the gap between your self-assessment and patient scores comes from?"
- "What do patients mean when they mention [specific free text theme]?"
- "Which of the 9 areas feels least natural for you right now?"
- "If you could change one consultation habit tomorrow, what would it be?"
- "How does this fit with what you've noticed in your COTs?"
Common trainee blind spots
- Screen time — trainees often don't realise how long they look at the computer while patients speak
- Generic safety-netting — "come back if worried" feels like safety-netting to the doctor, not the patient
- Assuming explanation = understanding — saying the right words is not the same as checking comprehension
- Invisible paternalism — making decisions without meaningfully involving the patient
- Over-medicalising emotional distress — patients with emotional concerns sometimes feel "processed"
- Acknowledge the data clearly and compassionately — don't minimise
- Explore context — new practice, illness, complex patient mix?
- Co-construct a targeted, specific, time-limited learning plan
- Use focused COTs to observe the specific domains that scored lower
- Agree to repeat the PSQ and document this in the ePortfolio with rationale
- Where concerns are significant or persistent, consider referral for additional deanery support
❓Frequently Asked Questions
How many times do I need to complete the PSQ?
Just once — during ST3, recommended after the midway point. If scores are significantly below the peer average, your ES may recommend repeating it to demonstrate progress, but this is not automatic.
Why is the minimum 34 responses? I've seen 40 mentioned elsewhere.
The current RCGP requirement (2017 PSQ revision) is a minimum of 34 responses. Older guidance and some legacy resources referenced 40 — this reflects the old form. Print around 50 copies for a sensible buffer. Always use 34 as the current minimum.
Can I use telephone consultation patients for the PSQ?
Yes — the RCGP explicitly confirms the PSQ is acceptable following telephone as well as face-to-face consultations. The electronic link sent via practice messaging is the most practical approach for telephone patients. Ensure the message comes from the practice team, not from you personally.
What if I can't get a receptionist to help?
Speak to your practice manager and ES. The anonymity rules are clear — you must not hand out or collect forms yourself. Options include electronic-only distribution via practice messaging, or asking another team member. Do not default to handing forms out yourself.
A patient has written something really hurtful. What do I do?
This is more common than people admit. One harsh comment in 34+ responses may not be a pattern. Acknowledge your emotional reaction honestly — with your trainer or ES. Then ask: is there a kernel of truth worth acting on? If you are significantly distressed, speak to your ES or VTS pastoral support. Your wellbeing matters too.
My scores are lower than my peers — will this affect my ARCP?
Low scores alone do not automatically lead to an adverse ARCP outcome. What matters is: have you reflected meaningfully? Identified learning needs? Documented an action plan? Trajectory over time counts. If scores improve on a repeat PSQ alongside developing WPBA evidence, ARCP panels are designed to see that as positive progress. Talk to your ES early if concerned.
Can I see which patient wrote which response?
No. All responses are anonymous. You see aggregate scores (mean, median, range) per question and free-text comments, but not which patient said what. This anonymity protects honesty and is non-negotiable.
Does the PSQ measure my clinical knowledge?
No — it measures patient experience of the consultation. It captures communication, empathy, and relationship-building — not diagnostic accuracy. It sits alongside, not above, the clinical WPBA tools. It specifically evidences Communication & Consultation Skills, Person-Centred Care, and Professionalism.
✅Final Take-Home Points
The 10 Things to Remember
The PSQ is not a hurdle to clear. It is one of the few moments in GP training where you receive unfiltered, structured feedback directly from the people you are training to serve. That is a privilege — and a remarkably useful mirror. Use it as one.
Bradford VTS — The universal GP training resource for everyone, everywhere. 🌱