Confirmation bias is our tendency to search for, interpret, and remember information that confirms our existing beliefs/opinions while overlooking contradictory evidence. It’s one of the most pervasive cognitive shortcuts (heuristics) affecting most people’s daily decisions.
Confirmation bias has deep evolutionary roots. Our ancestors didn’t have time to carefully weigh every piece of evidence; quick decisions often meant survival. The brain evolved to use heuristics that conserve cognitive energy and calories. Because processing information requires significant mental effort, our minds developed efficient strategies: stick with what you already “know” works, and filter out conflicting information. This usually generates “good enough” decisions quickly, which helped us navigate dangers and opportunities without ‘analysis paralysis’.
Imagine you’re convinced that a particular route to work is fastest. One morning, you hit unexpected traffic and arrive late. Instead of reconsidering your route choice, you think, ‘This is unusual, there must have been an accident’. The next day, you arrive on time and think, ‘See, this route really is the best’. You remember the smooth commutes vividly while dismissing the delays as anomalies. Meanwhile, you’ve never actually tested alternative routes because you’re already ‘certain’ yours is quickest.
We follow news sources that align with our politics, interpret ambiguous feedback from our boss based on our existing opinion of them, and notice evidence supporting our favourite sports team’s superiority while forgetting their losses.
Understanding confirmation bias doesn’t eliminate it, but recognizing when we might be cherry-picking evidence can help us make more balanced decisions, even if it costs us a few extra mental calories and a few extra seconds.
Here’s a powerpoint to help teach this topic to your IB Diploma Psychology class…
In IB Psychology, mastering subject-specific vocabulary isn’t just about definitions, descriptions and explanations, it’s about communicating with precision. The assessment criteria highlight this directly: examiners will be looking for accurate and consistent use of the language of psychology. Using the correct terminology makes students’ exam answers clearer, more credible, and much more likely to score in the higher markbands.
And how to learn vocabulary? Copy it, copy it’s meaning, and then use it in your writing at every possible opportunity.
Take, for example, the difference between saying “the study was good” and “the study had high internal validity because confounding variables were controlled.” The second statement not only shows that you know the correct term but also demonstrates deeper conceptual understanding. Words like validity, reliability, quasi-experiment, overt, covert, mean, median, determinism and many others allow students to describe research and arguments with the level of precision that examiners expect.
But the importance of vocabulary extends beyond exams. In class discussions, using appropriate and relevant terminology sharpens students’ arguments and makes their reasoning clearer to others. Even beyond the classroom, being able to apply psychological language helps students engage in more nuanced conversations about behaviour, mental health, and social issues.
So how do students achieve this? The glossary at the back of the Subject Guide is a solid starting point, but it’s very limited. That’s why we’ve compiled a resource with nearly 1,000 psychology words and phrases, organised into three sections: Concepts, Content, and Contexts. Available both in print (perfect for highlighting and making margin notes) and as an eBook (ideal for quick reference on the phone), this book gives students the tools to expand their psychological vocabulary and, with it, their confidence in exams.
As DP Coordinators and teachers pore over this year’s IB Diploma results, now is the time to think not just about content delivery, but on how your students are learning. The key to improving your school’s performance probably isn’t more revision sessions, tougher mock exam marking, greater intensity with internal assessments or even more teacher training…, the key to success in school is much more likely to be a shift in student mindset, habits, and engagement.
IB Diploma Psychology – Success at high school or college, written by a teacher with 30+ years of experience, offers exactly that: a practical, common-sense guide to helping students become more disciplined, proactive, and resilient—qualities that consistently lead to stronger academic outcomes in school and beyond.
Unlike quick-fix test strategies, this book provides a blueprint for long-term success: attending every class, managing time effectively, building positive relationships with teachers, and understanding how learning actually works. It’s simple, clear, and designed to be implemented now—not in theory, but in everyday student life. This book is practical – and it’s written for and to your students.
If you’re serious about lifting your students’ IB Diploma results in 2026 and beyond, make this book part of your school’s toolkit. It may be the most powerful change you make this year.
At this stage of the two-year IB Diploma course, many teachers are wondering about their students’ final grades, perhaps trying to reconcile what they considered a near-perfect mock exam result and a near-perfect Internal Assessment result with the middley grade that the student eventually received following the actual exam session. Some teachers question their ability to interpret the Subject Guide and the assessment criteria descriptors, but they shouldn’t, especially if they’ve put time and effort into reading and understanding the Guide, attending training workshops and engaging with their MyIB subject community.
During the exam session, the students’ exam scripts are sent to the markers via a scanning centre. The students’ answer papers are scanned and uploaded to the online marking database. Markers, who have received mark schemes and undergone training then access the database and read/mark the exam scripts. They do this quickly to meet deadlines and quotas.
About every 10th exam script is called a ‘seed’; it has already been marked, and the marker’s marks are compared with the existing marks. If the marker’s marks are within an acceptable tolerance range, the marker continues to access the database of exam scripts. If the marker’s marks are too different from the seed’s marks, the marker is diverted for more training, and may return to the database of scripts if/when their marking becomes more accurate/aligned with those of the chief marker (sounds Orwellian, right?)
Each marker is assigned to mark only Paper 1, 2 or 3.
Now the computer takes over. The marker’s marks are moderated to be consistent with the chief marker’s marks. This is an attempt to standardise the marker’s marks through the whole marking session.
A combination of people and computers confirm the mark boundaries. People will pull out papers on the boundaries, read answers and ask if this set of answers is consistent with the Grade 7, 6, 5, etc. descriptors. The computer then adjusts grades to ensure a certain percentage of students achieve a 7, 6, 5… This is called scaling. It can be controversial, especially when assessment is supposedly done with respect to assessment criteria descriptors which are objective and in theory are either achieved or not achieved. Scaling though protects against grade inflation, which can occur when teachers and students learn what is required to achieve a 7 as each set of results occurs, a greater % students achieving the higher grades.
The internal assessment is marked by teachers and the grades for each of the 4 criteria are entered into IBIS. IBIS then selects a sample of high, middle and low scoring IAs and the DPC uploads the digital copy of the selected sample which are then check marked by an experienced and trained/supported moderator. This moderator enters a grade for each criterion and the computer software then adjusts the teacher’s full set of results (not just the sample IAs’ results)… for exam Criterion A marks may be moderated up by a small percentage and Criterion C grades might be moderated downwards by a lot and Criteria B and D may not change. These moderations are applied to a school’s full cohort, pro rata, i.e. taking into account the unmoderated marks awarded by the teacher. It’s an odd procedure based on dubious logic. (It’s really odd when a moderator’s own students’ IA marks are moderated by a different moderator and the marks go down by a lot.)
These moderated marks are then scaled to match an expected % of grades. And yes, that’s also contrary to the philosophy of criteria-based assessment.
There’s a lot to like about this assessment system. Human markers’ grades are checked frequently to ensure they match the chief marker’s standard for each component (Papers 1, 2, 3, and the IA). The papers at the grade boundaries are checked against the grade descriptors. IA moderators’ marks are moderated by senior moderators…, and then the computer applies grade boundary checks and scales marks to meet grade distribution expectations.
And when grades are received students can submit an EUR – an enquiry upon results at several levels… check the component marks were correctly calculated, a re-mark of papers or even the IA which is problematic to understand because a student excluded from the sample may have to find and then submit their IA even if it was not one of the uploaded sample of IAs selected for moderation. That doesn’t bear thinking about for too long though. If the student’s grade (not mark) changes after an EUR the (hefty) fee is refunded, but if not, not, so… that doesn’t bear thinking about for too long either.
The big question that many teachers ask is, ‘Why are the actual grades not as good as the grades I think their students should get?’
We tend to mark our students higher than real examiners because we tend to give our students ‘benefit of the doubt’ marks.
Markers are less patient with difficult-to-read handwriting, while teachers come to learn the students’ handwriting.
Markschemes tend to be written with more detail than the subject guide’s assessment descriptors.
Teachers sometimes base marking on ‘knowledge’ gathered from unofficial, for-profit subscription-based sites and really unofficial sources such as Facebook groups. Psychology has one FB group that is well known for distributing incorrect information – and now you’ve now been warned!
Scaling. It’s quite likely that experienced teachers’ marks are similar to the actual marks, but after papers and IAs have been marked/moderated, the marks are scaled so that the grades are distributed as per IB’s grade distribution ‘formula’ for each subject. The difference (and the cause of so much angst) is likely due to the scaling factor.
In the end though, the grades are just one pillar of what students, teachers and schools achieve. In many ways, that final grade can be distorted, for example, which subjects were chosen to construct the student’s Diploma, which subjects were done at HL and SL, was Language B really a student’s second language or a second first language, how much support was given with the IAs, EE, TOK assessments, how much time went into CAS, etc. What we do know is that most IB Diploma students develop critical thinking skills, they develop an appreciation for internationalism and they appreciate the value of creativity, activity and service. And they all develop in alignment worth the Learner Profile to some extent – becoming better communicators, more open-minded, more thoughtful, more caring… and the IB doesn’t scale these (probably). So… don’t stress too much about the number on the results page. Our students and the teachers’ efforts will always be more than that Diploma score.
If you liked reading this, please subscribe to our blog.
We’ve just added a new free worksheet designed to support DP Psychology HL students as they prepare for Paper 3, particularly Question 4 on the role of technology in the health and wellbeing context.
Based on a recent Guardian article exploring how young people are taking control of their smartphone use to manage mental health, the worksheet guides students through a critical reading and reflection process. It encourages them to consider how media shapes public understanding of digital wellbeing.
A key feature of this activity is a close look at the reference to the Netflix series Adolescence—a dramatized portrayal of online misogyny. The worksheet prompts students to explore the Responsibility of including fictionalised content within an otherwise fact-based article. Is it appropriate? Does it blur the line between evidence and entertainment? What are the ethical considerations?
The aim is to help students build their own informed responses, drawing on both media literacy and psychological concepts relevant to wellbeing in the digital age.
We’re excited to share a new (and free) resource created to support teachers and students in the new IB Diploma Psychology course: “How to answer exam questions.”
This document will demystify the exam requirements across Papers 1, 2, and 3. It provides clear, structured guidance on how to approach every question.
Step-by-step structures for each question
Tips for content selection and writing focus
The document is provided in Word format so you can copy and edit it to suit your purposes or share it directly with your students. A suggested use: Give a copy to students when they are doing practice exams so they can see exactly how to structure their responses.
We hope this document helps make exam preparation a little smoother for everyone. Feel free to share it with colleagues.
If you’re looking for even more comprehensive support, check out our book IB Diploma Psychology – Ten Mock Examinations with Model Answers. It includes TEN full mock exams (Papers 1, 2, and 3) complete with high-scoring sample responses for every question. Use it to plan your mocks, guide student revision, and sharpen your understanding of exactly what to teach for exam success.
As teachers, all we really want to know is: Did they learn what I think I’ve taught them? The challenge is finding a simple, fast way to answer that question.
Oral checks at the end of class are great — but with 20 students, I’d need another lesson just to ask them all. Exams have their place, but in IB Diploma Psychology, written exams often only test a small slice of the syllabus. We know Social Identity Theory might not even show up. So how do I know if my students truly understand it?
Over the years, I’ve had a love-hate relationship with multiple choice tests. They take time to write, but once they’re built, they become one of the most efficient formative tools we have. Students actually like them — they can focus purely on content without the pressure of writing structure, command terms, or phrasing. I simply get a clear snapshot: do they know it or not?
That’s why I created a book of 70 multiple choice tests for the new IB Psychology syllabus. No, MCQs aren’t part of the official exam. But as quick, focused checks of cumulative knowledge, they’re one of the best tools I’ve found to guide my teaching.
The global Psychology community mourns the loss of a pioneer. Dr Phil Silva, founder of the world-renowned Dunedin Multidisciplinary Health and Development Study – better known simply as the Dunedin Study – passed away on Thursday at the age of 84.
For IB Diploma students, the Dunedin Study on which the longitudinal study that Caspi et al. (2003) is based.
Dr Silva’s legacy is nothing short of extraordinary. In 1972, he began following the lives of 1037 babies born at Queen Mary Maternity Hospital in Dunedin. Over 50 years later, the study continues, with a participation rate of around 90% – an unmatched achievement in longitudinal research worldwide.
As psychology teachers, we often search for meaningful case studies that demonstrate the real-world impact of psychological research. The Dunedin Study is one of those rare, gold-standard examples. It has helped reshape our understanding of child development, health, education, and mental wellbeing – and much of its success is owed to Dr Silva’s vision, energy, and compassion.
Before becoming a psychologist and researcher, Phil Silva was a primary school teacher. Teaching rural children in the 1960s deeply influenced his later work – a foundation built on empathy, curiosity, and a commitment to supporting young people and their families. That passion carried through into his academic career, where he completed a master’s and PhD under Otago University’s Dr Patricia Buckfield, who herself had a keen interest in neonatology.
Together, their early work collecting data on babies born in Dunedin from 1967 to 1973 laid the groundwork for the creation of the Dunedin Study. But it was Silva’s leadership, drive, and charisma that propelled the project forward. Against the odds – with minimal funding and few formal resources – he rallied hundreds of volunteers who believed in his mission to improve children’s lives.
Silva’s research didn’t just stay in academic journals. It influenced public health policy, helped normalise routine check-ups for preschoolers, and highlighted the impact of conditions like glue ear on child development. He spoke passionately about the need for society to prioritise children’s wellbeing. In one memorable critique, he noted that New Zealanders were more likely to service their cars than check on the health of their children – a powerful call for compassion and systemic care.
His work was internationally recognised. In 1993, the Dunedin Study made the cover of Time magazine under the headline: “All You Need is Love”. A year later, Silva was awarded an OBE for services to health and education.
Dr Silva’s influence extended far beyond data collection. He was a mentor to the late Professor Richie Poulton, who succeeded him as study director, and to the current director, Professor Moana Theodore, who first joined the team as an interviewer during the age-26 assessment phase.
Theodore describes him as “an energetic mentor” with a unique ability to bring people together in service of a bigger purpose: improving lives. That ability is reflected in the enduring loyalty of study participants – many of whom have stayed involved for over five decades.
She beautifully summed up his contribution:
“Dr Phil has left this legacy and a taonga [prized treasure] for New Zealand… the best childhood foundation guarder in the world – and the most studied group of people anywhere in the world.”
For those of us teaching psychology, the Dunedin Study is a model of longitudinal research excellence. It’s a case study we can use not only to explain developmental psychology, biopsychosocial models, or research methods—but also to inspire our students to see how psychological science can serve real people, communities, and policy.
Dr Phil Silva didn’t just collect data—he created change. He showed us what psychology could be at its best: compassionate, evidence-based, and relentlessly committed to human wellbeing.
Let us honour his legacy by continuing to teach with the same curiosity, purpose, and care.
Rest in peace, Dr Silva. Your work lives on—in policy, in classrooms, and in the lives of 1037 individuals who helped the world better understand what it means to grow, change, and thrive.
Newsflash: Estonia’s Minister of Education and Research, Kristina Kallas, emphasized Estonia’s proactive and open approach to digital tools in education during a speech at the Education World Forum in London. Unlike many European countries that are cautious about screen time and mobile phone use in schools, Estonia encourages the use of smartphones for learning. Schools set their own rules, and students—particularly those aged 16 and up who are eligible to vote online—are expected to use their phones as tools for both civic participation and education. Kallas notes the absence of problems related to mobile use, crediting Estonia’s digitally fluent society and schools’ autonomy.
Kallas highlighted Estonia’s long-standing digital investment, starting with the 1997 Tiigrihüpe (Tiger Leap) programme, which brought internet access to all schools. Now, the country is embracing AI and smartphone technology as the next evolution in education. Kallas predicts the decline of traditional homework essays and rote learning, pushing instead for a focus on oral assessment and the development of high-level cognitive skills. She frames this shift as essential in keeping pace with the capabilities of AI, warning that if humans don’t evolve cognitively, technology may overtake them.
Yes. The most powerful learning tool ever, so let’s teach students how to make the best use of them. Teaching… it’s what we do.
As teachers, we’ve all been part of the debate: are mobile phones a distraction or a tool in the classroom? Estonia’s Minister of Education, Kristina Kallas, offered a refreshing perspective this week that challenges many of our assumptions. In Estonia, mobile phones are not banned in schools—they’re embraced. Why? Because they reflect the real world students are living in, and Estonia sees them as integral to learning and civic life. Sixteen-year-olds vote online using their phones. It would be illogical, Kallas argues, to deny them the same access in a classroom setting.
This approach got me thinking. In IB Psychology, we ask students to critically evaluate, think metacognitively, and link psychological theory to the real world. What better way to model that than by integrating the very tools students already use to explore and interact with that world? Estonia isn’t ignoring the risks—phones aren’t used during breaks, and younger students face tighter limits—but they are trusting teachers and schools to manage these decisions locally.
The most provocative idea Kallas raised is that AI may render essays and rote learning obsolete. That’s a bold claim. But if AI can generate knowledge quickly and accurately, then the role of education must shift towards helping students think better—to question, synthesise, communicate, and reflect. Isn’t that what we’re already trying to do in the IB?
Rather than fight the tide, maybe we should, like Estonia, ride it. But there’s no maybe about it. We should. We must!Our job, our responsibility, is to help students use them correctly, with integrity and responsibilty and respect.
One of the most exciting features of the new IB Diploma Psychology course is the emphasis on class practicals, which give both Standard Level (SL) and Higher Level (HL) students a chance to engage directly with psychological research methods. These activities are more than just experiments, they are designed to support critical discussion of each research approach, helping students build practical understanding while making connections to real-world contexts.
Class practicals are embedded into each of the course’s four contexts: Health and Well-being, Learning and Cognition, Human Development, and Human Relationships. Each context is linked with a specific research method:
In Health and Well-being: Interview.
In Learning and Cognition: Experiment.
In Human Development: Observation.
In Human Relationships: Survey/Questionnaire.
Interview
Each Context in the Subject Guide includes a list of suggested class practicals, but these are examples only and teachers are encouraged to tailor the activities to their own students and their local setting. For instance, under the Health and Well-being context, students could conduct focus group interviews on how peers manage stress, or investigate student perceptions of mindfulness practices after a class activity. Other examples include semi-structured interviews with professionals such as school counsellors or fitness coaches, or interviews exploring links between social media use and self-esteem, or exercise habits and mood.
Importantly, Tom Coster’s Textbook includes dedicated guidance for the In Class Practicals in each context, helping both teachers and students make the most of these learning opportunities.
The In Class Practical is not just a learning activity — it is also formally assessed in Paper 2, Section A, where students respond to four structured questions related to one of the 4 Practicals they completed. These questions test students’ understanding of methodology, concepts like bias or ethics, and the ability to design or compare research approaches.
In short, In Class Practicals bring psychology to life, giving students the opportunity to be researchers themselves and fostering deeper, more critical engagement with the subject.
Here is a free downloadable document describing how to complete the In-Class Practical for an interview. It includes summary notes that students would use to prepare for Paper 2 Section A.