The tests have been conducted and the observations compiled. For some psychologist practitioners, the most daunting step still awaits: how best to convey the complexities of what’s been learned about an individual in a single report.
Developing and honing psychological assessment report writing skills is not easy, says Hadas Pade, PsyD, an assistant professor at Alliant International University’s California School of Professional Psychology in San Francisco, who co-leads workshops on writing meaningful reports. Report writing is sometimes given short shrift in psychology training programs, she says, which focus more on teaching test administration, scoring and interpretations.
To be useful to a broad mix of potential readers—patients, their families, school officials, other clinicians and even possibly a judge—every report must focus on quality and clarity, says A. Jordan Wright, PhD, a clinical faculty member at New York University and editor of “Essentials of Psychological Assessment Supervision” (Wiley, 2019).
That means the report must rest on empirically solid data, synthesized and explained at a level that a nonpsychologist can understand, and answer the central question at hand—for instance, “What’s underlying the patient’s problems with attention?”
That final component might seem more than a little obvious, Wright says, “but clinical assessment reports can meander, and we can get lost. So, I look for, ‘Did we answer the question clearly?’”
Pade, Wright and other clinicians provide more guidance on how to take your reports to the next level.
■ Verify that your report relies on solid data. Before the report writing even begins, make sure that the tests you will be citing in the report are backed by the latest research and were used appropriately, Wright says. That process includes keeping up with the literature regarding whether a test produces different results for different racial/ethnic groups. “We need to know, ‘Is that test biased, or is it reflecting real population differences?’” Wright says.
A good rule of thumb, Wright says, is to consider whether the report’s underlying assessment would meet a forensic standard. “We tend to have very high standards for forensic evaluations because they have to be defensible in court,” he says.
■ But don’t hide behind the test results. W. Joel Schneider, PhD, an associate professor in the department of psychological studies in education at Temple University in Philadelphia, says that when writing reports early in his career, he focused too much on the tests he used and the underlying data, almost “like my audience was a bunch of skeptical reviewers,” he says. “But my sense is that most report readers are not looking to be convinced by the evidence.” Their main concern, he says, is understanding how to solve the problems that prompted the evaluation.
Schneider lists the test results in an appendix. But he often doesn’t even include the names of the tests that were administered in the report’s written section. In short, don’t hide behind the data, he advises, but rather write about what those findings reflect about that individual.
“What I’m writing is my final judgment, and I’m taking responsibility and I’m not going to slough it off on the test,” says Schneider, who co-authored “Essentials of Assessment Report Writing” (Wiley, 2018). “If I’m not confident enough that it’s true, then my assessment isn’t finished.”
■ Consider incorporating diversity and cultural context. Alea Holman, PhD, frequently notices that some key context is missing from the reports she’s reviewed from students and other clinicians. A report might detail the patient’s family background, romantic relationships and educational or developmental history, she says, but lack a section that explores other formative experiences, including those that involve race, sexuality or socioeconomic status.
Including such a section in assessment reports as a matter of routine would encourage more psychologists to ask related open-ended questions about diversity and cultural context, says Holman, an assistant professor at Fordham University in New York City. She says that psychologists need to “humble ourselves enough to be able to at least try to feel what it’s like to be that client in our social-political world, and to understand how their thoughts and behaviors may very well be adaptive to the environment and time and place that they’re living in.”
■ Synthesize and conceptualize the findings. For many patients —such as children with attention difficulties—the psychologist will have gathered a bevy of data from various sources, including test results and collateral sources such as teacher and parent reports, Wright says. But the findings from those different sources shouldn’t be written up in their own separate sections. “Because then the reader has to go and search out the data on hyperactivity or inattention in each of those sections, and make a determination about what that means.”
Instead, the psychologist should integrate findings from multiple sources into a single section on inattentiveness or another issue, Wright says.
Along similar lines, he advises against writing up lists of patient strengths and weaknesses, which he notes are difficult for individuals to remember about themselves if those attributes aren’t explained within a larger psychological framework. Instead the report should tie them to a model of personality functioning, such as explaining those traits through the lens of attachment theory, he says.
“The idea is that, especially in clinical evaluations, we want [patients] to take our recommendations,” Wright says. “And they’re much more likely to do that if they understand in a very coherent, narrative way how we are conceptualizing them.”
■ Address discordant results. It’s not uncommon for different tests to produce divergent or discordant results, says Robert Bornstein, PhD, professor of psychology at Adelphi University in Garden City, New York. When writing your report, resist the temptation to play up the test you favor and downplay the one with divergent results, he advises.
For example, someone might score high on a performance-based measure of interpersonal dependence, like the Rorschach test, but low on a self-report measure of interpersonal dependence, says Bornstein, one of the editors of “Multimethod Clinical Assessment” (Guilford Press, 2014). These differing results, he says, need to be addressed in the report.
“This now helps in treatment planning,” Bornstein says, “because you know that for this person there will be extra steps in getting them to understand the role of dependency in their personality and behavior.”
■ Strip out the jargon. Because numerous individuals may read the report, Pade says, look at each paragraph and consider: Will a nonpsychologist understand what I’m saying? “If it’s meaningful and palatable to a nontrained reader, it will be for a trained reader as well,” she says.
Pade says this point was driven home early in her career when she was working with parents who sometimes struggled to understand the school reports clinicians wrote about their children. They expressed confusion, she recalls, asking her, “What do these numbers mean? What do these technical or jargony terms mean? What is the overall broader implication for my kid?”
■ Consider the patient’s perspective. Along with outlining patients’ vulnerabilities, it’s also important—for patients, who will likely read the report, as well as for treatment planning—to detail their strengths as well, Bornstein says.
Also, check that your wording won’t seem overly blunt from the patient’s perspective, Bornstein says. For example, if you were writing just for a psychologist, you might say, “Patient is highly narcissistic with poor impulse control.”
How can that same observation be expressed more sensitively? Perhaps, Bornstein suggests, a more delicate approach is in order, such as, “Patient often overestimates his/her skills and abilities and may have difficulty modulating anger and other forms of negative affect.”
When Holman teaches report writing, she advises her students to frame guidance as recommendations rather than dictates. Writing “client might benefit from family therapy” might be better received than “client needs family therapy to improve her relationships,” she says. “It’s important for continued rapport building with the client, and for the client to be more likely to follow through with your suggestions.”
■ Cull the report to its essence. Schneider typically keeps his reports to between six and 10 pages. “Most of the time when you get a really long report, it’s because someone was doing a data dump rather than an integrated, well-thought-out, thematically organized report,” he says.
Bornstein agrees, noting that reports can be as short as several pages and often run between five and 15. To assist a busy clinician who might need to reference a report’s contents quickly, it’s helpful to write a summary of the referral question and primary conclusion at the beginning of the report and follow with a more detailed explanation further on, he says.
■ Don’t lose sight of the narrative. Holman likes to incorporate quotes from patients in her reports, or metaphors they’ve used to describe themselves, as a way to bring the patients to life on paper. “That’s how you can write a really strong report, when you’re able to paint a compassionate picture of a person,” she says.
Pade advises psychologists to check that their reports haven’t simply broken down individuals into pieces based on their scores on tests in various domains such as attention, verbal abilities or emotional functioning. You can end up with “all of these bits and pieces, because that’s what our tests measure,” she says.
To be most beneficial, assessment reports must in the end put patients back together into a cohesive psychological whole, so they can best be helped moving forward, Pade says.
“What it comes down to in a report is telling a narrative about the person, and how all these pieces fit together,” Pade says. “And that directly leads to your recommendations, and what they might be able to do about it.”