Last November at ASHA we decided to focus our efforts on documenting progress in our talk, “Documenting Language Therapy Progress in School Settings.” It was exciting (and a bit nerve wracking) to present to a packed room, but we were so grateful for everyone who attended! Judging by the number of SLPs who showed up, it’s fair to say that the demand to demonstrate accountability as a school-based SLP is huge on our minds.
Our therapy has to be educationally relevant and significantly contribute to students’ education. We are accountable to administrators, parents, teachers, and probably most importantly, the students we serve. Part of accountability is monitoring progress and making adjustments to our therapy plan after analyzing and reflecting on our efficacy.
Language goals can be rather elusive. Language is abstract and ever evolving as children age. Language growth and progress can be slow with small incremental change over time. So, this year we decided to attempt to address the unique challenges of monitoring language therapy progress using formative assessments to monitor progress during intervention. This is somewhat new territory for SALT as we typically think of language sample analysis as a summative assessment or as a diagnostic tool when completing evaluations. However, we can expand our use of language sample analysis and use SALT weekly, monthly, or perhaps at semester/trimester as part of a formative assessment tool.
One of our colleagues, Claudia Dunaway (San Diego Unified School District, retired; Dunaway Consulting), shared an example of using language samples as a formative assessment to closely monitor the specific skill of inferential thinking and vocabulary using a book discussion format. In her case study the goals were to increase use of precise inferential vocabulary and indirectly increase inferential thinking.
Specifically, she measured how many emotion (mental state) words were used during group discussion of a text and calculated a ratio of # emotional state words/total number of comments. In this task, it was easy to use a consistent language sample elicitation protocol (group discussion about a shared text) and monitor the gains in inferential emotional state goals over several sessions (formative assessment). Ideally, with specific targeted instructional strategies and scaffolding, the overall amount of language and the use of mental state words increase. So, in this case study targeted (specific) language skills can be monitored closely and reported to administration or on progress notes, etc. In addition, therapy strategies can be altered if students are not showing adequate progress on the targeted goal.
One question asked during our talk was, “Can you use a different story [than what was presented] and look at the same types of words?” This was a great question and the answer is “YES!” Since we are thinking in terms of formative assessments, we are not using the SALT reference databases for sample comparison. So really any elicitation protocol could be designed to fit specific language goals or instruction. It is advisable to use a similar protocol such as a book discussion, personal narrative, picture description, or other elicitation task if measuring progress over multiple sessions. The advantage of language sampling is that it measures functional language- students use their whole language system during a language sample and so it is a strong indicator of the actual language performance students may use in the classroom.
The general process for monitoring progress using this method looks something like this:
- Determine the therapy target.
- Choose a language sample elicitation protocol that will demonstrate the skill.
- Repeat the elicitation protocol over multiple sessions.
- Measure the same skill each session using SALT’s reports (or design your own specific word list).
- Generate SALT analysis reports.
Let’s look at how we could use SALT for measuring specific vocabulary in the example regarding mental state words. There are word-list search routines built into the software that include a number of pre-programmed lists. These lists can be edited to suit specific vocabulary targets or a custom list can be created. The transcript can then be searched using the list and a report can be generated that shows the words used and their frequency. Short, targeted samples could be transcribed and then analyzed using SALT. Repeated samples can then be compared and progress can easily be measured by looking at the Word List report (which provides data on the preset word list).
In the figures below the “Explore” menu was used to call up all mental state words used in the language sample. SALT then generated a report with all mental state words used, including the utterances that contained the mental state vocabulary. This report can easily be generated with multiple samples to measure production of mental state words.
ASHA can serve as an inspiration for many SLPs out there! It was exciting for us at SALT to connect with colleagues and hear about challenges in the trenches of daily therapy. We strive to make easier the SLP’s job of being accountable and using good data to monitor progress. Consider using SALT to report on all the great things that are happening in therapy! Think of SALT as not just a diagnostic tool, but as a tool to help with formative assessments as well. Get creative in your elicitation of language samples and let SALT do the work of consistent, thorough analysis and easy reporting of results. See you next year at ASHA!
We’ve been asked to post the slides and handouts from our ASHA talk. So here they are.