<img height="1" width="1" src="https://www.facebook.com/tr?id=922328371831790&amp;ev=PageView &amp;noscript=1">
Request My Demo
Request My Demo

From Observation to Evidence: The Case for Video in Clinical Supervision

Dr. Carlos Zalaquett on what video observation makes possible in counselor training — and why the evidence from the Herr Clinic should matter to every program coordinator. 


Dr. Carlos P. Zalaquett

Former Professor of Educational Psychology, Counseling & Special Education, Penn State University  |  Former Co-Coordinator, Clinical Mental Health Counseling Program. President, Pennsylvania Mental Health Counselors Association  ·  President-Elect, APA Division 52 (Society for Global Psychology)  ·  Past President, Sociedad Interamericana de Psicología  ·  Author or co-author of 50+ scholarly publications and five books, including Intentional Interviewing and Counseling: Facilitating Client Development in a Multicultural Society  ·  Recipient: USF Latinos Association Faculty of the Year, Tampa Hispanic Heritage Man of Education Award, SMHCA Emeritus Award .

 

Ask any counselor educator what separates a good practicum supervisor from an exceptional one, and the answer often comes back to specificity. The supervisor who can point to a moment — a missed reflection, a well-timed confrontation, an empathic response that shifted the session — gives the trainee something to hold onto. Generalized feedback after the fact, however well-intentioned, rarely produces the same result. 

Dr. Carlos Zalaquett has spent decades building the infrastructure for that kind of specificity. A former professor at Penn State and co-coordinator of the Clinical Mental Health Counseling Program, he has used video observation in supervision, training, and clinical outcomes research throughout his career — starting with shoulder-mounted cameras and VHS tapes to now using the VALT platform from Intelligent Video Solutions. His argument for video in counselor education is not merely practical. It is grounded in outcome data, and it has implications for how programs are designed, evaluated, and funded. 

 

SUPERVISION

Memory is not a supervision tool

The Herr Clinic at Penn State records every session across every room, with footage stored in a centralized hub accessible to students, supervisors, and faculty. That architecture changes what supervision can do. Rather than relying on a trainee’s account of what happened in session — shaped by anxiety, selective attention, and the limits of recall — supervisors and students can return to the footage together. 

“It’s not just a self-reflection based on memory. They can revisit the session and actually make an even better — quote unquote — self-reflection, a more accurate analysis of what they need.” 
— Dr. Carlos Zalaquett 

  

This is the distinction Dr. Zalaquett returns to consistently: the difference between what trainees think they did and what the session record shows. Supervisors can watch sessions in advance, identify specific timestamps, and bring trainees directly to the moments that matter — whether to affirm a competency or to examine a missed opportunity. For externship and practicum supervision in particular, where developmental feedback is most consequential, that precision matters. 

He also describes using recorded sessions to demonstrate skill in his own courses — building a searchable library of examples that can be deployed across semesters and cohorts. A strong demonstration of a particular micro-skill, captured once, becomes a teaching resource that outlasts any single class session. 

 

ACCOUNTABILITY 

When documentation protects the trainee 

One of the more pointed examples Dr. Zalaquett shares involves a colleague who raised concerns about an international student counselor — suggesting the student’s accent was interfering with client progress and warranting removal from the program. Dr. Zalaquett reviewed the VALT recordings before responding.

“I went through VALT. I went through the recordings, and I realized that this student happens to have a large number of clients and they all keep coming back. The sessions seem to be constructive, and when they look at the self-report of the client, they were responding in a way that demonstrated that they were improving. So I sat down with my colleague and I said, I really value your concern, but I have to say this — I don’t have a way to support any actions because here is the data.” 
— Dr. Carlos Zalaquett 

For program coordinators and supervisors, this is a significant point. Video documentation does not only support developmental feedback — it creates an evidentiary record that can protect trainees from decisions driven by bias rather than performance. “You wouldn’t be able to do that if you don’t have all these recordings,” Dr. Zalaquett notes. The footage replaced assumption with evidence — a function of video observation that rarely surfaces in the literature but operates in training clinics every day.

 

RESEARCH

Clinical significance, not just statistical significance

Dr. Zalaquett’s use of video extends well beyond supervision into a sustained program of outcomes research. At the Herr Clinic, session recordings are combined with client self-report data to track change across the arc of treatment — from intake through termination. His team analyzes this material both qualitatively, examining how therapeutic themes develop within and across sessions, and quantitatively, measuring whether client outcomes meet the threshold not just of statistical but of clinical significance.

“My research with my team not only shows that there is change, but it shows that the change is clinically significant.” 
— Dr. Carlos Zalaquett 

That distinction carries weight in a field that has spent decades defending its effectiveness against methodologically driven critiques. Dr. Zalaquett recounts the historical challenge posed by Hans Eysenck’s early meta-analytic work on psychotherapy outcomes and the long reckoning that followed — a period that made counseling researchers among the most rigorous critics of their own evidence base. The current consensus, he argues, is clear: “Research today shows that we are effective to the tune of 60 to 80 percent across disorders and across therapies.” 

Video is what makes that research possible at the program level. Recordings allow his team to conduct mid-semester analyses of trainee performance, identifying specific areas for growth while there is still time to address them — an approach he frames explicitly as formative rather than evaluative.

“You are doing quite well, but here are areas of improvement. We still have the other half of the semester to make improvement. So it’s not to just chastise or criticize, but to find ways to get a more rounded and competent student.” 
— Dr. Carlos Zalaquett 

His research has also demonstrated equivalent success rates between domestic and international trainees — with nuanced differences in the types of presenting concerns each group addresses most effectively. That finding, made visible through video-supported analysis, allows supervisors to help students identify areas of strength and areas for continued development, rather than applying a single evaluative standard to a diverse trainee population. 

 

THE CASE FOR INVESTMENT

What program coordinators can take to their provosts

For ACES members who work within institutions where video infrastructure requires budget justification, Dr. Zalaquett’s argument is worth studying carefully. His case to program administrators is built on outcome data — the same data the recordings make possible. 

The Herr Clinic is a true training clinic, staffed entirely by students in preparation. When its aggregate outcomes are compared against university clinics staffed primarily by licensed professionals, the results hold. “If somebody asked me, is the Herr Clinic effective? I can give you a thirty-minute rundown that will have you walking out thinking, these guys are pretty darn effective.” That comparison is possible only because the recordings exist and the outcomes have been measured. 

He goes further, arguing that training clinics should position themselves not merely as service providers but as applied research settings — generating the kind of evidence that supports grant applications and demonstrates value to institutional stakeholders.

“The more opportunities we have to review all the work we do, the higher the likelihood of success, improvement of techniques, advancement of the profession, publication of outcomes — so we can even show what works and for how long.” 
— Dr. Carlos Zalaquett 

What can a video system for a training clinic cost? Get an estimate based on your needs and environment. 

 

ETHICS AND CONSENT 

Recording within a clinical and ethical framework

Dr. Zalaquett is direct about the conditions under which video observation is appropriate. Clients must be fully informed that they are working with trainees in a training clinic. Recordings must be stored on secure, encrypted platforms accessible only to authorized users. And for clients whose history or presentation makes observation particularly sensitive, the consent process requires additional care and time. 

What is the HIPAA regulatory interpretation of video technology and can it create technical safeguards to support compliance?

“When I say use video recording, I’m not saying use it in a dictatorial way.” For clients with heightened concerns about being observed, he describes a gradual process of education and trust-building — one that takes the therapeutic relationship seriously as the frame within which all other decisions are made. 

He is equally direct about the resistance that exists within training programs themselves — among supervisors and trainees who are uncomfortable being recorded or who worry that footage will be used punitively. His response reflects both conviction and patience: “I’ll be happy to sit down with anybody who disagrees and share with them the amount of information we have that demonstrates how relevant using these systems is. None of these systems are intended to be punishing.”
 

 

WHAT COMES NEXT 

AI, nonverbal analysis, and the expanding research horizon

Dr. Zalaquett anticipates that artificial intelligence will substantially accelerate the kind of session analysis that currently demands significant faculty time — transcription, coding, pattern identification across large sets of recordings. He points to research from a colleague in Germany who developed software capable of analyzing nonverbal synchrony between counselor and client, demonstrating that movement synchrony across a session predicts outcome ratings for that same session. 

The implication for counselor education programs is significant: the recordings being made today are not only supervision tools. They are a longitudinal data asset. As analytical methods grow more sophisticated, what programs can learn from their own session libraries — about trainee development, client outcomes, and the mechanisms of therapeutic change — will only expand. 

“The future, I believe, is for companies like yours — because in order to apply all of these platforms, we need to have the recording.” 
— Dr. Carlos Zalaquett 

The evidence Dr. Zalaquett has built over his career makes a straightforward case:counselor education programs that record, review, and systematically analyze session footage produce better-prepared clinicians, more defensible supervision decisions, and outcomes data that can speak to administrators, grant reviewers, and accreditation bodies alike. Video observation is not a supplement to rigorous training — it is what makes rigorous training verifiable. For ACES members working to strengthen their programs, the question is less whether video belongs in the training clinic and more what the program is doing with the footage it already has.

Download our Audio/Video Capture Guide for Healthcare and Higher Education. Understand the features and capabilities a video software platform should have as well as the benefits it can provide for producing better outcomes if done right.

Download Guide

ABOUT INTELLIGENT VIDEO SOLUTIONS 

Intelligent Video Solutions (IVS) develops the VALT video observation and recording platform, built for counselor education programs, clinical training facilities, and health science institutions. IVS is a partner of the Association for Counselor Education and Supervision and co-sponsor of the IVS–ACES Research Grant, which supports peer-reviewed research on the role of technology in clinical skills acquisition and supervision. Learn more at ipivs.com 

 

About Us

Our software empowers users to increase the effectiveness of their programs by leveraging the power of video. With over 15 years of experience, the team at Intelligent Video Solutions is dedicated to delivering excellence and can meet any challenge.
Never Miss an Update
Subscribe now to get new blog posts delivered straight to your inbox.