Over the course of four webinars, partners from the 100Kin10 Networked Improvement Community will share their work exploring a different facet of improving K-12 engineering and what they’ve learned in the process. A collaborative approach to running experiments, the Networked Improvement Community enables organizations to develop and test practical solutions to problems under a shared topic area.
Here we introduce partners Tom Jenkins from Teaching Channel and David Wisniewski from WNET, whose work investigated the impact of video-based professional development on teachers’ understanding.
Why did you join the Networked Improvement Community?
Tom, Teaching Channel:
At Teaching Channel we’re a huge believer in professional development for educators through the use of video-based resources, but something we struggle with is showing that our tools and our library is effective. That helped steer me to make sure that we could develop a process that shows impact; we know viewership is going up but are they effective tools in affecting a teacher’s practice?
David, WNET:
Exactly, I thought being part of a Networked Improvement Community might help us become better at evaluating the impact of our programs.
Tom, Teaching Channel:
Our biggest goal was to measure growth of understanding through video-based learning, specifically to see if observing video about the engineering process in addition to engineering in action helped K-12 educators’ understanding of engineering content more than solely watching a video of engineering in action.
What process did you develop to measure that impact?
David, WNET:
To start, we screened engineering videos in our collection with help from the Teaching Channel’s Next Generation Science Squad of teachers. Those teachers offered feedback on how interested their students would be and how much of the engineering design process was truly being shown.
Their feedback gave us a better sense of what teachers wanted, which is really showing parts of the engineering design process, not just the end product. It showed us that we need to talk about constraints, brainstorming, and prototypes. It also enabled us to select exemplar videos from our collection to use in our second survey.
We then recruited a new set of K-12 science teachers to do the second survey, which involved a pre-test measuring the participants’ knowledge, skills, and values around teaching engineering. Half the group watched a video targeted at teachers that explained the engineering design process (Using Engineering Design in the Classroom), while the other half did not receive any instruction.
Then, both groups watched a video for teachers to use with their students that showed engineering-in-action (Sitara the Swan) and discussed how they would use the video with their students. All participants completed a post-test measuring whether their knowledge, skills and values had changed.
Tom, Teaching Channel:
We wanted to see if the two video combination was more effective. If they didn’t get the Teaching Channel video, they didn’t get an explanation of the engineering design process overall. And so we did see an increase in the scores for the group that watched both videos.
What’s next?
Tom, Teaching Channel:
It would be interesting to go further and get a larger sample size. We would also break it up with different grade bands, which could give us interesting insight into how teachers at different levels approach this topic. We want to make sure we’re doing everything we can for our materials to be useful.
What advice do you have for partners looking to conduct impact evaluation surveys in their own work?
* Neither of us had teachers that we could sit down and interview, but that doesn’t have to stop you from doing something like this. We actually used Google Forms as a pretty effective way to gather data.
* Dissemination is easier if you have networks that you can tap into, especially if it’s a short turnaround – for example, having a listserv would have helped get us an increased sample size.
* We could have reached more teachers by trying some paid placements on Facebook where you can be very specific about who you’re targeting.
* Play around with incentives. We offered $10 gift cards to the participants, but maybe a higher value would have been more effective.
* Timing is important. I would want to do this at a different time of year, rather than over the summer. Our sample size did encourage us and makes us want to expand upon this and try it with a larger audience.
Are you a partner interested in learning more? Register here to join the webinar on October 27th.