Bringing the work of the CIS Network back to campus
This post is the third in a three-part series by Common Indicators System Network participants, who are sharing their experiences with the Network and how it’s supporting their efforts to improve through evidence. Read the first two posts in the series here and here.
In August 2017, I headed to Austin, Texas, with four of my colleagues from the University of North Carolina at Charlotte to help launch the Common Indicators System (CIS) Network, a national effort to gather credible evidence of candidate knowledge and skill and program performance using common measures and data collection protocols.
There, we joined more than 40 data and program leads from 12 teacher-preparation programs across the nation. We covered a lot of ground during the launch event, but one of the most powerful activities we did started before we even got to Austin when we were asked, as pre-work, to meet as an institution team to complete a data diagnostic tool developed by Deans for Impact.
We each rated our institution on the tool’s four criteria: developing shared understanding; collecting, organizing, and analyzing data; organizing people to learn; and using data for program improvement. It was very interesting to see how, as a team, we differed on our responses, and this work provided us the opportunity to discuss our thinking, arrive at consensus on how we rated each of the four criteria, and identify our rationale.
Once in Austin, each team displayed the results of our pre-work. It was really interesting to see how the different institutions rated themselves, and the discussion that emerged was amazing. It was interesting to hear rationale behind ratings and how they were based on program structure. For those institutions that rated themselves high on various criteria, it was nice to learn about specific things they are doing that put them at that top ranking. As a team, we began to consider how we could take this exercise back to our institution to assist in program improvement. And we did just that!
Back at UNC Charlotte, we began with our leadership team: deans, department chairs, and directors. Next, we moved to program coordinators, who completed the diagnostic with program faculty. We wanted every faculty member in the college to participate in this work.
We shared the results at our college-wide faculty meeting, with each program posting its results. In groups, faculty did a gallery walk, discussing with their group what they noticed. We also asked them to brainstorm how the college could get to “sustaining,” the diagnostic’s top level. Faculty were very receptive to this activity. They found it powerful to see how programs outside of their own were thinking and the notion that programs within the same college rated themselves very differently in many cases.
From this exercise, we identified two areas — “organizing people to learn” and “using data for continuous improvement” – as growth opportunities, in particular identifying time to discuss our assessment data and involving our school partners in this discussion. Based on these results, we decided to host a “School Partners Data Day,” with faculty from our teacher- and principal-preparation programs, and principals and teachers from 14 partner PreK-12 schools.
At the event, we gave an overview of the college’s current initiatives, including a teacher-preparation redesign, and then broke into groups that included both faculty and school partners. These groups completed two activities. First, they reviewed programmatic assessment data and answered questions about the data, such as: What do you find interesting and why?; and What are the notable strengths for the program? Second, faculty and school partners brainstormed the “non-negotiables” of a teacher preparation program: What do candidates need to know and be able to do to be ready on day one in a classroom? Then, each program came to a consensus on the top three to four “non-negotiables.”
During this day, we accomplished several things. We learned valuable information from our school partners to help guide our redesign efforts – while strengthening relationships with our school partners. Participants told us they found it helpful to have time to talk about data, to get perspectives on the data from faculty and school partners, and then to collaborate on trends they saw in the data. The “Data Day” helped move the redesign forward, and we have invited school partners to help with this work. We are also planning our next School Partners Data Day.
Ultimately, our involvement in the CIS Network has stretched our thinking and helped us make improvements in how we discuss our data, gain feedback from stakeholders, and use what we have learned for program improvement.
Dr. Teresa Petty is a professor and associate dean at UNC Charlotte’s Cato College of Education, and a member of UNC Charlotte’s Common Indicators System Network team.