Blog
Our blog offers top insights and analysis of the education sector from our staff, member deans, and guest authors.
Data, when interpreted in context and used appropriately, is a powerful tool for surfacing important trends and issues in education. For example,…
This event launches a new year of learning for programs participating in the Common Indicators System Network.
In 2017, Deans for Impact partnered with 14 educator-preparation programs in 12 states to build the Common Indicators System (CIS) Network, a…
This post is the third in a three-part series by Common Indicators System Network participants, who are sharing their experiences with the Network and…
This post is the second in a three-part series by Common Indicators System Network participants, who are sharing their experiences with the Network…
For years, Temple University’s College of Education has collected comprehensive and accurate information about what’s working and what could improve. Participation in the Common Indicators System Network is a way to amplify and extend its previous work, offering new and deeper ways to explore the pressing questions that we, and the field more broadly, need to answer.
For the last two years, more than 50 faculty and program leaders from 12 educator-preparation programs have worked in partnership with Deans for Impact to collectively investigate how to better prepare aspiring teachers through what we’re calling the Common Indicators System (CIS) Network. Participants in the CIS Network are now gearing up for our first-ever Inquiry Institute, where they’ll engage in cross-institutional inquiry into data on more than 3,500 teacher candidates, 500 program graduates, and 100 employers across four common indicators, and develop plans to inform improvement when they return to their institutions.
Texas recently announced it would make educator data available to the public in order to meet the requirements of a bill passed during the 85th Regular Session, which concluded on May 29, 2017.
“You can find evidence to support whatever you want.” It’s a common refrain heard in policy circles – but what happens when the evidence appears to say nothing at all? While one recent study appears to suggest that educator-preparation programs do not vary meaningfully in their performance – and thus data on performance should not be used for policy decisions – a more nuanced understanding of the interrelationship between research, policy, and practice may lead to a more nuanced conclusion.
Read about using shared data to promote program improvement.