Ohio is in the midst of implementing its new educator evaluation system. Documenting student growth is a key component of this system, and accomplished via a range of measures depending upon what data are available for an educator. Thus, for example, because value-added data are available for no more than a subset of Ohio’s educators in grades 4 through 8, reading and mathematics, the state’s LEAs are working on developing and incorporating student learning objectives (SLOs) as a means of documenting student growth in non-tested subjects and grades. Likewise, selected vendor assessments are a viable alternative for some grades and subjects. However, these non-value-added measures of growth remain unknown quantities — we neither know how reliable or valid they may be nor how they match (or underperform/outperform) available value-added measures. We also do not know how time- and resource-intensive local deployment of these student growth measures is likely to be, or if and how they may change (for better or worse) the local culture of schools and districts. Towards this end our study is designed to answer key policy and practice questions and in the bargain inform ongoing implementation of student growth measures. Specifically, our study will shed light on how student growth measures are being implemented in Ohio, their impact on improving educator quality, and on establishing a robust statewide educational accountability system.