Characteristics of Professional Development Survey

Background

Not all professional development is equally effective in improving teacher quality. Researchers confirm what teachers already know - teachers in the United States are not participating in well-designed professional development opportunities (Darling-Hammond, Chung Wei, Andree, Richardson, & Orphanos, 2009). In order to impact teacher effectiveness, professional development must be “high-quality, sustained, intensive, classroom focused” (No Child Left Behind Act of 2001) and “foster collective responsibility for improved student performance” (National Staff Development Council, 2010).

It is critical that teachers have access to professional development designed with the adult learner in mind. Professional development should create opportunities for teachers to take control of their own learning, deepen their subject knowledge, construct knowledge from previous knowledge and experiences, become comfortable with their role as a learner, and develop intellectual camaraderie with colleagues (Bransford, Brown, & Cocking, 2000; Knowles, Holton, & Swanson, 2011). To that end, professional development must be scrutinized for its alignment to known attributes of effective professional development and ultimately judged based on its impact on teacher behaviors, as well as student learning outcomes (Guskey, 2000).

Five empirical indicators of effective professional development surfaced during a review of the literature: (a) duration, (b) active, engaged learning, (c) focus on content knowledge, (d) coherence with teachers’ needs and circumstances, and (e) collective participation (Desimone, 2009; Garet, Porter, Desimone, Birman, & Yoon, 2001). It was found that there are no instruments designed to measure research-based components of teacher professional development. This lack of available tools limits the ability of educational systems to capture professional needs and track growth over time. The Characteristics of Professional Development Survey (CPDS) was developed to meet these needs.

Establishing Validity and Reliability

The CPDS was developed with a large teacher professional development program. Initial items were developed to match indicators of effective professional development and pilot tested. After administration to a large group of teachers, an exploratory factor analysis was conducted on the survey to provide evidence of construct validity.  Principal components analysis (PCA) and principal axis factoring (PAF), each with oblimin rotation, were conducted to establish the validity of the survey by determining the number of factors underlying the set of measured variables.  The factor analyses revealed the presence of five characteristics of professional development with moderate to strong internal consistency: (a) Collective Participation, (b) Focus on Teachers’ Content Knowledge and How Students Learn Content, (c) Coherence with Teachers’ Needs and Circumstances, (d) Active Learning in the Classroom, and (e) Active Learning Beyond the Classroom. All five factors were found to be reliable with alpha values ranging from .77 to 94. In two separate studies, scales of the CPDS were found to correlate with measures of student achievement in mathematics and English Language Arts (Soine & Lumpe, 2014, Bishop, Lumpe, Henrikson, & Crane, 2016).

Below are sample items from each of the CPDS subscales.

Active Learning in the Classroom - In reflecting upon my combined professional development experiences, I reflected on the effectiveness of a lesson.

 Active Learning Beyond the Classroom - In reflecting upon my combined professional development experiences, I participated in a coaching cycle (planning, observation, feedback).

 Focus on Teachers' Content Knowledge and How Students Learn Content - In reflecting upon my combined professional development experiences, I became more confident in my ability to know the next step I needed to take to deepen students’ conceptual understanding.

 Coherence with Teachers' Needs and Circumstances - In reflecting upon my combined professional development experiences, were aligned with our school’s mission and vision.

 Collective Participation - In reflecting upon my combined professional development experiences, I spent time building trusting relationships with my colleagues.

 Duration - How frequently did you participate in professional development?

The CPDS is a viable tool for capturing teacher perceptions about characteristics of professional development.  The instrument has been used to provide information to state, district, and school leadership about the quality of teachers’ professional development (see this report for an example). The instrument is versatile and cost effective because it was designed to measure professional development experiences in the broadest sense, and it can be completed online in 8-10 minutes.

Using the Assessments

The CPDS is available for use for a fee of $5 per teacher per administration. The survey will be scored and Washington School Research Associate staff will send instructors and professional development providers a detailed summary of teachers' professional development that includes scores on individual items and on each subscale, allowing them to use the scoring summary to analyze performance on specific items, subdomain topics, or overall CPDS average.

 

 

 

 

 

 

 

 

 

Coordinated Consulting Services

Washington School Research Associate staff are available to provide additional consulting services for using the CPDS or for planning professional development activities related to the characteristics of effective professional development.

Ordering the Assessments

Send an email to Washington School Research Associates staff at indicating your interest with a brief description of your intended use (e.g. with teacher professional learning grant, for a research study, for other professional development purposes, etc.). Also, include the following information to help us plan and schedule our scorers:

  • approximate dates of administration
  • approximate number of teachers completing assessments
  • contact information (with email address) to whom to return the completed scoring reports and fee invoices

Frequently Asked Questions (FAQ)

1.       Is training required to administer the measurement tool?
Training is not required. We provide a short document with administration instructions that are straightforward.

2.      What are the costs involved? 
Costs are $5 per assessment per teacher. When the electronic scoring summary is sent, an invoice for that amount will be sent as well.

3.      How long does it take teachers to complete one assessment? 
The length of time to take the assessments generally varies between 8-12 minutes.

4.      How are the assessments delivered to us, and what is the process to have them scored by you? 
The process for obtaining is as follows:

·        You will be sent a web link for the survey. You will share this link with the teachers for taking the survey.

·        After administration, we will prepare a summary report of the results.

·        The score summary report is sent back electronically along with the fee invoice.

5.      What are your recommendations for using these assessments? 
Our position is to leave it up to the clients to decide how these data will best serve them. We do not provide national norms for the assessment scores because the samples of teachers taking the assessment may or may not be representative of teachers as a whole. The assessments are intended for diagnostic purposes, and we suggest they are best used to measure growth or to identify individual and system wide professional development needs of teachers rather than comparing results to established benchmark scores.Below are some examples of how others have used these assessments:

·      Some project directors have administered the CPDS as a pretest or formative assessment of professional development needs in order to use results to determine specific areas of focus for upcoming professional development activities.

·      Some project directors have used these assessments in a prepost/posttest design to look for gains and some have used gains in subscales as well as overall gains to inform areas of specific need or determine the effectiveness of professional development initiatives.

·     Results could be correlated with student achievement scores to examine relationships between professional learning and student learning.

 

© Copyright 2016