2c. Use of Data for Program Improvement
2c1. What are assessment data indicating about candidate performance on the main campus, at off-campus sites, and in distance learning programs?
Off-campus and distance learning programs are just emerging on our campus. Because of this we do not have any data that compares these candidates to other candidates in traditional programs. As these programs continue to emerge candidate data will be compiled on them and comparisons will be made in the future.
2c2. How are data used by candidates and faculty to improve their performance?
Candidates are made aware of their performance through the immediate feedback of portfolio assessment, test scores, and field supervision and cooperating teacher evaluations. If a candidate receives a score of 0 or 1 in any portfolio entry, or on their narrative or philosophical/pedagogical narrative statement, they are not allowed to continue in the program without immediate remediation. Faculty are made aware of candidate performance through the CoPRA and departmental assessment committee process. Indication of faculty discussion of candidate data used for program improvement can be found in department meeting minutes. For example, a summary of data-driven changes that the reading faculty used to move the MSE Reading program into an MSE-PD Reading emphasis program can be found in the electronic documents room. Faculty regularly distribute course evaluations for students to complete at the end of each course. They are then returned to faculty after grading has been completed. Course evaluations are factored into the promotion and tenure process and for post-tenure review.
2c3. How are data used to discuss or initiate program or unit changes on a regular basis?
In addition to the ongoing CoPRA and department assessment committee process described previously, every program on campus is part of a 5 year “Audit and Review” process. The purpose of the audit is to improve the quality of programs, identify needs for additional study and/or planning, help set priorities for reallocation of resources, ensure appropriate standards for program quality, identify the needs and unique circumstances of specific programs, identify non-functional or unnecessarily duplicative programs, and identify needs for structural changes in programs or administrative units. This requires the completion of a strict data-driven report that is submitted to a campus-elected body of faculty representing all colleges in the university. The committee reads the submissions, works with the Associate Vice Chancellor to suggest changes, and then meets with program chairs to discuss strengths and weaknesses. Examples of audit and reviews conducted on our programs since the last NCATE visit are included in the electronic documents room.
2c4. What data-driven changes have occurred over the past three years?
There have been several data-driven program changes in the past three years. One example is the creation of a new master’s degree based on analysis of enrollment trends, audit and review reports and alumni surveys. The new program combines the “old” MS C&I degree and the MSE Reading degree into a new 30-credit MSE-PD with multiple emphasis areas. Several emphasis areas include licensure at the advanced level (alternative education, reading, ESL/Bilingual education). Another example of data-driven change has been the adoption of a new master’s degree in special education with initial licensure. In addition, the department of Special Education has adopted the portfolio artifact process moving to standardized products that represent key competencies for the field. In the past students were able to determine their choice of artifacts for submission in the portfolio. This change is expected to improve assessment of the alignment of course content to expected outcomes.
2c5. How are assessment data shared with candidates, faculty, and other stakeholders?
Assessment data is shared in multiple ways on and off campus. As stated above, a process exists to share data with program representatives (CoPRA) and department assessment committees as part of the overall Unit Assessment Plan. In addition to this, data is regularly shared with the Dean’s Advisory Council, constituency groups and advisory boards, field supervisors and cooperating teachers. Assessment information is also shared with prospective students as part of our “On Campus Days” and in the Phase 1 and 2 meetings. This helps students plan major and minors and licensure area concentrations. Assessment data is also shared through our website and print material and shared through our websites and alumni magazines. External to the college our assessment data is also included as part of whole campus assessment and as part of our 2006 Higher Learning Commission Self-Study. We have also shared information about our surveys on the national stage via presentations about our program at AERA and AACTE.
1. What does your unit do particularly well related to Standard 2?
The Unit has consistently implemented and monitored the progress of its candidates both directly in the assessments of all candidates, but also indirectly through the extensive audit and review process at multiple levels. There is a thorough on-going promotion, tenure, post-tenure review system in place on campus. Each department has a designated liaison to the Unit Assessment committee and the State Department of Public Instruction liaison has a close working relationship with the Associate Dean, keeping the Unit abreast of changes in policy and procedures. Various portfolio rater reliability studies are done throughout the college. Faculty supervisors place faculty and staff into direct contact with schools and children within our sphere of influence.
2. What research related to Standard 2 is being conducted by the unit?
Please see faculty research in the support section.