Go homeTerence W. Cavanaugh Ph. D.
About me projects presentations Publications teaching Related sites

A Journey of Design: Developing a Technology Based Unit Assessment System: Designing and Implementing a Teacher Education Unit Assessment System

 Terence Cavanaugh, Curriculum and Instruction, University of North Florida, USA  tcavanau@unf.edu
Cathy Cavanaugh, Curriculum and Instruction, University of North Florida, USA  ccavanau@unf.edu
Larry G. Daniel, College of Education and Human Services, University of North Florida, USA  ldaniel@unf.edu

 

 Abstract: The National Council for Accreditation of Teacher Education (NCATE) has developed professional standards for accreditation of academic units offering professional education programs. NCATE requires that each unit have an assessment system for collecting and analyzing data on teacher education candidates and unit operations and programs. This paper summarizes efforts to date in creating and utilizing a digital/electronic unit assessment system in the College of Education and Human Services at the University of North Florida.

 

NCATE Standard 2: Assessment System and Unit Evaluation

The unit has an assessment system (technology based) that collects and analyzes data on applicant qualifications, candidate and graduate performance, and unit operations to evaluate and improve the unit and its programs.

 

The purposes of this paper are to describe procedures used in developing and implementing the teacher education unit assessment system at one institution and to share observations regarding creation of assessment tools, design of the electronic assessment system, and utilization of system data. We illustrate how a teacher education unit assessment system can be effectively designed and utilized to (a) inform teacher educators about the quality of teacher education candidates, (b) develop plans for remediation of candidates and improvement of programs, and (c) make decisions about the operation of a teacher education unit. The assessment system described here was deployed during a successful 2004 NCATE accreditation review.

 Review of the Literature

 

Teacher education program and curricula have become increasingly aligned with state and professional standards and benchmarks for teacher and student performance (Ambach, 1996; Weisenbach, 2000). Focused heavily on program and candidate outcomes (as opposed to inputs or processes), the new standards require teacher education programs to develop an assessment system based on teacher candidate products (Denner, Salzman, & Harris, 2002). Teacher education programs must develop meticulous record-keeping systems to document the progress of candidates toward mastery of professional standards, with emphasis placed on evaluation of teacher candidate work samples (Fredman, 2002; Tomei, 2002).

 Professional accrediting bodies have raised standards and implemented assessment procedures for assuring teacher candidate proficiency vis-à-vis these standards. With the release of its NCATE 2000 Standards, the National Council for Accreditation of Teacher Education imposed the expectation (Standard 2) that institutions seeking initial accreditation or wishing to maintain continuing accreditation develop a unit assessment system that collects and analyzes data on applicant qualifications, candidate and graduate performance, and unit operations to evaluate and improve the unit and its programs (NCATE, 2002, p. 21).

 As outlined by NCATE (2002), a unit assessment system should reflect the unit’s conceptual framework, incorporate candidate proficiencies per professional and state standards, and utilize appropriate information technology in housing, storing, and accessing unit data. One such unit assessment system, developed and implemented at the presenters’ institution of higher learning, is described and demonstrated. This system includes timelines for data collection and analysis related to candidate performance and unit operation.

 

The University of North Florida Unit Assessment System

 

The College of Education and Human Services (COEHS) at the University of North Florida (UNF) has developed a versatile assessment system linking the performance of its candidates to the unit’s conceptual framework, national and state standards, professional organizational standards and directives, and K-12 student learning. The candidate assessment system was designed to track student progress through the required program tasks, not to act as an electronic portfolio system hosting student work. The system includes a comprehensive and integrated set of evaluative measures useful in monitoring candidate performance and managing the unit’s operations and programs. Our system is by no means unique—indeed it bears resemblance to various other systems developed by teacher education programs at other institutions (e.g., Harris, Salzman, Frantz, Newsome, & Martin, 2000)—nevertheless, we describe our system for the benefit of illustrating one means for operationalizing standards based assessment in hopes that our experiences may be useful to others in the field (See Figure 1).

 
Figure 1. Unit Assessment Design

 

System Description

 The system developed and currently being implemented at UNF allows for (a) tracking of the progress of individual candidates throughout their program of study in terms of their ability to meet professional, state, and program standards related to effective teaching and learning; (b) storage and recall of data for each candidate on a host of measures and artifacts, including pre-admission assessments, critical performance task assessments, candidate portfolios, and end-of-program summative measures; (c) development of summary reports on aggregated strengths and weaknesses of candidates in each of the unit’s teacher education programs; and (d) unit-wide evaluation to determine the progress of the unit in meeting its intended purposes and to provide program faculty and administrators information needed in making changes to improve the unit’s performance.

 Candidate data are gathered prior to admission, during each course and clinical experience included in the program of study, at specific transition points during the program, and at the time of program completion. During courses and clinical experiences, candidates are assessed on critical performance tasks identified by faculty within candidates’ programs of study and designed to make decisions about candidates’ level of proficiency in the knowledge, skills, and dispositions necessary to help students learn. These critical tasks are used to assess the most significant outcomes of each course, and they are linked to several sets of professional standards, including the Florida Educator Accomplished Practices (a set of 12 standards developed by the Florida Department of Education for assuring teacher candidates, upon graduation, will be prepared to enter a classroom with the minimum skills essential to succeed as a teacher) and the Florida ESOL standards (a set of standards developed by the Florida Department of Education to assure that teachers in the state’s schools are adequately prepared to work with students whose first language is not English.)

 UNF utilizes a standard database protocol for entering results of the critical task assessments and other candidate data into the system. As candidates complete critical task assignments within courses, course faculty members are responsible for assessing the task/assignment, the faculty member then reports a score to the data base clerk based on a rubric designed for assessing the assignment. Once sufficient data are entered on multiple tasks across many candidates, the system is set up so that data may be compiled, sorted, and printed out by program, by candidate, or by the critical task being assessed.

 

Developing the System

                 Development of the system is the result of a comprehensive, multi-year effort resulting from the work of a number of COEHS committees and initiatives. Two special task forces worked on refinement of the undergraduate core curriculum common to all of the teacher education programs. The resulting undergraduate core included five broad areas of content: instructional planning, classroom management, human development and learning, assessment, and learners with special needs. The task force efforts resulted in a set of competency statements for each of these five core curriculum areas that expressed what candidates in each program were expected to master. Core courses were redesigned to address these competency areas, and unit goals were consulted in designing a set of common assessment tasks to address the core competencies.

 Prior to the in-house development of the unit assessment system/database, members of the faculty evaluated several commercial tracking systems. The decision was made to develop our own system, because at the time the available commercial systems were found unacceptable based upon: costs; not sufficiently meeting the college’s needs; flexibility and access to data by faculty and students; questions concerning long term access to data; and ease of use of systems by faculty and students.

 Faculty groups then designed critical task assessments for assuring competence of their candidates per the Florida Educator Accomplished Practices, the Florida ESOL standards, and other relevant sets of professional standards, with educational practitioners serving in an advisory capacity, as appropriate, and with the unit’s Continuing Accreditation Team (CAT) providing oversight of these efforts. The COEHS Technology Committee worked diligently during the 2001-2002 and 2002-2003 academic years to create and provide procedures for implementing the electronic database for tracking candidate outcomes. Finally, the COEHS Teacher Education Advisory Council, a unit advisory panel composed of professionals from both within and outside of the University, provided feedback on the development and scope of the unit assessment system at its regular meetings (see Figure 2).


Figure 2. Database Development Timeline

During 2002-2003, efforts were devoted to implementing the program-specific aspects of the assessment system, with attention given to identifying program transition point assessments and/or course-based critical tasks within each program of study to be consistently used to assess the performance of all program candidates. This process included attention to (a) scoring procedures and rubrics for documenting the performance of candidates on each task and (b) planning for the development of a computer-based system for tracking these assessments at the unit level. The 2003-2004 academic year saw refinement of the program-based transition point and critical task assessments. Data from these assessments are currently being used to make decisions about candidate progression through programs and to reflect on the appropriateness and fairness of the assessment measures being employed. Further, the unit’s computer-based data tracking system for monitoring task data from these assessments is now fully operational for faculty access. A component of the future design of the assessment database is to tie together the newly developed critical task assessment system with the other unit databases to provide better access to information (see Figure 2).


Figure 3. Database Compilation

 

Candidate Assessment

 At the individual candidate level, the system features decisions about candidate performance based on multiple assessments made at admission into programs, at appropriate transition points (gateways), and at program completion. A graphic presentation of the candidate assessment procedures used by the unit is provided in Figure 3. Program faculty assess candidates’ knowledge, skills, and dispositions through course-based assessments and at various decision-point program gateways. Data from these assessments are used to make decisions about candidate performance at the pre-admission, developmental, and program completion stages. As candidates progress through the educator preparation programs, they are expected to demonstrate increasingly higher levels of knowledge, skills, and dispositions as identified in the unit’s conceptual framework and program knowledge bases. As feedback is given to candidates following assessments, growth is expected in the candidate’s planning and delivery of instruction. The feedback given to candidates includes a review of strengths observed, concerns, and specific suggestions for developing knowledge, skills, and dispositions relative to professional and unit standards (see Figure 4).



Figure 4. Candidate Program Assessment Plan

 

Course-Based Assessments

 Once admitted to a program of study, the first level of candidate assessment occurs at the individual course level. Faculty in each program identify course objectives and assess the extent to which candidates accomplish these objectives. A wide variety of assessment types are used within courses to evaluate candidate knowledge, skills, and dispositions. Examples of these assessments are traditional tests, portfolios, group and individual presentations, reflective essays, examinations, lesson and unit planning activities, practicum observations, case studies, and videotape-based skill evaluations. Rubrics, checklists, and other scoring tools are used to assess candidate performance on these activities and to provide feedback to candidates. Course grades serve as one means for assuring that candidates have demonstrated competence in important course-based outcomes. Students in undergraduate programs must obtain grades of C or higher in all courses, and graduate students are typically expected to earn grades of B or higher.

At the undergraduate level, a primary feature of the unit’s course-based assessment procedures is the utilization of “critical task” assessments that are required of all candidates completing a given course regardless of the instructor teaching the course or the program of study in which the candidate is matriculating. These critical task assessments are linked directly to the Florida Educator Accomplished Practices, and attention is given to utilization of multiple critical tasks for each Accomplished Practice throughout the candidate’s program of study with the goal of thoroughly documenting candidate performance consistent with the depth, breadth, and intent of each practice. Success in the critical tasks is essential to candidate performance in each program course, with performance on the critical tasks weighted heavily in the course grading system and, in many cases, with successful completion of all critical tasks included in a given course essential if the student is to receive a successful grade in each course (see Figure 5).

 


Figure 5. Course-Based Assessments

 

Program Assessment

 To thoroughly review each program on an annual basis, program faculty and department chairs examine findings developed through curriculum alignment audits, as well as aggregated internal data on candidate competencies and information from external sources, such as follow-up studies, candidate performance on licensure examinations, employer reports, and state program reviews. Aggregated candidate data collected at the pre-admission stage (number and qualifications of applicants by admission status) and at the intermediate and completion stages (including number of program graduates and graduation rates) are examined. Results of this program evaluation process are used for revising the program curriculum (see curriculum alignment audit below), for improving instruction, for revising field experiences, and for redesigning other components of the program to promote high levels of performance by all candidates.

  

Curriculum Alignment Audit

 The College utilizes database technology to facilitate the program evaluation process. During the Fall of 2002 the COEHS Technology Committee, in cooperation with department chairs and the Office of the Dean, used the audit criteria specified in program groups and the Florida Educator Accomplished Practices to develop a database to track and house the student candidate data. The Technology Committee provided feedback on results of this curriculum audit to program coordinators, who worked with program faculty to provide clarification on program curricula. The electronic database is used to compile data from all critical tasks and program transition assessments as gathered by unit faculty. Each critical task is keyed to the Educator Accomplished Practice(s) and/or Florida ESOL standard(s) with which the task is most directly related. The electronic database also includes fields showing the type of learning addressed by each objective (knowledge, skill, or disposition), the specialized professional association standard associate with the objective (if relevant), whether the objective entails candidate reflection (an underlying theme throughout the unit’s conceptual framework), and the general content and form of the assessment, including reference to the scoring tool.

 The electronic database generates reports to assist faculty members within each program in examining the alignment of their curriculum with the Florida Educator Accomplished Practices, the Florida ESOL standards, and other relevant sets of professional standards. Faculty are able to analyze the curriculum holistically by examining the program’s focus on appropriate knowledge, skills, and dispositions; by reviewing the various types of critical tasks and other assessments used in the curriculum; and by examining the extent to which candidates as a whole are experiencing success or difficulty in completing any relevant standard. 

 

Pilot Testing

 The electronic tracking database was pilot tested in summer 2003 in preparation for full implementation of the data collection and data entry process in fall 2003, with faculty in a select number of programs participating in the pilot. During the 2003-2004 academic year, data were gathered for candidates in all unit programs. All critical tasks and/or program transition points associated with all programs of study were formatted for entry into the database by the end of the Fall, 2003 semester. Program faculty use these data in making decisions on candidate continuation and completion for candidates entering programs in fall 2004 and thereafter. Data on candidate performance were aggregated for review at the end of Spring, 2004. Programs use these aggregated candidate decision point data in preparing their internal review evaluations during the 2004-2005 academic year.

 

Implementation

 During the first year of implementing the assessment database, the faculty input their critical task assessment results onto a paper form and then the assessment clerk transferred that information into the database. During the second year of full implementation of the assessment system, the database was transferred to a secure server providing assess to the database through a secure internet connection for faculty to directly input their data. The future design includes plans for candidate access to their personal data through the secure internet connection.

 

Personnel and Costs

 The costs for developing, pilot testing and implementing the unit assessment system during the first two years include personal costs. Hardware or software are not included in the cost estimates because the college used exiting resources. The personnel costs included administrators, faculty, staff, and student assistants’ time. A dedicated assessment clerk was hired to act as a liaison between the assessment system and the faculty and also to do data entry. A portion of one faculty member’s load was designated for overseeing the design, development and testing of the system. The college’s technical support person dedicated a portion of his time to development and programming of the database, the secure server, the web functionality, and the online interface. The college’s associate dean and student assistants also contributed to the project. The estimated value of this professional time totaled $80,000 in the first year and $65,000 in the second year.

 

Hardware and Software

 The existing assessment system was initially designed using Access relational tables to build the elements of the task tracking database. The constructed database tables were then transferred to our current server housed database. The hardware and software currently in use is a Pentium 3 desktop computer with a Windows 2000 operating system running Internet Information Server 5 for secure web server applications and MySQL Server software. The server operating system uses open source software and the only costs were the secure certificate at $50. This system allows the unit to access and use the database through a web PHP interface. The college has now budgeted purchase of a dedicated server, to host the database in the university’s server farm.

 

Accreditation

 During the 2003-2004 school year the College had its NCATE unit accreditation visit. A demonstration of the assessment database system was requested for the entire assessment team at the beginning of the review. A sub-group of the team investigated the assessment databases, meeting with the design and construction group as well as the supervising committees.  At the end of the successful review, the NCATE Board of Examiners Team members specifically mentioned the colleges’ assessment system in their Exit Interview. They stated that the “candidate assessment protocol and the correlated data management system is worthy of imitation” and that it “will be a significant contribution to the profession” (Personal communication NCATE BOE 2004).

 

Conclusion

 In addition to providing information on individual candidate performance, the design of the electronic tracking database also supports the creation of standard and custom reports for use in evaluation at the program and unit levels. Creation of a candidate assessment database will permit aggregation of these data for use in identifying program strengths and weaknesses. This aggregation of candidate performance data may be combined with other unit internal data (e.g., summaries of candidate complaints and their resolution) and external data (e.g., first year principal evaluations) for purposes of making decisions about program outcomes and overall improvement of the colleges’ education design. The database design, tracking the results of the tasks and associating the task with courses, provides an effective system and design, which is easy to use.

  

References

 Ambach, G. (1996). Standards for teachers: Potential for improving practice. Phi Delta Kappan, 78(3), 207-210.

Denner, P. R., Salzman, S. A., & Harris, L. B. (2002, February). Teacher work sample assessment: An accountability method that moves beyond teacher testing to the impact of teacher performance on student learning. Paper presented at the annual meeting of the American Association of Colleges for Teacher Education, New York. (ERIC Document Reproduction Service No. ED463285)

Fredman, T. (2002, February). The TWSM: An essential component in the assessment of teacher performance and student learning. Paper presented at the annual meeting of the American Association of Colleges for Teacher Education, New York. (ERIC Document Reproduction Service No. ED464046)

Harris, L. B., Salzman, S., Frantz, A., Newsome, J., & Martin, M. (2000, February). Using accountability measures in the preparation of preservice teachers to make a difference in the learning of all students. Paper presented at the annual meeting of the American Association of Colleges for Teacher Education, Chicago, IL. (ERIC Document Reproduction Service No. ED440926)

National Council for Accreditation of Teacher Education. (2002). Professional standards for the accreditation of schools, colleges, and departments of education (2002 ed.). Washington, DC: Author.

Tomei, L. J. (2002, February). Negotiating the standards maze: A model for teacher education programs. White paper. Paper presented at the annual meeting of the American Association of Colleges for Teacher Education, New York. (ERIC Document Reproduction Service No. ED463263)

Weisenbach, E. L. (2002). Myth 2: There is no connection between standards and the assessment of beginning teachers. In G. Morine-Dershimer & G. Huffman-Joley (Eds.), Dispelling myths about teacher education (pp. 25-32). Washington, DC: American Association of Colleges for Teacher Education.