2.8 Data Analysis
Candidates model and facilitate the effective use of digital tools and resources to systematically collect and analyze student achievement data, interpret results,
communicate findings, and implement appropriate interventions to improve instructional practice and maximize student learning. (PSC 2.8/ISTE 2h)
Artifact: School Net (Instructional Data Management System, IDMS)
Reflection
Available to me on a classroom teacher level, I established my use of IDMS spring 2014, and extended its use to my department fall 2014. I use the IDMS tool to monitor assigned students' performance. This is a county wide tool that provides me with district assessments, such as benchmarks that are used to to evaluate and gauge Student Learning Objectives (SLOs) of the curriculum content taught during the time frame leading up to the test. SLO assessments are mandated by the GaDOE and reported twice per year. As a teacher team leader, I use IDMS to create classroom level pre-post-assessments for each science content unit.
These science assessments are part of a large effort to advance student performance. At a classroom level, the assessment I create are shared by hard copies to be inserted into IDMS by each science teacher, at the teacher classroom level. Analysis of this data provides a triangulation of evidence for student learning. Test scores, observations, and course work are part of the science team's investigation efforts and these components are discussed weekly at our department meetings. The close monitoring of student growth status enables science team teachers to determine student grouping. Small student groups are required for differentiation of instruction and part of our school improvement plan (SIP). For science instruction, student groups are based on comprehension and reading skills, and student content interests, or student technology abilities. Our overall building level goal is to achieve a 10% post assessment score increase.
As the department chair I reflect on this data with a data coach and my administrators during weekly leadership meetings. The IDMS system's numerical analysis of science unit assessments allows me to present the findings is several formats, however administrators have access to this same data at a higher level and can better identify department trends in student achievement. At most meetings the data coach and administrators already have the same IDMS data that I present, and this availability is a convenience for administration uses. In the future I would like to have access at a higher IDMS level too. Access beyond my current IDMS access would permit me to ensure our science department maintains a tightly scheduled online assessment practice. In the future I would like to have a higher access level to also enable me to better coach my science content team and turn our focus to more curriculum intervention options.
Candidates model and facilitate the effective use of digital tools and resources to systematically collect and analyze student achievement data, interpret results,
communicate findings, and implement appropriate interventions to improve instructional practice and maximize student learning. (PSC 2.8/ISTE 2h)
Artifact: School Net (Instructional Data Management System, IDMS)
Reflection
Available to me on a classroom teacher level, I established my use of IDMS spring 2014, and extended its use to my department fall 2014. I use the IDMS tool to monitor assigned students' performance. This is a county wide tool that provides me with district assessments, such as benchmarks that are used to to evaluate and gauge Student Learning Objectives (SLOs) of the curriculum content taught during the time frame leading up to the test. SLO assessments are mandated by the GaDOE and reported twice per year. As a teacher team leader, I use IDMS to create classroom level pre-post-assessments for each science content unit.
These science assessments are part of a large effort to advance student performance. At a classroom level, the assessment I create are shared by hard copies to be inserted into IDMS by each science teacher, at the teacher classroom level. Analysis of this data provides a triangulation of evidence for student learning. Test scores, observations, and course work are part of the science team's investigation efforts and these components are discussed weekly at our department meetings. The close monitoring of student growth status enables science team teachers to determine student grouping. Small student groups are required for differentiation of instruction and part of our school improvement plan (SIP). For science instruction, student groups are based on comprehension and reading skills, and student content interests, or student technology abilities. Our overall building level goal is to achieve a 10% post assessment score increase.
As the department chair I reflect on this data with a data coach and my administrators during weekly leadership meetings. The IDMS system's numerical analysis of science unit assessments allows me to present the findings is several formats, however administrators have access to this same data at a higher level and can better identify department trends in student achievement. At most meetings the data coach and administrators already have the same IDMS data that I present, and this availability is a convenience for administration uses. In the future I would like to have access at a higher IDMS level too. Access beyond my current IDMS access would permit me to ensure our science department maintains a tightly scheduled online assessment practice. In the future I would like to have a higher access level to also enable me to better coach my science content team and turn our focus to more curriculum intervention options.