Showing posts with label EOC. Show all posts
Showing posts with label EOC. Show all posts

Wednesday, May 20, 2015

Chemistry EOC and CCI Results


I think I've earned all of the above after this semester.  It was dreadful.  But I want to reiterate, it was not the chemistry modeling curriculum I found dreadful.  Rather, it was the combination of a lousy group of especially unmotivated students, attempting new curriculum, and a number of other factors out of my control.

I'm not going to lie, I was truly worried about my students' EOC scores.  As I have said before, a large portion of our annual teaching evaluation comes from student growth on their state exams.  Across the board, my students were not showing the strong mastery I usually see.

The EOC results:

Spring 2015 - Modeling Curriculum
58 total students (all standard)

Average Score:  83.5
Median Score:  84
Lowest Score:  66
Highest Score:  96
Failures:  3

2013-2014 - Traditional Instruction
79 total students (all standard)

Average Score:  84.0
Median Score:  83
Lowest Score:  66
Highest Score:  94
Failures:  3

As you can see, the differences are minuscule.  My class average was half a point lower, but my median was a point higher than last year.  My lowest scores were the same, but my highest scores were higher.  I think it's worth noting, last year's junior class was collectively one of the highest performing groups of students we've ever had at this school.  Overall, they were a bright and motivated bunch.  This year's juniors are collectively considered to be the polar opposites of their preceding class.  I only say that to illustrate why I'm not kicking myself over the half point drop in class average.

What I am kicking myself about is the failures.  I had the same number of failures this year, but with less students.  It works out to a 5% failure rate this school year vs. a 4% failure rate last school year.  If that wasn't bad enough, 2 of those 3 failures were complete surprises to me.  I had targeted a list of about 8 students whom I was seriously concerned about.  Of those 8, only 1 failed, which I had unfortunately anticipated.  But my other 2 failures were students with B and C averages in class.  They performed well on my assessments and the practice EOC.  I am dumbfounded as to why neither passed.  While neither were child prodigies, I had zero indication that they were at risk of failure.  One student actually scored higher on her practice EOC than the actual EOC, and she's not one who I would suspect of even cheating.  Consider me stumped.

I gave the Chemistry Concepts Inventory as my final exam.  This is a hard test geared towards college level students, so I don't base their "grade" off their score.  Instead, I give it to them as a pre-test/post-test and they earn a 100 on the final if they show improvement (I don't tell them their pre-test scores).

I plugged their data into another homemade Excel data tracker, hoping to identify some trends on where my students are weakest and strongest:

Pre-test data... with unit/topics listed at the top for each question.  Blue indicates something I felt was strongly covered.  Yellow were students who I had no data on because they transferred into my class late.

Post-test data, including the difference in score.  The student with a -8 was absent and didn't take the post-test yet.  The -3 difference on another student is correct, though.  The #VALUE indicate students who I didn't have data on or were exempt from the post-test due to school activities.

The data:

Pre-Test
54 students
Average Score:  5.6 out of 22 questions
Highest Score:  11 out of 22
Lowest Score:  0 out of 22

Post-Test
52 students
Average Score:  6.8 out of 22 questions
Highest Score:  12 out of 22 (not the same student who had the highest pre-test score)
Lowest Score:  1 out of 22
Average Gain:  1.2
Highest Gain:  6
Number of Students with Gains:  35
Number of Students with No Change:  10
Number of Students with Regression:  7

One student (whom I've had severe behavioral/disciplinary issues with all year) actually scored 6 points worse on the post-test than the pre-test.  Lovely.

I tried to track the questions for strongest/weakest topics and patterns on what students answered correctly and incorrectly, but the data was all over the place.  I couldn't identify many real patterns.  Overall, I think the majority of my students were completely guessing both times they took this test.  The students who had no change didn't even answer the same questions correctly both times.  

The only two questions where I saw a noticeable change in results were questions #7 and #8, which were related questions.  Question #7 is true or false about matter being destroyed when a match burns.  Question #8 is the reason for the answer in question #7.  75% of students answered those questions correctly on the post-test, as compared to 46% and 52% on the pre-test.  Nice gains, although I have to shake my head about the 25% who still managed to answer those questions wrong...






Tuesday, May 5, 2015

EOC Prep

This Thursday is my student's EOC.  They are not even remotely ready.

Friday I had my students take our state's only EOC practice test that they have available for Chemistry.  You can see the test HERE.



In the few years we have been implementing the chemistry EOC (it began in 2013), I have always found the actual EOC to be much easier than the practice test.  Which is a good thing, because my students scores are always depressingly dreadful on the practice test.  Their scores end up being okay on the actual EOC.

I really liked how I created a Biology EOC data tracker last semester, so I did the same for chemistry.  I had them complete the test on Scantron, and then entered their scores into an excel spread sheet:


Obviously I left the student names out of the screen shot.  Each question is color-coded by Unit 1-8.  A "0" indicates they answered incorrectly, "1" for a correct answer.

Entering in this manner allows me to not only see the students' raw scores, but also determine their strongest and weakest units:


A group of students raw scores are at the lefthand side of the photo above.  The color coding indicates below basic, basic, proficient, or advanced based on the state's criteria of # of questions correct.   Yellow indicates the student did not complete the entire practice test.  The "curved" is my own curve-- I factored out the questions I felt that I did not sufficiently cover to give the students the ability to answer them correctly.  The red text color is for me to see who was still failing.  To the right, you can see the students performance by unit.  Also, the number of questions that I classified to be from each unit is listed.

Tracking the test in this manner also allows me to see the percent of students who answered each question correctly, so I can see where students are collectively making mistakes:


For example, in the questions shown above, only 13 students or 22% answered question #12 correctly.  Question #12 had students predicting the products of the following reaction:

Al + Cl2 -> _____ (I can't seem to get subscripts to work on Blogger)

Almost all of the students chose "AlCl2" as the answer, because that would be a balanced equation.  So we revisited ionic compounds and how the subscript tells you how many of each atom you need to cancel the charges.

The breakdown of scores is terrifying.  If I wasn't so burned out from this group of students and this semester, I'd be in panic mode.  Although I've hit the point where I just can't care anymore.

The results show that only 9 of my students scored in the "advanced" range.  18 scored in the "proficient" range.  26 were in the "basic" range and while 5 were "below basic," only 2 of those 5 actually finished the test.

I have serious concerns that 3 of my students will not pass the EOC.  Those 3 students all happen to be minorities, which makes me kick myself even harder.  I hate to think that I'm contributing to the "education gap" on paper.  Unfortunately, there is pretty much nothing I can do now.  These students have struggled ALL year.  I've tried to get them to come in for extra help.  I've tried to pull them aside, or group them with a peer tutor.  I've emailed their parents and coaches.  No luck.

The nice thing about data tracking students in this manner is that it allows me to tailor the review material over the past few days.  Instead of making my students review EVERYTHING, we go over the most missed questions in class... then they each have their own set of review material they need to complete.  In reality, I just make a review packet for each unit and the students need to complete the material for each unit that they scored below 75%.

The test is Thursday.  I hopefully will get scores back before the end of the year!







Monday, January 5, 2015

Biology EOC Results

Where's this stuff again?

Right before lunch on our first day of the spring semester, one of the guidance counselors knocked on my classroom door with my students' EOC quick scores from last semester.

Initially, I was elated.  At first glance, I noticed TONS of scores in the 90s!  Last year I could count the students who scored in the 90s on my fingers.  However, the elation faded when I noticed two failures, and very low failures at that.  It only got worse as I began to crunch the numbers.

2014 - Modeling Curriculum
58 total students (standard and honors combined)

Average Score:  88.8
Median Score:  91
Lowest Score:  63 (two failures)
Highest Score:  98


2013 - Traditional Instruction
37 total students (all standard)

Average Score:  81.6
Median Score:  82
Lowest Score:  68 (the only failure)
Highest Score:  92

Those numbers initially look MUCH improved, until you pull out the data from only my standard students this year:

2014 - Modeling Curriculum, Standard Biology Only
12 total students

Average Score:  82.2
Median Score:  85
Lowest Score:  63 (2 failures, both scoring 63)
Highest Score:  94

I'm not about to run a statistical analysis on the data.  While there was a slight improvement in average score and median score for standard biology students from 2013 to 2014, I expected to see more.  Plus, I had two extremely bad failures-- 63 is the lowest EOC score I have ever had a student earn on an EOC in any subject.  To add insult to injury, while one of the students who scored 63 is failing, the other is passing the class with a C.  That student had been on my "watch list" all year, but the student was very good about completing assignments, which offset her low quiz and test grades.

So why didn't I see more improvement?  Why was I unable to reach some of these students, especially the two failures?  Why was my failure rate so much higher than usual?  Why did my students think the exam was so hard this year, when I thought I was seeing strong mastery in the classroom?  And most importantly-- what do I need to do differently next time around?

The honors scores look okay on their own, I just hope they are "enough" so they won't negatively impact my evaluation at the end of the year.  We are evaluated on growth:  the state has some magic algorithm that determines what a student should score on their biology EOC based on prior state science exams.  If a student scores below that number, it impacts you extremely negatively.  If a student scores the expected number or only slightly above, you receive a mediocre evaluation.  The majority of your students need to score significantly better than their anticipated scores to receive a strong evaluation.  When you teach honors students, it's difficult to get the growth needed for a good evaluation.

On a side note, it's a sad state of education when I'm sitting here stressing about my students' growth scores instead of whether or not they learned enough biology to be successful in the future.  When did my job become more about numbers than education?

And where's that wine anyway?




Wednesday, December 10, 2014

EOC Prep: Data, data everywhere...

What have I been doing?  Data tracking.  Tomorrow is my students' state end of course exam for biology.  It's hard to believe that it's already that time...

Since last Friday, we have been working on practice tests and review.  I have been struggling over the years to find an EOC review method that works best.  Let's just say that the struggle continues.

Some teachers say they start at the beginning of the curriculum and revisit every topic, then take a practice test.  I've found that style ineffective for me and my style of teaching:  the students tune out and none of us get a very good assessment of what we know/don't know until we're done.

Lately, I have been trying to tailor review activities to individual students' needs:  I usually give a practice test as a diagnostic and use that to guide the review material.  Sometimes I have done "whole class" reviews, other years I have done individual review packets.  In a perfect world, I would give a post-practice test afterwards, but we always seem to run out of time.

This year, I sort of blended my methods.  I somehow actually had time to create a real, live data tracker for their EOC practice test results.  Data tracking is something that was stressed heavily in my alternative teacher certification program, but I've never had the time to utilize it in my own classes.  The schools I've taught at haven't mandated it, and it's time consuming to create all of the necessary tools from scratch.  But somehow this year I had a chance to sit down and make a simple excel spreadsheet to track my students performance on the EOC practice test.

Raw data for each student per question, color-coded by unit-- 1 for answering a question correctly, 0 for incorrect.

Raw scores for each student and color-coded performance for each unit.  Missing data from a handful of students, hence the reds.

I also calculated the % of students who answered each question correctly.  I had several students absent, so no question has 100% correct.

The results have been really interesting.  I also have successfully gotten myself worked up into a tizzy.  You can see the numbers for some of my standard biology students in the 2nd picture above, and they are NOT pretty.  But in years past, I do recall students doing significantly better on the actual EOC vs. the practice test.  Last year, I believe only my absolute brightest standard students even passed the same practice test, yet they all had strong scores on the real exam.  Thanks to this data tracker, I should eventually have facts to support my recollection...

Based off the data, students then had to complete a combined class/independent review activity.


There were six questions that had lower than 50% of students answering correctly.  We reviewed these six topics as a class (boring lecture style).  Of these six questions:  
  • One was a cell diagram where it was unclear if the arrow was pointing to the ER specifically or a ribosome on the ER.  That was an easy fix.
  • One involved a coral reef food web.  I was really surprised that this question was missed so frequently, but when I asked the class about it, apparently it was the way that the diagram was drawn that threw a lot of students off.  They misinterpreted the diagram and thought it was a trick question.
  • One (well, technically two but we combined them) were testing for macromolecules in food.  We just needed to review that topic, since most students had forgotten which indicator to use for each macromolecule.
  • One (technically two) were on enzymes.  I did not cover enzymes well.  We did enzymes the week my classes were disrupted by students taking the ACT Plan test.  So we revisited the topic.
  • One was on the percentage of chromosomes in gametes formed by meiosis.  Now, every single student in my classes can repeat that gametes are haploid and have half the chromosomes of the parent cell.  But for some reason, seeing the number as a percentage threw them for a loop.
  • The last was a tricky pedigree chart for an x-linked recessive disorder.  It began with an affected father and carrier mother who produced affected sons and daughters.  Seeing both males and females affected made many students assume it was an autosomal disorder.
After our whole class review, I gave students the breakdown of how they scored for each unit.  Based on their score, they had to attend stations with review materials for their weakest units.  The idea for the activity was good, but it was difficult to execute.  If I can find a way to streamline and organize the stations, it may work better in the future.

So... that's that.  Tomorrow is the big test.  I should find out their quick scores after the new year and their full results sometime in July.  I am extraordinarily curious to see how my standard biology class will fare and if the class average will be higher than last year's average (I didn't have any honors students last year to compare).  I really want feedback on how effective the modeling curriculum has been for my teaching!!!!