« Achievement Gaps Shrunk Faster in the 70s and 80s than Over the Past Decade | Main | Don't Miss My New Slate Podcast, "Schooled!" »

September 10, 2013

Comments

Mission driven? Having worked as a mentor for perhaps a dozen TFA corps members in Oakland, I would say this is an apt term. And the mission, as far as TFA is concerned, is higher test scores.

Most of the TFA teachers with whom I worked were instructed to put large posters on their wall that exhorted their students to higher test performance. Student data was often tracked on wall posters as well.

I saw a first year teacher who had low scores on her tests get the advice of her TFA data coach to change her daily instruction so that it more closely resembled the tests. Soon, she was giving daily worksheets with multiple choice and short answer questions. Her students test scores rose, but after a month or two, they were bored out of their skulls and climbing the walls. In her second year, she shifted and embraced a Project based learning approach and was able to get much more engagement. Unfortunately, the year after that, she left to go to medical school.

That is the key problem with TFA. As you point out, experience matters. People become more effective over time on a variety of levels, many of which are not measured by test scores. But so long as test scores are the only yardstick we are using, TFA may look good. If we were to choose basketball players on the basis of height alone, I might make the team alongside Arne Duncan. But there are many more aspects to being a winning ball player, just as there are many more aspects to teaching.

Honest question, not snarky -
On a multiple choice test, how many more questions did TFA students get correct compared to traditional? What is 2.6 months of learning in multiple choice question terms?

I think my questions was a bit confusing. What I'm trying to get at is: For students with TFA teachers, how many more multiple choice questions did they answer correctly compared to students with traditional teachers? Did they get, let's say, 55 of 65 multiple choice questions correct while the other students got 47 correct? What are we talking about here with months of learning?

Great overview. But I do think the fact that TFA recruits generally had stronger math backgrounds in college could be one of the determining factors in this study. Previous studies have shown strong correlations between high content knowledge in math teachers and student achievement gains in math. For example:
http://sitemaker.umich.edu/lmt/files/hillrowanball.pdf
http://www.jstor.org/discover/10.2307/3516044?uid=3739832&uid=4578595737&uid=2&uid=3&uid=3739256&uid=60&sid=21102628180727

If TFA focused on providing *secondary math teachers* it would be a lot less controversial. Finding good *math* teachers is a nightmare.

This study poses questions about gender as well. Why was the comparison group so heavily female (79% as compared with a 59% average for teachers nationwide and 60% in the TFA group). When so many at-risk students are male, isn't it widely believed that they may perform better in general with a male teacher for reasons that have nothing to do with teaching technique? Why, then, focus on students with female math teachers in the comparison group? When the results are so minuscule, what if this gender discrepancy had an effect?

Above all, what does a slight bump in math scores on a single test even prove? Does a child's ability to bubble in answers at the command of a teacher in any way prepare him or her for college or for life? Does a 6 or 7-answer test score difference represent actual learning--or just the teacher's enthusiasm for the test?

I think it's important here to raise the substantially higher turnover of TFA teachers. If only 15% of TFA teachers remain in the low-income school they started in, either because they burn out or because they never intended for teaching to be a career, then there are significant long-term negatives that far outweigh these short-term and small gains in "student achievement".

The report says "This difference in math scores was
equivalent to an increase in student achievement from the 27th to the 30th percentile. This difference also translated into an additional 2.6 months of school for the average student nationwide."

On a 65 question test (assumuing 1 mark for a correct answer) - the ordinary teacher's kids would have scored 17.6 and the TFA teacher's kids would have scored 19.5.

There is an interesting study done in a military college where they had complete control of student assignment to a math class. They randomised the students to either new or experienced teachers and then randomised them again the second year to other teachers. Students with a new teacher in the first year got better scores in the first year but students with an experienced teachers in the first year got better scores in the second year - it was assumed that the experienced teachers gave them deeper learning that played out as they advanced.
http://www.nber.org/papers/w14081.pdf?new_window=1


It is important to note that TFA teachers do not teach courses beyond Algebra 2. To suggest that they are better overall math teachers is misleading.

If a TFAer didn't major in math, it's unlikely that they have the range of knowledge that a real teacher has; beyond teaching formulas (perfect for multiple choice testing!) TFAers do not encourage higher order thinking skills. Their sole goal is to generate high test scores, not understanding of underlying concepts or the practical applications of advanced math.

Value as instructors? ok for first and second year of high school, as long as they stick to fundamentals. However if a student asks questions beyond what is supplied in the script, he or she will be out of luck.

The comments to this entry are closed.