Thursday, March 20, 2014

Wrapping up Rap Research

**Action research is a process used by teachers, administrators, and/or any person interested in the education field to evaluate their own teaching practices and/or how students learn to make adjustments to improve instruction and maximize student learning (Duganzic, Durrant, Finau, Firth, & Frank (2009); Mertler, 2012; Pappas & Tucker-Raymond, 2011). **

Over the past few months I've shared with you my research journey. There have been ups, downs, and some slow spots or should I say "snow" spots. :) The whole process has been a big learning experience. I have learned more about my school, my students, and the writing process just as the information above suggests that I should.  I have learned a lot about writing in our school; although, the data didn't show what I expected to find. This piece of research was actually quite surprising and even a little exciting. I now know what I need to study more in order to help our students. 

At this point in the process, I am continuing to work through all of the data I have collected.  As Holliday (2002) said, "Many novice and experienced researchers find out once they have got their data, deciding what to do with it and how to talk about it on paper can become equally crucial and even more problematic."  I am certainly a novice research and feel completely overwhelmed as I look at how to share the data I've collected.  I feel like I have a good direction to go, but putting the findings into words will be a challenge.  It is a challenge I am up for and ready to complete this action research project!



Monday, March 17, 2014

Presentation...eek!

I am presenting my research project to my PLC and principal this Wednesday.  I have been working on a presentation to share what results I have at this point.  I feel a little worried that I don't have enough information to feel like it was a successful project.  Snow days have really put a damper on this whole thing.  That being said, I feel like the qualitative data I collected with the interviews as well as the information gleaned from the ERQs will be helpful in identifying students strengths and areas of growth so they can be addressed.    I am in what Holliday (2002) referred to as the "dark night of the soul" where "the process of analysis, sorting and organizing has the potential to take the argument in many directions."  I believe I can make some good projections, but they will not be exactly what I was thinking I would find.  I strongly feel that if I had more time I could make draw better conclusions about the effectiveness of the RAP method. 

So, what am I thnking at this point?  Based on the coding of the ERQs, I believe that teachers need to address the importance of underlining the questions, boxing key words, and circling important numbers so that students are aware of what the question is asking.  I also think that when students learn to restate the question in their answer they will be much more aware of what the question is.

In adddition to the ERQ data, the interviews were very insightful as well.  Test anxiety seems to be a big struggle for students and I believe we, as teachers, need to find some strategies to help students realize that testing is important, but not make it a fearful experience. I found it quite fascinating that students didn't really worry about tests with multiple choice questions.  It was mainly the writing prompts that had them in a tizzy. 

As far as the question, what are the affects of the RAP method on how novice students respond to extended-response questions, I don't really know. I only have about 3 samples or so for each student and do not feel they have had enough experience with the whole process to draw conclusions.  Maybe I can get in another data point this week, but with yet another snow day and novels being read in every reading class, I am not hopeful. 


Reference: Holliday, A. (2002). Doing and writing qualitative research. London:Sage.

Tuesday, March 11, 2014

Qualitative Data...so far

As I stated in a previous post, I struggled with coding student interviews.  I only included one page of the interview below, but every page is similar.  In addition to the first page, you can see a page with post-its of the final themes that ran throughout the interviews.  A more in depth description of the coding follows the interviews.


 
 
 
The pages above display the beginning coding for each interview that was conducted in February.  Two interviews were conducted.  One was with four fourth grade students and the other with four fifth grade students.  Both interviews were recorded on my cell phone.  I then scribed the interviews and went back through to code.  As you can see in the pictures above, I color coded each student. All names are pseudonyms created by the students.  After color coding, I counted the number of speaking parts for each student.  The numbers also indicate times when more than one student was talking. 

                       

Observations

             Although the atmosphere of each interview was different, the information gleaned from them was very similar.  I used “constant comparative methods” while coding to see how the interviews related to each other (Glaser & Strauss, 1967 as cited in Charmaz, 2006 p. 54).  All eight students indicated negative emotion relating to testing and/or writing.  Concerns expressed were extended-response questions with multiple parts, knowing the answer but struggling when it was time to write it on the page, time,  and spelling.  In addition to the struggles with writing and testing, tests with multiple choice questions were less of a concern than tests with writing activities or writing assignments in general.

Indications of fear or worry

When asked about taking the KPREP –

“Kind of nervous.” 

 “Yeah and scared.”

 “Cause, it’s like, what happens if you get all the answers wrong?”

“Sometimes you might worry that you might not finish like the whole test in time.”

When asked about answering extended-response questions-

“Nervous”

When talking about taking tests-

“I study all the time and then, like, whenever it goes to the test, like, I worry too much and like I forget all the answers.”

“I study all the time, but when I’m, when I’m actually doing the reading test I lock up.”

“Scary. Scary. Scary.”

 

Concerns about multiple parts

“Like you have to write A and B and you have to read through it.”

“You know when you see and you have lots of parts to answer for it…”

 

Concerns about knowing the answer, but not being able to write it correctly

“With open response you have to like write it all out.”

“I know it in my head, but when I put it down on paper it comes out wrong.”

“….I’ll write about it and I’ll get it wrong because I don’t remember about it when we read it…”

“I know the answers. I study all the time, but when I’m, when I’m actually doing the reading test I lock up.”

 

Comments about multiple choice questions

Teacher: Why is it harder to do that than to do multiple choice?

Male student: Because you have four answers to pick out.

 

Teacher: Do you feel nervous about…multiple choice questions, too?”

3 Students in unison: No way!

 

“I like it more when you have like choices.”

Table 2

Code Mapping (to be read from the bottom up)

 

THIRD ITERATIONS: Data Categories

Concerns (CO)                                      Emotion (EM)                                  Multiple Choice (MC)

 

 

SECOND ITERATION: Focused Codes

Spelling (S)                                           Emotion (EM)                                   Multiple Choice (MC)

Know answer, but forget

     while writing (FW)

Multiple Parts (MP)

Time (T)

 

 

FIRST ITERATION: Initial Codes

 Spelling                                                          Emotion                                 Multiple Choice

Forgetting while writing                Clarifying negative feelings                Explaining why multiple                 

Worried about time                                          Scary                                                choice is easier

Writing                                                                                                   

Testing

Worried about time

 

 

 

Resource:

Charmaz. K. (2006). Constructing grounded theory. Thousand Oaks: Sage (42-71).

Quantitative Data...so far

The coding process for the quantitative data is below.  You can see a student example and how it was coded.  The second image is a more in depth look at the the restating category and the all parts category. After that you see the tallies for each category of RAP as well as one for graphic organizers.    The description of all the data is in the paragraphs below.




Table 1

 
Restating (R)
Missing Parts (MP)
Proving Answer (PA)
Graphic Organizer (GO)
Total
37
23
23
4
4th Grade
21
11
8
0
5th Grade
16
12
15
4

 

Observations

             Restating the question and/or marking the question for key words, important numbers, and underlining the question(s) was the most missed section of RAP.  After just marking whether students one or the other out, I decided I needed to specify exactly what part of the “R” they were missing so I went back through each example to clarify.  I found that only five examples displayed both parts completed correctly.  Two fourth graders and 1 fifth grader underlined, boxed, and circled, but did not restate the question in their answer.  Two fourth graders and two fifth graders did not underline, box, or circle, but successfully restated the question in each part of their answers.  Eighteen fourth graders and twelve fifth graders did not box, circle, or underline and did not restate the question in their answers.  The question must be asked.  If more students had underlined, boxed, and circled important words, numbers, and questions would they have been more aware of the task at hand and been more likely to restate the question? 

            The next code indicated the number of examples where students did not answer all parts of the question or some or all answers given were incorrect.  I also did an initial quick marking to indicate either of these options, but returned to the examples to specify which area exactly. I found that only one student correctly answered all parts that were answered, but just left out a part.  Five ERQs, two fourth grade and three fifth grade examples, had unanswered parts and had incorrect answers in the parts that were completed.  Nine fourth grade examples and eight fifth grade examples had all parts, but one or more of the answers given were incorrect. 

            Next, I looked for examples where students were unable to prove their answer using either evidence from the story or their own personal experience.  Twenty-three examples total received the “PA” code.  Fifteen of those samples were from fifth graders while just eight were from fourth graders. 

            Finally, I used the code “GO” to indicate the use of a graphic organizer.  Of the forty-two samples, only four contained a graphic organizer.  All four were fifth grade students.  The question asked was a compare and contrast question and students are “trained” to use a Venn-diagram to display similarities and differences.  Two of those students were reminded that they need to write their answers in sentence form from now on.  The other two students are special education students who had a writer.  I am not sure if the writer automatically drew the Venn-diagram or if the student prompted her to.  They should have prompted, but the scribe was not a usual scribe so she may not have known.  It is unlikely that a graphic organizer will be used on another ERQ this year.

 

Thursday, March 6, 2014

Past Overwhelmed

To say that I feel overwhelmed would be a complete understatement today.  I worked on coding my interviews last night and have to admit that I think I totally messed up.  I pulled out important details, but I did not do it the way that Charmaz (2006) suggested.  I will upload images later. I found very interesting insight into the major fears that are associated with writing and testing.  Students were honest about their worries of getting incorrect answers, running out of time, understanding the answer but not being able to verbalize it, and worries with spelling.  I think the information provided was great, but I am really struggling with how to put it down on paper. I will share images of my findings from both the interview as well as the ERQ coding in my next post.

Tuesday, March 4, 2014

Snow Days = Some Progress

Thanks to Old Man Winter, I have had a couple of days to feel like I'm getting caught up...a little.  I spent some time this weekend finishing the coding and tallying of the student work.  I did total tallies for each category and then decided I wanted to see grade level specific data.  I think you would call this step "grounded theory" (Charmaz, 2006).   As, I stated in a previous post, I also added in the "Graphic Organizer" category.  I will continue to add to this information as I get student work.   Due to technical difficulties, I cannot get the photo of the data so far posted.  It will be part of my next post later in the week that will include all of my coding of examples as well as interviews and the proposed theories at this time.

This morning, I spent about an hour and a half just scribing the interviews I conducted in February. Whew! That is a JOB! I plan to spend this evening coding those.  I'm excited to see what I find.

Resource: Charmaz, K. (2006). Constructing grounded theory. Thousand Oaks: Sage. (42-71).




Thursday, February 27, 2014

Little Progress...

      This week has been overwhelming.  I honestly haven't had the chance to put in the time that I truly want to with my research.  Health issues, being mommy, and a full time job have really taken over my schedule.  All that being said, I have thought A LOT about my project and my next steps.  I need to continue coding each ERQ.  Hopefully, I can get that caught up as soon as possible and then the incoming ERQs should be pretty easy to code.  In addition to that, I need to finish scribing my interview.  I did a partial scribe of each of them for a previous post, but I need to get the rest of it down.  After following my classmates' blogs, I am a little disappointed that I did not video tape them.  I don't think I would have had the personnel to get that done anyway, but I do wish that I had the body language to see again. Some students were holding a basket in front of them, others played with pencils, while some were completely open and willing to express their true feelings.  I may choose to change this method next time. 
      As I continue this work, I am finding that it is actually fun.  I am enjoying digging deeper into different situations and learning more about my students and students in general. Hopefully we will have full weeks from now on so that I can have a good amount of data to really see how effective the RAP method is. 

Monday, February 24, 2014

Coding

Below is the beginning of my coding efforts that I explained in the last post.  After meeing with Dr. Bowers-Campbell Saturday, I think I am going to add another category for graphic organizers.   You can see that both students below used a venn diagram to compare and contrast stories.  I know teachers use graphic organizers to teach students to organize their thoughts, but we are wanging students to learn to put their final answers into sentence form.  As you can see, I am tracking when they do not restate the question with and "R" and don't answer all parts with a "MP" for missing parts.  When they not prove their answer I am marking a "PA."   One interesting observation in  the example by Kelsey below is that she got a "4" which is full credit, but she was still missing part of RAP.  She also chose to use a graphic organizer instead of using sentence form.  Since we are still in the learning stages of RAP she was able to earn full credit, but after the method has been taught more we will expect students to use sentence form instead to receive full credit for their work.

  Kelsey earned a 4, but still had missing parts.

 This student gets a writer as part of her IEP.

 Tallying...the beginning stages.


   I'm sure I will continue tweaking this process, but this is astart.  I am surprisingly excited to dig into this.  I just wish my schedule allowed me the time to get it done without stressing out. 

Thursday, February 20, 2014

Let the coding begin...

I sat down today to begin coding (Bogdan & Biklen, 1982) writing samples only to realize that I left most of my samples at school. So, I used the 10 or so to get a start. I am very glad I thought to do this. I am using the codes "R" for restating if a student fails to restate the question in their answer. "MP" stands for missing parts if students did not answer all parts of the question. Finally,  I am using "PA" when students fail to prove their answer using information from the text or using their personal connecrions. I am also keeping tallies for each category as I write the codes on each piece. So far, all three categories are equal. One interesting observation I've made is that students who earned full credit for their answers still missed parts from RAP. I look forward to diving into the other samples and seeing the trends that emerge.

Resource: Bogdan, R.C. & Bicklen, S.K. (1982). Qualitative research for education: An introduction to theory and methods. Boston: Allyn and Bacon.

Tuesday, February 18, 2014

Questions, Questions, Questions

As I sit to examine the data I have I have many questions running through my mind.  A few of them I have mentioned in the past and a few more are new.  Bogdan and Biklen (1982) recommends that researchers "force [themselves] to make decisions that narrow the study" (p. 146).  I understand my primary focus and what I ultimately want to know, but I am having trouble narrowing exactly the data I need to accomplish this goal.  Bogden and Biklen also say that researchers often collect more data than they need.  Although I believe I have a lot of data, I am wondering if I need more qualitative data.  I conducted the interviews with my focus groups.  Other than that, I don't really have any other qualitative data. I have kept every ERQ from the participants.  I believe I need to go back to those and start coding for trends (Bogden & Biklen, 1982; Rossman & Rallis, 2003).  I would like to see if students are struggling with restating, are they leaving out parts, or can they not prove their answer.  What exactly is missing?  I initially thought that I could just use the scores from the RAP rubric, but now that I am thinking about it, I could gain more data by finding specifics within the ERQs.  I will be honest and say that that task seems scary and totally overwhelming to me, but in order for this study to work, I must just do it.

In addition to qualitative data, I am still struggling with the best way to record averages.  Since students are completing the ERQs in their classrooms and 2 grade levels are involved, not every studetn will complete and ERQ every week.  I need to decide the best way to document the average info.  I will continue to ponder this. Any ideas will be welcomed!

Resource: Bogdan, R.C. & Bicklen, S.K. (1982). Qualitative research for education: An introduction to theory and methods. Boston: Allyn and Bacon.

Thursday, February 13, 2014

Benchmarking Data


1. Benchmark scores using “Kentucky Extended-Response Scoring Guide”

·         4th grade: 3 2 4 4 1 4 1 2 4 4
Ø  Mean: 2.9 (Range1-4)
Ø  Median: 2 or 3
Ø  Mode: 4
·         5th grade: 4 1 3 3 2 3 4 3
Ø  Mean: 2.9 (Range1-4)
Ø  Median: 2 or 3
Ø  Mode: 3
·         4th and 5th Combined: 3 2 4 4 1 4 1 2 4 4 4 1 3 3 2 3 4 3
§  Mean: 2.9 (Range 1-4)
§  Median: 2 or 3
§  Mode: 4
            Benchmarking scores were found using the “Kentucky Extended-Response Scoring Guide” that is used to score extended- response questions on the KPREP to evaluate extended-response answers without having an “attack” strategy (Kentucky Department of Education, 2012).  Students receive a score from 0 to 4 based on completeness and accuracy.  Students in both fourth grade and fifth grade scored an average of a 2.9 on their benchmark extended-response question. 
              

2. Average Benchmark Scores using the RAP Rubric

·         4th grade

§  Read/Restate the Question: 0 0 4 4 2 3 2 0 3 4
Ø  Mean: 2.2 (Range0-4)
Ø  Median: 2
Ø  Mode: 0 and 3
§    Answer All Parts: 3 2 4 4 1 4 2 2 4 4
Ø  Mean: 3 (Range1-4)
Ø  Median: 2 or 3
Ø  Mode: 4
§  Prove Your Answer: 1 1 4 4 1 4 1 1 4 4
Ø  Mean: 2.5 (Range1-4)
Ø  Median:
Ø  Mode: 1 and 4

·         5th grade

§  Read/Restate the Question: 3 0 4 2 4 1 4 3
Ø  Mean: 2.6 (Range0-4)
Ø  Median: 2
Ø  Mode: 4
§    Answer All Parts: 4 1 3 3 2 3 4 3
Ø  Mean: 2.9 (Range1-4)
Ø  Median: 2 or 3
Ø  Mode: 3
§  Prove Your Answer: 2 0 3 1 2 1 4 1
Ø  Mean: 1.2 (Range0-4)
Ø  Median: 2
Ø  Mode: 1

            Each benchmark extended-response question was also scored using the RAP rubric that assigned a number from 0 to 4 for each part of RAP based on evidence of each part of RAP as well as accuracy with the question.  The data above indicates that fifth grade students have a better grasp on restating the question in their answers than fourth grade students prior to being taught the skill.  Both grade levels displayed almost the same ability to answer every part of the question with fourth graders scoring a 3 and fifth graders scoring a 2.9.  The fourth graders, however, displayed a much stronger ability to prove their answer with a mean score of 2.5 compared to 1.2 for the fifth graders.

Interesting findings...
I have to admit, I was surprised by the amount of 4s students earned on the benchmarking assessments.  I am thrilled that our "novice" students are displaying such knowledge, it just wasn't what I was expecting to see.  I am very excited to see how they continue to work with the RAP method in hand. 

New questions...
I am still struggling with exactly how to handle all of the data.  I know that some teachers (myself included) have been doing some extended-response questions together as we teach the RAP method. Should I include those scores in my data?  I know that the instruction must be scaffolded and doing some together is an imporant step.  If I use those scores, what is the best way to document that?  I guess those are all questions I need to continue pondering. 
 
 
 
 


 

 

 

A little more info about my participants...

Here is a little more "formal" information about my participants.  

1. 15 participants (N=15)

·         7 fourth graders

§  1 female

§  6 males

·         8 fifth graders

§  5 females

§  3 males

Fifteen fourth and fifth graders (N=15) participated in the RAP research group.  Of the 7 fourth graders, 1 was female and 6 were male.   Of the 8 fifth graders, 5 were female and 3 were male.  All participants earned a novice on the Kentucky Performance Rating for Educational Progress (KPREP) during the 2012-2013 school year.    

2.    Gap Groups

·         Free or reduced lunch: 48.5% of the school

·         46% of the participants have an IEP for speech, reading, and/or math

       Within this group of novice students, every student was identified in one or more gap groups.  Forty-six percent of the novice students were identified as special education students and have an individualized education plan in speech, reading, writing, and/or mathematics.  One participant was of Asian ethnicity, but was not considered to be part of the gap group for ethnicity.  All other students were white/Caucasian.  48.5 % of the school received free or reduced lunch. Nine of the fifteen students were males with just six females.

3. Students in tier 3 Response to Intervention Groups

·         4th Grade: 3 out of 7

·         5th Grade: 4 out of 8

            Of the 15 participants 47% are in a tier 3 RTI group for reading.