The Results, the outcomes and my interpretation…
At the beginning of the Information Learning Activity (ILA) the students were asked to select a material to study and research the living and non living components of it. Using the School Library Impact Measure –SLIM Toolkit (Todd, Kuhlthau & Heinstrom, 2005) as measurement tool I was able to assess and track the guided inquiry process in great detail. I conducted two data collection surveys using the questionnaires were and making a change to the suggested time frame. The outcomes of this unit was tied into the ACARA Science Curriculum. For ease of interpretation results for questions 1, 2 and 3 in SURVEY 1 will be green and all SURVEY 2 results will be coloured in red.
Question 1 – What do you know about your topic?
In these surveys students were asked to think about what they were learning about in order to focus their thoughts and to then write down what they knew about the topic. The student responses were analysed individually and responses to question 1 from both surveys was categorised into fact, explanation and conclusion statements as can be seen below in table 1.
This figure clearly demonstrates an increase in all three areas. However, one difficulty when scoring facts in question 1 was that some of the students were defining the general topic of living and non living things, rather than writing verb statements to describe “what a concept is or how it is performed” (Todd, Kuhlthau & Heinstrom, 2005, p. 7). Whilst recording results for Questionnaire 1 I noticed there was only one explanation and no conclusions made. I was worried that I had not explained the questionnaire correctly or that the students didn’t understand me. Some students wrote “we don’t know” even after the initial topic launch that was conducted voiding their responses. In the image below you will see the response from Student 6 who I decided to track more closely.
For the purposes of this data analysis, Students 1 to 6 were tracked individually. The various educational, social and emotional needs of these students varies greatly and will be seen in the detailed analysis. Student 1 made one very deep explanation in Questionnaire 1 and as this student is involved in the gifted and talented program I wondered if this would make a difference to the overall outcomes of this survey. Looking at the individuals inspired me to consider the total change in learning. To look at total change in learning I added up all fact, explanation and conclusion statements from Survey 2 and subtracted the total from Survey 1. This gave me a score that was averaged into the two groups of students. Again student 1 ranked the highest across the cohort and is a confident and optimistic person. However, interestingly the scores did not match preconceived learning needs and styles of others.
Overall there is a general increase in all areas between questions but interestingly the greatest improvement was seen in an increased factual knowledge demonstrated in Table 3.
Question 2 – How interested are you in the topic?
The SLIM Toolkit questionnaire asks students to consider how interested they are in their topic and below in table 4 you can clearly see the increased student interest between survey 1 and 3.
The pie graphs below enable you to see clearly the changes between each survey and then finally a comparison of grouped individual interest levels based upon changes. The grey pie graph despite demonstrating a 14% decline is representative of only 3 students.
Question 3 – How much do you know about the topic?
This question displays student self awareness of topic knowledge. Table 8 depicts the quantitative data of student estimated knowledge on the topic. When surveyed at the end of the ILA some students wanted to make a fifth box entitled “everything”. This question was seen as formative assessment and helped me to prioritise who to track more closely and support. Overall despite the graph showing an increase in knowledge there were six students who scored the same in both surveys.
Question 4 – What did you find easiest to do?
Due to the age of these students results in this are were less than I expected possibly due to their age and ability to answer these questions cognitively. However the themes of information searching and usage of the internet were common across this research project. Students commonly responded to these questions in dot point and in similar ways as seen in this image. However, there were 6 questionnaires that were unable to be included due to absence during the survey 2 data collection phase. That being the case results could have been significantly higher.
The results in table 9 depict the collated student perceptions of the information search process and were broken into six common themes as seen below:
Question 5 – What did you find difficult to do?
The common themes of surrounding the information search process continued to be mentioned in the surveys. However, as can be seen in table 10 there were some significant changes that can be contributed to formative assessment strategies and discussion of this can be seen in the analysis. Again, there were four questionnaires that have been omitted in this survey due to absence and their common area of difficulty was in the area of internet usage and the strategy enlisted to elicit answers from Google. This table does not demonstrate the difficulty some groups had on staying on task. There were two particular groups whose topics changed in each session making information selection difficult due to the changing topic questions. The inclusion of work environment will be addressed in the analysis.
This pie graph does not reflect three students decline in interest levels but when looking at the raw data it was interesting to see the answers in question 5 and 6. Student 19 was not interested at all and found “nuthing (sic) hard to do” (SLIM Questionnaire 3) and interestingly stated it was because they “knew efreing (sic)” (SLIM Questionnaire 3). This student’s topic area was a common material that most Victorians study in depth and with parents who find and make holiday links to learning had selected something comfortable.
Question 6 – What did you learn in doing this research project? (Questionnaire 3 Survey 2)
Survey 2 used The SLIM Toolkit Questionnaire 3 and consisted of two additional questions to be answered. This second survey was conducted in a rush at terms end and enabled me to collect 25 responses. However, two students omitted this question. This question was designed to “generate a student-based summary of their learning” (Todd et.al., p.17) and data collected did not enable me to analyse information seeking strategies rather then main self perceived learning. Answers to this question were brief and tended to fall into the following three categories.
Table 12 below depicts a clear ring with percentages, again the strength in factual information based on the evidence above can be seen.
Question 7 – How do you feel about your research?
The ILA ended with a particularly high sense of achievement where the presentation of findings was a whole class celebration. There was a wide variety of materials and an even broader assortment of presentation structures ranging from models, power point presentations, role plays, television news reports, brochures and experiments. When adding the confident and happy responses together there is an overwhelmingly high percentage of 95% of students who are able to demonstrate their new knowledge with passion and assurance. There was one confused student who consistently selected broad questions, displayed and reported ongoing difficulty using the internet who could have benefited from tracking.
Outcomes and Interpretations
As seen in the gallery below students were able to “share the product they have created to show what they have learned with the other students in their inquiry community” (Kulthau, Maniotes & Caspari, 2012, p. 5). A clear ability to distinguish between living and nonliving things was achieved. Students were able to confidently report upon their selected materials describing a range of uses and their specific properties.This ILA was considered successful as we were able to achieve numerous ACARA content descriptors and the inquiry skills consisting of questioning, predicting and communication. Wilson accurately states that “assessment data should be used for planning and to ensure that students are actively involved in all aspects of the learning process” (2013, p. 72) and all teachers were part of the process making the learning journey a success.
Kuhlthau, C.C., Maniotes, L. K. & Caspari, A.K. (2012). Chapter 1: Guided Inquiry Design: The Process, the Learning, and the Team. In Kuhlthau, C. C. ; Maniotes, L. K. & Caspari, A.K. Guided inquiry design : a framework for inquiry in your school. Santa Barbara: Libraries Unlimited.
Kuhlthau, C. C., Maniotes, L. K. & Caspari, A. K. (2007). Chapter 2: The Theory and Research Basis for Guided Inquiry. In Kuhlthau, C. C. ; Maniotes, L. K. & Caspari, A. K. Guided inquiry : learning in the 21st century. Westport, Conn: Libraries Unlimited
Todd, R., Kuhlthau, C.C. & Heinstrom, J.E. (2005). School Library Impact Measure. A Toolkit and Handbook for Tracking and Assessing Student Learning Outcomes of Guided Inquiry Through The School Library. Center for International Scholarship in School Libraries, Rutgers University. Retrieved August 6th, 2013 from http://cissl.rutgers.edu/joomla-license/impact-studies?start=6
Wilson, J. (2013). Activate Inquiry: The what ifs and the why nots. Education Services Australia, Carlton South.