Adapting Learning with Entry Tests
Over the course of the term, I have been working to focus the work of my students, narrowing what they need to study to prepare for the Official GED Examinations. I have been using entry tests provided in the online course to create study plans. The first step is to examine the results for lessons where the student was not successful. Using performance analysis charts provided, we create a list of these lessons. Then we enter these lessons onto a 12 week chart to create a plan.
With the first group of students, we completed this task, from marking to lesson listing to study planning, using the textbook and a blank study plan. I sat down with each (or discussed on the phone) their results using the provided performance analysis charts. First, we looked at the overall score to determine to what extent they would need to focus on the unit based on the cut score provided. Next we examined each item. We found the Writing Unit Entry Test straightforward with the chart pointing to the lesson the item centred on. However, we soon discovered several lessons were not addressed by the entry test as the entry test was to be completed in half the time of the Official GED Examination. The approach for the moment became to complete the lessons identified. As well, the student would complete lesson sets (already identified) for which they had at least a couple of lessons already identified as needing work. In a couple of instances, I added related lessons that have typically posed difficulty for students in the course and the Official GED Exams in the past.
When we moved to the Reading Unit Entry Test, we had more fun. the items pointed to large lesson sets rather than lessons. The Reading Unit focuses on reading skills applied across various genres. The performance analysis chart referenced the genres as rows and a cognitive taxonomy scale as columns. No reference to the skills or particular lessons are made. At this point, I started creating my own performance charts in an Excel spreadsheet. I went through the questions and connected them with the lessons. As with the Writing Unit Entry Test, many lessons were not covered.
From this point, I needed to automate the process to make it easier for students to take the tests and make study plans. Without this ease of use, several students opted not to create the study plans attempting all 116 lessons. To this task, I put the entry tests online in the learning management system. Upon completion, I pulled the results into the same spreadsheet with the custom performance analysis charts.
The next step was to make a report with these results comparing them with the custom performance analysis charts. Using Excel formulae no human being should ever have to write, I created a summary sheet. On this sheet, a score was calculated for each lesson based on the number of correct responses compared to the number of items for that lesson. For example, if two items applied to a lesson and the student scored correct on one, the score was 50%. This lesson then would be one to be added to the study plan.
Several things started to happen. More students completed the entry tests. Many without any prompting. More study plans were created. I was spending less time working on creating study plans. Students were more easily able to list them on the study plan sheet. A large amount of judgement and guess work was eliminated.
The next step will be to filter lessons in learning management system. However, before this can happen, the entry tests will need to be rounded out. Items for each lesson will need to be made. Multiple lessons per item would work well. An adaptive testing delivery, coming next, would allow the test to have more items and keep the exam time and length reasonable. More work will need to be capturing objectives tied to the skills and lessons identified in text and materials. Items related to objective would be better inform instructional activities in terms of scope and sequence.
My hope is to replace the spreadsheets to further automate the process. This database would be able to print individual student study plans for students. This database would also be able to provide statistics for item analysis of these entry exams. Post tests would then spin from here and be added to the assessment discussion.
I can see so many points of extension and collaboration.