This case study describes an introductory programming course where a very limited time-frame for re-development resulted in only small 'tweaks' being made to the presentation of content. These changes were in keeping with Cognitive Load Theory, with the express purpose to reduce the extraneous load and increase germane load on working memory.
The assessments (quizzes, programming assignments and exam) from the previous year - before the redevelopment - were re-used to determine the effects of the redevelopment. The improvements were substantial.
Participation (as shown by completion of assessments) dramatically improved, as shown on the below graph:
Even though many of the students who normally would not have participated then handed in assignments, the mean marks for the assessments improved.
The result was a 19% reduction in the failure rate for this course.
The "troublesome course" in this study is a first year, introductory computer programming course, taken as a compulsory part of an Information Technology degree at an Australian regional university. The students are a mixture of novices with no programming skills, school-leavers with limited programming skills, and those who have self-taught or may have outdated programming skills. This mixed cohort brings its own challenges in class management.
Attrition and GhostsAt the end of 2015, the course had high attrition, and many 'ghost' students who enrolled, but then disappeared throughout the Semester. This was evidenced by the participation rate in assessments. The course had 3 quizzes, a minor and major programming assignment, and an exam. In 2015, the highest participation rate was in Quiz 1 in Week 3 where 70% of students participated. This was a compulsory assessment! Only 55% of students handed in the major assessment, which was worth 30% of the total grade for the course.
High Failure RateGiven the non-participation above, it is not surprising that the course also had a high failure rate. In fact, 52% of students that were enrolled past census date failed the unit.
PerformanceThe actual performance of the students who completed the assessments was acceptable. The mean mark for each assessment was over the pass mark and the final exam had a mean mark of 58%.
Changing degree structureThe course and another introductory programming course was about to be superceded by another two course which would have a pre-requisite relationship - for the sake of this case study we will call them Programming 1 and Programming 2. These would be introduced in late 2017, so any redevelopment work we did would not be used after early 2017. A full redevelopment would be a waste of resources. Despite this, we wanted to improve the course for the students who would take it from the beginning of 2016 to mid 2017.
Time framePerhaps the most signficant challenge in this redevelopment was the limited time frame to do anything at all to change the course. The course was given to us to redevelop a few short weeks before it needed to be delivered. All that was possible were some changes in presentation of content, without restructuring, resequencing or adding major content.
When there are major problems with a course (high failure rates, attrition, unhappy students and lack of learning), often the whole course is re-designed and re-developed from the top down. Using Cognitive Load Theory, the content is examined, resequenced, segmented and re-written, using what we know about human cognition and the limitations of working memory. Unfortunately this is an intensive process, and there is sometimes not time to complete this process before the course needs to be delivered. This case study describes just such a case, where a very limited time-frame resulted in only small 'tweaks' being made to the presentation of content in an introductory programming course. These changes were in keeping with Cognitive Load Theory, with the express purpose to reduce the extraneous load and increase germane load on working memory.
With limited hours at our disposal, we targeted three content sources:
The study guide was a topic-by-topic PDF file that was given to all students to help with self-study. The study guide gives content, directs to readings, and includes activities and answers for those activities. The study guide also contained some worked examples. The first step was to review the study guide and work out how it could be improved using cognitive load theory.
The study guide was written by someone who was experienced with programming. It contained objectives for the topic at the beginning of each topic (good!), and then a large amount of information about the particular topic, with interesting 'asides' about the history of programming interspersed in the midst of the content. Headings were used throughout, without a hierarchical structure, and there were very few diagrams. In addition, there was substantial sections where students were assumed to have prior knowledge - whether this was deliberate or whether it was due to the expertise of the writer, we don't know.
The ChangesFirstly, the "interesting asides" were identified and pulled out of the main content, and visually separated. This helped the students who may be overloaded to concentrate on the most important aspects of the content, and later look at the 'for interest' sections if desired. A heading hierarchy was established with sizes and numbering, to help students mentally structure the content and build schema in long-term memory. Several diagrams were added in sections that have proven difficult for students to comprehend, such as looping. Sections where knowledge was assumed were reworded so they directed back to previous work first, and made sure that students were not left guessing about what was meant. In addition, whitespace was added to visually divide sections and help with segmentation of the content. The two screenshots below show before (left) and after the changes (right). As you can see, there are minimal changes.
The second set of changes was conducted on the slides that were used in the online lectures. These lectures were delivered live and also recorded for later viewing. An examination of the slides showed that the bullet points were used without much differentiation between different types of content. There was little use of whitespace and no use of colour. There were also problems that would lead to split attention - for example, if a section of code was on one slide, the text that explained it was on the next slide.
The ChangesAgain, the only changes made were minor. Colour was used to differentiate keywords, whitespace was added to help with segmentation. Sections that would cause split attention were resolved. There were no significant changes to content. The results are shown below - old slides on the left, new slides on the right.
There were no screencasts in the course prior to 2016. The only multimedia that was provided for students were online recorded lectures. An essential part of the course is learning to use a development environment for programming. We believed that the modality effect could be used in conjunction with worked examples to help novices to the environment to learn the basics of programming. Screencasts were a quick and easy way to do this. Several screencasts were added - between 2 and 4 each topic.
To measure the effect of the changes made, we used the same assessments in two consecutive years. The assessments that were used were:
There were several possible confounding factors. In 2015, the course was delivered to distance students and students in 2 on-campus locations. There were 88 students who completed the course (i.e. stayed enrolled throughout the semester). In 2016 another two on-campus locations were added. There were 175 students who stayed enrolled throughout the semester. The additional two on-campus locations had a large proportion of international students. To ensure that the comparison between years was equitable, only the distance students and students from the original two on-campus locations were considered for 2016. There were a total of 98 students that year.
A control groupIt is possible that the 2016 student cohort were generally better students than the 2015 cohort. To control for this, we looked at a comparison of two deliveries of a similar technical IT course in the same semesters. The same students were in both the case study course and the control course. If the students were more capable in 2016 than in 2015, then this would show in the comparison in the control course.
Participation in the assessments increased significantly in all assessments apart from the exam. The exam is always better attended than the prior assessments, as some students believe they need to complete the exam to be classed as a student for student allowance purposes. The number of students participating in each assessment are shown in the table below:
Considering that more students (and perhaps less able students) participated in the assessments, we weren't sure whether the marks for assessments would stay the same, increase or decrease. The results show that on three of the assessments, performance increased significantly. In the remainder, the mean increased but the result was not statistically significant. As a result, the failure rate for the course decreased by 18.6% - a significant amount.
For comparison, here are the results for the control course, held in the same Semesters, and with the same students. Both courses were technical IT courses.
The only significant differences were for performance in Assignment 1 and the exam, and both of these were in a negative direction. The changes that were made to the case study course had the effect of higher performance and participation in these same students.
In any case study there are limitations. There were less than 100 students in each of the deliveries of this course - although many introductory programming courses in Australian universities have this number or fewer students per year.
The course has also only been delivered once in its redeveloped form. A series of analyses would make a stronger case study, but it also is not practical to use the same assessments again.
Copyright © 2017, CAFÉ Toolkit