A quick guide to AoL
The Assurance of Learning (AoL) process we have developed follows the guidelines provided by AACSB, the purpose of which is to facilitate continuous improvement at the programme level.
Assurance of Learning process
For a quick overview, download the Wellington School of Business and Government Assurance of Learning Process Workshop (PDF 585 KB).
The Victoria University of Wellington strategic plan emphasises Research, but also Teaching and Learning.
There have been noticeable changes in recent years:
- Increase in emphasis on student engagement, active learning and the student experience.
- Our points structure has changed—and we need to adjust workloads to suit students and teachers.
- Our Business School faculty teaching performance indicators are good, but we could do better.
- In tertiary education worldwide, most academic staff are not trained as teachers and there is boundless potential for improving Teaching and Learning, by upskilling staff teaching methods and practices (Banta, 2008).
- Last but not least, as a business school we are keen to improve our standing internationally, through benchmarking, process improvements, and recognition through accreditation.
The four key questions of AoL align closely with current educational thinking (e.g. Biggs and Tang, 2007) that: “We should be quite clear about what we want our students to know and provide learning opportunities that build progressively, throughout the students’ programme of study.”
A key difference between AACSB's system of Assurance of Learning (AoL) when compared with normal educational continuous improvement practices is its emphasis on continuous improvement at the programme level, rather than at the individual student level.
The cycle diagram depicts the process we are following.
Mission, vision and values
- First, we defined the mission, vision, values.
- Then we determine the desired Learning Goals (LG) and Learning Objectives (LO) for each programme.
- Then we design curricula and map course Learning Objectives to the programme level Learning Objectives/Goals.
- We then deliver courses, providing students opportunities to learn the knowledge, skills and values we have laid out in course LOs and programme Learning Goals.
- We then assess whether students have learnt the desired Learning Objectives.
- We check whether there are gaps, and if so close the loop by revising and going back through the cycle.
- Periodically we would check the Mission etc. but, for the most part, the cycle repeats from one of its later stages.
Programme Learning Goals and Learning Objectives
The Bachelor of Commerce (BCom) is our main bachelor programme, and we decided on five Learning Goals. Three of these, LG1, LG2, and LG4, follow the University's generic Graduate Attributes:
- critical and creative thinking skills
- communication skills
- leadership skills
Download the BCom Learning Goals and Learning Objectives (PDF 185 KB).
LG3 is important for our faculty, as a business school operating in an international context.
LG5 relates to Major Attributes specific to each discipline area or Major.
For each Learning Goal we have defined more specific, measurable Learning Objectives.
The slide below shows where we finished with LG1 after discussion (including obviously much critical thinking, reflection and debate!).
Communication skills (LG2) are a key graduate attribute—they are skills sought by employers, and graduates have confirmed in Graduate Destinations Surveys that communications skills and critical thinking skills are a “very significant factor in getting a job”.
Download Major Student Attributes (PDF 80 KB).
Curriculum cohesion and AoL plans
Next we come to curriculum design and mapping.
Below is a section from one of our curriculum maps—this one is for the BCom (Bachelor of Commerce Degree).
Along the top are graduate attributes, BCom LGs, and below those are the LOs. The courses are listed down the left, with core courses for the BCom first and then the individual courses for a particular major (just the compulsory courses are shown here).
In each cell, we have curriculum coverage, H, M, L (high, medium or low). G shows where it’s currently assessed counting towards a student’s grade.
The red areas were designed to show where we had decided to assess for AoL. However, some of the places we were going to assess are no longer being graded, so this needs to be changed.
This illustrates the iterative nature of the process. If course coordinators make changes to the course, then there can be repercussions elsewhere. Other staff may need to adapt their courses to fit in with the change, either taking up a graduate attribute that has been dropped or making use of the fact it is now covered elsewhere.
This needs to be a process of ongoing conversation, of give and take, of collaboration, within our teaching groups. No one teacher can be responsible for delivering the entire set of graduate attributes in his/her course, so the design of the curriculum is a team effort, a collective pursuit.
How did we derive these maps? The photo shows two of our colleagues, with the map developed collaboratively for their Major. It's fair to say it involved vigorous, sometimes heated, debate and finally, an agreement to share the coverage of Learning Outcomes and AoL assessments.
One major benefit was discovering that a lot of courses were doing a marketing plan—and they didn’t all need to! Having found out so much new and useful information about the courses, it was decided to make it an annual event.
Assessment of student learning
In this part of the cycle, we are trying to assess what proportion of students have demonstrated that they have achieved the Learning Objectives. Generally there will be three categories: good enough, way good enough, and not good enough.
To do this we assess the students' work, focusing on the particular Learning Objective. An overall assignment grade can’t be used to measure achievement of Learning Objectives, as each assignment grade will usually reflect a combination of LOs such as critical thinking skills, communication skills and subject knowledge.
While indirect measures such as surveys, opinions, focus groups, employer feedback, and student feedback can be used, direct measures are seen to be more reliable. This generally involves directly assessing a sample of student work using a rubric.
A typical rubric might be for a student presentation to the class.
Download a Rubric for Presentation Skills (PDF 57 KB).
The aspect covering visual aids is stated in a very general way to allow for different styles of presentation in different subjects, from slick multi-media presentations through to writing on a white board. Individual subjects are encouraged to be more specific with students about what they judge to be Exemplary, Satisfactory and Unsatisfactory.
Things to note:
- The traits are listed down the left, with just three categories across: Exemplary, Satisfactory and Unsatisfactory. Teachers will sometimes add more boxes if used for marking.
- The order has been the subject of debate. To me, as a scientist, it seems much more logical to go from poor to excellent, left to right. But some of our early users and the SLSS persuaded us to put the exemplary category on the LHS, to more clearly signal the standard to which we want the students to aspire.
- Naming the categories has also been debated—we started off with meets expectations, exceeds and does not meet, but ran into semantic difficulties. Our current categories were introduced in 2009 and seem to work well.
Note some key differences from NCEA:
- no implicit or explicit numerical weighting of criteria (though some staff have added this for their own grading schemes)
- holistic score provides the marker with some discretion, whereas with NCEA the worst criterion score dictates the final overall grade. Here the criteria can be compensatory.
- criteria can include a pre-emptive element—e.g. if plagiarism found, this overrides everything else, automatic fail.
Check results and design improvements
What must we change if our students haven’t learnt what they needed to?
The results of the assessments are used to identify deficiencies in student achievement of the intended Learning Outcomes.
These then become the focus of discussions on how/where to rectify such gaps. We have identified many gaps and are finding the process useful. We have also had some surprises which are perhaps the best trigger of learning.
We also review our “Assurance of Learning” processes, using a process of continuous improvement. As a result we have improved and streamlined the process and will continue to seek improvements.
As Eileen Peacock (Senior Vice President and Chief Officer, Asia, AACSB International) says, it has to be simple and automated...and we need to be as rigorous with our teaching as we are with our research!
We couldn't have done all this without the support and collaboration of colleagues within the Faculty and around the University.
One needs to establish the governance and infrastructure to support such systems—we have set up a Learning and Teaching Committee that operates at Faculty and School level, and we have developed standard processes for conducting assessments.
Download Flowchart—Procedures and Reporting (PDF 54 KB).
We started with the aim of improving student Learning Outcomes and are confident that the process is helping us to move toward that, step by step, based on the direct evidence of levels of achievement of the intended Learning Outcomes, including graduate attributes.
To comment or to follow up on any of the ideas here, please do feel free to get in touch with the Centre for Academic Development, or contact us directly. We are more than happy to receive feedback, and you are welcome to browse these additional resources.
Programme directors’ guide to Assurance of Learning
1. Familiarise yourself with our AoL system structure and rationale:
For a description of our AoL system and the rationale behind it—see the web—quick intro and Beyond assessment paper. This lays out all the nomenclature.
2. Define your programme Learning Goals
4-8 Learning goals, 4-10 Learning Objectives in total, is what’s recommended currently by AACSB as being useful and manageable. Any more will likely be unmanageable. LG’s should answer the question: what is it that we want our students to learn. Make sure they encapsulate the special features/focus of your programme, why it exists, what market it serves, what skills you want the graduates to come out with.
The BCom LG’s and LO’s are available on the Teaching Matters website. They are stated in a way that was consistent with the old Vic Graduate Attributes; we are currently realigning them with the new Victoria Graduate Profile, and in the process cutting them down to a more manageable number. We used to have 16 plus however many MA’s each major had, though there was some overlap between MA’s and LG’s 1-4. We are endeavouring to reduce them to 4-10 LO’s while covering the Vic Grad profile and the learning skills that our faculty deems important.
3. Develop your curriculum map
Where are you developing each of the LG’s in the programme? Complete a curriculum map, with Courses in the left hand column, against the Learning Goals/Objectives along the top row.
Use some code, eg H, M, L for coverage, and G if graded, and A if you could use for AoL.
4. Decide where you will do AoL
Make a plan for the next 5-10 years, and update on a rolling basis, each LG needs to be assessed for AoL twice in a 5 year period. Execute this AoL as planned. Analyse the results. You will need to develop rubrics to suit. Use just 3 levels, Excellent, Satisfactory and Unsatisfactory, E, S, U.
Make sure the staff asked to do the AoL understand the task, carry it out as planned, and send the data to the ADLT office. Please refer to the AoL FAQ, or contact Marina or ADLT.
Note: rubrics at post graduate level should be different from undergraduate level (according to AACSB). While the traits may well be the same, the descriptors for E, S and U should represent a higher level of mastery. Put another way, lower performance in the range considered to be Satisfactory at Undergraduate level is likely to be Unsatisfactory at Postgraduate level.
5. Close the loop
Based on the results of step 4, decide what changes need to be made to the programme, and carry out those changes. Record findings and share any lessons learnt. Go back to step 3.
Where masters level programmes have common themes, you are welcome to share learning goals and rubrics, with appropriate tailoring for individual programmes.
Q: I’ve been told I have to do AoL. What does AoL stand for?
A: AoL stands for Assurance of Learning. It’s the way we check whether the students are learning what we want them to learn. In particular we’re looking at the generic skills they develop during the course of the degree programme, like communications, critical and creative thinking and so on.
AoL is required for our accreditation with AACSB, but it also teaches us lots of useful things about the skills our students are good at and what they’re not, that we probably would never discover if we rely simply on grades. This is because grades typically are based on a composite of these generic skills and content knowledge.
Q: I’ve been told I have to do AoL. What do I need to do?
A: If you have been told which Learning Goal(s) or Learning Objective(s) to assess, then you need to decide where would be the best place to assess this? Which piece of assessment already covers this? Then we can simply add on the AoL to assess at the same time as you’re doing the marking.
If no, then we can look at the 8 year plan to see where the gaps are for your Major. And select a Learning Goal/Objective that most needs doing (it’s never been done or was done a long time ago). If it’s one that you will be covering anyway, then there shouldn’t be any extra work. If it’s something that fits within your course objectives, then you can consider how to assess it within your course, easily and reliably.
Q: Where is the 8 year plan?
A: On the M: Drive. Ask the Learning and Teaching Administrator, Nicole Green.
Q: Where do I find rubrics?
A: On the Rubrics and Scoresheet Templates. Look for a rubric that relates to the Learning Goal/Objective you have chosen, or one or more rubrics that covers traits you think the students should demonstrate in the assessment.
If you can’t find a rubric that fits, we can develop one, or if one on the website needs adapting to fit your context, talk to your School Learning and Teaching coordinator or the Associate Dean, Chris Eichbaum.
Q: How do I do the actual scoring?
A: There are a number of ways to complete the actual scoring, ranging from marking a separate rubric sheet for each student, through to collecting scores for all students on a single grid or spreadsheet, by indicating E, S or U, or writing a 1 or a tick in the relevant column. We have a variety of examples for view, contact the Learning and Teaching Administrator, Nicole Green.
If you score on the actual rubric, then students can get a copy so they can see where they did well, and where they need to improve. This is especially useful for oral presentations. If you’re doing an exam, or an assignment where the students won’t be getting their own marked rubric back, then it’s easier to just collect the scores on a spreadsheet or grid. All you need to assign and record how each student (in the sample) scored on each of the traits being measured. Use just the 3 categories E/S/U as per the rubric. Marina will set you up with all the rubrics and scoresheets for your particular assessment exercise.
The Teamwork Preparation rubric is set up for students to assess themselves and each of their teammates on aspects that are considered important when working in teams or groups, like attendance at meetings, preparation before meetings, contributions and how they helped the group as a whole achieve its task. Each student should fill out the form independently and confidentially. Teachers can decide whether to ask students to provide names and feed back the results, or whether the results will be completely anonymous. Encourage students to assess themselves and their peers honestly and independently, and to refer to the rubric descriptors for each trait. When students collude, they are less likely to give useful scores.
Q: How many do I need to sample?
A: See sample size guidelines. Usually 20-25% of the class for large classes. If there are fewer than 120 students in the class, score at least 30, or the whole class if the class is smaller than 30 students.
Q: How many assignments (or tests or exams) do we need to copy?
A: Just 15–20 pieces of student work should be scanned (not copied) and saved into a single file.
Collection of AoL data
When assessing student work to collect data for AoL, the sample needs to be representative, at least 20–25% of the class, with a minimum of 30, or the entire class if fewer than 30 students. If the entire class is marked on a rubric, then the marked rubrics may be provided to students to provide comprehensive feedback efficiently.
This is needed only for nominated assignments in nominated courses according to the AoL plan. Copies of marked rubrics, or a completed scoresheet, showing student identifiers (name or number) to ADTL office.
Copies of student work for AoL archives
For each AoL exercise: 15–20 examples of marked student work should be scanned, saved in a single file and forwarded to the AoL Administrator. Marked rubrics should accompany each piece of marked work. The sample to include 1 from the top, 1 from the bottom, 4 around the pass mark. The rest should be spread throughout the range, covering a range of Exemplary, Satisfactory and Unsatisfactory work, as per the AoL rubric.
The file name should follow the format: AoL_Course code_Year_Tri_Assessment description eg AoL_MGMT206_2012_Tri1_Asst1
Scanned file to be forwarded to the AoL Administrator.
Q: Can I do AoL on groupwork?
A: If you’re assessing how well students can work in groups, then yes. There are several Learning Objectives and rubrics designed for grips. However, if you’re assessing an attribute like critical thinking, then no, we can’t use a group report to determine what percentage of students can think critically. (All we can tell from assessing group reports is what percentage of groups had at least one student in the group who could think critically!)
Q: Where can I find out about all this?!
A: See the handy flowcharts on the Teaching Matters website, which shows the process to follow when doing the AoL exercise: Business School Assurance of Learning Checklist and scanning before handing marked work back to students.
Q: What happens to the scores?
A: You can tally the results yourself so you can see straight away how your students have done, or you can pass the scoresheets or marked rubrics to us and we will process them. Either way, we will calculate the statistics and send them back to you via your School T&L Rep for comment. It’s important to realise that AoL provides a useful learning opportunity for us too; we usually find out something that we can use to improve the teaching/learning experience next time, either in this course or elsewhere in the degree programme.
Q: So what can I learn from AoL? Why should I bother? How can it benefit me?
A: Perhaps some stories from fellow teachers will help to indicate some of the insights that come about through doing AoL.
“I didn’t know anything about AoL, but found myself having to do one on an exam. We compiled a rubric to fit my exam question, based on the rubrics on the website, and I had a copy of this rubric to pin up to refer to while marking. I did the marking of the questions, and after the particular question in the exam, I scored the students on a simple grid. It was interesting to discover that the students didn’t do as well as I’d expected on the analytical aspects, and I had to abandon ‘coming up with results’ because I realised I hadn’t really asked them to do that. I’m wondering why they didn’t do so well on the calculations—maybe because they ran out of time. I’ll think about that more for next time.” Contract lecturer, 3rd year class, 120 students.
“I teach a method I think is really good for generating creative solutions to difficult situations. So I used the ‘Creative thinking’ rubric to assess this Learning Objective when I was marking the exam. I was surprised by the findings. I discovered that the student who came up with the most creative answers, actually knew very little and were forced to really think out of the box, whereas the students who followed the methods well, came up with sound, well-reasoned solutions, that I guess were creative, but because I saw the same ideas from many students, they didn’t seem so creative. I also got the feeling that a lot of the solutions that seemed really unusual might have been due to the student’s different cultural background. It’s something I’d now like to check with a research project to see whether that’s generally true, and maybe the method can be improved if we take that into account. Incidentally, I’d intended to mark just every 4th student to get a 25% sample, but found myself marking virtually the whole class, as it wasn’t any extra effort, and I was finding out more about how my students had done as a cohort.” Associate Professor, 2nd year class, 280 students.
“I run a debate with my students in a 3rd year class. I worked with the ADTL to adapt the Oral presentation skills rubric to cover this type of activity, and the rubric for ‘Debates’ is now on the TM website. As the students were engaged in the debate, I scored each of the groups on the rubrics and later transferred the results to a grid for analysis. The rubric for oral presentations worked well. During the debates, I was impressed with the students’ preparation, organisation, and creativity in the way they dug up info, developed arguments, and conjured up examples. But my main concerns were that they had an overreliance on their notes, had too much detail in their slides, and didn’t address the audience.
One key ingredient in effective delivery is to look up from your notes, look people in the eye. Indeed, making eye contact, both with opponents and with audience, is very important, as is speaking to various people. But this was what was missing—something that really only became clear once we calculated the proportions of students getting E, S and U on these elements. The following year I added elements relating specifically to a debate, such as:
- ability to critique arguments of opponents
- ability to respond confidently and pertinently to issues raised
- ability to respond by adding new material.
By making these explicit, both I and the students were able to give adequate attention to these and the quality of the debates in subsequent years has been much improved.” Professor, 3rd year class, 40–80 students.
For any questions or feedback, please contact the Learning and Teaching Administrator, Nicole Green.