Learning through Reflective Dialogue: Assessing the Effectiveness of Feedback Vivas

Page created by Ethel Pham
 
CONTINUE READING
Learning through Reflective Dialogue: Assessing the
             Effectiveness of Feedback Vivas

       Benjamin Franks and Stuart Hanscomb
       School of Interdisciplinary Studies (Dumfries Campus)
       University of Glasgow

       This article is based on a mini-project funded by the Subject Centre for Philosophical and
       Religious Studies of the Higher Education Academy. We would like to thank the Subject
       Centre for their support, and we would also like to thank the anonymous students at
       University of Glasgow’s School of Interdisciplinary Studies who participated in the research.

Introduction

Academics from a range of higher education disciplines frequently complain that written
feedback is not being followed by students (e.g. Glover and Brown, 2006; Higgins et al,
2001; Williams, 2003). David Carless’ anecdote about a student for whom ‘it was not his
practice to collect the assignment if the mark awarded was low!’ is indicative of the problem
(2002: 355). One obstacle can be the quality of the feedback provided (Carless, ibid), and
accordingly a body of research and recommendations has been generated in recent years
concerning such issues as the time it takes to return work, provision of clear advice that will
help students to improve their performance in future assignments, the tone in which it is
written, and transparency in the assessment process (e.g. Juwah et al, 2004; Lizzio and
Wilson, 2004; Price, Caroll et. al., 2011). Another problem is encouraging students to
actively engage with the comments they receive. If students are not reading feedback – or at
least not reading it in such a way that it will make a difference to the work they subsequently
produce – then its quality is largely irrelevant. It is to this latter issue that this article is
primarily addressed; the task of motivating students to attend and respond effectively to their
tutors’ comments. It is also hoped, since the pivotal feature of the method under discussion is
a discursive conversation between student and teacher, that this study will contribute to the
current debates concerning the value of dialogue in higher education teaching (e.g.
Brockbank and McGill, 1998; Juwah et al, 2004; Carless, 2006; Larvor, 2006; Nicol and
MacFarlane-Dick, 2006; Golding 2011; Wass et al 2011). Within philosophy dialogue has
traditionally been highly regarded, both as a method of learning and as a topic of study in its

                                                 53
Benjamin Franks and Stuart Hanscomb—Learning Through Reflective Dialogue

own right (e.g. Walton, 1989; Gilbert, 1997; Tindale, 2004). For this reason the authors are
optimistic that the feedback viva will have amplified relevance and appeal to those teaching
in this area.

The feedback viva
Context

The ‘feedback viva’ has been trialled with students from a range of subject specialisms, and
with a range of class sizes that place varying degrees of pressure on staff time. For the past
three years it has formed part of the assessment on an applied moral and political philosophy
course called Issues in Contemporary Society. ICS is a Level 2 compulsory course on two
degree pathways (MA in Primary Education and Liberal Arts), and a recommended but non-
compulsory course for two other degrees (BSc Environmental Stewardship and MA Health
and Social Studies). Class sizes vary from the low 20s to the high 60s.

The process

The viva itself is a 10-minute, formally assessed dialogue between a student and two
academic members of staff. The grade awarded is worth 10% of the course total. Prior to the
meeting, the student has submitted their course essay (1500 words, and worth 30% of the
total course mark) and received it back, several days in advance of the viva, with written
comments. (The grade for the essay is withheld until after the viva.) In the viva the student
and academics (at least one of whom has marked the essay) discuss, in a semi-structured
format, the written feedback from the assignment.

        The aims of the feedback viva are six-fold:

             •   To ensure that students read the comments. The minimum requirement to
                 ensure a grade is to have read and understood (or at least demonstrate an
                 attempt to have understood) the tutor’s written feedback.
             •   To provide an opportunity for the student to respond to the comments and
                 highlight areas of ambiguity in, or disagreement with them.

                                                      54
Discourse Volume 11 Number 2 Summer 2012

           •   To prompt the student to reflect upon strengths and weakness in their work,
               both with respect to the essay in question, and with respect to their
               performance on other courses.
           •   To provide advice, and prompt the student to reflect upon, means of
               overcoming weaknesses and/or enhancing strengths when working on future
               assignments. This aim extends to work habits (such as time management) as
               well as issues concerning content and structure.
           •   To allow the students to reflect upon the skills learnt on this course and their
               wider application.
           •   To provide an opportunity to practice constructive dialogue.

The expected outcomes tied to these aims are demonstrations:

           •   that the student has read and understood the feedback
           •   of the student’s ability to identify strengths and weakness in the essay (form
               and content), in part through their articulation of the main points in the
               feedback
           •   of the student’s ability to reflect on the relationship between this essay and its
               strengths and weakness and their academic performance and output in general
           •   of how the ideas and advice in the feedback, and as generated by the viva
               discussion, could be directed towards improving this essay, and/or future
               academic work
           •   of the student’s ability to reflect on the skills learnt on this course and their
               wider application

The viva itself is based around the following questions:

           1. Do you understand the feedback?
                  a. Are there points you need clarification on?
                  b. Do you disagree with any of the comments?

           2. What would you say are the essay’s main strengths?

                                               55
Benjamin Franks and Stuart Hanscomb—Learning Through Reflective Dialogue

             3. What would you say are its weaknesses, and how would you improve these if
                 you could write the essay again?
             4. Are these strengths and weaknesses specific to this essay, or do they represent
                 more general tendencies in your assignments?
             5. What skills (if any) have you learnt on the course?

These broad questions typically lead to more specific ones: for instance, asking for
clarification about the meaning of a particular sentence; to articulate a corrected account of a
misunderstood concept, or if their conclusion was unclear in the essay, whether they have
now reached a more identifiable position. Applying the criteria from the intended outcomes
to responses to these questions provides the basis for a formal assessment of the viva.
Comprehensive guidance on the process and marking criteria are provided in advance via the
course handbook and the virtual learning environment. Because this assessment technique is
unusual, additional time is made available in lectures and tutorials to answer any questions
that might arise.

Precursors

The feedback viva as described is not wholly original, although it is distinct from mini-vivas
and other reflective feedback such as those instituted by Carless (2002) at the Hong Kong
Institute of Education; Steve Prowse et al’s (2007) viva system at Wolverhampton
University, or the ‘gold standard’ of the Oxbridge tutorial system (e.g. Palfreyman, 2008). To
take the latter first, the usual format is for a student to write an essay once a week, read it
aloud and discuss it orally with the tutor who provides feedback. This is a formative
assessment quite separate from the summative examination at the end of session. As Graham
Gibbs and Claire Simpson (2004, 8) indicate, few institutions can match the Oxbridge
Colleges in this manner, largely for reasons of resources. The feedback viva, by contrast,
occurs just once in the course, is semi-structured and summatively assessed, and lasts just 10
minutes.
        The structure of the Wolverhampton system involves a first submission of work by
the student, which the tutor then returns with written comments. The student then has a viva
based on the written comments, and following this she rewrites the essay, and only this
second draft is summatively assessed (Prowse et al, op cit). The main differences between
this system and the feedback viva are that there is no re-submission of the essay, and a mark

                                                      56
Discourse Volume 11 Number 2 Summer 2012

is awarded for viva performance. In so far as it does not require the additional second
marking of the revised essay, the feedback viva is more resource-friendly, and it avoids some
grade-related unfairness. Those Wolverhampton courses which allow for double submissions
are likely to have higher grades than those which do not.
       Carless’ ‘mini-viva’ is more complicated than the one utilised here, with feedback on
a draft of the assignment, a pre-submission tutorial and then submission of the assignment.
Like the feedback viva, the mini-viva takes place after the submission of the essay, but since
it is carried out in groups of three the cost in time is less (Carless, 2002, 357). The feedback
vivas take approximately 12-13 hours for an assessment of 50 students, as opposed to the 4-5
hours cited for Carless’ method (ibid, 360), but when the reading and feedback on drafts and
the pre-submission tutorials are taken into consideration the respective time commitments are
likely to be similar. There are otherwise two main differences between these systems: the first
is that the feedback viva prompts reflection on patterns in the students’ approach to work and
academic performance in general, not just on the essay that has been marked; and the second
is that the feedback viva, unlike the mini-viva, awards a grade for viva performance, meaning
that the student has potentially more incentive to engage with the dialogue. Also, as the
feedback viva is conducted with individual students, it has the additional advantage of being
a more personal, flexible and confidential encounter.

Research

Whilst the authors’ experiences of running feedback vivas have been very positive, and
anecdotally it has been largely welcomed by students and spoken of as a valuable method of
learning and assessment, more objective evidence of its worth is required in order to verify,
and shed more light upon, these impressions. If evidence is found a case can be made with
confidence for its introduction beyond the one course on which it is currently employed. The
aim of the research study was to assess students’ views on the viva; its precise advantages
and disadvantages; whether they felt it should be introduced on other courses, and
suggestions for improvements.

Methods

The two methods of data collection were questionnaire and semi-structured interview. In the
former case two items were added to the standard end of semester course evaluation form for
academic sessions 2010-11 and 2011-12. The first asked students to indicate their level of
                                              57
Benjamin Franks and Stuart Hanscomb—Learning Through Reflective Dialogue

agreement, on a Likert scale (from 1 (‘strongly disagree’) to 5 (‘strongly agree’)) to the
statement ‘The feedback viva was helpful’. The second asked them to ‘comment on the
feedback viva as a form of assessment’.
        The anonymised, semi-structured interviews were carried out with eight participants,
all of whom were students who had completed the course, and thus participated in the
feedback viva. The interview was conducted by a research assistant trained in semi-structured
interview techniques. She had detailed knowledge of the aims and content of the course and
of the viva process (including observation of a mock viva), but otherwise no connection to it
in terms of teaching or assessment. Interviews lasted up to 45 minutes, and the questions
asked were guided by the aims of the research; for example:

             •   In comparison to other forms of assessment, what would you say have been its
                 benefits for you as a learner (if any)?

             •   In comparison to other forms of assessment, what would you say are its
                 limitations (if any)?

             •   What (if anything) could be changed to improve the viva?

        (For the full list of questions see Appendix 1.)

The interviews were then transcribed, and an analysis of themes using the ‘general inductive
approach’ (Thomas, 2006) was carried out separately by the research assistant and by the
authors of this article.
        One weakness with the interview method is the representativeness of the interviewees.
Unlike less time-consuming (and less cognitively taxing) methods such as questionnaires, the
semi-structured interview is liable to attracted volunteers who either have a strong
commitment to the course (and to their academic performance as a whole) or who have an
axe to grind, causing the data to be somewhat polarized. This effect is, however, mitigated by
the triangulation of results with quantitative and qualitative data gathered from the
questionnaire (Mays and Pope, 2000). Not only did these involve a greater number of
participants, but the user-friendly and ‘captive’ nature of end of semester evaluations (i.e.
relatively short forms handed out and collected in one of the final lectures) means that the

                                                      58
Discourse Volume 11 Number 2 Summer 2012

information obtained is more widely representative. That a section of disengaged students are
typically missing from these final lectures is a further source of bias. This effect was diluted
in the case of the 2011-12 cohort by the need for shortened questionnaires (that included only
the questions pertaining to the viva and overall satisfaction rating for the course) to be
distributed in classes at the beginning of the following semester when attendance levels are
typically higher.1
        A further weakness concerns the reliability of the thematic analysis of the interview
transcripts. The presuppositions of the researchers will inevitably bias the interpretation of
data to some degree, but this effect is lessened by the performing of ‘independent parallel
coding’ (Thomas, 2006, 244), and the requirement for all of the themes reported to be
unanimously agreed upon by the three researchers.

Results

The mean response to the statement ‘The feedback viva was helpful’ was 4.2 (n = 22), and
comments from the 2010 and 2011 end of semester course evaluation forms (n = 35) have
been combined with the data collected from the semi-structured interviews. The vast majority
of respondents and participants viewed the viva as a valuable method of assessment, one that
should be included on other courses, and one that should be introduced in first year to help
encourage the habit of reading and reflecting on feedback. They did, however, have different
reasons for concluding this and, as will also be discussed below, this general positivity did
not preclude specific criticisms and suggestions for improvement. In what follows the
dominant themes that emerged from content analysis are outlined under the general headings
of ‘advantages’ and ‘improvements’.

Advantages
1. The value of dialogue (over written feedback alone)
The discursive method was seen as advantageous in terms of helping to clarify points that
were, for example, vague or ambiguous in the written feedback. This aspect connects to a
further point concerning the value of dialogue for embedding understanding. Being able to
ask about and discuss feedback that is unclear is of course important, but written feedback
that is clear is also better understood as a result of the assessment method itself. This is partly

1
 This was due to the vivas being postponed because of staff absence. Normally they are carried out
before the end of the teaching period, allowing the opportunity for feedback in the end of semester
course evaluation, but this was not possible for this particular group.

                                                 59
Benjamin Franks and Stuart Hanscomb—Learning Through Reflective Dialogue

due to the preparation involved, and partly due to the deepening of comprehension that
dialogue facilitates. As one student said:

        I think if you had to talk about every essay you did you’d understand it a lot more … I think it
        embeds it much more in your brain if you have to actually say it out loud. (Student 2)

The opportunity for dialogue also allows students to explain and defend themselves to the
tutors. Occasionally they felt they had been misunderstood, but mostly it was a matter of
acknowledging that they hadn’t expressed themselves as well as they would have liked, and
thus appreciating the chance, as one participant put it, to ‘reaffirm what you were saying or
reword an argument to try and make it more succinct’ (Student 6). The same student goes on
to describe dialogue as a ‘more natural way of expressing your understanding.’
        Also, many valued the opportunity to develop aspects of their essay that had been
restricted by the confines of the word limit.

2. Understanding students as individuals
A second commonly cited reason for students valuing the feedback viva was the
individualised attention it affords. At a practical level there were comments like:

        they have a class of like thirty, forty students in the lecture and it’s hard to correspond with
        each one on a personal level. But then you get the viva and it’s just all about you and your
        essay, and you get that you time, and any questions about the course in general they, they,
        you know, they let me ask (Student 3).

And in another (‘personal’) sense the contact is appreciated because:

        it was also nice that you felt your lecturers actually cared about your improvement ... a sort of
        social and emotional dimension ... it made it feel like somebody’s actually read the essay and
        had a think about it and, you know, that it’s valuable. I think it’s valuable because you get to
        know the lecturers if you have that chance to talk to them (Student 4).

One student, who would typically be proactive about seeking out more feedback, appreciated
how the viva legitimised this enthusiasm:

        [I] suppose it made me reassured that it was okay to do that. Because I always felt that …the
        staff always seem so … busy that you feel a little bit bad disturbing them and asking for more.
        So useful, yeah, just to be reassured that it was okay to do that and so take up somebody’s
        time (Student 8).

                                                      60
Discourse Volume 11 Number 2 Summer 2012

3. Outcomes
The transferable value of oral communication in the form of dialogue was highlighted by
several of the interviewees; one pointing out that:

       you need to be able to talk about things when you’ve qualified and you’ve got a job;
       communication by voice is how you do a job, you don’t do a job by writing essays (Student 2).

The most common reference in the questionnaires, however, was to improvements in their
approach to, and results from, subsequent assessments. For example:

       [It] helped to make me think about ways of improving future work.

       It enabled me … to comment on the feedback but also engage with it and consider how this
       may be improved and crossover into other subjects.

       It allowed me to better apply what I’d learned from the feedback to other essays, which
       earned higher grades than the original essay (the subject of the viva).

       [It] made me much more confident about my work.

       As well as getting feedback on my work, I also got advice on learning how to learn

Improvements
1. Viva length
By far the most common response to the question ‘What (if anything) could be changed to
improve the viva?’ was that it would benefit from being longer. In some cases students said
they would have preferred more time to express themselves fully, one describing how the
feedback they received on the viva performance itself had:

       said [I] actually got stake-holder theory wrong. And in the time itself there wasn’t really the
       time to thrash out the definition of stake-holder theory. ... I felt I’d understood it but hadn’t
       conveyed that understanding in the essay. And it would have taken longer than the ten minute
       allotted slot to really step down and say: look this is how I see it (Student 7).

In other instances it was suggested that the structure attempts to cover too much ground in the
10-15 minutes (i.e. essay feedback, the wider relevance of this feedback, plus skills learnt on
the course), and others simply wanted more of an assessment they found themselves enjoying
and learning from.

                                                  61
Benjamin Franks and Stuart Hanscomb—Learning Through Reflective Dialogue

2. Preparation

A number of participants raised the issue of whether they had had the opportunity to be
adequately prepared for the viva (and sometimes this was accompanied by a questioning of
whether it should carry a summative grade). This surprises the authors because clear
instructions, both in writing (on the on-line learning environment) and verbally in lectures,
were provided on what they would be expected to discuss, and how it would be graded. It
would appear though that the unusual nature of the assessment means that this kind of
preparation is not always enough, and indeed several comments to the effect that it was a
‘scary’ prospect supports this view. In the next section a couple of methods for enhancing
preparation for unfamiliar forms of assessment will be discussed.
        In some cases different and more subtle points about preparation were being made. In
one especially interesting example – one that could be highly instructive for future
approaches to preparation – it was suggested that we:

        have some kind of guiding questions on what to look for in the feedback. Something like, um,
        sort of meta-cognition: look at your own sort of response to the feedback, because sometimes
        you overlook the stuff you don’t like and you just focus on the stuff that you do like and end up
        just talking about that. And you don’t end up treating the problem. So, maybe some kind of
        guidance about how to look at feedback before the viva so that when you actually went into it
        you could have talked about it in more detail (Student 4).

This is not just saying there were no instructions (there were), it is more about the
sophistication of the instructions. Instead of just encouraging students to read feedback, they
should be challenged to learn how to read feedback (which shares territory with effective
self-reflection); what to look for other than the obvious, and how to deal with your emotional
(e.g. defensive) responses to it.2
        Students who are advanced in terms of self-reflection, and who pay close attention to
the written feedback, voiced a related point of criticism. For them the viva could offer, they
felt, greater opportunities for deeper critical reflection. One way of addressing this would be
the kind of prior instruction in self-assessment indicated above. This in turn provides an
answer to the additional concern that those who produce high quality essays have ostensibly
less to reflect on with respect to the important question ‘how would you go about improving
it?’

2
 For an interesting discussion of the importance of having generalized discussions about the nature
of assessments with students see Carless (2006).

                                                      62
Discourse Volume 11 Number 2 Summer 2012

Implications

In this final section three issues will be discussed that have significant bearing on the wider
application of the feedback viva on other courses and in other institutions. Its adoption at
level 1 (first year); the demands it makes on staff time; and its value in terms of developing
graduate attributes, will be addressed in varying degrees of detail.

Implementation at level 1

As was previously indicated, many participants thought that the feedback viva should be used
on other courses, and it was often felt that it would have considerable benefits if introduced in
the first year of study. The following comment has further implications in terms of solving
the first of the ‘preparation’ problems previously discussed:

       I think having vivas in first year would make it much less scary, you’d get used to them and,
       you’d probably take much more out of them (Student 5).

A compulsory, but not summatively assessed, viva in first year would be one way of
introducing students to a novel assessment form in a way that does not risk compromising
their grade for the course. Another useful method for familiarising student with forms of oral
assessment (presentations, debates, vivas) is the use of videoed examples.

Time constraints

As Carless (2002, 360) makes apparent, from the teaching perspective the most off-putting
feature of this approach to assessments is that it is time-consuming. In our case we spent
fifteen minutes per student (10 minutes for the viva, 5 discussing the grade) such that for an
average cohort of sixty, fifteen extra hours for two staff members is needed. (Two are
required for moderation purposes.) If, as is desired by the students, we made the vivas longer,
then the problem clearly increases.
       A ‘naïve’ response to this obstacle is to say that if this is as effective a means of
assessment as it appears to be, then it is worthwhile devoting staff time to, even if this means
that more teaching hours (or more teaching staff) need to be made available. A mitigating

                                                 63
Benjamin Franks and Stuart Hanscomb—Learning Through Reflective Dialogue

factor is that since many departments rely on relatively cheap part-time (often postgraduate)
tutors, and since these are the people who typically mark the students’ work, then they would
play a key role in the vivas. Another economic argument supporting extra resources for this
assessment method is this: if it is introduced in first year and is indeed effective, then this will
reduce the time markers, and effective learning advisors, spend helping and correcting
students’ work in later years.3
        One further economic argument concerns student retention. The allocation of some
discrete and specialised time with academics, focused exclusively on their work, helps the
student feel more valued. This would contribute to strengthening their link with the
institution, making them less likely to drop out. Indeed evidence from student surveys
suggests that it is the individual attention paid to students by academics that students most
value (Guildhall University, 2008; Alderman and Palfreyman, 2011).
        In the face of structural and financial barriers, however, further measures will no
doubt be needed to offset the human resources issue. One simple solution is to drop (or
shorten) other assessments (typically the exam) in order to reallocate time for the feedback
viva. This remedy is more palatable as there is no suggestion that reflective vivas be used in
all or the majority of courses. For instance, some strategic thinking within an academic unit
could arrange an assessment structure that guarantees all students one per year, thereby
minimising interference with assessments in other courses.

Graduate attributes

It was stated at the beginning of the article that the main reason for introducing this method
of assessment was to encourage students to read their feedback, learn from it, and be more
able to apply it to other assignments in other courses. It has, therefore, an immediate
pedagogical benefit, and serves as a tool to assist students’ learning to learn.
        There is, however, a further dimension to the viva experience – one that quickly
became apparent to the authors as they began to conduct these assessments – concerning its
relevance to the development of graduate attributes. A relevant example of a list of graduate
attributes is that of our own institution (University of Glasgow, 2012).4 Of a matrix of ten
main categories, and thirty sub-categories, our research (and our own experiences with this
form of assessment) highlights several that are enhanced by the feedback viva.
3
  We are grateful to Barbara Weightman, the effective learning adviser in the College of Social
Sciences, University of Glasgow, for providing this argument.
4
  Another good example is the University of Sydney: http://www.itl.usyd.edu.au/ graduate attributes/

                                                      64
Discourse Volume 11 Number 2 Summer 2012

        One is ‘confidence’; graduates should be able to ‘defend their ideas in dialogue with
peers and challenge disciplinary assumptions’ (ibid). Most dialogue on academic topics will
be beneficial to some degree, but the emphasis here on ‘peers’ perhaps says more about the
difficulty many universities have in committing adequate staff time to this kind of interaction
than it does about their position on the relative merits of different kinds of interaction. For
example, Rob Wass et al (2011) offer evidence that dialogue with lecturers in particular is
‘central to higher-order thinking.’ They explain how,

        improved access to teaching staff allowed [students] to explore their understanding of the
        course material at a much deeper level and some reported that their questions gradually
        became more sophisticated as confidence in their relationship with the lecturer increased.
        (ibid, 324)

A pair of attributes fairly clearly developed by the viva process is ‘reflective learners’,
including being able to ‘use feedback productively to reflect on their work, achievements and
self-identity’ (ibid); and ‘effective communicators’. Of less obvious but perhaps more
significant relevance is the attribute of being ‘resourceful and responsible’, incorporating the
notion of ‘accountability’.5 As one might expect, and as many who took part in the interviews
confirmed, the viva creates a pronounced requirement for students to take responsibility for
their output. It is far harder to hide from one’s motives when explaining oneself to another
person than it is when explaining oneself to oneself. The types of self-protective devices
familiar from our own self-reflection, from informal discussions, and from interviews
conducted in this kind of research – such as leaving an essay to the last minute so that the
rushed nature in which it is executed provides a convenient excuse for its weaknesses, or just
telling yourself that your poor grade was because you ‘didn’t try’6 – are more readily exposed
for what they are under these conditions. As Michael Bonnett says, ‘we reveal ourselves most
fundamentally through those of our actions that directly affect others, and that by choice or
necessity are taken up by them in some way (including ... responses of rejection)’ (2009,
360). In the context of higher education, assignments are a principal way in which the student
‘directly affects’ her teachers, and one formative upshot of a viva is to make the nature and
extent of this impact unusually vivid.

5
   According to the Graduate Attributes booklet, it is important for the Glasgow University graduate to
be able to ‘Manage their personal performance to meet expectations and demonstrate drive,
determination and accountability’.
6
   An attributional bias discussed by one of the interviewees, the broad form of which is familiar from
literature on the self-serving bias (e.g. Taylor and Brown, 1988).

                                                   65
Benjamin Franks and Stuart Hanscomb—Learning Through Reflective Dialogue

        The following comment from one of the interviewees expresses, with some irritation,
the link between seeking and accepting criticism and maturity:

        People who don’t feel comfortable talking to other people about their mistakes … need to
        grow up. Because, if you’re working for some company and you have a report due and you
        turn it in, and your boss wasn’t happy with some of the things he calls you into the office and,
        you know, like: “look, you didn’t do this so well...” and you just look down at the ground and
        you’re like: “oh yeah, I’m sorry...” I mean that doesn’t work in real life and so, so, I think this is
        a reality that … this viva prepares you for. … About reflecting on your own work. Being able to
        see the strengths and the weaknesses and being able to use your strengths to improve on
        your weaknesses (Student 3).

Defensiveness in response to negative feedback will impair learning and development, and
will have an impact on many areas of students’ lives, including the workplace once they
graduate. It is of course very hard to take criticism on the chin, but since it is a foundation of
effective self-reflection it is a vital capacity to develop, and the sooner this process begins the
better. Framed in terms of virtues, being able to respond constructively to criticism is related
to resilience, modesty and courage, and it highlights the extent to which this kind of growth is
enabled by our relationships (including our dialogues) with others.

Conclusion

The feedback viva was considered a valuable method of assessment by the vast majority of
the students surveyed and interviewed. Many enjoyed the experience (even if they didn’t
enjoy the prospect of it), and most of those who found it stressful still regarded it as
something which should be maintained and extended to other courses (especially level 1
courses, where practice would make the level 2 experience easier and more beneficial). Its
main benefits can be summarized in terms of the value of dialogue for instilling learning; the
value of being forced to reflect deeply on one’s academic strengths and weaknesses, and the
individual attention and closer contact with the academic staff that it facilitates. Many,
however, would prefer it to be longer; improvements can be made to the way we prepare
students, and consideration should be given to how we might be able to further challenge the
minority who are already suitably reflective and habitually engage with the written feedback
they receive. As well as showing strong signs of helping students become better learners, the
viva process would also appear to contribute to the development of graduate attributes such
as ‘accountability’. We would add, lastly, that this link raises further questions about the
potential for ‘reflection on the qualities of assignment’ to serve as a tool for student self

                                                      66
Discourse Volume 11 Number 2 Summer 2012

assessment of these wider virtues and attributes.7 This will hopefully constitute the subject of
a later inquiry.

References

Alderman, G. and Palfreyman, D., ‘The Birth of the Market place in English Higher
        Education’, Perspectives: Policy and Practice in Higher Education, 15 (3), (2011),
        pp. 79-83.
Bonnett, M., ‘Education and Selfhood: A Phenomenological Investigation’, Journal of
        Philosophy of Education, 43 (2009), pp. 357-370.
Brockbank, A. and McGill, I., Facilitating Reflective Learning in Higher Education,
        (Buckingham: Open University Press, 1998).
Carless, D.R., ‘The ‘Mini-Viva as a Tool to Enhance Assessment for Learning’, Assessment
        & Evaluation in Higher Education, 27 (4), (2002), pp. 353-363.
Carless, D.R., ‘Differing Perceptions in the Feedback Process’, Studies in Higher Education,
        31 (2), (2006), pp. 219-233.
Gibbs, G. and Simpson, C., ‘Conditions Under Which Assessment Supports Students’
        Learning’, Teaching in Higher Education, 1, (2004), pp. 3-31.
Gilbert, M., Coalescent Argumentation, (New York & London: Routledge, 1997).

Glover, C. and Brown, E., ‘Written Feedback for Students: Too Much, Too Detailed or Too
        Incomprehensible to be Effective?’, BEE-j, 7, (May 2006), available at
        http://www.bioscience.heacademy.ac.uk/journal/vol7/beej-7-3.pdf.
Guildhall University, ‘Individual Attention Brings Student Satisfaction’, (2008), available at
        http://www.guildhe.ac.uk/en/news/index.cfm/nid/C344838A-B4CF-4847-
        ACC59E88FEF2E5D4, (accessed 26th July 2012).
Higgins, R., Hartley, P. and Skelton, A., ‘Getting the Message Across: The Problem of
        Communicating Assessment Feedback’, Teaching in Higher Education, 6 (2), (April
        2001), pp. 269-74.

7
  Recent work that touches on this relationship by John Lippitt and Sylvie Magerstädt at the University
of Hertfordshire was presented at a recent HEA Arts and Humanities Annual Conference in May
2012; and the idea has been addressed explicitly by one of the authors of this piece (Hanscomb) at
the University of Glasgow Annual Learning & Teaching Conference, April 2011 (see http://www.gla.
ac.uk/media/media_190723_en.pdf).

                                                  67
Benjamin Franks and Stuart Hanscomb—Learning Through Reflective Dialogue

Juwah, C., Macfarlane-Dick, D., Matthew, B., Nicol, D., Ross, D. and Smith, B., Enhancing
        Student Learning Through Effective Formative Feedback, Higher                 Education
        Academy:
        http://www.heacademy.ac.uk/assets/documents/resources/database/id353_senlef_guid
        e.pdf , last accessed, July 4, 2012.
Larvor, B., ‘Manifesto for Higher Education: Students are Human Beings (Discuss)’,
        Discourse, 6 (1), (2006), pp. 225-236.
Lizzio, A. and Wilson, K., ‘Feedback on Assessment: Students’ Perceptions of Quality and
        Effectiveness’, Assessment & Evaluation in Higher Education, 33.3 (2008), pp.263-
        275.
Mays, N. and Pope, C., ‘Assessing Quality in Qualitative Research’, British Medical Journal,
        320 (January 2000), pp. 50-52.
Nicol, D. and MacFarlane-Dick, D., ‘Formative Assessment and Self-regulated Learning: a
        Model and Seven Principles of Good Feedback Practice’, Studies in Higher
        Education, 31 (2), (2006), pp. 199-218.
Palfreyman, D. (ed.) The Oxford Tutorial: ‘Thanks, You Taught Me How to Think, (Oxford:
        Oxford Centre for Higher Education Policy Studies, 2008).
Price, M. Carroll, J., O’Donovan, B. & Rust, C., ‘If I was Going There I Wouldn’t Start from
        Here: A Critical Commentary on Current Assessment Practice’, Assessment &
        Evaluation in Higher Education, 36. 4 (2011), pp.479-492.
Prowse, S., Duncan, N., Hughes, J. and Burke, D., ‘“….Do that and I’ll raise your grade”.
        Innovative Module Design and Recursive Feedback’, Teaching in Higher
        Education,12(4) (August 2007), pp.437-445.
Taylor, S. and Brown, J., ‘Illusion and Well Being: A Social Psychological Perspective on
        Mental Health’, Psychological Bulletin, 103, (1988), pp. 193-210.
Thomas, D.R., ‘A General Inductive Approach for Analysing Qualitative Evaluation Data’,
        American Journal of Evaluation, 27 (2) (2006), pp. 237-246.
Tindale, C., Rhetorical Argumentation, (Thousand Oaks, CA.: Sage, 2004).
University of Glasgow, Graduate Attributes (2012) available at:
        http://www.gla.ac.uk/services/careers/attributes/ (accessed 25th July 2012)
University of Sydney, Graduate Attributes (2012) available at:
        http://www.itl.usyd.edu.au/graduateAttributes/policy.htm (accessed 25th July 2012)
Walton, D., Informal Logic, (Cambridge: Cambridge University Press, 1989).

                                                      68
Discourse Volume 11 Number 2 Summer 2012

Wass, R., Harland, T, and Mercer, A., ‘Scaffolding Critical Thinking in the Zone of Proximal
       Development’, Higher Education Research and Development, 30 (3), (June 2011), pp.
       317-328.
Williams, J., ‘Providing Feedback on ESL Students' Written Assignments’, The Internet
       TESL Journal, Vol. IX (10) (October 2003):
       http://iteslj.org/Techniques/Williams-Feedback.html

Appendix: Interview questions

           1. Can you describe the procedure and rationale of the feedback viva in your own
              words?

           2. What are your overall feelings about the feedback viva as a form of
              assessment?

              a)       Did you enjoy it? (Why/Why not?)
              b)       Did you find it difficult or easy in comparison to other forms of
                       assessment? (Why?)

           3. In comparison to other forms of assessment, what would you say have been its
              benefits for you as a learner (if any)?

           4. In comparison to other forms of assessment, what would you say are its
              limitations (if any)?

           5. What (if anything) could be changed to improve the viva?

           6. How, if at all, has participating in the viva altered your approach to academic
              work?

           7. What reasons would you have for or against seeing a similar technique used in
              other courses?

           8. Are there any other points you would like to make about the feedback viva?

                                              69
Benjamin Franks and Stuart Hanscomb—Learning Through Reflective Dialogue

             9. Are there any questions you would like to ask about this research?

                                                      70
You can also read