Through a glass darkly: Assessment of a real client, compulsory clinic in an undergraduate law programme. Cath Sylvester1 Northumbria University, UK At Northumbria Law School the real client clinic (the Student Law Office) is an integrated capstone experience in the four year Masters in law course. The programme’s integrated approach with assessed clinic, was introduced in 1992 and drew on the teaching hospital model in medical education where no distinction is made between education and training. The programme was designed to meet the requirements of the Quality Assurance Framework for UK Undergraduate programmes, the professional body requirements for subject knowledge2 and the procedural and legal skills knowledge required by the vocational Legal Practice Course 3 . Students acquired an academic qualification and met the competence standards required for day one of a training contract. At the time it was unique, in 1996 the ACLEC 4 report referred to the Northumbria model as “allowing for progressive learning of analytical skills and conceptual understanding of both 1 Cath Sylvester is Principal Lecturer in Law at Northumbria and leads on Programme Design. 2 The requirements for the Qualifying Law Degree were set out in the Joint Statement on the Academic Stage of Training, 2002 3 The Legal Practice Course is the vocational course required by the Solicitor’s Regulatory Authority for those wishing to qualify as a solicitor in England and Wales 4ACLEC First Report of Legal Education and Training 1996 para 2.2, Lord Chancellor’s Advisory Committee on Legal Education and Conduct ( Special Issue Problematising Assessment in Clinical Legal Education ) ( 32 ) substantive law and procedure, and the acquisition of basic professional skills and values.” However, the academic/vocational divide has persisted and whilst the model has been replicated it has not proliferated. In the recent LETR review5 it was identified as one of the examples of ‘considerable flexibility‘ in the system of legal education and training. There are many reasons that Law Schools may not wish or be able to deliver a similar model and as part of the flexibility agenda no one would want uniformity. However one of the prevailing misconceptions of the integrated approach is that it is only relevant for those wishing to become lawyers and therefore by implication the skills required to become a lawyer are in conflict or detract from the skills acquired as part of the academic study of law. As Bradney6 succinctly states “being a lawyer is not the same as studying law and being a lawyer is what only a minority of law students will be”. Taking this to its logical conclusion Van der Vleuten’s longitudinal utility model for assessment of medical training would appear to have limited relevance in the non-vocational law degree where the mastery of the subject is evidenced by traditional undergraduate methods. Nevertheless few students would consider an undergraduate programme that did not equip them with anything other than core discipline knowledge and the ability 5J. Webb, J. Ching, P. Maharg and A. Sherr, Setting Standards: The Future of Legal Services Education and Training Regulation in England and Wales (London, Legal Education and Training Review, 2013) (LETR Report). Available at: http://letr.org.uk/the-report/index.html. 6 Anthony Bradney SPTL ( Society of Public Law Teachers) Reporter 21, Winter 2000 ( N o r t humb r i a D eg r ee i s d e s i g n e d to m eet t h e ex i s t i n g p r o f e s s i o n al b o d y r eq u i r e m e n ts i ts c e n t r al e p i s te m o l ogy i s t h at b y e m b e d d i n g p r o p o s i t i o n al )to study law as a useful investment. In the light of the year on year increase in numbers of students studying law as a discipline 7 there would seem to be a perception that the range of intellectual and other skills developed by the study of law are worth having as useful preparation for employment. Whilst the knowledge 8 in a practice orientated setting, students would develop more sophisticated skills for using their knowledge. Broudy adopted a four stage model of knowledge use; replication, application, interpretation and association. Students using their knowledge in the clinical setting or other enquiry based exercise are required to go beyond application of knowledge and to interpret their knowledge so that it can be applied in a different factual settings9. As Eraut identifies when discussing professional expertise “The process of using knowledge transforms that knowledge so that it is no longer the same knowledge”10. The QAA subject benchmark for undergraduate law programmes in England and Wales has recently been substantially revised and marks a significant move away from predominantly prescribing discipline knowledge towards a broader use of skills approach. It states “We have made considerable changes to the structure of the 7 The Law Society Entry Trend records show that in 2012, 32,345 students applied to study law at University in the UK, of these 20,070 accepted places. 8 M.Eraut, Developing Professional Knowledge and Competence (London, Falmer Press, 1994), p103. Eraut uses the term propositional knowledge to describe discipline based theories and concepts and practical principles in the applied field. 9 H.S. Broudy, Personal Communication (1980) as referred to by Eraut (supra n7) p26 10 Eraut, supra n7, p 25 statement. We have done so to reflect the panel’s view that a law graduate is far more than a sum of their knowledge and understanding and is a well skilled graduate with considerable transferable generic and subject–specific knowledge, skills and attributes” 11. The benchmark specifies generic skills linked to broader professional expertise for example “self-management, including the ability to reflect on their own learning, make use of feedback. A willingness to acknowledge and correct errors and an ability to work collaboratively”. This approach is mirrored by the growing use of generic graduate attributes in some universities. Such attributes are incorporated into programme outcomes for all undergraduate programmes offered by the University12. At the other end of the training process the SRA has recently revised its competency statement for solicitors 13 and has adopted an approach of focussing on “the activities that all solicitors need to be able to do competently, rather than describing the attributes that solicitors require in order to be competent”. It sets out four domains of solicitors’ competence; ethics, professionalism and judgement, technical legal practice, managing themselves and their own work, working with other people. As the language of professional competency and academic programme aims and objectives come closer together and our module, year and programme outcomes and 11 The Quality Assurance Agency for Higher Education, Subject Benchmark Statement, Law, July 2015 Section 2 available from www.qaa.ac.uk. 12 Northumbria University Graduate Attributes, 2015 13 Solicitors Regulation Authority, Training for Tomorrow: A Competence Statement for Solicitors. 20.10.14. graduate attributes start to sound very like some of the professional body competencies it is a good time to review assessment and its place in the law curriculum as a whole and to consider how we can effectively assess these attributes, align them to the objectives and measure them. Currently the majority of undergraduate law provision has its emphasis on measuring the student’s ability by subject matter or skills area rather than their reliability as competent practitioners14. Adopting the language of competency does not, on its own, ensure programme design and assessment to deliver competency. Eraut refers to the assessment of competency as requiring a change in emphasis; instead of making ‘separate judgements about each piece of evidence; judgements of competence have to rest on separate decisions about each element of competence, taking into account all relevant sources of evidence. Thus assessment criteria “belong to the elements of competence not to the pieces of evidence”15. This echoes Van der Vleuten’s longitudinal approach to assessment which should theoretically fit well with the constructively aligned curriculum through which competencies can be tracked at different levels. For example in year three of the Northumbria programme, students’ interviewing skills are assessed using a standardised client process, in the year four clinic interviewing is assessed in a real client setting however each of these individual assessments are lost in the overall degree 14D. Newble, B. Jolly and R.E. Wakeford, The Certification and Rectification of Doctors. Issues in the Assessment of Clinical Competence.(Cambridge, Cambridge University Press, 1994) 15 Eraut, supra n7, p 207 classification which remains the primary concern for students, employers and universities. Nevertheless, on a module level, the embedded clinical programme in the curriculum has the potential to assess the development of professional competency and use of knowledge skills and offer an alternative to the measurement approach. By taking assessment seriously in clinic and being able to articulate and justify our approach and grading process we achieve a number of very significant benefits. These include providing a measure of competence which informs students of their strengths and weaknesses as they progress through the clinical module. It also provides a more nuanced and authentic reflection of students’ achievements for external purposes as well as building up a level of expertise amongst assessors in the assessment of broad based professional competence rather than the components of competence. The use of a range of more innovative methods of assessment in clinic adds depth to the range of largely traditional assessment methods elsewhere in the curriculum and the intense scrutiny of clinical work lends itself very well to repeat sampling which impacts on the reliability of clinical marks. Clinic is a constructivist teaching methodology – it can deliver discipline and procedural legal knowledge but more often its role is emphasised in terms of teaching legal and intellectual skills and as a method of inculcating professional values and ethics through its traditional involvement in social justice. In the SLO we draw on the transformational qualities of the method and the impact of the real client on student learning. Whilst the knowledge may be delivered in the classroom, the context of clinic is unique in that it uses a real client/ real emotions, has an unknown dynamic/ changing and evolving factual perspectives, has an unknown outcome/ uncertain content and is delivered through a distinctive working relationship with a supervisor. This is a powerful methodology and students will have variable experiences and construct their knowledge accordingly. Standardising assessment in these circumstances takes it out of the clinical setting. Eraut argues that the combination of using propositional knowledge and process knowledge (by which he means skills such as how to acquire information and deliberative processes such as planning or problem solving) constitutes professional knowledge “although knowledge may be included in the curriculum because somebody else has deemed it relevant to professional practice, it does not become part of professional knowledge unless and until it has been used for a professional purpose’’16. Van der Vleuten’s utility model 17 offers reassurance that we can assess what is unique about clinic without disassociating the assessment from the clinic or limiting assessment to specific tasks within clinic. In addition by assessing the real clinical process we require students to focus on developing these complex competencies. As Biggs and 16 Eraut, supra n7, p 119 17 C. Van Der Vleuten, L W T Schuwirth, Assessing professional competence: from methods to programmes 2005, Medical Education 39 Tang state “Assessment is the senior partner in learning and teaching. Get it wrong and the rest collapses.”18 One of the most complex tasks for clinical providers is that of deciding what learning outcomes will be assessed. This process has often been influenced by a desire to assess only outcomes that can be standardised. Van der Vleuten warns against the risk of atomisation of competencies which has the capacity to “trivialise content and threaten validity” 19 . With multiple sampling opportunities the constraints of standardisation are reduced. Nevertheless the first step of the assessment design process in clinic is to ensure that the outcomes/competencies to be assessed are expressed in such a way as to embrace the range of experiences and to fit the type of clinical programme on offer. Clinical programmes vary in length and content, students in an advice only, short optional clinic may experience only one client so the concept of sampling across a range of client contact experiences is not realistic. A recurring and legitimate question from students in the live client clinic is how can they be assessed fairly when every student in clinic has a different experience? Can we be sure that the student who has a difficult, demanding and disorganised client is assessed on interviewing skills in the same way as the student who has the organised, articulate and accepting client? To some extent these issues can be addressed by carefully worded outcomes. There is a need to share and develop the language of competencies and outcomes in the clinical setting. In the 18 John Biggs and Catherine Tang, Teaching for Quality Learning at University, 4th Edition, Society for Research into Higher Education and Open University Press, 2011 19 Van der Vleuten, supra n16 UK the time is ripe for this with the SRA recent statement of solicitor competency and the QAA guidance on levels providing a frame for this discussion. At Northumbria the clinical module is the largest credit bearing module in year four. Seventy per cent of the clinic mark is attributed to the practical work in clinic and the remaining thirty per cent to two pieces of reflective writing. The practical work is assessed with reference to a set of criteria, each one being described at a range of levels which equate with degree classification. The criteria are evidenced by the collection of the students’ clinical work in a portfolio which is marked by the supervisor and moderated by other members of the team. The criteria for the practical work are not treated as distinct components of the assessment and include professional attributes, intellectual qualities as well as the more predictable tasks associated with work in the clinic such as client interviewing and advising. The student’s portfolio submission is not structured by criteria or competencies and its content is not prescribed. Supervisors will have given feedback on students’ work through the year but draw on it to remind themselves of the entirety of the student’s work and are asked to indicate broad grade bandings for each of the criteria by way of explaining their grade and also to focus their minds on the specified elements that make up the assessment for the practical work. This is not a mathematical formula and by necessity expert judgement is called for. Applying the validity element of the utility index to this approach concerns may arise over the way the assessment criteria are broken down and then reconstituted into a single mark for ‘practical work’ by the supervisor at the end of the module. In some ways this is a longitudinal approach drawing on the full range of the student’s clinical experience. However, there is no formal process of measuring the various outcomes during the programme. There is a risk that the balancing act carried out by the supervisor is not transparent and when applied to broadly worded assessment criteria lends itself to a middle ground approach. The risk is that students will interpret this for themselves and do only what is needed to achieve what they require. A non-aligned assessment regime has capacity to undermine the effectiveness of the method. Driessen and Van Der Vleuten described this tussle effectively when discussing the use of examinations in a problem based learning law programme: “As usual the assessment programme gained the upper hand and slowly but progressively undermined the problem based learning approach”20. Viewed through the lens of van der Vleuten’s utility index there may also be an issue with reliability. The students are learning by doing and as a consequence their learning will be in response to what they are doing and will be varied both in the nature of the task and its complexity. In addition their work is supervised by a single clinical supervisor. Van der Vleuten’s evidence that reliability is predominantly a consequence of adequate sampling is of great significance in the clinical setting. It is inevitable that real casework will require every aspect of practice in clinic to be supervised by a qualified practitioner. Whilst these supervision processes may not 20E. Driessen & C. Van Der Vleuten, Matching Student Assessment to Problem-based Learning: Lessons from experience in a law faculty, Studies in Continuing Education, 22:2, 235-248 take the form of a summative element of assessment, students receive extensive feedback on their efforts. In many settings the prospect of multiple sampling is a stumbling block from the cost effective aspect of the index. In Northumbria’s year long in house clinic this level of scrutiny is already in place and with some careful consideration can easily be adapted to provide multiple points of sampling without turning every task into an assessment point. At Northumbria students receive a mid- year appraisal and are assessed on certain discrete skills for LPC21 purposes. In addition feedback rubrics/guidance may be developed which tie into discreet SLO outcomes. What may be lacking in terms of sampling practice is a range of different types of assessment and different assessors. Multiple small conversations take place between supervisor and clinic students on a daily basis about strategies on cases and how to respond to developments; it is a short step to use these in a more strategic way. Whilst oral assessment and presentations are used in the law curriculum in a variety of formats, clinic provides a wealth of opportunity for developing more practice orientated versions, informed by the experience of other work based assessments. By developing a range of assessments and a community of experienced assessors, clinic has the potential to offer new insights into assessment methodology in the wider law curriculum. 21 The Legal Practice Course currently requires students to pass assessments on specific legal skills including client interviewing and legal writing. These are assessed on a competent / non competent basis. In one significant respect the sampling evidence relating to reliability of a discrete SLO module may require significant change in assessment; Amsterdam22 argues that the relationship between student and supervisor is a key requirement of the clinical method. Typically the supervisor takes primary responsibility for assessment of their supervisees. Whilst the normal checks are in place for consistency through the moderation and external examiner’s review of marks, these are hard to achieve effectively on the review of the portfolio alone. One of the ways repeat sampling improves reliability is as a result of the involvement of multiple assessors. The in house clinic is not the same as a teaching hospital where students will learn from many different experts as they rotate through different specialisms. Typically the SLO supervisor works on a mainly one to one basis with a small group of students throughout the entire clinical programme. This is to facilitate learning, particularly through the process of reflection and feedback, but also as a practical measure to enable supervisors to easily monitor cases within their specialism. However, there are benefits in involving other supervisors both for students and for clinic. The clinical methodology should be the constant here not the practice of the supervisor. Facilitating other supervisor involvement may result in students benefitting from a range of practice as well as further developing core principles of approach in both clinical method and assessment. To some extent the expert judgement approach to assessment of the practical work at Northumbria is counter-balanced by the assessment of the two reflective reports 22 A. Amsterdam, Clinical Legal Education – A 21st Century Perspective, 34 J. Legal Education 612, 1984 submitted at the end of the module. The compulsory report is on skills in practice and the other can be selected from a range of optional subject areas including clinic and my career, clinic and legal education, justice and ethics, clinic and public discourse and law in action. Within these broad areas students can select any subject matter for discussion although there is an expectation that it relates to some experience they have had through clinic. Reflection is an integral part of clinic. Eraut includes it in his definition of experiential learning: “experience is initially apprehended at the level of impression, thus requiring a further period of reflective thinking before it is either assimilated into existing schemes of experience or induces those schemes to change”23. Students are provided with reading lists and lectures on the theory of reflection during the course of the module, they will undertake preparatory exercises in firm meetings and the content of the firm meeting itself will frequently focus on reflection although not necessarily categorised as such. A practice reflective piece is submitted as part of the mid year appraisal process and students are encouraged to keep short reflective records on all they do in the SLO and are provided with a journal for this purpose (this is not part of the assessment). Nevertheless students are resistant to the assessment on reflection. As one of our students reflected, “Reflective practice is and should be personal; what is valuable reflection will be different for each individual. As such it is difficult to understand how a mark can have any significant meaning and how marking reflection can aid the learning process. ” 23 Eraut, supra n7, p107 Ledvinka states that the purpose of assessing reflection is to ‘assess the learning journey’24. Moon refers to reflective practice as a form of ‘mental processing’25 or as Race puts it a way of making “sense of what we’ve learned” and to “link one increment of learning to the wider perspective of learning - heading towards seeing the bigger picture”26. It is also a process for learning which is central to continuing professional development. Whilst the student above cannot see beyond the content of reflective reports being right or wrong the purpose of assessing reflection is to communicate the value of the ongoing process of assimilating new learning and to instil it as a lifelong approach to learning. The ‘one off/ end of year’ nature of the reflective report would appear to conflict with the utility approach primarily in terms of reliability which is increased with the additional number of samples but also on the grounds of validity, the current assessment is more likely to assess a snapshot of reflection than evidence of a reflective practice. Whilst we might be able to assess the degree to which the student sees the links to the bigger picture it is considerably harder to draw from these isolated examples of reflection an approach to mental processing in line with the learning cycle.27 The process of reflection does not always occur through a written process – a more authentic place for reflection might be as part of an assessed interview or presentation around a case. Within clinic we can introduce reflection as a routine part of the clinical process, a sort of think 24 G. Ledvinka, Reflection and assessment in clinical legal education: Do you see what I see? 9 Int'l J. Clinical Legal Educ. 29 2006 25 J. Moon, Reflection in Higher Education Learning, PDP Working Paper 4, LTSN (2001) 26 P. Race, Evidencing Reflection: Putting the "w" into reflection, ESCALATE Learning Exchange (2002) 27 D. Kolb, Experiential Learning; Experience as the source of learning and development. Englewood cliffs, NJ: Prentice Hall (1984) aloud commentary on the dilemmas faced when encountering day to day SLO work. We may also consider assessing reflective work at other points in the curriculum. At Northumbria we have a number of modules delivered in a problem based learning format which use reflection but only one of which currently assesses it on a pass/fail basis. The problems surrounding the assessment of clinical work have to some extent been aggravated by the difference in approaches between assessment of academic work (essays, coursework, dissertations meeting grade descriptors) and of assessment of skills (portfolios and competencies). It is not surprising that clinical modules delivered within an undergraduate programme have struggled to find appropriate assessment methodologies. In many cases clinic has remained outside the curriculum entirely, open to self-selecting students and as a methodology that generally engages students without the need for the motivating factor of an assessment process, and some argue that this is where clinic should remain. However, for the reasons explained above clinic has a lot to contribute to the changing regime of legal and professional undergraduate education. Van der Vleuten urges us to look at the value of the assessment method outside of traditional academic assessment boundaries and focus on their reliability, validity and educational impact. In one significant respect clinic lends itself to a range and number of assessment methods in that the level of scrutiny and feedback on the students’ clinical work is so extensive that formative assessment is taking place on a task by task basis. With some consideration and imagination assessment points can be incorporated into the year to address the full range of criteria and to reinforce the learning delivered as part of the case work. In addition processes can be designed to ensure consistency when marking portfolios28. It is not a major departure from the normal day to day work of the clinic to utilise oral presentations or feedback on letters and research reports in a way that feeds in to the students’ grades in a more transparent way. We have only just started to explore the assessment toolbox and each clinical programme will have its own aims and limitations but we can start to draw on this widening pool of experience. Whilst the utility index does not introduce us to new concepts it might give us confidence to use a range of assessment activities in a combination which is designed to support learning as well as to measure it. 28 E. Driessen, C. Van Der Vleuten, L. Schuwirth, J. Van Tartwijk and J. Vermunt (2005) The Use of Qualitative Research Criteria for Portfolio Assessment as an Alternative to Reliability Evaluation: a Case Study, Medical Education 39 (2) 214-220