当前位置: 首页 > 期刊 > 《生理学进展》 > 2006年第4期 > 正文
编号:11417228
Staff and student perceptions of computer-assisted assessment for physiology practical classes
http://www.100md.com 《生理学进展》医学期刊
     Faculty of Life Sciences, The University of Manchester, Manchester, United Kingdom

    Address for reprint requests and other correspondence: R. Grady, Faculty of Life Sciences, The Univ. of Manchester, 1.800 Stopford Bldg., Oxford Rd., Manchester M13 9PT, UK (e-mail: ruth.grady@manchester.ac.uk)

    Abstract

    Effective assessment of laboratory practicals is a challenge for large-size classes. To reduce the administrative burden of staff members without compromising the student learning experience, we utilized dedicated computer software for short-answer question assessment for nearly 300 students and compared it with the more traditional, paper-based method of assessment of the same student cohort. Students were generally favorably disposed toward computer-assisted assessment (CAA): 75% of the students responded that for future assignments, they either had no preference for the method of assessment or would prefer CAA. Advantages were perceived to be remote access to the questions and ease of submission. The most common disadvantage cited was lack of internet access. Various advantages of CAA were mentioned by staff members: notably, the reduction in marking time and reduction of paperwork as well as the potential for the software to detect plagiarism and to administer anonymous marking. Disadvantages to CAA were the need to tailor questions to the technology, having to adapt to reading answers and marking onscreen, and the quality of feedback to students. All of the disadvantages could be overcome by training and improved versions of CAA software, currently under development. The use of CAA has proved to be a welcome addition to the tools available to staff members for the assessment of practical classes, and future improved versions of the software will increase the utility of this assessment method.

    Key words: computer-based assessment; e-assessment; on-line assessment; laboratory practical class assessment; short-answer questions

    Introduction

    IN the Faculty of Life Sciences at The University of Manchester (Manchester, UK), physiology practical coursework has traditionally been assessed by data-handling exercises and short-answer questions (SAQs) based on principles encountered in the laboratory. The assessments are paper based and marked by a pool of academic staff members and postgraduate student demonstrators, a process similar to that employed by other academic institutions. However, an increase in student numbers (500 students expected in future first-year intakes) and the modularization of the curriculum have led to the demand for new assessment methods that reduce the administrative and logistical burden of staff members. Such methods need to fulfill the rigorous criteria required of assessment procedures (7, 8) but also have to be viable and usable for staff members and students alike.

    Computer-assisted assessment (CAA) engines have been employed in different educational settings (1, 12, 13). Commercial software such as Question Mark’s Perception, EQL’s I-Assess, and Microsoft’s WebCT have been reviewed and found to have potential (6, 10), although their use has been limited to the "multiple-choice question" (MCQ) format. Although it is acknowledged that MCQs are easy to mark, it is not always appropriate or desirable to offer assessment in this form. SAQs provide the opportunity for students to structure a coherent answer in their own words or to solve a mathematical problem without getting any help from the multiple-choice responses. An assessment system that allows students to input free-text answers and that can be marked manually by staff members to avoid penalizing students for typing errors but that avoids the avalanche of paper normally produced by large classes has been developed by the University of Manchester School of Computer Science: Assess by Computer (ABC) software (9, 11).

    Before introducing CAA wholesale in the Faculty of Life Sciences, we aimed to compare CAA and paper-based assessment (PBA) to see whether CAA had any advantages over the more traditional PBA. ABC software was therefore used for the assessment of part of the practical component of a first-year Cardiovascular Physiology Unit (Body Systems) taken by nearly 300 undergraduate students. Student and staff member perceptions of the assessment methods were sought by questionnaire and face-to-face interviews and are presented in this study.

    METHODS

    Study design.

    In October 2004, 288 students registered for Body Systems Unit BL1811 in the Faculty of Life Sciences, University of Manchester. This unit had four associated practicals, which in previous years had been assessed by SAQs based on work covered in laboratory classes. In this trial, for two of the practicals, questions were posed, answers were submitted, and marks were awarded using ABC software (11); the remaining two practicals were assessed using the traditional paper-based format. All students attempted the SAQs for all four practicals.

    Students were surveyed via an anonymous questionnaire (see the Appendix) to establish their opinions regarding both assessment methods and to determine whether they had a preference for either method. The questionnaire was also designed to determine the students’ base level of computer literacy before they completed the CAA. The questions were all of the "closed format" type, directing the responses of the students. However, for each question, there was also the option of adding additional comments if desired. Questionnaires were distributed to 100 students after the students had received their mark for this unit, of which 83 were returned completed.

    Staff opinion regarding CAA was acquired via face-to-face interviews and informal e-mail feedback.

    Training for the assessment.

    Students were given a brief training session of how to use the ABC software. This was in the form of a 20-min talk to show the students how to access the questions and how to enter, edit, save, and submit answers. Students were also given a printed set of written instructions for reference, and the ABC software had a "Help" function integral to the package.

    Staff and demonstrator training was performed on an ad hoc basis; a half-hour tutorial in using the "Question Setting" and "Marking Tool" were required to master the ABC system. No training was deemed necessary for marking the PBA because all staff members and demonstrators had had experience of marking PBA in previous years.

    Completion of assessment questions.

    ABC software allows inputted questions to be accessed via an external URL website with a separate URL for each practical. All students were given a unique identifier (student registration number and university username) and password so they could access the URLs from any internet-linked computer. Students could enter the websites and attempt to answer the questions in their own time by entering text-based answers (Fig. 1). Answers could be saved and edited before submission with a deadline date of 8 wk after the website went live. Only one submission was allowed, and submissions were not possible after the deadline had lapsed. Submitted answers were saved on file and could be accessed by the administrator as required.

    PBA practical questions were distributed at the beginning of the semester. Hand-written or word-processed answers were submitted at the end of the semester to be marked.

    Marking of assessment questions.

    For CAA, answers saved on file could be passed to tutors for marking, with scripts made anonymous automatically. To mark the assessment questions via computer, tutors could access the student answers onscreen and, depending on the length of the answer, could visualize several scripts at a time. Student responses could be compared with a preinputted "model answer" (Fig. 2), and marks were allocated manually as appropriate. Final marks were totaled automatically and presented in a spreadsheet for analysis.

    For CAA practicals, questions were marked by two academic staff members and one postgraduate student demonstrator, such that one person could mark all student answers for one question. Questions were assigned different marks ranging from 1 (for questions requiring one-word answers) to 10 (for questions requiring longer, more-detailed responses). Marks were allocated manually depending on the quality of the response as in traditional PBA.

    For PBA, one member of the academic staff (who also marked the CAA) and four postgraduate student demonstrators were employed.

    Hereafter, the term "staff" includes both "academic staff members" and "postgraduate student demonstrators."

    Administration of CAA.

    Questions were set by two staff members, although only one member of the staff was required to enter the questions into the ABC system. One computer technician was then required to oversee the process. This involved registering the students’ usernames and registration numbers to use the system and monitoring any problems the students had in accessing the websites.

    RESULTS

    Student questionnaire data.

    Most students rated themselves as at least "competent" in using e-mail, the internet, and word processing, with the majority (96%) rating their overall computer literacy as "adequate" or "more than adequate" before university. Despite this, 90% had no experience in using CAA software before taking this practical unit. Training in using the software was deemed to be "adequate" or "more than adequate" by the majority (99%); consequently, no respondents declared themselves "not confident" in being able to use CAA.

    More students were "confident" about paper-based work as a method of assessment (91%); only 50% made the same claim for CAA. Ten percent of the respondents declared themselves "not confident" in CAA as a method of assessment, whereas no one held this opinion about PBA.

    Anticipated and actual problems encountered by students using CAA are summarized in Fig. 3. The most common anticipated problem was logging on (45% of respondents) and saving/editing the answers (45%) followed by submitting the answers (29%) and accessing the software (25%). Many students did not anticipate any problems at all (22%). After the assessment was completed, the most common problem actually encountered by students was of logging on (32%), with most (63%) reporting no problems.

    Several common advantages and disadvantages were perceived by the students for CAA and are summarized in Table 1. The most common advantage was the remote access, which was mentioned by 26% of the respondents, and the ease of submission (24%). The most common disadvantage cited was lack of internet access (21%).

    More students (48%) believed that PBA took longer to complete than CAA, whereas 15% thought the opposite; 37% did not find a difference in the completion time (Fig. 4). Over half of the students (52%) rated CAA as "better" than PBA (Fig. 5), whereas 21% believed the opposite; 27% of the students had no preference. When asked which method of assessment they would prefer to use in future, 53% responded with CAA, 25% preferred PBA, and 22% displayed no preference (Fig. 6). There were no differences in responses between men and women (data not shown).

    Staff perceptions of CAA.

    Staff responses to CAA were elicited via e-mail and face-to-face interviews after the assessment procedure was completed. There were no discernible differences between academic staff comments or postgraduate demonstrators. Topics that elicited the most comments were as follows: question setting, administration of the system, awarding marks, feedback to students, and confidence in CAA as a method of assessment. Advantages and disadvantages as perceived by the staff members are summarized in Table 2.

    Question setting using CAA.

    SAQs have been used for several years in the Faculty of Life Sciences for the assessment of practical classes. The ease of which these questions could be transferred to a CAA platform was a concern for some staff members. Some questions were easily transferable, whereas others (such as mathematical calculations requiring scientific notation to be entered by the students and questions involving graph drawing and labeling) had to be amended. The training needed to enter questions into the ABC system was minimal, and, although some staff members felt that the system was not entirely intuitive, it was relatively easy to organize the questions and enter the model answers.

    Administration of the system.

    A computer technician was required to administer CAA. This involved registering the students’ usernames and chasing up students’ reported problems with logging on. In most instances, these were due to operator error and were easily overcome (the entry of the username was case sensitive, for example). One perceived benefit of CAA was the ease by which anonymous marking could be organized and administered, unlike with PBA.

    Awarding marks.

    The "Marking Tool" of the ABS system was easily mastered, with no staff members reporting problems in its use. The software is set up so that staff members can sort or cluster student answers in different ways: by the length of the answer, by similarity to the model answer, by unmarked status, and by keyword. The staff found marking easy and convenient to complete for the shorter questions (worth 5 marks or less) in CAA. The ABC system automatically calculates final marks for the students, removing the human errors associated with the checking of addition, rechecking, and inputting of marks to a final spreadsheet that accompanies PBA. However, for questions with several marks to allocate (6 or more), the ABC system was considered to be less user friendly than PBA. Staff members commented that this was due to there being no way of annotating the students’ answers in situ, to help determine the final mark allocated for an answer, and that it was harder to read a long answer onscreen than on paper.

    The CAA system allowed for the detection of plagiarism by clustering similar answers together–something not easily achieved with PBA.

    Fewer staff members were needed to mark large numbers of students’ scripts in CAA as opposed to PBA. The software was set up to enable a single member of the staff to mark all answers to a particular question rather than a single member of the staff marking all questions in a limited number of scripts. This removed the problem of consistency between different markers and was seen as a huge advantage of CAA.

    Staff members found CAA to be faster to mark than PBA. Three staff members took 12 h in total to mark all scripts for CAA, whereas five staff took 40 h for PBA.

    Feedback to students.

    Opinion was divided as to the usefulness of CAA in providing feedback to the students. One member of the staff felt that PBA allowed better feedback to students, because they could gain full access to their marked scripts and read comments added by the staff members. The CAA system does allow comments to be added to a script, but the process of typing text in a textbox would have severely slowed down the CAA marking process and was not deemed as flexible as the markers required and so was not used. However, one staff member felt that the speeding up of the marking process enabled marks to be published earlier and that global feedback comments could be passed to students to compensate for the lack of individual comments. The ABC system does allow for students to have access to their marked scripts, but in this study, this function was not utilized.

    Confidence in CAA as a method of assessment.

    In general, staff members were positive about using the ABC system as a method of assessment. Some concerns were raised about CAA. Staff members felt that they would have no counter for students who insisted they had submitted the work despite evidence to the contrary. The CAA system trialed here did not allow for a receipt reference number to be sent on submission of answers, leaving it possible that students could inadvertently forget to submit their work.

    DISCUSSION

    An increase in student numbers, ever-escalating work commitments for academic staff, and the advancement of internet technology has made the use of CAA an attractive proposition for many higher education institutions. However, it is important that the instrument of assessment does not place an additional burden on staff members or devalue the assessment process overall.

    The students in this trial were generally competent with computers, but most had not used them for inputting assessment answers before. Some students were concerned that they were "technophobic" and had poor typing and computer skills; however, for science students, the opportunity to learn how to master such expertise is an undoubted asset. The CAA here was not completed under exam conditions, and so students with limited experience of computers should not have been unduly penalized.

    More people anticipated problems with the CAA than actually had them; the main problem was logging on, but this problem was easily remedied and could be avoided in the future by focusing on this aspect in the training session given to the students, although, from a staff point of view, the initial panicky e-mail and/or phone call from a student could easily become tiresome.

    The training needed to use CAA was minimal, and students generally found this adequate (despite the logging on problems mentioned above); however, it has been reported that even skilled computer users can improve test scores with an increase in training to use the technology (6). In future assessments, the training by students may have to increase if questions become more sophisticated, for example, inputting graphical answers or diagrams. If this is the case, it is a concern that mastery of the CAA technology then becomes an end in itself, which was not the original intention. However, it is not unacceptable that competence in using such technology develops into a requirement of practical science courses (2), and this competency could be added to the intended learning outcomes for the course unit. As such, the use of CAA could be further exploited to its full potential, rather than using the system for text input only.

    Despite fewer students being confident about CAA before completing the assessment, more students stated a preference for CAA afterward, with no significant differences in the responses of either men or women. There was a perception among a few students that CAA could be completed in the students’ own time, and they cited this as an advantage, whereas PBA was more time constrained, even though this was not the case here. Despite these responses, more students believed that PBA took longer to complete than CAA, even though neither were time limited.

    Seventy-five percent of the students either had no preference or would choose CAA in future assessments, showing there was a willingness for students to move to new technology. Ogilvie et al. (5) also found that medical students held positive attitudes toward using computers for assessment and that this type of learning environment actually accentuated their learning experience.

    Negative attitudes to CAA were generally confined to specific individual concerns, which were mainly down to logging on or lack of home access to the internet for submission of answers. CAA, however, could have been completed on paper and answers could be submitted using university computers if students did not have access at home.

    In future versions, the system will provide a confirmatory reference number to the student after the submission of answers to alleviate the worry of answers being "lost" by the system.

    the staff perspective, using the "Question Setting Tool" was not difficult once staff members were trained in how to enter the questions efficiently. The lack of direct transference from some questions posed via PBA to CAA was a minor issue for some staff members. This was an unforeseen problem that was readily surmounted, although a few staff members felt that this inevitably led to compromise and to the "dumbing down" of the assessment to fit the technology. However, once the worth of the technology is proved, it is possible that staff members will devote energy into generating more creative questions that assess the same learning outcomes rather than rely on questions that have "always been used in the past." In the future, improvements to the technology and a greater emphasis on student training will allow the posing of more sophisticated questions, which should alleviate some of the staff member concerns regarding standards.

    Staff members did appreciate that the ABC software could be utilized in different ways. The software is Java based and uses XML data storage. This means it is highly portable, does not rely on external database systems, and can support large numbers of students simultaneously. Access to the ABC website can be controlled as the setter requires, allowing single, time-limited access (exam conditions) or multiple logons over a longer period of time as appropriate.

    The issue of marking longer answers and assigning marks on the computer screen reliably needs to be addressed. This was not a reported problem for all staff members, suggesting that it may only be a question of having to adjust to a new mode of working rather than an intransigent barrier. In some higher education institutions, it is becoming the norm not to annotate students’ scripts at all during the marking process, a situation analogous to the CAA here. Future versions of the ABC software will look to include a scratchpad for adding marks and comments in situ in response to staff requests.

    The quality and quantity of feedback to students is a perennial issue (3, 4). The CAA system trialed here does allow specific comments to be entered by the marker for each student, although this was not exploited; again, this is analogous to the situation described above, where examination scripts were not annotated. As an alternative, global feedback was given to the students via a document posted on the student intranet that outlined common problems and misunderstandings as well as the individual marks obtained for each question. The further development of adequate and appropriate feedback to students regarding CAA is currently under development in the Faculty of Life Sciences in consultation with academic staff members and students.

    In conclusion, both staff members and students responded positively, if cautiously, to the introduction of CAA for practical class assessment. The many benefits for both sets of users were seen to outweigh the disadvantages. CAA has the ability to spot plagiarism, effortlessly allows anonymous marking, and substantially reduces staff marking time (by a third). The ability for one staff member to mark a whole question for large numbers of students helps to ensure the assessment process is fair, because there is a reduction in the variability often observed when different markers are used.

    Further studies are currently being performed to monitor the extended use of such CAA technology for larger classes and for a variety of subjects in the Faculty of Life Sciences, The University of Manchester. In particular, we aim to demonstrate that the use of CAA does not prejudice the marks awarded but is an objective assessment tool.

    Appendix: Student Perceptions of CAA

    The following is the anonymous questionnaire given to students:

    At the end of the first semester, the practical component of the unit BL1811 (Body Systems) was assessed using two methods: half of the questions were posed and submitted online, whereas half of the questions were completed on paper. This questionnaire is designed to determine your perceptions concerning these methods of assessment and provides a chance for you to air any comments you may have.

    Thank you for taking the time to fill in this questionnaire,

    Drs. Liz Sheader, Niggy Gouldsborough, and Ruth Grady

    Please circle the most appropriate response to each question.

    Q1. Are you

    Female

    Male

    Q2. Prior to undertaking your degree, how would you rate your competence in using a computer for the following

    E-mail

    Never used

    Less than competent

    Competent

    More than competent

    Internet

    Never used

    Less than competent

    Competent

    More than competent

    Word processing

    Never used

    Less than competent

    Competent

    More than competent

    Spreadsheets

    Never used

    Less than competent

    Competent

    More than competent

    Games

    Never used

    Less than competent

    Competent

    More than competent

    Please feel free to add any comments.

    Q3. How would you have rated your computer literacy overall prior to coming to university

    Less than adequate

    Adequate

    More than adequate

    Please feel free to add any comments.

    Q4. Before completing the Body Systems assessment, had you any previous experience in submitting work online for assessment (other than answering multiple-choice questions)

    Yes

    No

    Please give details.

    Q5. Were the instruction sheet, guidance, and on-line help adequate training for using the CAA software

    Less than adequate

    Adequate

    More than adequate

    Please feel free to add any comments.

    Q6. How did you rate your confidence in your ability to use the CAA software

    Not confident

    Fairly confident

    Confident

    Extremely confident

    Please feel free to add any comments.

    Q7. Prior to using CAA, how did you rate your confidence in the on-line system as a method of assessment

    Not confident

    Fairly confident

    Confident

    Extremely confident

    Please feel free to add any comments.

    Q8. Prior to using CAA, how did you rate your confidence in using written paper answers as a method of assessment

    Not confident

    Fairly confident

    Confident

    Extremely confident

    Please feel free to add any comments.

    Q9. Before you undertook the CAA, did you anticipate any of the following limitations

    Accessing the software

    Security of the system

    Problems with logging on

    Submitting your answers

    Saving and editing your answers

    Did not anticipate any problems

    Other (please give details)

    Please feel free to add any comments.

    Q10. What actual limitations did you encounter when using the CAA

    Accessing the software

    Security of the system

    Problems with logging on

    Submitting your answers

    Saving and editing your answers

    Did not anticipate any problems

    Other (please give details)

    Please feel free to add any comments.

    Q11. What were the advantages of submitting some of your assessment questions online

    Q12. What were the disadvantages of submitting some of your assessment questions online

    Q13. Which method of assessment took longer to complete

    CAA

    Paper assessement

    No difference

    Please feel free to add any comments.

    Q14. How do you rate the CAA compared with the paper-based assessment

    CAA better

    Paper assessment better

    No difference

    Please feel free to add any comments.

    Q15. If you had the choice, would you prefer to use CAA or paper-based assessment

    CAA

    Paper assesssment

    No preference

    Please feel free to add any comments.

    Q16. Do you have any other comments you would like to add about CAA

    Thank you for taking the time to fill in this questionnaire.

    Acknowledgments

    The authors thank Dr. Mary McGee Wood, Dr. John Sargeant, and Phil Reed (School of Computer Science, University of Manchester, Manchester, UK) for the technical advice and support regarding CAA.

    REFERENCES

    Bull J and Collins C. The use of computer-assisted assessment in engineering: some results from the CAA national survey conducted in 1999. Int J Electric Eng Educ 39: 91–99, 2002.

    Dowsing R. The computer-assisted assessment of practical IT skills. In: Computer-Assisted Assessment in Higher Education, edited by Brown S, Race P, and Bull J. London: Kogan Page, 1999, p. 131–138.

    Higgins R, Hartley P, and Skelton A. Getting the message across: the problem of assessment feedback. Teach Higher Educ 6: 269–274, 2001.

    Higher Education Academy Generic Centre. Resources Database: Enhancing Student Learning Through Effective Formative Feedback (online). http://www.heacademy.ac.uk/resources.aspid=353&process=full_record&section=generic [30 October 2006].

    Ogilvie R, Trusk T, and Blue A. Students’ attitudes towards computer testing in a basic science course. Med Educ 33: 828–831, 1999.

    Pain D and Le Heron J. WebCT and online assessment: the best thing since SOAP Educ Technol Soc 6: 62–71, 2003.

    Quality Assurance Agency for Higher Education. Code of Practice for the Assurance of Academic Quality and Standards in Higher Education (online). http://www.qaa.ac.uk/academicinfrastructure/codeOfPractice/section6/COP_AOS.pdf [30 October 2006].

    Race P. Designing assessment and feedback to enhance learning. In: The Lecturer’s Toolkit (2nd ed.). London: Kogan Page, 2001, p. 31–103.

    Sargeant J, Wood M, and Anderson A. A Human-Computer Collaborative Approach to the Marking of Free-Text Answers. Loughborouch, UK: Eighth International CAA Conference, 2004, p. 361–370.

    Sclater N and Howie K. User requirements of the "ultimate" online assessment engine. Computers Educ 40: 285–306, 2003.

    University of Manchester. The ABC Project: a Brief Overview (online). http://elearn.cs.man.ac.uk/examweb/overview.htm [30 October 2006].

    Wang T, Wang K, Wang W, Huang S, and Chen Y. Web-based assessment and test analyses (WATA) system: development and evaluation. J Computer Assist Learn 20: 59–71, 2004.

    Wen M and Tsai CC. University students’ perceptions of and attitudes toward (online) peer assessment. Higher Educ 51: 27–44, 2006.(Elizabeth Sheader, Ingrid Gouldsborough )