Scholarly article on topic 'Innovative Method of Peer Assisted Learning by Technology and Assessment of Practical Skills'

Innovative Method of Peer Assisted Learning by Technology and Assessment of Practical Skills Academic research paper on "Educational sciences"

CC BY-NC-ND
0
0
Share paper
Academic journal
Procedia Technology
OECD Field of science
Keywords
{"peer assisted learning" / "peer assessment" / "vocational education and training" / "quality auditor" / "quality practical thing"}

Abstract of research paper on Educational sciences, author of scientific article — Liviu Moldovan

Abstract The purpose of the paper is to present some results of the project entitled “Innovative Peer Learning Assessment System for Evaluation of Trainers and Quality VET Professional Programs” (iQVET), financed by European Commission. Experiences with delivery of the Quality auditor course by the Chamber of Commerce and Industry Mures are presented. This paper is addressed to an experimental educational research with the objectives to: design and evaluate a pedagogical approach which can enhance core competencies of quality auditors; to innovate technology enhancement strategies that can create an effective collaborative peer assisted learning and assessment environment. The findings are the teaching intervention evaluation according to the Kirkpatrick methodology, at levels of: reaction, learning, behaviour, results. The learning outcome is evaluated with a new tool that provides verification feedback to trainees immediately after a test or exam, by using Peer Learning Assessment System Professional (PeLePro) for mobile technology. It doesn’t require any reconstruction of classrooms or other infrastructure at the training site, due to the portability and availability of handheld tools among the trainees. It is possible to apply this evaluation method in any classroom that is connected to the WI-FI network.

Academic research paper on topic "Innovative Method of Peer Assisted Learning by Technology and Assessment of Practical Skills"

Available online at www.sciencedirect.com

ScienceDirect

Procedia Technology 12 (2014) 667 - 674

The 7th International Conference Interdisciplinarity in Engineering (INTER-ENG 2013)

Innovative method of peer assisted learning by technology and

assessment of practical skills

Liviu Moldovan*

"Petru Maior" University, 1 Nicolae Iorga street, 540088 Tirgu Mures, Romania

Abstract

The purpose of the paper is to present some results of the project entitled "Innovative Peer Learning Assessment System for Evaluation of Trainers and Quality VET Professional Programs" (iQVET), financed by European Commission. Experiences with delivery of the Quality auditor course by the Chamber of Commerce and Industry Mures are presented. This paper is addressed to an experimental educational research with the objectives to: design and evaluate a pedagogical approach which can enhance core competencies of quality auditors; to innovate technology enhancement strategies that can create an effective collaborative peer assisted learning and assessment environment. The findings are the teaching intervention evaluation according to the Kirkpatrick methodology, at levels of: reaction, learning, behaviour, results. The learning outcome is evaluated with a new tool that provides verification feedback to trainees immediately after a test or exam, by using Peer Learning Assessment System Professional (PeLePro) for mobile technology. It doesn't require any reconstruction of classrooms or other infrastructure at the training site, due to the portability and availability of handheld tools among the trainees. It is possible to apply this evaluation method in any classroom that is connected to the WI-FI network.

©2013 TheAuthors.Published byElsevierLtd.

Selection andpeer-reviewunderresponsibility ofthe Petru Maior University of Tirgu Mures.

Keywords: peer assisted learning; peer assessment; vocational education and training; quality auditor; quality practical thing.

* Corresponding author. Tel.: +40740498427; fax: +40265211838. E-mail address: liviu.moldovan@ing.upm.ro

2212-0173 © 2013 The Authors. Published by Elsevier Ltd.

Selection and peer-review under responsibility of the Petru Maior University of Tirgu Mures. doi: 10.1016/j.protcy.2013.12.547

1. Introduction

With a reduced amount of time spent for practical demonstrations, during the course, the practical things related to quality are difficult to be understood. Thus, for most of the vocational education and training (VET) instructors, it is difficult to demonstrate to their trainees the quality practical things. Within such approach, teaching of quality concepts by large trainee groups within limited practical class time is difficult.

On the other hand VET instructors are currently unable to judge and assess on the fly professionals trainer's reproduction, demonstration and performance of learning. The short time that is available for the actual interaction between the trainer and the trainees makes it challenging for the instructor to assess trainee's level of knowledge related to the course material and the prerequisite knowledge required. The assessments and certification usually targets the amount of knowledge retained by the trainees immediately after the course rather than the amount of knowledge applied on the job a period after the course. In class communication, interaction and collaboration processes that may enhance effective peer learning processes, where trainees learn from their peers, are difficult to measure by use of existing ICT technologies.

With this background the project "Innovative Peer Learning Assessment System for Evaluation of Trainers and Quality VET Professional Programs" (acronym iQVET) [16], promoted by S0r-Tr0ndelag University College, Trondheim, Norway and four VET training organisations in Europe, including Chamber of Commerce and Industry Mures (CCIM), provide instant evaluation in industry oriented VET professionals courses, by adapting, testing and validating the new online service that are based up on the Peer Learning Assessment System Professional (PeLePro), and extend Kirkpatrick's learning outcome evaluation model by using modern, easy to use, cheap mobile response technology, to include Assessment for Learning in competence based systems.

2. Methods

Educational research uses two main designs: naturalistic and experimental. Naturalistic designs look at specific or general issues as they occur. In contrast to a naturalistic research design, experimental designs usually involve an educational intervention [4]. This paper is addressed to an experimental educational research with the objective:

• To design and evaluate a pedagogical approach which can enhance core competencies of quality auditors, encourage deeper understanding of key concepts and promote feedback during the same practical class of large trainee groups;

• To innovate technology enhancement strategies, that can create an effective collaborative peer assisted learning and assessment environment for current practical experiments of the VET trainees.

Sen and Selvaratnam [8] have described a technology enhanced teaching demonstration as an innovative method of peer assisted learning of practical skills in medicine for non-dissection based integrated anatomy curriculum.

Chamber of Commerce and Industry Mures (CCIM) is delivering VET courses according to the occupational standard for Quality auditor, code R43 [14] with a duration of 40 hours, one quarter being dedicated for practical applications [15].

During quality auditors courses a teaching innovation has been successfully tested. Trainees organised in collaborative learning groups, solve practical tasks, specially designed by the instructor with the aid of various video resources. The practical sessions are structured in two parts: first part under instructor's guidance and second part is self directed by trainees. In the last part of the practical session, selected groups have performed a peer teaching by demonstrating key concepts to the whole group with available quality resources: pictures, videos, documents, web resources, samples with defects, 3D images with defects, etc. (Fig. 1).

Than a peer learning assessment is performed by engaging large trainee groups in experiences of practical skills in combination with appropriate real time feedback in order to improve their deep understanding and learning of the real word aspects.

The teaching intervention is evaluated according to the Kirkpatrick methodology at levels of: reaction, learning, behaviour, results. iQVET provides a faster and cheaper solution to this evaluation process, while at the same time adding new features by using mobile technology. The reaction level and learning outcome is evaluated with a new tool that provides verification feedback to trainees immediately after a test or exam, by using Peer Learning Assessment System Professional (PeLePro) for Smartphone.

Fig. 1. Quality resources: (a) video; (b) picture.

Mobile technology doesn't require any reconstruction of classrooms or other infrastructure at the campus, due to the portability and availability of handheld tools among the trainees. It is possible to apply this evaluation method in any classroom that is connected to the WI-FI network.

3. Evaluation model

Evaluation activities are critical to effective development of interactive learning systems. Evaluation should guide the creative development process by providing timely and insightful information about the status of the design ideas and the quality of their implementation [11]. Training evaluation is recognized as an important component of the instructional design model, there are no theoretically based models of training evaluation [7]. Evaluation can be characterised as the process by which people make judgements about value and worth. Olivier [10] showed that sometimes the judgement process is complex and controversial according to the continuous development of the teaching technology. The need for practitioners to carry out their own evaluations has led to concerns about expertise. Instructors usually have expertise in evaluation of teaching of a certain subject, but they are not able to perform programme evaluations. Oliver and Conole [9] have structured around a model of evaluation design that incorporates six stages. These steps are as follows:

1. Identification of stakeholders;

2. Selection and refinement of evaluation question(s), based on the stakeholder analysis;

3. Selection of an evaluation methodology;

4. Selection of data capture techniques;

5. Selection of data analysis techniques;

6. Choice of presentation format.

The first two and the last steps of the model are relating to the context, while the middle three are focusing on the details of the study itself.

The evaluation of an educational event may have many purposes; each evaluation should be designed for the specific purpose for which it is required and for the stakeholders involved [1]. For a short educational course, for example, the purpose might be to assist organisers in planning improvements for the next time it is held. The systematic collection of participants' opinions using a specifically designed questionnaire may be appropriate.

Kirkpatrick [5] described four levels of evaluation in which the complexity of the behavioural change increases as evaluation strategies ascend to each higher level. According to the Kirkpatrick methodology, the teaching innovation of the Quality auditors delivered by CCIM is evaluated at levels of (Fig. 2):

1- Reaction (course evaluation and trainee feedback about course content and instructor);

2- Learning (the resulting increase in knowledge or capability certified by formative and summative exam grades);

3- Behaviour (extent of behaviour and capability improvement and implementation/application effect of the course in organisation);

4- Evaluation of results (product quality in society - the effects on the business or environment resulting from the trainee's performance).

(4) Evaluation of results (product quality in society)

(3) Evaluation of behaviour (effect of the course in organisation)

(2) Evaluation of learning (summative exam grades)

(1) Evaluation of reaction: (course evaluation and student feedback)

Fig. 2. Kirkpatrick's hierarchy of evaluation levels.

Complexity of evaluation increases as evaluation of intervention ascends the hierarchy, as demonstrated in figure 3. At the first two levels the trainee evaluation is dominant in relevance, while at the last two levels the employer evaluation is dominant in relevance.

Evaluation level

(4)-Results (3)-Behaviour

(2)-Learning

(l)-Reaction

Employer Evaluation

Teacher evaluation

/ Trainee

evaluation j_

Complexity of evaluation

Fig. 3. Complexity of evaluation hierarchy.

Kirkpatrick's learning evaluation model has been used for training of professionals for nearly 50 years without providing new innovations on the technology field. iQVET provides major innovation to Kirpatrick's learning innovation model by introducing assessment for learning outcomes using of trainees own mobile devices. The first two levels in the Kirkpatricks's learning evaluation model are performed with a new tool Peer Learning Assessment System Professional (PeLePro) for mobile devices, a software tool developed in the iQVET project. It provides verification feedback to trainees immediately after assessment. The assessment phases include several steps:

Step 1: A typical PeLePro session consists in running the application on a computer connected to the iQVET server and trainees connect to the session using electronic device like a Smartphone, iPod, pad, computer, that are connected to the same server as the application.

Step 2: During the assessment the trainees are handed the test on the electronic device monitor, and respond to it.

Step 3: The result consideration phase is after the test submission, when instructor gets complete overview of the results submitted by the trainees, how the trainees have answered the assessment.

Step 4: In the post-assessment phase, new questions are elaborated, in connection with the problematic questions from the testing phase.

The mobile evaluation system gives the instructors a new tool to provide verification or elaborative feedback to trainees immediately after a test or exam. This method is a collaborative supported learning that helps trainees to improve their knowledge in the study subject and facilitates active learning.

Use of mobile technology doesn't require any reconstruction of classrooms or other infrastructure at the campus, due to the portability and availability of handheld tools (Smartphone, tablets and laptops) among the trainees. Due to this, it is possible to apply this evaluation method in any classroom that is connected to the WI-FI network, thus reducing the costs for carrying out advanced and improved evaluations in large scale training environments with thousands of trainees [16].

Trainees experience with learning processes and response technologies have been described by Thorseth and Stav [13]. Experiences with use of online response technologies for Smartphone in education of engineers have been also reported by Hansen-Nygard et al. [2, 3].

4. Course evaluation

Every year new challenges emerge in the field of training and development-for example, competency dvelopment, outsourcing, e-learning, and knowledge management, to name a few. In spite of the variety and complexity of these challenges, there is a common theme: business leaders want to see value for their investment. Do people's initial reactions to the learning experience indicate that the learning is relevant and immediately applicable to their needs? How effective is the learning and how sustainable will it be? What are people doing differently and better as a result? What results are these investments in learning and development having for the business? These are the fundamental questions that have been asked every year about training and development when Kirpatrick put them on the agenda of business and learning leaders [6].

In this paragraph, by following the Kirkpatrick's learning evaluation model, the method of evaluation in four levels, applied to the quality auditors course is presented. The course delivery by CCIAM was done over a period of one month in June 2013, in a group for (29) trainees in quality auditors professionals.

In the first level of course evaluation about reaction, according to Kirkpatrick's model, trainees' feedbacks on the system were collected from a survey given at the end of the course, by using a questionnaire with three sections.

In the first section of the questionnaire the prior implication of the trainee in the thought subject in order to evaluate than the immediate impression of the material presented, courseware, personal benefit of the course and instructor abilities for course delivery from the trainee point of view, in other words what the trainees felt about the training were evaluated.

It was appreciated if the trainees had previous experience in quality assurance, both from theoretical and practical point of view. The results show a mixed distribution in the group with experinced (10) and medium level (19) trainees.

In the second section of the reaction evaluation questionnaire, the courseware was evaluated: the clarity of the handouts, power point presentation; the course material and presentation provided have followed the course subject; the overall rating of the course material provided during training. The third section was addressed to the personal benefit of the course: to understand the subject of the course; the content of the course for trainee requirements; the trainee knowledge after the course evaluated by himself; overall rating of the course.

In the third section of the reaction evaluation questionnaire the instructor was evaluated: professional knowledge of the subject of training; practical examples offered during training; presentation abilities, clarity of expression and speech; the pedagogical method, instructiveness with participants, discussions; openness to suggestions from participants; ability to respond appropriately to questions.

Data collection was done by means of Peer Learning Assessment System Professional (PeLePro). Distribution of answers shows a clear picture about reaction of the trainees.

In the second level of course evaluation about learning outcomes, according to Kirkpatrick's model, we have started from the definision of learning evaluation, that is a process through which instructors appreciate the quality of teaching by achievements of the trainees. Considering the moment of learning outcomes evaluation, it can be: formative evaluation by monitoring continuously trainees learning through ongoing feedback and summative

evaluation, conducted at the end of the course. Usually in education, courses are finished with a final exam for assessment of knowledge. Summative evaluations have a high contribution to the final degree, while formative assessments have low or no contribution to the final degree. Summative evaluation may consist in: final exam, project, essay, report. Results from summative evaluations may be used to improve teaching and learning efforts in the next generation of trainees.

In our course delivery the traditional exam was replaced with a number of tests that provide feedback from the trainee to the instructor and from the instructor to trainee. The course was completed with a smaller final exam since the tests contribute to the final score from the course.

In the formative and summative evaluation phases, the Peer Learning Assessment System (PeLePro) for Smartphones was used, which is a new learning outcomes evaluation model where test results for a class are turned into an active, creative and collaborative learning process by the use of immediate feedback, as described by Stav [12]:

• The verification feedback leaded by the instructor allows demonstrate the trainees why a particular answer is correct and why the others are incorrect;

• An elaborate feedback discussion run by trainees: the answers are displayed but they don't know which are the correct/incorrect ones;

• An elaborative feedback discussion led by one trainee: the deviation from the correct answer without addressing why this is correct and the other ones are incorrect.

The methodology consists in preparing a number of multiple-choice tests from the study subject. These are distributed to trainees through PeLe Professional software and each trainee uses mobile device to answer the questions. The evaluation system comprises an embedded automatic marking system, which helps instructor to see the participation degree of the trainees and results of the evaluation.

In the second level of course evaluation about learning, the trainees knowledge and capability increase were evaluated in formative and summative exam grades performed with PeLe Professional by the instructor. The participant evaluation by instructor was done, evaluating the amount of knowledge received during training., by using questions with multiple alternative answers, like: Quality pyramid consists of: a) characteristics; b) delivery time; c) required resources; d) product price; e) after-sales service, etc.

Data collection was done by means of Peer Learning Assessment System Professional (Fig. 4). Distribution of answers shows a clear picture about trainees learning outcomes. Results were used for grading trainees.

In the third level of evaluation, about behaviour, according to Kirkpatrick's model, we evaluate quality auditors' programmes to support improvement of quality management system in the company. It is evaluated the changes in behaviour determined by the effect of the course in applying theory at the workplace/organisation in concrete situation of the workflow and production delivery but also during audits.

Evaluation was done one month after the course delivery, with information collected from both trainees and employer. The trainee evaluates the course effect on the quality behaviour: to understand current issues of quality related to the work place; to identify the requirements of ISO 9001/ISO 190011 on the organization's quality management system and the application in the organization; degree of understanding quality issues during internal/certification audit of the company/workplace; development of skills in quality assurance, etc.

The employer evaluates also the course effect on the quality behaviour, by the employee capability to select what kind of responsibility and task will take in the job; to select the activities in order to perform a task related to quality; to select the best methods, tools he/she like to use in the job; to plan their own work; preference for a particular type of work.

Fig. 4. (a) trainees evaluated with PeLePro; (b) results of evaluation.

Distribution of answers shows a clear picture about transformation of the behaviour after the trainees' graduation of the course. An improvement of quality behaviour in the company has been reported by trainees and employer.

In the fourth level of evaluation, about results, according to Kirkpatrick's model, we evaluate quality auditors' programmes to support trainees' professional development but also quality management system in the company to improve, by the changes of products quality. It is evaluated the effects on the business or environment resulting from the trainee's performance.

Evaluation is done two month after the course, with information collected from both trainee and employer. The trainee evaluates the course effect on the professional development and quality results by: finding a new job in the profile/related to the profile; promotion in the company; changes in the manager attitude; improvement of the results in the work; easy way to perform quality tasks, etc.

The employer evaluates after the course the fulfilling a legal requirement for the company by employee degree; the variation of non conformities number in the area of employee intervention; the quality level of the product/service in the area of employee intervention; efficiency of activities related to quality. Distribution of answers shows a clear picture about product quality transformation after the trainees' graduation of the course. Both trainees and employer reported an improvement of activities related to quality.

5. Discussion and conclusion

The study describes the successful design, introduction and evaluation of a novel technology enhanced peer collaborative learning and assessment environment which is centred on the trainee. The application of focused practical tasks collaboratively and demonstrating them to peers but also peer assessment produced significant learning and development of transferable skills.

S0r-Tr0ndelag University College of Trondheim in Norway is coordinator of the project "Innovative Peer Learning Assessment System for Evaluation of Trainers and Quality VET Professional Programs" (acronym iQVET), in partnership with four VET training organisations in Europe, including Chamber of Commerce and Industry Mures (CCIM). The main deliverances of the iQVET project are the Peer Learning Assessment System Professional (PeLePro) for mobile devices. It is used in quality auditors vocational education and training for the reaction and learning outcomes levels assessment according to Kirkpatrick's learning evaluation methodology.

The teaching innovation ensured that the knowledge and skills learnt in the collaborative learning environment in individual small groups were reinforced to attain a higher level of competency by teaching and demonstrating to the whole group of trainees.

Trainees' evaluations of the collaborative learning environment and peer evaluation are highly positive. An improvement of summative assessment scores has been achieved (12%).

The peer assessment of each demonstrating group indicated a high rating for use of quality resources and the accuracy of content. Peer assessments encouraged feedback to the trainee leading to a learning arena of the collaborative learning.

Acknowledgements

Supported by a grant that is financed by the European Commission. This publication reflects the views only of the author, and the Commission cannot be held responsible for any use, which may be made of the information contained therein.

References

[1] Edwards J. Evaluation in adult and further education: a practical handbook for teachers and organisers. Workers' Educational Association Liverpool 1991.

[2] Hansen-Nygard G, Nielsen K.L, Stav JB, Thorseth TM, Arnesen K.. Experiences with online response technologies in education of engineers, Proceedings from the International Conference on Computer Supported Education (CSEDU 2011) Conference, Noordwijkerhout,

Netherland 2011.

[3] Hansen-Nygard G, Nielsen KL, Stav JB, Thorseth TM, Arnesen K.. Experiences with use of Open, Web-based Student Response Services for Smartphones, Proceedings from the International Technology, Education and Development Conference (INTED 2011), Valencia, Spain 2011: 4944-4950.

[4] Hutchinson L. Evaluating and researching the effectiveness of educational interventions, BMJ 318(7193) 1999: 1267-1269. On line available http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1115652/

[5] Kirkpatrick DI. Evaluation of training.Training and development handbook. R. Craig, I. Bittel Editors. McGraw-Hill New York 1999.

[6] Kirkpatrick DL, Kirkpatrick JD. Evaluating training programs. The four levels. Third edition. Accesible Publishing Systems PTY, Ltd. 2010.

[7] Kraiger K, Ford JK, Salas E. Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of Applied Psychology 1993; 78(2): 311-328.

[8] Sen A, Selvaratnam L. Technology enhanced teaching demonstration as an innovative method of peer assisted learning of practical skills, 3rd International Conference of Edu cation, Research and Innovation. Madrid, Spain, ICERI2010 Proceedings 2010: 830-838.

[9] Oliver M, Conole G. The Evaluation of Learning Technology - an overview. Innovation in the Evaluation of Learning Technology. M. Oliver, Editor. University of North London Press, London 1998; 5-22.

[10] Olivier M. An introduction to the Evaluation of Learning Technology. Educational Technology & Society 2000; 3(4): online available: http://www.ifets.info/others/journals/3_4/intro.html

[11] Reeves TC, Hedberg JG. Interactive learning systems evaluation; Educational Technology Publications, Inc., New Jersey 2003.

[12] Stav JB. Designing new assessment methods that turn immediate results obtained in tests, into a creative learning tool by use of Smartphone's, Proceedings from the 3rd annual International Conference on Education and New Learning Technologies, Barcelona, Spain 2011; 6797-6800.

[13] Thorseth TM, Stav JB. Students experience with learning processes, response technologies and webapps for Smartphone's, Proceedings from the 3rd annual International Conference on Education and New Learning Technologies, Barcelona, Spain 2011; 6733-6740.

[14] *** http://www.anc.gov.ro/uploads/so/rZAuditor%20al%20calitatii.pdf

[15] *** http://www.cciams.ro/cursuri/index.html

[16] *** project iQVET, online at http://histproject.no/node/684.