- “Securing assessors' professionalism: Meeting Assessor Requirements for the Purpose of Performing high-quality (RPL) Assessments”
Author: Antoinette van Berkel
It is mainly consist of 4 parts: (1)introducing “Recognition of Prior Learning in the Netherlands” and its implement and proving the necessity of securing assessor professionalism; (2)exploring the notion of assessor competence; (3) putting forward the importance of introducing assessor certification and introducing the “Professionalization and Certification Programme for Assessors” which includes 3 parts: professional development, performance assessment and maintenance of the certificate; (4)Comparing two tried and tested certification Programmes and analyzing the lessons learned in developing and enacting these programmes.
Refining and Applying the Assessment Standard Mastering the assessment standard is one of the basic assessor skills, which makes it a very important element of the training programme. The training programme pays special attention to making standards feasible by defining three elements that make them appropriate for use:
- Typical professional tasks the assessee should be able to perform. These are defined as key tasks, activities, critical situations and problems or dilemmas.
- Criteria that describe what competent performance entails, i.e. what adequate behaviour is expected.
- Examples of proof that assessees may submit to demonstrate competent performance.
2.”Components of Good Assessor Training”
Author: Mary Lee Howe
It is mainly about the author’s experience of being an trainer of assessor and sharing the necessary components a good assessor training should has, such as the basis quality a trainer should hold and the interaction between trainer and trainee.
When teaching a group of bright, competent adults, it is critical to be an articulate speaker, project self-confidence, manifest appropriate body language and physical gestures and, in general, be “quick on one’s feet”. Effective trainers utilize their voice to make a point, maintain eye contact with the audience, vary the pace to avoid monotony, move about the room, and generate enthusiasm throughout the presentation.
- 3. “Evaluating the Evaluator: Development, Field Testing, and Implications of a Client-Based Method for Assessing Evaluator Performance”
Author: Kathleen Dowell, Jean Haley, Jo Ann Doino-Ingersoll
It is mainly about 5 parts:(1) Generally discusses the importance of performance assessment and client satisfaction; (2) Introducing the developing process of the Client Feedback Form(CFF); (3) Introducing the contains of CFF tool; (4) using CFF tool to carrying out the survey for evaluator performance and analyzing the data which they received; (5) concluding the lesson learned form the test about how to improve the CFF tool.
Overview of the CFF Tool. The CFF(Client Feedback Form) starts with a question asking the respondent about his or her level of involvement in the project in question. Respondents are provided with nine choices and can select all that apply from the following list:
- Involved in selecting the evaluator
- Provided input to the evaluation plan
- Key decision maker (for example, approved instruments, reports, changes in the plan, and so on)
- Day-to-day point of contact with the evaluator
- Handled my organization’s responsibilities in the evaluation
- Approved invoices/interim status reports
- Read/commented on final evaluation report/s
- Participated in interpreting results/writing recommendations
- “Reframing the Position of the Evaluator”
Author: S. Gattenhof
It defines two partnership types that can be utilized by evaluators and which explores the positioning of the evaluator. It will make recommendations on which partnership type is more effective in a participatory evaluation model. It investigates the two positions most often adopted by researchers/evaluators – external and distanced or embedded and collaborative – and will argue the merits and deficiencies of the two approaches. The chapter also discusses how the embedded and collaborative approach can be aligned with the notion of co-production of research.
In some cases, arts projects and cultural institutions will provide:
- • A list of generalized key performance indicators rather than clear expectations about what sort of evaluation data they want, how they want it, when they want it, and what they will do with it;
- A predetermined definition of value and predetermined data on value;
- Data gathered by artists or arts marketers days, months, or years after a project, and, in some cases, already collated into trends and statistics, with selected quotes
- Beyond Being an Evaluator: The Multiplicity of Roles of the Internal Evaluator”
Author: Boris B. Volkov
It explores critical roles of internal evaluators in contemporary organizational settings. The need is highlighted for an expanded, reconfigured, unorthodox set of roles and styles of work to meet the needs of the emerging learning organizations effectively. A discussion of major categories of internal evaluator roles emerged from the analysis of the evaluation literature and other sources suggests new directions for how internal evaluation is conceptualized and practiced. Systematic promoting and advancing positive change, evaluation capacity building, decision making, learning, and evaluative thinking in organizations are seen as part of the harmonized internal evaluator role.
Table 3.1.(p.6-7): Major Categories of the Evaluator Roles Found in the Evaluation Literature
- Change agent: agent of social change; activist; promoting social justice, transformation, and democracy
- Educating about evaluation: teaching, training, coaching, mentoring, and providing technical assistance to managers, and other stakeholders; popularization of evaluation; resource for infusing evaluative thinking
- ECB (evaluation capacity building) practitioner- Building evaluation capacity; long-term education of the Baron; promoting evaluation; infusing evaluative thinking; facilitating learning processes; leadership role in “mainstreaming” evaluation into the organization; building and an organization’s skills and knowledge; driving force for ECB
- ( Management) decision- making support: Supporting program decision making; management facilitator, making supporter analyst, adviser, and consultant; management supporter; management information resource; management decision-support specialist; administrator’s tool; problem solver; expert troubleshooter
- Consultant: Consultant; management consultant; program staff consultant; organizational development consultant; consultant–mediator; advisor or a consultant to program managers; adviser; counselor
- Researcher/ technician/ analyst: Researcher: ; applied researcher supporting organizational development and learning; social researcher and operations researcher; social scientist; technical servant; technician; technical geek; collaborative researcher; action researcher; policy analyst, studying topics selected by top management
- Advocate: Program advocate; advocacy for support; champion of evaluation; advocate for intended primary users; evaluation use advocate; advocate for the program’s target groups; advocate and for the most vulnerable population; advocate for cultural justice
- Organizational learning promoter: Building, supporting, and promoting organizational learning; organizational development consultant; advancing organizational knowledge; educating about learning and change processes
- Other roles: Facilitator; generalist/jack-of-all-trades; planner; collaborator; Multiple authors; independent observer; evaluator; judge; information specialist