This section aims to provide evaluators and stakeholders with better insight into the evaluation process within MSF.

  • 1. Who does what during the Evaluation Process (Roles and Responsibilities)?

    These are the main roles and responsibilities during the evaluation process... 

    EVALUATION UNIT

    • ToR definition together with commissioner of evaluation
    • Supervision of evaluation: process, methods, analysis;  feedback to the inception report
    • Approving draft report: collecting feedback, ensuring completion of final report
    • Organisation & communication: contracts, travel arrangements, briefings, arranging presentations, communication with stakeholders on the process
    • Dissemination of findings & follow-up of recommendations

    EVALUATOR(S)

    • Exchange and agree on process and methods with evaluation manager
    • Share a brief inception report at an early stage (before field visit)
    • Share bullet points of first findings after the field visit
    • Share first draft of report with evaluation manager
    • Finalization of report based on feedback provided
    • Is available for presentations after the evaluation

    We expect evaluators to work with our Evaluation Manual and adhere to the templates for the inception and the final report. 

  • 2. How can I apply for the MSF Evaluators Pool?

    This is how you can apply to be included in the Evaluators Pool:
     
    Please download and fill in the Application Form. Return the completed form and your CV to evaluation@vienna.msf.org
     
    If your qualifications match MSF's requirements, your information will be included in the MSF Evaluators Pool. 
  • 3. Where do I find out about doing an evaluation with MSF?

    MSF regularly posts evaluation opportunities on the ALNAP portal. Please register for the ALNAP weekly newsletter to stay informed of current MSF evaluation consultant needs. 

    MSF also actively manages an evaluators pool. To register your interest with the MSF evaluators pool, please fill in this form and send it to evaluation@vienna.msf.org along with your CV.

  • 4. Where can I find a list of all MSF acronymns?

    There is a full and public 'MSF Glossary of Acronyms' you can refer to: https://tukul.msf.org/glossary_export/HomePage.html

  • 5. Evaluation Definitions

    Definitions of different learning and accountability exercises for the use in MSF

    Type of excercise

    Definition

    Utilisation/purpose

    Evaluation

    The assessment, as systematic and objective as possible, of an ongoing or completed project, programme or policy, its design, implementation and results. The aim is to determine the relevance, appropriateness, effectiveness (timeliness, reactivity), efficiency, continuity and/or impact.[i]

    Evidence base for important decisions (re-orientation, change). Understand effects / outcomes of (pilot) programs and reasons for success or failure.  Organisational learning and accountability.  

     

    Review

    An assessment of the performance of a project/program, ~evaluation, but by choice less comprehensive; may compromise on the rigour in methodology.

    Rapid overview for the management.

    Real time evaluation

    A real time evaluation (RTE) is conducted at the early stages of response to a humanitarian emergency. The evaluator/s act as “a stranger who sees more” because of their distance from the day to day activities. The focus is on gaining as much information possible in a short time frame. The primary audience for an RTE is the MSF staff – at field and headquarters level. 

    The main purpose is to provide feedback to the operational staff in real time.

    Peer review

    A review that is conducted by a team in an equal position, with comparable know how / experience,

    Learning from peers for program improvement.

    Meta-review

    A comparison / synthesis of different evaluations on a common subject in order to draw overall conclusions and lessons learnt and/or compare approach and methodology (and quality) of the evaluations.

    Concise overview / synthesis of findings on a specific subject.

    Retrospects

    = Lessons learned exercise or After Action Review. Structured working session with key people involved in an (emergency) intervention in order to critically reflect the successes and shortcomings of an intervention. It may be complemented by a preparatory analysis of key issues.

    Rapid documentation of lessons learnt in a particular setting and definition of recommendations to improve next response. Joint/reinforced learning for the team involved.

    Capitalisation

    Structured reflexion of experiences and lessons learnt at the end of a project.

    Documentation and transfer of knowledge obtained in an intervention.

     

    Assessment

    Analysis of a situation in order to identify (health and humanitarian) problems, their sources and consequences in order to determine the best way of response. The focus is on current and future needs of the population.

    Evidence base for a decision on an intervention.

    Socio-cultural assessment / study

    Looking at the socio-cultural characteristics of a society or group of people. This includes everything that is related to a social group which shares values, norms, knowledge etc. Culture is seen as a complex dynamic process which can change constantly.

    Understanding socio cultural determinants before a project start or mid-term to inform operational decisions on culturally adapted activities.

    Audit

    Objective assessment of either compliance with applicable statutes and regulations.

    („regularity audit“) or the relevance, economy, efficiency, effectiveness („performance audit“).

    Internal or external accountability.

    Quality of care assessment

    Cross-sectional study design to assess structural, process and outcome quality of (health) care; primarily on hospital level.

    Comprehensive picture of the quality of care in a health structure (or project), incl.  scores for different performance levels.

     

    The choice of methodology for each assignment depends on the questions to be answered. In most cases it is a mix of qualitative (In depth interviews, Focus group discussion) and quantitative methodologies (numerical data analysis, surveys, structured observation). 

     



    [i] Evaluation may be of different kind according to:

    ·         its timing: prior to action (ex ante), during it (mid-term), or after completion (ex post ).

    ·         its doer: the ones involved in the performed action itself (self or internal), semi-external (people from the organisation, but not involved in the program)  or an external  consultants  (external).

    ·         its focus: on accountability (summative) or on learning, improving performance (formative).

    ·         its scope: on project or program level or transversal / thematic  (across projects) on a specific subject

     

    MSF Evaluation units, 2016

    http://evaluation.msf.org