Monitoring & Evaluation: A Glossary for Project Success
February 26, 2025 2025-02-26 19:16Monitoring & Evaluation: A Glossary for Project Success
Introduction
This comprehensive glossary provides definitions and practical guidance for terms commonly used in Monitoring and Evaluation (M&E) of development projects and programs. Each entry includes:
- Definition: A clear explanation of the term
- Example: A practical illustration of how the term is applied in M&E contexts
- When to use: Guidance on appropriate contexts for applying the concept
- Related concepts: Similar or connected terms to enhance understanding
This resource is designed for project managers, M&E specialists, donors, implementing partners, and anyone involved in designing, implementing, or assessing development interventions.
A
Absorption Capacity
Definition: The ability of a program, organization, or system to effectively utilize available resources within a given timeframe.
Example: The project faced challenges due to low absorption capacity of local implementing partners, resulting in underspending of allocated funds.
When to use: When analyzing financial performance and implementation pace of projects or organizations.
Related concepts: Implementation Rate, Burn Rate, Resource Utilization
Accountability
Definition: The responsibility for the use of resources and decisions made, with an obligation to demonstrate work has been done according to agreed rules and standards, reporting fairly and accurately on performance results.
Example: A project manager submits quarterly reports to donors detailing how funds were used and what outcomes were achieved.
When to use: When discussing transparency of project implementation, resource utilization, and reporting obligations.
Related concepts: Transparency, Reporting, Responsibility
Action Research
Definition: A reflective process of progressive problem-solving where practitioners study their own actions to improve practices and address issues.
Example: The health team conducted action research by testing different community engagement methods, documenting outcomes, and adapting approaches based on findings.
When to use: When practitioners want to improve their practice through ongoing cycles of planning, action, observation, and reflection.
Related concepts: Participatory Research, Learning by Doing, Reflective Practice
Activity
Definition: Actions taken or work performed through which inputs such as funds, technical assistance, and other resources are mobilized to produce specific outputs.
Example: Training 50 health workers on HIV testing protocols is an activity that transforms financial and human resources into skilled personnel.
When to use: When describing the specific tasks undertaken in a project to achieve desired outputs.
Related concepts: Inputs, Outputs, Implementation
Activity Tracking
Definition: Monitoring the implementation of planned activities against schedules, targets, and resource allocations.
Example: Weekly activity tracking showed that 70% of planned training sessions were completed on schedule, while community mobilization activities were delayed.
When to use: When regularly monitoring the execution of work plans to ensure timely implementation.
Related concepts: Implementation Monitoring, Progress Tracking, Milestone Tracking
Adaptive Management
Definition: A process that integrates project design, management, and monitoring to provide a framework for testing assumptions, adaptation, and learning.
Example: After mid-term review showed lower-than-expected participation, the team redesigned the outreach strategy based on community feedback.
When to use: When discussing how projects respond to changing circumstances and incorporate lessons learned.
Related concepts: Learning, Flexibility, Course Correction
Additionality
Definition: The extent to which a development intervention creates benefits or results that would not have occurred without the intervention.
Example: The evaluation demonstrated additionality by showing that private sector investment in agricultural value chains would not have occurred without the initial program support.
When to use: When assessing whether an intervention creates genuine additional value rather than displacing existing activities or resources.
Related concepts: Added Value, Net Impact, Deadweight
After Action Review
Definition: A structured review process for analyzing what happened, why it happened, and how it can be done better.
Example: Following the emergency response, the team conducted an after action review to identify successes, failures, and lessons for future responses.
When to use: When concluding a major activity or milestone to capture immediate learning while experience is fresh.
Related concepts: Lessons Learned, Reflection, Process Review
Annual Review
Definition: An assessment of the performance of a project, conducted yearly to review progress toward objectives.
Example: The annual review meeting brought together all stakeholders to assess achievements against the annual work plan and make recommendations for the next year.
When to use: When discussing regular yearly assessments of project performance.
Related concepts: Review, Performance Assessment, Work Plan
Annual Work Plan and Budget (AWPB)
Definition: The annual commitment of the project detailing operational aspects, resource allocation, and specific activities planned for the year.
Example: The AWPB outlined monthly targets for community outreach sessions and allocated resources accordingly.
When to use: When discussing operational planning and resource allocation for the upcoming year.
Related concepts: Planning, Budgeting, Implementation
Appreciative Inquiry
Definition: An approach focusing on identifying what works well in an organization or program and leveraging these strengths to create positive change.
Example: The evaluation used appreciative inquiry to identify successful community mobilization strategies that could be amplified and replicated.
When to use: When seeking to build on existing strengths rather than focusing primarily on problems or weaknesses.
Related concepts: Strength-based Approach, Positive Deviance, Solutions Focus
Appraisal
Definition: Assessment of the feasibility and acceptability of a project prior to a funding commitment, according to established decision criteria.
Example: During appraisal, the team assessed whether the proposed intervention was contextually appropriate and financially viable.
When to use: When discussing pre-funding assessment of project viability.
Related concepts: Feasibility, Ex-ante Evaluation, Project Design
Assessment
Definition: A process of gathering information, analyzing it, and making judgments based on the information.
Example: The needs assessment revealed that access to clean water was the community’s highest priority.
When to use: When discussing information gathering to inform decisions or judgments.
Related concepts: Evaluation, Analysis, Judgment
Assumption
Definition: External factors (events, conditions, or decisions) that could affect project progress or success, largely beyond project control but necessary to achieve objectives.
Example: The project design assumed that local government would maintain political support throughout implementation.
When to use: When identifying external factors that may impact project success but are outside direct control.
Related concepts: Risks, External Factors, Preconditions
Attribution
Definition: The causal link between observed changes and a specific intervention, determining the extent to which changes can be linked to the intervention.
Example: The evaluation used a control group to attribute the 15% reduction in malnutrition to the nutrition education program.
When to use: When discussing how to determine whether observed changes are caused by the intervention.
Related concepts: Causality, Contribution, Impact
Attribution Gap
Definition: The difference between observed changes and changes that can be credibly attributed to a specific intervention.
Example: The attribution gap was significant because improved nutrition outcomes were influenced by both the program and concurrent improvements in economic conditions.
When to use: When discussing limitations in establishing causal relationships between interventions and observed changes.
Related concepts: Causality Challenges, Alternative Explanations, Contribution
Audit
Definition: Independent verification of resource implementation legality and regularity, determining whether activities and procedures conform to established norms and criteria.
Example: The annual financial audit verified that all expenditures matched approved budget categories and procurement procedures were followed.
When to use: When discussing verification of compliance with financial and procedural requirements.
Related concepts: Compliance, Verification, Financial Management
B
Backcasting
Definition: Planning approach that starts with defining a desired future and then works backward to identify policies and programs that will connect the future to the present.
Example: The evaluation used backcasting to determine what interventions would be needed to achieve the long-term goal of carbon neutrality by 2050.
When to use: When planning interventions that need to achieve specific long-term outcomes or targets.
Related concepts: Reverse Planning, Visioning, Future-back Thinking
Baseline Drift
Definition: Gradual, unintended change in measurement parameters over time, affecting the comparability of data collected at different points.
Example: Baseline drift occurred when staff turnover led to inconsistent interpretation of indicator definitions over the five-year project period.
When to use: When identifying methodological issues that may affect the validity of trend data over extended periods.
Related concepts: Measurement Error, Data Consistency, Methodological Rigor
Baseline Information
Definition: Facts and figures collected at initial stages of a project that provide a basis for measuring progress toward objectives.
Example: The baseline survey showed that 60% of households lacked access to safe drinking water, providing a reference point to measure improvements.
When to use: When establishing initial conditions before project implementation.
Related concepts: Benchmark, Starting Point, Pre-intervention Measurement
Baseline Study/Survey
Definition: Analysis describing the situation in a project area prior to intervention, allowing progress assessment and comparison over time.
Example: The baseline study documented existing agricultural practices before introducing new techniques.
When to use: When gathering comprehensive pre-intervention data to enable future comparisons.
Related concepts: Situation Analysis, Pre-intervention Assessment, Reference Point
Benchmark
Definition: Reference point or standard against which performance or achievements can be compared.
Example: The program used WHO guidelines as benchmarks to assess the quality of maternal healthcare services.
When to use: When establishing standards for comparison or performance targets.
Related concepts: Standards, Targets, Reference Points
Benchlearning
Definition: Structured process of learning from the practices and experiences of others to improve performance.
Example: The benchlearning exercise brought together six similar projects to compare implementation approaches and identify successful strategies for reaching youth.
When to use: When facilitating learning between comparable programs or organizations to identify and adapt good practices.
Related concepts: Peer Learning, Best Practice Exchange, Comparative Learning
Beneficiaries
Definition: Individuals, groups, or organizations who benefit directly or indirectly from the development intervention.
Example: Primary beneficiaries included 500 smallholder farmers, while indirect beneficiaries included their families and local markets.
When to use: When identifying who receives benefits from an intervention.
Related concepts: Target Group, Stakeholders, Recipients
Beneficiary Assessment
Definition: An approach to evaluation that involves systematic consultation with project beneficiaries to assess the value of activities to their lives.
Example: The beneficiary assessment used in-depth interviews with program participants to understand how the vocational training had affected their livelihoods.
When to use: When seeking to understand how beneficiaries perceive project benefits and relevance to their needs.
Related concepts: Participatory Assessment, Client Feedback, User Perspectives
Beneficiary Feedback Mechanism
Definition: A formal system enabling project participants to provide input on activities, express concerns, and influence decision-making.
Example: The water project established community feedback boxes, SMS reporting, and quarterly community meetings to gather user perspectives.
When to use: When implementing systems to ensure participant voices inform ongoing implementation and adjustments.
Related concepts: Accountability Mechanisms, Grievance Procedures, Community Voice
Beneficiary Targeting
Definition: The process of identifying, selecting, and reaching specific population groups intended to receive program benefits.
Example: The nutrition program used community-based targeting to identify households with children under five at risk of malnutrition.
When to use: When designing mechanisms to ensure program benefits reach intended populations, especially vulnerable or marginalized groups.
Related concepts: Participant Selection, Eligibility Criteria, Inclusion Strategies
Benefit-Cost Ratio
Definition: The ratio of the present value of benefits to the present value of costs of an intervention.
Example: The vaccination campaign had a benefit-cost ratio of 16:1, indicating that every dollar invested generated $16 in economic and health benefits.
When to use: When comparing the economic efficiency of different interventions or making investment decisions.
Related concepts: Return on Investment, Economic Evaluation, Value for Money
Bias
Definition: Systematic error in data collection, analysis, or interpretation that can lead to inaccurate conclusions.
Example: Selection bias occurred when only program participants who completed the full training were included in the outcome assessment.
When to use: When examining the validity and reliability of evaluation findings and identifying potential sources of error.
Related concepts: Validity Threats, Systematic Error, Methodological Limitations
Big Data
Definition: Extremely large datasets that may be analyzed computationally to reveal patterns, trends, and associations related to human behavior and interactions.
Example: The program used big data from mobile phone usage patterns to track population mobility during the disease outbreak.
When to use: When analyzing large, complex datasets to identify patterns that couldn’t be detected with traditional data analysis.
Related concepts: Data Analytics, Machine Learning, Data Mining
Blockchain for M&E
Definition: A distributed ledger technology that can verify and record transactions and information in a transparent, secure, and tamper-proof way.
Example: The aid distribution program used blockchain to track resources from donor to recipient, ensuring transparency and preventing duplication.
When to use: When immutable record-keeping and transparent tracking of resources or outcomes is essential.
Related concepts: Distributed Ledger Technology, Digital Verification, Supply Chain Tracking
Boundary Partner
Definition: Individuals, groups, or organizations with whom a program interacts directly and can anticipate opportunities for influence.
Example: Local government officials were identified as key boundary partners whose policies and practices the advocacy program sought to influence.
When to use: When identifying and analyzing the stakeholders a program works with directly to bring about change.
Related concepts: Sphere of Influence, Direct Stakeholders, Change Agents
Budget Plan Schedule
Definition: Plan assigning quarterly costs to different activities and subdividing costs based on funding sources.
Example: The budget plan schedule allocated $10,000 for training in Q1, with 70% from donor funds and 30% from government contribution.
When to use: When planning detailed financial allocations over time and by source.
Related concepts: Financial Planning, Expenditure Projection, Resource Allocation
Burden of Disease
Definition: Measurement of the gap between current health status and an ideal health situation where the entire population lives to an advanced age, free from disease and disability.
Example: The health program prioritized interventions targeting conditions representing the highest burden of disease in the region, as measured by disability-adjusted life years.
When to use: When analyzing health priorities and evaluating the potential impact of health interventions.
Related concepts: Health Impact, Morbidity, Mortality, DALY
C
Capacity
Definition: The ability of individuals and organizations to perform functions effectively, efficiently, and sustainably.
Example: The NGO had strong capacity in community mobilization but limited capacity in data management.
When to use: When discussing organizational or individual abilities to accomplish objectives.
Related concepts: Capability, Competence, Skills
Capacity-Building
Definition: Processes through which capacity is created or enhanced, a key crosscutting issue in poverty alleviation projects.
Example: The project included workshops, mentoring, and on-the-job training to build staff capacity in results-based management.
When to use: When discussing efforts to strengthen abilities of individuals or organizations.
Related concepts: Training, Organizational Development, Skills Enhancement
Causal Loop Diagram
Definition: Visualization tool that shows how different variables in a system are interrelated through causal relationships and feedback loops.
Example: The causal loop diagram illustrated how improvements in maternal education created reinforcing feedback loops affecting child nutrition, health-seeking behavior, and economic opportunities.
When to use: When mapping complex systems and identifying leverage points for intervention in systems with multiple interdependencies.
Related concepts: Systems Mapping, Feedback Loops, Complexity Analysis
Causal Relationship
Definition: Logical connection or cause-and-effect linkage between related, interdependent results.
Example: The evaluation established a causal relationship between improved water access and reduced diarrheal disease.
When to use: When examining how different results are connected through cause and effect.
Related concepts: Attribution, Impact Pathway, Theory of Change
Causality Analysis
Definition: Study of cause-and-effect relations that link an intervention to its impacts.
Example: Causality analysis revealed that improved crop yields resulted from a combination of better seeds, training, and favorable weather.
When to use: When examining the mechanisms through which an intervention leads to observed changes.
Related concepts: Impact Pathway, Contribution Analysis, Theory of Change
Ceiling Effect
Definition: Limitation in measurement when a significant proportion of subjects score at or near the maximum possible score, reducing the ability to detect improvements.
Example: The literacy assessment showed a ceiling effect among urban children, making it difficult to measure further improvements from the advanced reading program.
When to use: When designing assessments or identifying limitations in measurement tools, especially when tracking change over time.
Related concepts: Measurement Limitations, Data Range, Assessment Design
Change Management
Definition: Structured approach to transitioning individuals, teams, and organizations from a current state to a desired future state.
Example: The project implemented a change management strategy to help health facility staff adapt to the new electronic medical records system.
When to use: When implementing significant changes in systems, procedures, or technologies that affect how people work.
Related concepts: Organizational Development, Transition Management, Adoption Strategies
Client Satisfaction
Definition: The extent to which beneficiaries or service users are satisfied with the quality, relevance, and delivery of services provided.
Example: Client satisfaction surveys revealed that while beneficiaries valued the health services provided, long waiting times significantly reduced overall satisfaction.
When to use: When assessing service quality from the perspective of users and identifying areas for improvement.
Related concepts: User Experience, Service Quality, Client Feedback
Climate-Smart M&E
Definition: Monitoring and evaluation approaches that consider climate change impacts, vulnerabilities, and adaptation in program design and assessment.
Example: The agricultural project’s M&E system tracked both immediate yield improvements and resilience to changing rainfall patterns.
When to use: When evaluating interventions in sectors affected by climate change or specifically focused on climate adaptation.
Related concepts: Environmental Monitoring, Resilience Measurement, Adaptation Tracking
Cluster Evaluation
Definition: Evaluation approach examining a group of similar or related projects to identify common themes, findings, and lessons.
Example: The cluster evaluation of 12 youth employment projects identified common success factors and implementation challenges across different contexts.
When to use: When seeking to understand patterns and generate knowledge across a portfolio of related interventions.
Related concepts: Thematic Evaluation, Cross-project Analysis, Meta-evaluation
Coherence
Definition: The compatibility of an intervention with other interventions in a country, sector, or institution, including consistency with international norms and standards.
Example: The evaluation assessed coherence by examining how well the disaster risk reduction project aligned with national policies, other donor initiatives, and international frameworks.
When to use: When assessing how well interventions align with and complement other relevant efforts and frameworks.
Related concepts: Alignment, Coordination, Complementarity, Harmonization
Collaboration
Definition: Process of working together toward a common goal, sharing knowledge, learning, and building consensus.
Example: Collaboration between health, education, and water departments strengthened the integrated approach to improving school health.
When to use: When discussing how different entities work together on shared objectives.
Related concepts: Partnership, Coordination, Joint Action
Collective Impact
Definition: Approach to tackling complex social problems through a structured form of collaboration across sectors around a common agenda.
Example: The evaluation found that the collective impact initiative had successfully aligned activities of government, business, and civil society actors around shared education outcomes.
When to use: When assessing multi-stakeholder collaborative efforts addressing complex social problems.
Related concepts: Cross-sector Collaboration, Shared Measurement, Backbone Organization
Communication Strategy
Definition: Planned approach to sharing information about a project, its activities, and results with key stakeholders.
Example: The communication strategy included quarterly newsletters, community radio spots, and an annual stakeholder forum to share project progress.
When to use: When planning systematic approaches to information sharing for transparency and engagement.
Related concepts: Outreach Plan, Information Dissemination, Stakeholder Engagement
Community Participation
Definition: Active involvement of community members in local development activities, ranging from token involvement to empowerment-oriented decision-making.
Example: Community participation in the water management project included needs identification, implementation planning, and ongoing system maintenance.
When to use: When discussing how local people are involved in development initiatives affecting them.
Related concepts: Stakeholder Engagement, Participatory Approaches, Ownership
Community Score Card
Definition: Participatory tool that solicits user perceptions on quality, efficiency, and transparency of services, compares with provider self-evaluation, and facilitates dialogue.
Example: The community score card process revealed discrepancies between health providers’ assessment of service quality and users’ experiences, leading to agreed improvement actions.
When to use: When promoting dialogue between service providers and users to improve accountability and service quality.
Related concepts: Social Accountability, Citizen Report Card, Service Quality Assessment
Completion
Definition: Final phase in the project cycle when a completion report is produced, lessons learned are identified, and project closure activities take place.
Example: During completion, the team documented successful approaches for future replication and conducted handover meetings with local partners.
When to use: When discussing project closure and documentation of final results.
Related concepts: Closure, Project Cycle, Final Evaluation
Completion Evaluation
Definition: External evaluation conducted after project completion to assess overall achievements and impact.
Example: The completion evaluation conducted six months after project closure revealed sustained benefits and identified factors critical to success.
When to use: When assessing final outcomes and sustainability after project closure.
Related concepts: Ex-post Evaluation, Impact Assessment, Final Assessment
Completion Report
Definition: Document detailing the project situation at closure, including achievements, challenges, and lessons learned.
Example: The completion report highlighted how the project exceeded targets for community centers but faced challenges in achieving women’s participation.
When to use: When documenting final project status, achievements, and lessons.
Related concepts: Final Report, Project Documentation, Lessons Learned
Complexity-Aware Monitoring
Definition: Monitoring approaches that are designed to operate in complex, dynamic environments where linear cause-effect relationships are not easily discernible.
Example: The complexity-aware monitoring system tracked emergent outcomes, changing relationships between actors, and real-time adaptation rather than only predefined indicators.
When to use: When monitoring interventions in unpredictable environments where activities and strategies need to adapt continuously.
Related concepts: Adaptive Monitoring, Emergent Learning, Systems Monitoring
Conceptual Model
Definition: Diagram depicting relationships between factors believed to impact or lead to a target condition, forming the foundation of project design and monitoring.
Example: The conceptual model illustrated how improved agricultural practices, market access, and weather information would lead to increased farmer incomes.
When to use: When visually representing the theory of how change will happen in a project.
Related concepts: Theory of Change, Results Chain, Logic Model
Confidence Interval
Definition: A range of values that is likely to contain the true value of a population parameter with a specific level of confidence.
Example: Survey results showed that 45% of farmers adopted the new technique, with a 95% confidence interval of 41-49%, indicating the reliability of the estimate.
When to use: When reporting survey results to indicate the precision and reliability of estimates.
Related concepts: Statistical Precision, Margin of Error, Sampling Error
Conflict Sensitivity
Definition: Ability of an organization to understand its operating context, recognize how its interventions interact with conflict, and act to minimize negative impacts.
Example: The M&E system included regular conflict analysis to ensure aid distribution wouldn’t exacerbate tensions between different community groups.
When to use: When working in conflict-affected contexts where projects may interact with conflict dynamics.
Related concepts: Do No Harm, Peace and Conflict Impact Assessment, Context Analysis
Control Group
Definition: Specially selected subgroup that does not receive the intervention, allowing comparison with the target group to measure intervention effects.
Example: By comparing literacy rates between the control and treatment groups, researchers determined the literacy program increased reading scores by 35%.
When to use: When designing rigorous impact evaluations requiring comparison groups.
Related concepts: Comparison Group, Counterfactual, Experimental Design
Contribution Analysis
Definition: Approach for assessing causal questions and inferring causality in real-life program evaluations when experimental designs are not possible.
Example: Contribution analysis helped determine the program’s role in reducing child mortality alongside other interventions and external factors.
When to use: When determining the extent to which observed results are due to program activities in complex environments with multiple influences.
Related concepts: Theory-Based Evaluation, Causal Inference, Alternative Explanation Assessment
Contribution Claim
Definition: An evidence-based assertion about the contribution an intervention has made to observed outcomes, acknowledging other influencing factors.
Example: The contribution claim stated that the advocacy program, alongside allied organizations, contributed significantly to policy change by providing evidence and mobilizing public support.
When to use: When making justified assertions about an intervention’s influence on outcomes in complex contexts with multiple influencing factors.
Related concepts: Attribution Analysis, Causal Inference, Evidence-based Claims
Cooperating Institution
Definition: Organization responsible for loan administration and project supervision on behalf of the funding agency.
Example: The World Bank acted as the cooperating institution, supervising implementation and providing technical support.
When to use: When discussing institutional arrangements for project oversight and fiduciary management.
Related concepts: Implementing Agency, Oversight, Supervision
Cost-Benefit Analysis
Definition: Comparison of investment and operating costs with direct benefits or impact generated by the investment, using various methods to express results.
Example: The cost-benefit analysis showed that every dollar invested in the vaccination program saved $16 in healthcare costs.
When to use: When assessing economic efficiency and return on investment.
Related concepts: Economic Analysis, Return on Investment, Efficiency
Cost Driver
Definition: A factor that causes a change in the cost of an activity or project.
Example: Analysis identified remote location of project sites as the primary cost driver, accounting for 35% of the difference in implementation costs compared to similar projects.
When to use: When analyzing factors that influence costs to improve efficiency and resource allocation decisions.
Related concepts: Cost Factors, Expense Analysis, Financial Efficiency
Cost Effectiveness
Definition: Comparison of relative costs of achieving a given result by different means, especially useful when benefits are difficult to monetize.
Example: The analysis found mobile clinics were more cost-effective than fixed facilities for delivering prenatal care in remote areas.
When to use: When comparing alternative approaches to achieve the same outcome based on cost.
Related concepts: Efficiency, Value for Money, Alternative Analysis
Cost-Utility Analysis
Definition: Economic evaluation that compares different interventions in terms of their cost and their utility, often measured in quality-adjusted life years (QALYs).
Example: The cost-utility analysis showed that the new treatment protocol cost $3,500 per QALY gained compared to the standard protocol.
When to use: When comparing health interventions based on both their costs and their effects on both length and quality of life.
Related concepts: QALY, Health Economics, Comparative Effectiveness Research
Counterfactual
Definition: What would have happened to beneficiaries in the absence of the intervention; used to determine attribution.
Example: The comparison group provided a counterfactual scenario showing what literacy rates would have been without the educational intervention.
When to use: When attempting to establish causal relationships between interventions and observed changes.
Related concepts: Control Group, Comparison, Attribution
Critical Assumption
Definition: Important external factor that influences activity success but over which managers have no control.
Example: A critical assumption was that political stability would continue throughout the implementation period.
When to use: When identifying key external factors that must hold true for project success.
Related concepts: Risks, External Factors, Preconditions
Critical Reflection
Definition: Questioning and analyzing experiences, observations, theories, beliefs, and assumptions.
Example: The team engaged in critical reflection about why certain communities participated more actively than others, leading to revised engagement strategies.
When to use: When examining underlying causes, assumptions, and biases affecting project performance.
Related concepts: Learning, Adaptive Management, Reflection
Cross-sectional Study
Definition: Research design that collects data from a population at a specific point in time to examine the relationship between variables of interest.
Example: The cross-sectional study gathered data from 1,200 households to analyze the relationship between water access, hygiene practices, and health outcomes.
When to use: When seeking to understand the current situation or relationships between variables without tracking changes over time.
Related concepts: Prevalence Study, Snapshot Analysis, Status Assessment
Cultural Competence
Definition: The ability to understand, communicate with, and effectively interact with people across different cultures in evaluation practice.
Example: Cultural competence was demonstrated by adapting data collection methods to align with indigenous communication patterns and decision-making processes.
When to use: When designing and implementing evaluations in cross-cultural contexts or with diverse stakeholder groups.
Related concepts: Cultural Sensitivity, Inclusivity, Cross-cultural Communication
Cultural Responsiveness
Definition: Ability to recognize, respect, and respond effectively to cultural factors in the design, implementation, and evaluation of programs.
Example: The evaluation methodology was adapted to include storytelling approaches that aligned with indigenous knowledge systems and communication norms.
When to use: When working with culturally diverse populations to ensure evaluation methods respect and reflect their values and perspectives.
Related concepts: Cultural Competence, Cultural Sensitivity, Indigenous Evaluation
D
Data
Definition: Specific quantitative and qualitative information or facts that are collected and analyzed.
Example: The project collected data on attendance, test scores, and teacher perceptions to evaluate the educational intervention.
When to use: When discussing the information collected to inform decision-making or assessment.
Related concepts: Information, Evidence, Metrics
Data Collection Tools
Definition: Instruments and approaches used to gather information for monitoring and evaluation purposes.
Example: The evaluation used household surveys, focus group discussions, and health clinic records as data collection tools.
When to use: When discussing methods for gathering information from various sources.
Related concepts: Methodology, Research Instruments, Surveys
Data Dashboard
Definition: Visual display of key performance indicators, metrics, and data points that enable quick understanding of current performance.
Example: The project dashboard displayed real-time information on beneficiary reach, activity completion rates, and budget utilization.
When to use: When presenting complex data in an accessible format for decision-makers and stakeholders.
Related concepts: Data Visualization, Management Information Systems, Performance Monitoring
Data Disaggregation
Definition: Process of breaking down data by demographic, geographic, or other characteristics to reveal patterns, trends, and disparities.
Example: Data disaggregation by gender, age, and disability status revealed that program benefits were not reaching older women with disabilities.
When to use: When examining how outcomes vary across different subgroups or identifying inequities in program access or benefits.
Related concepts: Stratified Analysis, Demographic Analysis, Equity Assessment
Data Ethics
Definition: Moral principles guiding the collection, analysis, and use of data, especially concerning privacy, consent, and potential harm.
Example: The evaluation team established data ethics protocols ensuring informed consent, confidentiality, and secure data storage.
When to use: When handling sensitive data or working with vulnerable populations to ensure ethical standards are maintained.
Related concepts: Privacy, Informed Consent, Data Protection
Data Interoperability
Definition: Ability of different information systems, devices, or applications to access, exchange, and cooperatively use data.
Example: The health monitoring systems across different implementing partners were designed with data interoperability standards, allowing unified reporting.
When to use: When multiple systems need to share or combine data for comprehensive analysis or reporting.
Related concepts: Data Standards, Systems Integration, Information Exchange
Data Quality Assessment
Definition: Systematic review of data to ensure it meets quality standards for accuracy, reliability, completeness, precision, timeliness, and integrity.
Example: Quarterly data quality assessments identified reporting inconsistencies that were addressed through additional training and supervision.
When to use: When verifying that data used for decision-making and reporting meets established quality standards.
Related concepts: Data Verification, Quality Control, Information Integrity
Data Revolution
Definition: Transformation in how data is produced and used to drive decision-making, innovation, and citizen empowerment in development contexts.
Example: The data revolution enabled real-time monitoring of health indicators through mobile data collection rather than paper-based quarterly reports.
When to use: When discussing fundamental shifts in data availability, quality, usability, and analysis in development contexts.
Related concepts: Open Data, Big Data, Data for Development
Data Saturation
Definition: Point in qualitative data collection when additional data no longer generates new insights or themes.
Example: After conducting 15 interviews with health workers, data saturation was reached as subsequent interviews yielded no new perspectives on implementation challenges.
When to use: When determining adequate sample size for qualitative research or deciding when to conclude data collection.
Related concepts: Theoretical Saturation, Information Redundancy, Qualitative Sampling
Deadweight
Definition: Outcomes that would have occurred even without the intervention.
Example: Analysis suggested that 30% of the observed employment increase represented deadweight, as these individuals would likely have found jobs without the training program.
When to use: When assessing the net impact of an intervention by accounting for changes that would have happened anyway.
Related concepts: Counterfactual, Natural Change, Additionality
Developmental Evaluation
Definition: Evaluation approach that supports innovation by collecting and analyzing real-time data in complex, dynamic environments.
Example: Developmental evaluation provided ongoing feedback during the pilot phase, allowing rapid iteration of the community banking model.
When to use: When evaluating innovative, emergent, and complex initiatives where goals and implementation approaches evolve over time.
Related concepts: Adaptive Evaluation, Real-time Evaluation, Innovation Support
Diffusion of Effects
Definition: The spread of intervention effects beyond the immediate target population or area.
Example: Diffusion of effects was observed when non-participating farmers in neighboring villages began adopting techniques learned from program participants.
When to use: When examining the wider spread of program influence beyond direct beneficiaries or intervention areas.
Related concepts: Spillover Effects, Demonstration Effects, Knowledge Diffusion
Digital Data Collection
Definition: Use of electronic devices and applications to gather, store, and transmit information for monitoring and evaluation purposes.
Example: The project used tablet-based surveys with built-in validation checks, reducing errors and enabling real-time data analysis.
When to use: When implementing efficient, accurate data collection systems that reduce paper use and data entry time.
Related concepts: Mobile Data Collection, Electronic Surveys, Computer-Assisted Interviewing
Disaggregated Data
Definition: Data that has been broken down by detailed sub-categories, for example by marginalized group, gender, region, or level of education.
Example: Vaccination coverage data was disaggregated by gender, age group, and geographical location to identify underserved populations.
When to use: When analyzing whether program benefits reach all intended groups and identifying disparities or inequities.
Related concepts: Stratified Analysis, Demographic Breakdown, Equity Analysis
Do No Harm
Definition: Principle that interventions should avoid causing inadvertent harm to target communities or exacerbating existing conflicts or inequalities.
Example: The food distribution program conducted regular context analysis to ensure aid wasn’t creating dependency or disrupting local markets.
When to use: When planning, implementing, and evaluating interventions in fragile contexts or with vulnerable populations.
Related concepts: Conflict Sensitivity, Humanitarian Principles, Unintended Consequences
Downward Accountability
Definition: Process by which development organizations are accountable to partners and marginalized groups, including greater participation and transparency.
Example: The project established community feedback mechanisms and shared monitoring results publicly as part of downward accountability.
When to use: When discussing accountability toward beneficiaries rather than just donors.
Related concepts: Transparency, Participation, Responsiveness
E
Economic Evaluation
Definition: Application of analytical techniques to identify, measure, value, and compare costs and outcomes of alternative interventions.
Example: The economic evaluation compared three vaccination strategies based on costs per disability-adjusted life year (DALY) averted.
When to use: When assessing the economic value and efficiency of interventions.
Related concepts: Cost-Benefit Analysis, Cost-Effectiveness, Efficiency
Effect
Definition: Intended or unintended change resulting directly or indirectly from a development intervention.
Example: An unexpected effect of the agricultural program was improved household nutrition from increased crop diversity.
When to use: When discussing changes resulting from an intervention without specifying causality strength.
Related concepts: Impact, Outcome, Results
Effectiveness
Definition: Extent to which a project attains its objectives at the goal or purpose level under normal conditions in a real-life setting.
Example: The HIV prevention program was effective at increasing condom use among the target population.
When to use: When assessing whether a project achieved its intended results in real-world conditions.
Related concepts: Results, Achievement, Success
Effectiveness Index
Definition: Composite measure combining multiple indicators to provide an overall assessment of program effectiveness.
Example: The effectiveness index combined measures of reach, service quality, behavior change, and sustainability to compare performance across program sites.
When to use: When needing to synthesize multiple performance dimensions into a single metric for comparison or tracking.
Related concepts: Composite Indicators, Performance Index, Weighted Scoring
Efficacy
Definition: Extent to which an intervention produces expected results under ideal conditions in a controlled environment.
Example: Clinical trials demonstrated the efficacy of the treatment, showing 95% success rates under controlled conditions.
When to use: When discussing potential for success under ideal or optimal conditions.
Related concepts: Effectiveness, Controlled Results, Potential
Efficiency
Definition: Measure of how economically inputs (funds, expertise, time) are converted into outputs and results.
Example: The project delivered training to 2,000 farmers at 30% lower cost than similar initiatives, demonstrating high efficiency.
When to use: When assessing the relationship between resources used and results achieved.
Related concepts: Cost-Effectiveness, Value for Money, Resource Utilization
Efficiency Analysis
Definition: Assessment of how well resources (funds, expertise, time) are converted into results, examining alternative approaches to achieve the same outcomes.
Example: The efficiency analysis determined that the community-based approach delivered similar results at 40% lower cost than the previous centralized model.
When to use: When comparing alternative implementation models or seeking to optimize resource use.
Related concepts: Cost-effectiveness, Resource Optimization, Comparative Efficiency
Emergent Outcomes
Definition: Unforeseen or unplanned results that arise from complex interactions within a program or between a program and its context.
Example: An emergent outcome of the women’s economic empowerment program was the spontaneous formation of advocacy groups addressing gender-based violence.
When to use: When identifying and documenting unexpected results and adaptively responding to them in complex programs.
Related concepts: Unintended Consequences, Complex Outcomes, Adaptive Response
Empowerment
Definition: Process of increasing the capacity of individuals or groups to make choices and transform those choices into desired actions and outcomes.
Example: The women’s economic empowerment program enabled participants to start businesses and increase decision-making at household level.
When to use: When discussing initiatives that strengthen agency and capacity of marginalized groups.
Related concepts: Capacity Building, Agency, Participation
Empowerment Evaluation
Definition: Evaluation approach focused on fostering self-determination among program participants by building their capacity to plan, implement, and evaluate their own programs.
Example: The empowerment evaluation helped community groups develop skills to define success criteria, collect data, and use findings to improve their own initiatives.
When to use: When seeking to build evaluation capacity and self-determination among program participants and implementing organizations.
Related concepts: Participatory Evaluation, Capacity Building, Self-determination
Equity
Definition: The absence of avoidable or remediable differences among groups of people defined socially, economically, demographically, or geographically.
Example: The evaluation found that while overall outcomes improved, equity concerns remained as benefits were not reaching the most marginalized communities.
When to use: When examining whether program benefits are fairly distributed and whether disparities between different groups are being addressed.
Related concepts: Fairness, Justice, Equality, Inclusion
Equity-Focused Evaluation
Definition: Evaluation that particularly assesses the extent to which an intervention addresses systemic inequalities and reaches the most disadvantaged groups.
Example: The equity-focused evaluation examined whether the education program succeeded in improving outcomes for girls, ethnic minorities, and children with disabilities.
When to use: When assessing how well interventions address disparities and promote fairness in access to benefits.
Related concepts: Social Justice Evaluation, Gender-Responsive Evaluation, Rights-Based Approach
Ethical Evaluation
Definition: Practice of conducting evaluations in accordance with ethical principles such as respect, beneficence, justice, and minimizing harm.
Example: The ethical evaluation framework ensured proper consent procedures, confidentiality protections, and culturally sensitive data collection methods.
When to use: When establishing standards for responsible and respectful evaluation practice, particularly with vulnerable populations.
Related concepts: Research Ethics, Professional Standards, Ethical Frameworks
Evaluability
Definition: Extent to which an activity or project can be evaluated in a reliable and credible fashion.
Example: The evaluability assessment found the project lacked baseline data and clear indicators, making rigorous impact evaluation difficult.
When to use: When assessing whether conditions exist for meaningful evaluation.
Related concepts: Evaluation Design, Feasibility, Methodology
Evaluation
Definition: Systematic examination of a planned, ongoing, or completed project to determine its merit, worth, and lessons learned.
Example: The mid-term evaluation revealed implementation bottlenecks and recommended adjusting the targeting strategy.
When to use: When assessing project value, effectiveness, and learning opportunities through systematic inquiry.
Related concepts: Assessment, Appraisal, Review
Evaluation Criteria
Definition: Set of standards used to assess the value of an intervention, typically including relevance, effectiveness, efficiency, impact, and sustainability.
Example: The evaluation used the OECD-DAC criteria as a framework for assessing the agricultural development program.
When to use: When establishing a comprehensive framework for assessing different aspects of program performance.
Related concepts: Assessment Framework, Evaluative Standards, Performance Dimensions
Evaluation Design
Definition: Overall plan for conducting an evaluation, including methodology, data sources, analysis approach, and timing.
Example: The evaluation design incorporated mixed methods to triangulate findings about program outcomes and implementation processes.
When to use: When planning how to structure and conduct an evaluation to answer key questions reliably.
Related concepts: Methodology, Evaluation Plan, Research Design
Evaluation Matrix
Definition: Planning tool that links evaluation questions with data sources, methods, and analysis strategies.
Example: The evaluation matrix outlined specific indicators, data collection methods, and sampling approaches for each evaluation question.
When to use: When planning an evaluation to ensure comprehensive and methodical data collection and analysis.
Related concepts: Evaluation Framework, Research Matrix, Methodological Plan
Evaluation Synthesis
Definition: Systematic approach to organizing and combining information from multiple evaluations to develop comprehensive evidence about program effectiveness.
Example: The evaluation synthesis combined findings from 23 evaluations of similar programs to identify consistent success factors and common implementation challenges.
When to use: When seeking to generate broader insights and evidence base from multiple individual evaluations.
Related concepts: Meta-evaluation, Evidence Synthesis, Knowledge Aggregation
Evaluative Rubric
Definition: A tool that defines criteria for assessing performance and describes what performance looks like at different levels of quality or achievement.
Example: The evaluative rubric clearly defined what “excellent,” “good,” “adequate,” and “poor” sustainability looked like across multiple dimensions.
When to use: When establishing transparent, consistent standards for making evaluative judgments about program performance.
Related concepts: Assessment Criteria, Performance Standards, Quality Definitions
Evidence
Definition: Information that serves as proof or support for conclusions, encompassing a range of data types from different sources.
Example: The evaluation drew on multiple forms of evidence, including quantitative survey data, qualitative interviews, and administrative records.
When to use: When discussing the basis for findings, conclusions, or recommendations.
Related concepts: Data, Proof, Information, Facts
Evidence-Based Decision Making
Definition: Use of the best available evidence from research, context, and experience to inform policy and practice decisions.
Example: The ministry used findings from four rigorous evaluations to inform the redesign of the national nutrition program.
When to use: When discussing how evidence informs practical decisions about program design, implementation, or policy.
Related concepts: Data-Driven Decisions, Research Utilization, Knowledge Translation
Ex-Ante Evaluation
Definition: Assessment conducted before implementation to estimate potential impacts, risks, and feasibility.
Example: The ex-ante evaluation modeled potential economic returns and social impacts of three different program approaches.
When to use: When assessing proposed interventions to inform design decisions or funding allocations.
Related concepts: Feasibility Study, Appraisal, Prospective Assessment
Ex-Post Evaluation
Definition: Evaluation conducted some time after program completion to assess long-term impacts and sustainability of results.
Example: The ex-post evaluation conducted three years after project closure found that 70% of community water committees remained functional and infrastructure was still maintained.
When to use: When assessing enduring changes and sustainability of benefits after external support has ended.
Related concepts: Impact Assessment, Sustainability Evaluation, Long-term Follow-up
External Evaluation
Definition: Evaluation carried out by entities independent from those implementing the project.
Example: The donor commissioned an external evaluation team with no prior involvement in the project to ensure objective assessment.
When to use: When independent, objective assessment is required, especially for accountability.
Related concepts: Independent Assessment, Third-Party Evaluation, Objectivity
External Validity
Definition: The extent to which evaluation findings can be generalized to other contexts, populations, or time periods.
Example: The impact evaluation’s external validity was strengthened by implementing the program in diverse settings and analyzing how effects varied by context.
When to use: When discussing the broader applicability and generalizability of evaluation findings beyond the specific program and context studied.
Related concepts: Generalizability, Transferability, Applicability
F
Facility Survey
Definition: Survey of representative sample of facilities to assess service readiness, infrastructure, supplies, and quality of care.
Example: The facility survey revealed 70% of health centers lacked essential medicines and trained staff for emergency obstetric care.
When to use: When assessing health or service delivery infrastructure capacity and quality.
Related concepts: Health System Assessment, Quality Assessment, Infrastructure
Facilitator
Definition: Person who helps group members conduct meetings efficiently and effectively without dictating outcomes.
Example: The facilitator guided the stakeholder workshop through prioritization exercises without influencing the final selections.
When to use: When discussing roles for supporting participatory processes and group decision-making.
Related concepts: Moderator, Process Guide, Neutral Party
Feasibility Study
Definition: Analysis to determine whether a proposed project is technically, financially, and operationally viable.
Example: The feasibility study assessed market demand, financial projections, and technical requirements before investing in the irrigation system.
When to use: When determining if a project idea can realistically be implemented.
Related concepts: Appraisal, Viability Assessment, Pre-investment Study
Feedback
Definition: Transmission of evaluation findings to relevant parties to facilitate learning, including collection and dissemination of findings, conclusions, and lessons learned.
Example: Feedback sessions with community members validated findings and generated additional insights about program effects.
When to use: When sharing evaluation results with stakeholders to promote understanding and improvement.
Related concepts: Learning, Communication, Knowledge Sharing
Feedback Loop
Definition: Process by which information about program implementation and results is systematically gathered, analyzed, and used to inform adjustments.
Example: The feedback loop ensured that beneficiary satisfaction data collected monthly was reviewed by management and led to service improvements.
When to use: When establishing systems for continuous learning and adaptation based on performance information.
Related concepts: Learning Cycle, Adaptive Management, Information Flow
Findings
Definition: Factual statements based on evidence from one or more evaluations.
Example: Key findings included 30% increase in crop yields, improved market access, and gender disparities in program benefits.
When to use: When presenting evidence-based observations from evaluations.
Related concepts: Results, Evidence, Observations
Formative Evaluation
Definition: Evaluation conducted during implementation to improve performance, intended for managers and direct supporters of a project.
Example: The formative evaluation after six months of implementation identified bottlenecks in the referral system that required immediate adjustment.
When to use: When seeking to improve ongoing implementation through early feedback.
Related concepts: Process Evaluation, Implementation Assessment, Improvement-Oriented Evaluation
Funding Proposal
Definition: Document outlining a project concept, objectives, implementation approach, and budget to seek financial support.
Example: The funding proposal detailed the community water project’s objectives, implementation approach, budget, and expected outcomes.
When to use: When seeking financial resources from donors or investors for a project.
Related concepts: Project Proposal, Grant Application, Resource Mobilization
G
Gantt Chart
Definition: A project management tool that illustrates a project schedule, showing the start and finish dates of various elements such as activities, milestones, and deliverables.
Example: The evaluation Gantt chart visually displayed the timeline for key activities including instrument development, data collection phases, analysis, and reporting.
When to use: When planning and tracking evaluation or project activities against a timeline.
Related concepts: Project Schedule, Timeline, Activity Planning
Gender
Definition: Socially constructed roles, behaviors, activities, and attributes that a given society considers appropriate for men and women.
Example: The M&E system included gender-disaggregated indicators to track differential impacts on men and women.
When to use: When analyzing how project impacts differ based on gender roles and relations.
Related concepts: Gender Analysis, Sex-Disaggregated Data, Gender Equality
Gender Analysis
Definition: Examination of how gender roles, norms, and relations affect program access, participation, benefits, and impacts.
Example: Gender analysis revealed that the timing of training sessions prevented women from participating due to household responsibilities.
When to use: When examining how programs affect and are affected by gender dynamics in a specific context.
Related concepts: Gender Assessment, Gender Mainstreaming, Gender-Responsive Programming
Generalizability
Definition: Extent to which findings can be assumed true for the entire target population, not just the sample studied.
Example: The nationally representative survey design ensured findings could be generalized to all rural households in the country.
When to use: When discussing whether findings from a sample can apply to broader populations.
Related concepts: External Validity, Representativeness, Statistical Inference
Geographic Information System (GIS)
Definition: System designed to capture, store, analyze, and present spatial or geographic data.
Example: GIS mapping showed the correlation between program coverage and reduction in malnutrition rates across different regions.
When to use: When analyzing and visualizing the spatial dimensions of program implementation and results.
Related concepts: Spatial Analysis, Mapping, Geospatial Visualization
Goal
Definition: Broad statement of a desired, usually longer-term, outcome of a program that guides program development.
Example: The project’s goal was to reduce maternal mortality in the region by 30% within five years.
When to use: When stating the highest-level intended change a project contributes to.
Related concepts: Impact, Long-term Objective, Vision
Governance
Definition: Systems, processes, and institutions through which decisions are made, authority is exercised, and accountability is ensured.
Example: The evaluation assessed whether improved local governance resulted in more transparent, participatory decision-making about public resources.
When to use: When examining decision-making structures, accountability mechanisms, and power relations in programs or institutions.
Related concepts: Accountability, Transparency, Institutional Arrangements
Grassroots Organizations
Definition: Community-based organizations that represent primary stakeholders and may serve as implementing partners.
Example: Local farmer associations and women’s groups served as grassroots organizations delivering project activities at community level.
When to use: When discussing community-level organizations involved in project implementation.
Related concepts: Community-Based Organizations, Local Partners, Civil Society
H
Hawthorne Effect
Definition: A type of reactivity in which individuals modify their behavior when aware they are being observed or studied.
Example: The evaluation noted potential Hawthorne effect as health workers demonstrated better protocol adherence during announced monitoring visits than in routine practice.
When to use: When discussing potential biases in data collection due to observed subjects changing their behavior.
Related concepts: Observer Effect, Reactivity, Behavioral Modification
Health Information System (HIS)
Definition: Data system, usually computerized, that routinely collects and reports information about health service delivery, costs, demographics, and health status.
Example: The district health information system tracked monthly data on vaccination coverage, disease incidence, and service utilization.
When to use: When discussing systems for routine health data collection and management.
Related concepts: HMIS, Data Management, Health Metrics
Horizontal Logic
Definition: Component of a logframe that defines how objectives will be measured and verified, summarizing the M&E matrix.
Example: The horizontal logic specified indicators, data sources, and assumptions for each level of the results chain.
When to use: When explaining how achievement of project objectives will be measured.
Related concepts: Logframe, Indicators, Means of Verification
Hot Spot Analysis
Definition: Method of identifying geographic clusters where indicators show significantly high or low values compared to the overall average.
Example: Hot spot analysis revealed that child malnutrition was significantly concentrated in three sub-districts, prompting targeted intervention.
When to use: When identifying geographic patterns and targeting interventions to areas of greatest need or opportunity.
Related concepts: Geographical Targeting, Spatial Analysis, Cluster Detection
Human Rights-Based Approach
Definition: Conceptual framework that integrates human rights norms and principles into development programming and evaluation.
Example: The evaluation examined whether the legal aid program increased access to justice for marginalized groups as a fulfillment of their rights.
When to use: When assessing programs through the lens of rights fulfillment, non-discrimination, participation, and accountability.
Related concepts: Rights-Based Programming, Empowerment Evaluation, Justice-Oriented Evaluation
I
Impact
Definition: Long-term, cumulative effect of programs over time on what they ultimately aim to change, such as disease prevalence or poverty reduction.
Example: Five years after project completion, evaluation showed sustained impact through 40% reduction in childhood stunting.
When to use: When discussing long-term, significant changes resulting from interventions.
Related concepts: Long-term Outcomes, Ultimate Results, Transformative Change
Impact Assessment
Definition: Process of assessing the impacts of a program in an intervention area.
Example: The impact assessment used mixed methods to determine whether improved water access led to education and economic benefits.
When to use: When measuring the broader, longer-term effects of an intervention.
Related concepts: Impact Evaluation, Results Assessment, Effects Measurement
Impact Evaluation
Definition: Evaluation that assesses the changes in target outcomes attributable to a specific intervention using rigorous methods.
Example: The randomized control trial provided strong evidence that the cash transfer program reduced child malnutrition by 25%.
When to use: When determining causal relationships between an intervention and observed changes.
Related concepts: Attribution Analysis, Counterfactual Analysis, Causal Inference
Impact Investing
Definition: Investments made with the intention to generate positive, measurable social and environmental impact alongside a financial return.
Example: Impact investors required rigorous impact measurement to verify that the social enterprise was achieving both financial sustainability and improved livelihoods.
When to use: When discussing investment approaches that require evaluation of both social/environmental and financial performance.
Related concepts: Social Finance, Double Bottom Line, Sustainable Investing
Impact Monitoring
Definition: Tracking of health-related events such as disease prevalence or incidence; often referred to as “surveillance” in public health.
Example: Impact monitoring tracked HIV prevalence rates quarterly in sentinel sites to detect changes in the epidemic.
When to use: When tracking population-level changes in key impact indicators over time.
Related concepts: Surveillance, Population Monitoring, Trend Analysis
Impact Pathway
Definition: Visual representation of the causal chain from project activities through outputs, outcomes, and impacts, showing the mechanisms of change.
Example: The impact pathway illustrated how improved agricultural techniques would lead to higher yields, increased income, better nutrition, and ultimately improved health outcomes.
When to use: When mapping the causal mechanisms through which an intervention is expected to achieve its long-term goals.
Related concepts: Results Chain, Causal Pathway, Theory of Change
Implementation Environment
Definition: The set of external conditions, constraints, and opportunities that affect program implementation and results.
Example: The evaluation analyzed how the political, institutional, and economic implementation environment influenced the effectiveness of policy reform efforts.
When to use: When examining contextual factors that enable or constrain program implementation and effectiveness.
Related concepts: Context Analysis, Enabling Environment, Constraining Factors
Implementation Fidelity
Definition: Extent to which a program is delivered as intended in the original design or protocol.
Example: Implementation fidelity assessment showed that while the core components of the curriculum were consistently delivered, the duration of sessions varied significantly.
When to use: When assessing whether lack of results might be due to implementation failure rather than theory failure.
Related concepts: Program Integrity, Adherence, Compliance
Implementation Research
Definition: Study of methods to promote the systematic uptake of research findings and evidence-based practices into routine practice.
Example: Implementation research identified barriers to adoption of the new treatment protocol and tested strategies to overcome health worker resistance.
When to use: When investigating factors that influence the adoption, implementation, and sustainability of interventions in real-world settings.
Related concepts: Knowledge Translation, Evidence Uptake, Implementation Science
Implementing Partners
Definition: Organizations subcontracted or officially identified in agreements as responsible for implementing defined aspects of a project.
Example: The ministry of health and three NGOs served as implementing partners, each responsible for different geographic zones.
When to use: When discussing organizations directly involved in delivering project activities.
Related concepts: Stakeholders, Service Providers, Collaborating Agencies
Independent Evaluation
Definition: Evaluation carried out by entities free from control by those responsible for designing and implementing the intervention.
Example: The independent evaluation by external consultants provided an objective assessment of project achievements and shortcomings.
When to use: When unbiased, external assessment is needed for credibility and objectivity.
Related concepts: External Evaluation, Third-Party Assessment, Objectivity
Indicator
Definition: Quantitative or qualitative factor providing a simple and reliable means to measure achievement, reflect changes, or assess performance.
Example: The indicator “percentage of pregnant women receiving at least four antenatal visits” measured access to maternal health services.
When to use: When defining specific metrics to track progress toward objectives.
Related concepts: Metrics, Measures, Performance Criteria
Indicator Alignment
Definition: The degree to which selected indicators appropriately measure the intended outputs, outcomes, or impacts of a program.
Example: The indicator alignment assessment found that while output indicators were well-defined, outcome indicators did not adequately capture the program’s intended effects on systemic change.
When to use: When assessing whether monitoring and evaluation frameworks measure what matters most for program success.
Related concepts: Measurement Validity, Indicator Relevance, Results Framework Quality
Indigenous Evaluation
Definition: Evaluation approaches that center indigenous values, knowledge systems, and methods to assess programs in culturally appropriate ways.
Example: The indigenous evaluation framework incorporated traditional storytelling, elder consultations, and community consensus-building in the assessment process.
When to use: When evaluating programs with indigenous communities to ensure cultural appropriateness and respect for knowledge sovereignty.
Related concepts: Culturally Responsive Evaluation, Indigenous Methods, Decolonizing Evaluation
Indirect Effects
Definition: Unplanned changes brought about as a result of an intervention.
Example: An indirect effect of the road improvement project was increased school attendance as children could travel safely.
When to use: When discussing unintended consequences or ripple effects of interventions.
Related concepts: Unintended Consequences, Spillover Effects, Secondary Impacts
Information Management System
Definition: System for inputting, collating, and organizing data to provide selective information to management for monitoring and controlling resources, activities, and results.
Example: The project information management system integrated financial, activity, and results data for real-time decision support.
When to use: When discussing systems for organizing and using project data.
Related concepts: MIS, Data Management, Knowledge Management
Input
Definition: Financial, human, and material resources necessary to produce intended outputs of a project.
Example: Project inputs included staff time, training materials, equipment, and operational funding.
When to use: When discussing resources required to implement activities.
Related concepts: Resources, Investments, Requirements
Input and Output Monitoring
Definition: Tracking information about resources used (inputs) and results of program activities (outputs).
Example: Monthly input and output monitoring tracked budget utilization, staff time, and the number of training sessions delivered.
When to use: When tracking operational efficiency and immediate results of activities.
Related concepts: Process Monitoring, Activity Tracking, Resource Utilization
Institutional Assessment
Definition: Systematic analysis of an organization’s capabilities, systems, structures, and processes to identify strengths and areas for improvement.
Example: The institutional assessment revealed strong technical capacity but weak financial management systems that required strengthening.
When to use: When evaluating organizational performance or capacity development needs of implementing partners.
Related concepts: Organizational Diagnosis, Capacity Assessment, Institutional Analysis
Intangible Benefits
Definition: Positive effects of an intervention that are real but difficult to quantify or monetize.
Example: While economic returns were modest, intangible benefits included increased community cohesion, enhanced leadership skills, and strengthened local institutions.
When to use: When identifying and describing important non-quantifiable or non-monetary benefits of interventions.
Related concepts: Qualitative Outcomes, Soft Benefits, Non-economic Value
Integrated Data Management
Definition: Coordinated approach to collecting, storing, and analyzing data from multiple sources and systems.
Example: The integrated data management system combined health facility data, community outreach information, and household survey results in a central dashboard.
When to use: When establishing systems to manage diverse data streams for comprehensive program monitoring and evaluation.
Related concepts: Data Integration, Information Systems, Unified Reporting
Internal Evaluation
Definition: Evaluation conducted by individuals who report to the management of the organization responsible for the intervention.
Example: The program team conducted an internal evaluation to quickly identify implementation challenges and make adjustments.
When to use: When rapid feedback and learning are priorities and external perspective is less critical.
Related concepts: Self-Assessment, Internal Review, Organizational Learning
Internal Validity
Definition: The extent to which a causal relationship can be established between the intervention and the observed outcomes, ruling out alternative explanations.
Example: The randomized design strengthened internal validity by ensuring that differences between treatment and control groups could be attributed to the intervention.
When to use: When assessing the strength of evidence for causal claims about program effects.
Related concepts: Causal Inference, Research Design, Methodological Rigor
Interim Evaluation
Definition: Project evaluation undertaken toward the end of implementation (about one year before closing) when considering a second phase or new project in the same area.
Example: The interim evaluation documented achievements and challenges to inform design of the proposed follow-up project.
When to use: When assessing performance before project completion, especially to inform future phases.
Related concepts: Mid-term Evaluation, Formative Evaluation, Pre-extension Assessment
Intervention
Definition: Specific activity or set of activities intended to bring about change in some aspect of the target population’s status.
Example: The nutrition intervention combined supplementary feeding, education, and growth monitoring to address child malnutrition.
When to use: When discussing specific approaches to address development challenges.
Related concepts: Program, Project, Initiative
Intervention Logic
Definition: Causal chain showing how inputs lead to outputs, outcomes, and impacts through project activities.
Example: The intervention logic showed how improved seeds and training would lead to increased yields, then to higher incomes and reduced poverty.
When to use: When explaining the causal pathway from activities to intended results.
Related concepts: Theory of Change, Results Chain, Logic Model
J
Joint Evaluation
Definition: Evaluation to which different institutions and/or partners contribute.
Example: The joint evaluation involved the donor, government, and NGO partners sharing costs, expertise, and responsibilities.
When to use: When multiple stakeholders want to collaborate on assessing shared initiatives.
Related concepts: Collaborative Assessment, Multi-stakeholder Evaluation, Partnership Evaluation
K
Kappa Statistic
Definition: A measure of inter-rater reliability that assesses the degree of agreement between two or more raters beyond what would be expected by chance.
Example: The data quality assessment reported a Cohen’s kappa of 0.82 for consistent classification of household vulnerability status by different field staff.
When to use: When assessing the reliability of categorical judgments or classifications made by multiple data collectors or analysts.
Related concepts: Inter-rater Reliability, Agreement Coefficient, Consistency Measurement
Key Informant
Definition: Person with specialized knowledge who can provide insights about a community, context, or intervention.
Example: Village elders, health workers, and local officials served as key informants providing historical context and cultural insights for the evaluation.
When to use: When seeking expert or insider perspectives on program context or implementation.
Related concepts: Resource Person, Expert Informant, Local Knowledge Source
Key Performance Indicator (KPI)
Definition: Critical measure used to track performance against strategic objectives, typically focusing on factors most important to success.
Example: The organization’s KPIs included cost per beneficiary reached, adoption rate of promoted practices, and percentage of initiatives showing sustained results after one year.
When to use: When establishing priority metrics for tracking organizational or program performance against strategic goals.
Related concepts: Strategic Metrics, Dashboard Indicators, Performance Measurement
Knowledge Management
Definition: Systematic processes for creating, storing, sharing, and using organizational knowledge to improve performance.
Example: The knowledge management system captured lessons from project evaluations across countries and made them accessible to new project teams.
When to use: When establishing systems to leverage learning and knowledge for improved programming and decision-making.
Related concepts: Organizational Learning, Knowledge Sharing, Information Management
L
Learning
Definition: Reflecting on experience to identify improvements and then applying this knowledge to make actual improvements.
Example: Quarterly learning sessions analyzed monitoring data to identify successful outreach strategies for replication.
When to use: When discussing how organizations use evidence and experience to improve performance.
Related concepts: Knowledge Management, Adaptive Management, Reflection
Lessons Learned
Definition: Knowledge generated by reflecting on experience that has potential to improve future actions.
Example: A key lesson learned was that involving community leaders from the planning stage significantly improved participation rates.
When to use: When documenting insights from experience to inform future practice.
Related concepts: Best Practices, Knowledge Sharing, Insights
Loan Agreement
Definition: Agreement spelling out project goal, area, components, and budget, containing formal compliance conditions related to procurement, reporting, and financial management.
Example: The loan agreement specified that quarterly financial reports and annual audits were required for continued disbursement.
When to use: When discussing formal contractual arrangements for project financing.
Related concepts: Financing Agreement, Legal Document, Contract
Logic Model
Definition: Visual representation of the theory of change showing the relationships between resources, activities, outputs, outcomes, and impacts.
Example: The logic model illustrated how training community health workers would lead to improved service delivery, increased care seeking, and ultimately better health outcomes.
When to use: When creating a simplified visual representation of program logic for communication or planning purposes.
Related concepts: Program Theory, Results Chain, Conceptual Framework
Logic Testing
Definition: Systematic examination of a program’s theory of change to assess its plausibility, feasibility, and potential gaps or flaws.
Example: Logic testing revealed unrealistic assumptions about how quickly policy changes would translate into behavioral changes among target populations.
When to use: When critically examining the logic and assumptions underlying program design before or during implementation.
Related concepts: Theory Assessment, Assumption Testing, Plausibility Check
Logical Framework Approach (LFA)
Definition: Analytical and management tool involving problem analysis, stakeholder analysis, objective hierarchy development, and implementation strategy selection.
Example: The team used the logical framework approach to systematically analyze the problem, identify stakeholders, and develop coherent objectives.
When to use: When designing projects using structured analytical processes that link problems, solutions, and measurement.
Related concepts: Results Framework, Project Design, Planning Tool
Longitudinal Study
Definition: Research design that involves repeated observations of the same variables over extended periods of time.
Example: The longitudinal study tracked cohorts of program participants over five years to assess how early childhood interventions affected later educational outcomes.
When to use: When examining changes, developments, or trends over time, especially for outcomes that may take time to manifest.
Related concepts: Panel Study, Cohort Study, Trend Analysis
M
M&E Budget
Definition: Financial resources allocated specifically for monitoring and evaluation activities within a project or program.
Example: The M&E budget allocated 7% of total program funds to baseline studies, ongoing monitoring, mid-term and final evaluations, and learning events.
When to use: When planning and tracking resources dedicated to measurement, assessment, and learning activities.
Related concepts: Evaluation Resources, Monitoring Costs, M&E Planning
M&E Framework
Definition: Document outlining how a program will be monitored and evaluated, including indicators, data collection methods, analysis plans, and reporting requirements.
Example: The M&E framework established clear responsibilities, timelines, and methodologies for tracking progress across all project components.
When to use: When establishing the overall approach and system for monitoring and evaluating a program.
Related concepts: Results Framework, Performance Measurement System, Evaluation Plan
M&E Plan
Definition: Detailed document describing how the M&E framework will be operationalized, including specific roles, resources, and timelines.
Example: The M&E plan specified who would collect what data, when, how, and with what resources throughout the project lifecycle.
When to use: When translating an M&E framework into concrete operational guidelines for implementation.
Related concepts: Implementation Plan, Operational Guidelines, Monitoring Strategy
Meta-data
Definition: Structured information that describes, explains, locates, or otherwise makes it easier to retrieve, use, or manage data resources.
Example: The project database included meta-data on each indicator specifying the definition, calculation method, data source, collection frequency, and responsible staff.
When to use: When documenting and organizing information about data to ensure consistent understanding and appropriate use.
Related concepts: Data Documentation, Information Architecture, Data Dictionary
Meta-evaluation
Definition: Evaluation of evaluations to assess their quality, credibility, and utility.
Example: The meta-evaluation reviewed 12 project evaluations to assess methodological rigor and identify common weaknesses in evaluation practice.
When to use: When assessing the quality of evaluations or synthesizing findings across multiple evaluation studies.
Related concepts: Quality Assessment, Evaluation Review, Evaluation Synthesis
Methodology
Definition: Systematic approach to data collection and analysis, including research design, sampling, data collection methods, and analytical techniques.
Example: The evaluation methodology combined household surveys, in-depth interviews, and administrative data analysis to triangulate findings.
When to use: When describing the approach to gathering and analyzing information in monitoring and evaluation.
Related concepts: Research Methods, Data Collection Approach, Analytical Strategy
Mid-term Evaluation
Definition: Assessment conducted at the middle of the implementation period to assess progress and make course corrections.
Example: The mid-term evaluation identified implementation bottlenecks and recommended adjustments to the targeting strategy and budget allocation.
When to use: When assessing initial implementation experience to improve performance in the remaining project period.
Related concepts: Formative Evaluation, Interim Assessment, Implementation Review
Milestone
Definition: Significant point or event in a project used to measure development or progress toward ultimate objectives.
Example: The completion of the baseline study, launch of the training program, and establishment of community committees were key project milestones.
When to use: When defining markers of progress for monitoring implementation timelines and achievements.
Related concepts: Benchmark, Checkpoint, Key Deliverable
Monitoring
Definition: Continuous process of collecting and analyzing information to track progress against plans and check compliance with established standards.
Example: Regular monitoring revealed that attendance at training sessions was below target in certain regions, prompting outreach strategy adjustments.
When to use: When tracking implementation progress and identifying issues requiring attention during program execution.
Related concepts: Tracking, Oversight, Surveillance
Monitoring, Evaluation, Research, and Learning (MERL)
Definition: Integrated approach to generating and using evidence throughout program cycle, combining monitoring, evaluation, research, and intentional learning processes.
Example: The MERL framework ensured that evidence from routine monitoring, formal evaluations, targeted research, and reflective learning all informed program adaptation.
When to use: When establishing comprehensive systems for generating and using different types of evidence to improve program performance.
Related concepts: Evidence-based Management, Learning Organization, Knowledge System
Most Significant Change
Definition: Participatory monitoring and evaluation technique based on collecting and analyzing stories about important changes experienced by stakeholders.
Example: Most Significant Change stories revealed unexpected psychosocial benefits of the economic empowerment program that weren’t captured in quantitative indicators.
When to use: When capturing complex, diverse, and unexpected changes from the perspective of program participants.
Related concepts: Storytelling, Qualitative Monitoring, Narrative Methods
Multi-level Evaluation
Definition: Evaluation approach that examines changes at different levels of a system or intervention, such as individual, organizational, and policy levels.
Example: The multi-level evaluation assessed changes in provider knowledge and practices, health facility capacities, district management systems, and national policies.
When to use: When evaluating complex interventions that operate at multiple levels and seek to create change across a system.
Related concepts: Systems Evaluation, Comprehensive Assessment, Hierarchical Analysis
N
Needs Assessment
Definition: Systematic process for determining gaps between current and desired conditions to guide intervention design.
Example: The needs assessment identified lack of technical knowledge, market access, and credit as key constraints for smallholder farmers.
When to use: When identifying priority problems and their causes before designing interventions.
Related concepts: Situation Analysis, Gap Analysis, Problem Assessment
Negative Case Analysis
Definition: The process of identifying and examining cases that contradict patterns or explanations emerging from data analysis.
Example: Negative case analysis focused on understanding why some communities showed no improvement despite receiving the same intervention as successful communities.
When to use: When refining and strengthening findings by exploring exceptions or contradictions to apparent patterns.
Related concepts: Exception Analysis, Alternative Explanation, Critical Inquiry
Network Analysis
Definition: Method for mapping and measuring relationships between people, groups, organizations, or other entities within a system.
Example: Network analysis of stakeholder relationships revealed that information flowed primarily through district health officers, making them critical nodes for program communication.
When to use: When examining collaboration patterns, information flow, or influence relationships within complex stakeholder systems.
Related concepts: Social Network Analysis, Systems Mapping, Relationship Mapping
Norm-Referenced
Definition: Assessment approach that compares individual or group performance against the performance of a defined reference group or norm.
Example: The norm-referenced analysis showed that the project district’s vaccination rates were in the top quartile compared to national averages.
When to use: When comparing performance relative to established benchmarks or peer groups rather than against absolute standards.
Related concepts: Comparative Assessment, Benchmarking, Standardized Comparison
O
Objective
Definition: Statement of desired results to be achieved within a specified time frame, described in concrete, measurable terms.
Example: The project objective was to increase agricultural yields by 30% among 10,000 smallholder farmers within three years.
When to use: When defining specific, measurable results the intervention aims to achieve.
Related concepts: Goal, Target, Expected Result
Objective-Based Evaluation
Definition: Evaluation that focuses primarily on determining the extent to which a program’s stated objectives have been achieved.
Example: The objective-based evaluation systematically assessed achievements against each of the five specific objectives defined in the project document.
When to use: When primarily interested in measuring success as defined by the program’s own stated objectives.
Related concepts: Goal-Based Evaluation, Effectiveness Assessment, Objective Achievement
Outcome
Definition: The likely or achieved short-term and medium-term effects of an intervention’s outputs, representing changes in conditions resulting from the intervention.
Example: Improved farming practices (outcome) resulted from the training and extension services (outputs) provided by the project.
When to use: When discussing the changes in behavior, knowledge, skills, or conditions that result from project activities and outputs.
Related concepts: Results, Effects, Changes
Outcome Harvesting
Definition: Evaluation approach that collects evidence of what has changed (outcomes) and then works backward to determine whether and how an intervention contributed to these changes.
Example: Outcome Harvesting identified shifts in government policy that advocates claimed resulted from the program’s policy dialogue and technical assistance.
When to use: When identifying and understanding outcomes in complex situations where cause-effect relationships are not easily established.
Related concepts: Outcome Mapping, Contribution Analysis, Non-linear Assessment
Outcome Mapping
Definition: Methodology focusing on changes in behavior, relationships, actions, or activities of people and organizations directly influenced by a program.
Example: Outcome mapping documented how health officials gradually changed their approach to community engagement after participating in the program.
When to use: When tracking behavioral and relationship changes in boundary partners that contribute to development impacts.
Related concepts: Boundary Partners, Behavior Change, Progress Markers
Output
Definition: The products, capital goods, and services that result from development interventions; includes changes resulting from interventions that are relevant to the achievement of outcomes.
Example: Project outputs included 20 training sessions conducted, 500 farmers trained, and 15 demonstration plots established.
When to use: When discussing the direct, tangible products of activities.
Related concepts: Deliverables, Products, Immediate Results
P
Participatory Evaluation
Definition: Evaluation approach involving stakeholders, especially primary stakeholders, in designing, conducting, and using evaluation.
Example: The participatory evaluation engaged community members in developing evaluation questions, collecting data, and interpreting findings about the water project.
When to use: When seeking to build ownership, empowerment, and utilization of evaluation findings among stakeholders.
Related concepts: Stakeholder Involvement, Collaborative Evaluation, Empowerment Evaluation
Participatory Monitoring
Definition: Process through which primary stakeholders actively engage in monitoring project activities, sharing in decision-making about what to monitor and how findings will be used.
Example: Community health committees conducted participatory monitoring of local health services using simple tools to track service quality and accessibility.
When to use: When enabling beneficiaries and local stakeholders to track and provide feedback on intervention implementation.
Related concepts: Community Monitoring, Citizen Oversight, Feedback Mechanisms
Performance Management
Definition: Using information from monitoring and evaluation to improve effectiveness and efficiency of an organization or intervention.
Example: Performance management meetings reviewed quarterly monitoring data to identify implementation bottlenecks and allocate resources to underperforming areas.
When to use: When establishing systems to regularly use data for operational improvements and strategic decision-making.
Related concepts: Results-Based Management, Data-Driven Decision Making, Continuous Improvement
Portfolio Evaluation
Definition: Assessment of a suite of related projects, programs, or investments to understand collective performance, strategic alignment, and optimization.
Example: The portfolio evaluation examined the complementarity and combined effectiveness of five different educational interventions across the region.
When to use: When assessing the collective value and strategic coherence of multiple related interventions rather than individual projects.
Related concepts: Strategic Evaluation, Program Evaluation, Investment Review
Process Evaluation
Definition: Assessment focusing on how a program is implemented and operates, examining whether it is operating as intended and identifying areas for improvement.
Example: The process evaluation examined barriers to client enrollment, quality of service delivery, and staff adherence to program protocols.
When to use: When assessing implementation quality, fidelity, and efficiency rather than outcomes or impacts.
Related concepts: Implementation Assessment, Formative Evaluation, Quality Assurance
Program
Definition: Group of related projects managed in a coordinated way to obtain benefits and control not available from managing them individually.
Example: The maternal and child health program encompassed multiple projects on nutrition, immunization, and antenatal care.
When to use: When referring to a coordinated set of activities or projects with common objectives.
Related concepts: Initiative, Portfolio, Strategic Framework
Program Theory
Definition: Explanation of how and why an intervention is expected to lead to intended outcomes, articulating causal mechanisms and assumptions.
Example: The program theory explained how community mobilization, coupled with improved service access, would lead to behavior change and health improvements.
When to use: When examining the underlying logic and assumptions of how an intervention is expected to work.
Related concepts: Theory of Change, Impact Pathway, Intervention Logic
Project
Definition: Planned set of interrelated tasks to be executed over a fixed period and within certain cost and other limitations to achieve specific objectives.
Example: The irrigation project aimed to install water management systems in 50 villages over three years with a budget of $2 million.
When to use: When discussing a specific, time-bound set of activities with defined objectives, resources, and management arrangements.
Related concepts: Initiative, Intervention, Activity Set
Project Cycle
Definition: Set of clearly defined phases through which a project progresses from start to completion.
Example: The project cycle included identification, preparation, appraisal, implementation, monitoring and evaluation, and completion.
When to use: When discussing the sequential stages or phases through which projects typically progress.
Related concepts: Project Management, Project Phases, Project Lifecycle
Project Supervision
Definition: Process by which a cooperating institution or implementing agency monitors compliance with agreements, makes recommendations, and helps to resolve problems.
Example: Project supervision missions conducted twice yearly assessed implementation progress, financial management, and compliance with safeguards.
When to use: When discussing oversight and support provided to implementing partners during project execution.
Related concepts: Oversight, Implementation Support, Compliance Monitoring
Propensity Score Matching
Definition: Statistical matching technique that attempts to estimate the effect of a treatment or intervention by accounting for the factors that predict receiving the treatment.
Example: Propensity score matching was used to create comparable treatment and comparison groups by matching on baseline characteristics that influenced program participation.
When to use: When creating comparable groups for impact evaluation when random assignment is not possible.
Related concepts: Quasi-experimental Design, Statistical Matching, Causal Inference
Purpose
Definition: The publicly stated objectives of a development program or project.
Example: The stated purpose of the irrigation project was to increase agricultural productivity and food security among smallholder farmers.
When to use: When communicating the primary intended objectives or aims of an intervention.
Related concepts: Objectives, Aims, Goals
Q
Qualitative Data
Definition: Information that describes qualities or characteristics, typically expressed in narrative form rather than numerically.
Example: Qualitative data from interviews revealed participant perceptions about program quality and barriers to participation.
When to use: When seeking to understand meanings, experiences, perspectives, and contextual factors.
Related concepts: Narrative Data, Descriptive Information, Textual Data
Qualitative Evaluation
Definition: Evaluation approach using primarily qualitative methods to understand program processes, contexts, and experiences.
Example: The qualitative evaluation explored how cultural factors influenced program acceptance and implementation in different communities.
When to use: When seeking deep understanding of how and why interventions work or don’t work in specific contexts.
Related concepts: Interpretive Evaluation, Naturalistic Inquiry, Case Study
Quality Assurance
Definition: Systematic process of checking whether a product or service meets specified requirements and quality standards.
Example: Quality assurance procedures included double data entry, supervisor review of surveys, and regular calibration of measurement equipment.
When to use: When establishing systems to maintain consistent quality in data collection, analysis, and reporting.
Related concepts: Quality Control, Data Quality, Standards Compliance
Quality Management
Definition: Coordinated activities to direct and control an organization with regard to quality, ensuring consistency and continuous improvement.
Example: The evaluation team’s quality management system included standardized protocols, peer review processes, and regular quality checks of data and analysis.
When to use: When establishing systems to ensure quality and consistency in program implementation or evaluation processes.
Related concepts: Quality Assurance, Quality Control, Continuous Improvement
Quantitative Data
Definition: Information that can be counted or measured and expressed numerically.
Example: Quantitative data showed that 78% of program participants reported increased income after project completion.
When to use: When precise measurement, statistical analysis, or numerical comparisons are needed.
Related concepts: Numerical Data, Statistical Information, Measurable Data
Quantitative Evaluation
Definition: Evaluation approach using primarily numerical data and statistical methods to assess program outcomes and impacts.
Example: The quantitative evaluation used survey data from 2,000 households to measure changes in health indicators before and after the intervention.
When to use: When seeking to measure and quantify program effects, particularly for larger populations.
Related concepts: Statistical Analysis, Impact Measurement, Outcomes Assessment
Quasi-Experimental Design
Definition: Research design that resembles an experiment but lacks random assignment of participants to treatment and control groups.
Example: The quasi-experimental design used difference-in-differences analysis to compare changes in program communities with similar non-program communities.
When to use: When seeking to establish causal relationships but random assignment is not feasible for practical or ethical reasons.
Related concepts: Non-randomized Trial, Comparison Group Design, Impact Evaluation
Questionnaire
Definition: Set of structured questions designed to collect specific information from respondents for data analysis.
Example: The household questionnaire contained modules on demographics, economic activities, food security, and program participation.
When to use: When systematic collection of comparable data from multiple respondents is needed.
Related concepts: Survey Instrument, Data Collection Tool, Interview Schedule
R
Random Assignment
Definition: Process of assigning participants to treatment or control groups using probability methods to ensure no systematic differences between groups.
Example: Random assignment was used to determine which 50 of the 100 eligible schools would receive the new teaching materials.
When to use: When designing experimental evaluations to establish causal effects of interventions.
Related concepts: Randomization, Experimental Design, Control Group
Random Sampling
Definition: Selection process in which each member of the population has an equal chance of being selected for the sample.
Example: Random sampling selected 400 households from the complete list of 10,000 program beneficiaries for the survey.
When to use: When selecting representative samples for surveys or assessments to make inferences about the larger population.
Related concepts: Probability Sampling, Representative Sample, Statistical Inference
Randomized Controlled Trial (RCT)
Definition: Experimental evaluation design in which participants are randomly assigned to treatment or control groups to determine causal effects of interventions.
Example: The randomized controlled trial demonstrated that the agricultural extension program increased crop yields by an average of 23% compared to control farmers.
When to use: When seeking strongest evidence of causal impacts of an intervention, controlling for selection bias and confounding factors.
Related concepts: Experimental Design, Impact Evaluation, Causal Inference
Rapid Appraisal
Definition: Quick, low-cost assessment approach using simplified methods to provide timely information for decision-making.
Example: A rapid appraisal using key informant interviews and community transect walks provided initial insights within two weeks of the disaster.
When to use: When timely information is needed and comprehensive data collection is not feasible due to time or resource constraints.
Related concepts: Quick Assessment, Rapid Assessment, Expedited Study
Reach
Definition: The extent to which a program or intervention connects with its intended target population.
Example: The vaccination campaign achieved 85% reach among children under five in target districts.
When to use: When assessing the coverage and penetration of program services to intended beneficiaries.
Related concepts: Coverage, Participation Rate, Access
Real-Time Evaluation
Definition: Evaluation conducted during implementation to provide immediate feedback for program adjustments.
Example: The real-time evaluation provided bi-weekly feedback during the emergency response, allowing rapid course corrections to distribution methods.
When to use: When time-sensitive decision-making requires immediate evaluative feedback, especially in humanitarian or emergency contexts.
Related concepts: Developmental Evaluation, Formative Assessment, Rapid Feedback
Recommendations
Definition: Proposals for action based on evaluation findings and conclusions, aimed at improving performance.
Example: Key recommendations included redesigning the targeting criteria, strengthening the monitoring system, and increasing community engagement in planning.
When to use: When translating evaluation findings into actionable guidance for program improvement.
Related concepts: Action Items, Suggested Improvements, Evaluation Utilization
Reflection
Definition: Critical examination of experience to generate insights and learning that can be applied to improve future practice.
Example: The quarterly reflection workshops allowed field staff to share challenges, successes, and adaptations in program implementation.
When to use: When facilitating intentional learning from experience to improve program implementation and design.
Related concepts: Critical Thinking, Learning, After Action Review
Relevance
Definition: Extent to which an intervention’s objectives are consistent with beneficiaries’ requirements, country needs, global priorities, and partners’ and donors’ policies.
Example: The evaluation assessed the relevance of the vocational training program in relation to labor market demands and youth employment challenges.
When to use: When assessing whether an intervention addresses priority needs and aligns with strategic priorities.
Related concepts: Appropriateness, Alignment, Responsiveness
Reliability
Definition: Consistency or dependability of data collection procedures, measures, or instruments over time and across different observers.
Example: Inter-rater reliability testing showed 92% agreement between different enumerators using the observation protocol.
When to use: When assessing whether data collection methods produce consistent results when repeated.
Related concepts: Consistency, Dependability, Precision
Remote Monitoring
Definition: Collection and analysis of program information from a distance, without direct physical presence at implementation sites.
Example: Remote monitoring using satellite imagery, mobile data collection, and phone surveys allowed continued oversight during access restrictions.
When to use: When security concerns, geographic barriers, or other constraints prevent direct on-site monitoring.
Related concepts: Distance Monitoring, Third-Party Monitoring, Technology-Enabled Monitoring
Replication
Definition: Implementation of a successful intervention in a new context or with a different population to test its transferability.
Example: After successful pilots in two districts, the community health worker model was replicated in ten additional districts.
When to use: When scaling up proven interventions or testing whether successful approaches work in different contexts.
Related concepts: Scale-up, Transfer, Adaptation
Reporting
Definition: Process of communicating evaluation findings, conclusions, recommendations, and lessons learned to stakeholders.
Example: Evaluation reporting included a comprehensive final report, executive summary, community feedback sessions, and donor briefing.
When to use: When communicating evaluation results to diverse audiences for accountability and learning purposes.
Related concepts: Communication, Dissemination, Knowledge Sharing
Representative Sample
Definition: Subset of a population that accurately reflects the characteristics of the larger group.
Example: The representative sample included participants from all geographic regions, income levels, and demographic groups in proportion to their distribution in the program population.
When to use: When selecting participants for surveys or assessments to enable valid generalizations about the full population.
Related concepts: Random Sample, Statistical Validity, Generalizability
Research Design
Definition: Overall plan for addressing research or evaluation questions, including methodology, data collection, analysis approach, and sampling strategy.
Example: The research design combined a household survey, focus groups, and administrative data analysis to assess program impact and implementation quality.
When to use: When planning the overall approach to gathering and analyzing information to answer evaluation questions.
Related concepts: Methodology, Evaluation Plan, Study Framework
Results
Definition: The output, outcome, or impact of a development intervention, either intended or unintended, positive or negative.
Example: Results of the education program included increased enrollment rates, improved test scores, and reduced gender disparities in academic performance.
When to use: When discussing the consequences or effects of an intervention at any level of the results chain.
Related concepts: Effects, Consequences, Achievements
Results Chain
Definition: Causal sequence showing how program activities are expected to lead to outputs, outcomes, and ultimately impact.
Example: The results chain illustrated how teacher training (activity) would improve teaching practices (output), leading to better student learning (outcome) and eventually higher graduation rates (impact).
When to use: When mapping the logical sequence of results from activities to long-term impacts.
Related concepts: Logical Framework, Theory of Change, Impact Pathway
Results Framework
Definition: Explicit articulation of the results expected from a particular intervention, depicting the causal relationships between inputs, activities, outputs, outcomes, and impact.
Example: The results framework visually displayed how program components would contribute to intermediate and final outcomes, with indicators for each level.
When to use: When establishing a comprehensive structure for planning, monitoring, and evaluating an intervention.
Related concepts: Strategic Framework, Logic Model, Performance Framework
Results-Based Management
Definition: Management strategy focusing on performance and achievement of outputs, outcomes, and impacts rather than activities or processes.
Example: The organization’s results-based management approach prioritized outcome-level measurement and created incentives for achieving intended results rather than just implementing activities.
When to use: When establishing management systems focused on achieving and demonstrating results rather than just implementation.
Related concepts: Managing for Results, Performance Management, Outcome-oriented Management
Return on Investment (ROI)
Definition: Performance measure used to evaluate the efficiency of an investment, calculated by dividing the benefit (return) by the cost.
Example: The nutrition program showed an ROI of 1:16, meaning each dollar invested generated $16 in economic benefits through improved productivity and reduced healthcare costs.
When to use: When assessing the economic efficiency of interventions or comparing investment options.
Related concepts: Cost-Benefit Ratio, Economic Return, Investment Efficiency
Review
Definition: Assessment of performance or progress of an intervention, periodically or on an ad hoc basis, less comprehensive than an evaluation.
Example: The quarterly program review examined implementation progress, budget utilization, and early results against the annual work plan.
When to use: When conducting regular or periodic assessments of program performance or progress.
Related concepts: Assessment, Performance Check, Progress Review
Risk Assessment
Definition: Systematic process to identify, analyze, and respond to risks that might affect program implementation and results achievement.
Example: The risk assessment identified political instability, seasonal flooding, and staff turnover as high-probability risks requiring mitigation strategies.
When to use: When identifying and planning responses to potential threats to program success.
Related concepts: Risk Management, Threat Analysis, Contingency Planning
S
Sample
Definition: Subset of a population selected for data collection and analysis to make inferences about the wider population.
Example: The evaluation collected data from a sample of 350 beneficiary households to represent the full program population of 5,000 households.
When to use: When it is not feasible or necessary to collect data from the entire population.
Related concepts: Population, Representativeness, Sampling Frame
Sampling Frame
Definition: Complete list of all members of the population from which a sample can be drawn.
Example: The program beneficiary database containing contact information for all 10,000 registered participants served as the sampling frame for the evaluation survey.
When to use: When defining the population from which a sample will be selected for data collection.
Related concepts: Population, Sample Selection, Enumeration
Sampling Method
Definition: Technique used to select a subset of units from a population for data collection and analysis.
Example: The evaluation used stratified random sampling to ensure proportional representation of urban and rural beneficiaries in the study.
When to use: When designing a study that requires data collection from a subset rather than an entire population.
Related concepts: Random Sampling, Purposive Sampling, Representative Sample
Scale-up
Definition: Expansion of a successful intervention to reach more people, cover wider geographic areas, or address additional issues.
Example: After successful piloting in three districts, the nutrition program was scaled up to all 15 districts in the province.
When to use: When discussing the expansion of effective interventions beyond initial implementation sites or populations.
Related concepts: Replication, Expansion, Growth
Scope
Definition: The boundaries of an evaluation or intervention, defining what is included and excluded in terms of time period, geographic area, target groups, and issues addressed.
Example: The evaluation scope focused on program implementation from 2018-2022 in northern provinces, examining agricultural productivity and market access outcomes.
When to use: When defining the parameters and limitations of an evaluation or intervention.
Related concepts: Boundaries, Coverage, Focus
Secondary Data
Definition: Information collected by someone other than the current user, often for purposes different from current research needs.
Example: The evaluation utilized secondary data from national household surveys and ministry health statistics to complement primary data collection.
When to use: When existing data sources can provide relevant information to address evaluation questions or establish context.
Related concepts: Existing Data, Archival Data, Administrative Data
Sector-Wide Approach (SWAp)
Definition: Development strategy where major funding agencies support a single sector policy and expenditure program under government leadership.
Example: The education sector-wide approach aligned donor investments with the national education strategic plan and used common monitoring frameworks.
When to use: When discussing coordinated approaches to sector development that align multiple donors and government under common frameworks.
Related concepts: Harmonization, Alignment, Coordinated Development
Self-Evaluation
Definition: Evaluation conducted by those responsible for designing and implementing an intervention.
Example: The project team conducted a self-evaluation of their community engagement methods to identify strengths and areas for improvement.
When to use: When program implementers assess their own work for learning and improvement purposes.
Related concepts: Internal Evaluation, Reflective Practice, Learning Review
Semi-Structured Interview
Definition: Interview methodology using a flexible guide with predetermined but open-ended questions, allowing new ideas to emerge during the conversation.
Example: Semi-structured interviews with health workers explored implementation challenges while allowing unexpected issues to emerge during discussions.
When to use: When seeking detailed information on specific topics while maintaining flexibility to explore emerging themes.
Related concepts: Qualitative Methods, In-depth Interview, Interview Guide
Significant Change
Definition: Important or noteworthy alteration in conditions, knowledge, attitudes, practices, or systems that represents meaningful progress toward desired outcomes.
Example: The most significant change reported by community members was increased female participation in village decision-making processes.
When to use: When identifying and analyzing important changes resulting from interventions, particularly from stakeholders’ perspectives.
Related concepts: Impact, Transformation, Important Outcome
SMART Criteria
Definition: Framework for setting objectives that are Specific, Measurable, Achievable, Relevant, and Time-bound.
Example: The project revised its objectives to make them SMART: “Increase vaccination coverage among children under 5 from 60% to 80% in target districts within 18 months.”
When to use: When formulating or assessing objectives to ensure they provide clear direction and enable measurement of success.
Related concepts: Goal Setting, Objective Formulation, Performance Management
Social Return on Investment (SROI)
Definition: Method for measuring extra-financial value (social, environmental, economic outcomes) relative to resources invested.
Example: The SROI analysis showed that for every $1 invested in the youth employment program, $4.20 of social value was created through reduced crime, increased tax revenue, and improved well-being.
When to use: When seeking to capture the full value of interventions beyond financial returns alone.
Related concepts: Cost-Benefit Analysis, Impact Valuation, Socioeconomic Return
Stakeholder
Definition: Individual, group, or institution with an interest in an intervention or evaluation, either as participants, beneficiaries, implementers, funders, or those potentially affected by it.
Example: Key stakeholders for the water management project included community members, local government, the implementing NGO, donor organization, and downstream water users.
When to use: When identifying relevant parties who should be considered or engaged in planning, implementing, or evaluating an intervention.
Related concepts: Interested Parties, Participants, Partners
Stakeholder Analysis
Definition: Process of identifying and analyzing stakeholders, their interests, influence, and potential impact on or from an intervention.
Example: The stakeholder analysis revealed potential resistance from local merchants to the market intervention and informed engagement strategies to address their concerns.
When to use: When planning stakeholder engagement and communication strategies for an intervention or evaluation.
Related concepts: Power Analysis, Interest Mapping, Influence Assessment
Stakeholder Engagement
Definition: Process of involving relevant parties who may affect, be affected by, or have an interest in a project, throughout its lifecycle.
Example: Stakeholder engagement included quarterly review meetings with government partners, community town halls, and a beneficiary feedback mechanism.
When to use: When seeking to build ownership, relevance, and sustainability through meaningful participation of affected and interested parties.
Related concepts: Participation, Consultation, Inclusion
Statistical Power
Definition: Probability that a study will detect an effect when there is an effect to be detected.
Example: The sample size was calculated to ensure 80% statistical power to detect a 10% difference in outcomes between treatment and control groups.
When to use: When designing impact evaluations to ensure they have sufficient ability to detect meaningful changes.
Related concepts: Sample Size, Effect Size, Type II Error
Statistical Significance
Definition: Indication that an observed effect or relationship is unlikely to have occurred by chance alone.
Example: The evaluation found a statistically significant increase in crop yields (p<0.05) for farmers who received the improved seeds.
When to use: When determining whether observed differences or effects provide reliable evidence of program impact rather than random variation.
Related concepts: P-value, Confidence Level, Hypothesis Testing
Steering Committee
Definition: Group responsible for providing overall strategic direction and guidance to a project or evaluation.
Example: The evaluation steering committee, comprising donor representatives, government officials, and technical experts, approved the evaluation design and reviewed preliminary findings.
When to use: When establishing governance structures for oversight and strategic direction of evaluations or interventions.
Related concepts: Advisory Group, Oversight Committee, Governance Body
Stratified Sampling
Definition: Sampling technique that divides the population into distinct subgroups (strata) before sampling, ensuring representation from each subgroup.
Example: The evaluation used stratified sampling to ensure proportional representation of male and female beneficiaries, urban and rural locations, and different age groups.
When to use: When ensuring that important subgroups within a population are adequately represented in a sample.
Related concepts: Representative Sample, Sampling Strategy, Proportional Representation
Summative Evaluation
Definition: Assessment conducted at the end of an intervention or phase to determine the extent to which anticipated outcomes were achieved.
Example: The summative evaluation assessed the degree to which the five-year health program achieved its objectives and contributed to improved health outcomes.
When to use: When assessing overall merit, worth, and impact of a completed or established intervention.
Related concepts: Impact Evaluation, Final Evaluation, Outcomes Assessment
Sustainability
Definition: Continuation of benefits from an intervention after major development assistance has been completed, including environmental, institutional, and financial sustainability.
Example: The evaluation assessed sustainability by examining whether community structures, funding mechanisms, and technical capacity remained functional two years after project closure.
When to use: When assessing the likelihood or reality of continued benefits after external support ends.
Related concepts: Continuation, Institutionalization, Self-sufficiency
SWOT Analysis
Definition: Strategic planning technique used to identify Strengths, Weaknesses, Opportunities, and Threats related to a project, organization, or initiative.
Example: The SWOT analysis identified strong community ownership as a strength, limited technical capacity as a weakness, new government policy as an opportunity, and funding constraints as a threat.
When to use: When conducting strategic analysis to inform planning, decision-making, or evaluation.
Related concepts: Strategic Assessment, Situational Analysis, Environmental Scan
Systematic Sampling
Definition: Probability sampling technique where elements are selected from the population at regular intervals.
Example: Using systematic sampling, every 10th household on the village list was selected for the survey, after a random starting point was chosen.
When to use: When selecting a sample from an ordered population when a complete sampling frame exists.
Related concepts: Random Sampling, Sample Selection, Probability Sampling
T
Target
Definition: Specific, planned level of result to be achieved within a specific timeframe, expressed as a measurable value.
Example: The project set a target of training 2,000 farmers and achieving a 30% increase in yields by the end of the third year.
When to use: When defining specific, measurable levels of achievement expected from an intervention.
Related concepts: Goal, Benchmark, Performance Standard
Target Group
Definition: Specific group of people or organizations for whom an intervention is designed to provide benefits.
Example: The primary target group for the nutrition program was pregnant women and children under five in high-poverty districts.
When to use: When defining the specific population that an intervention aims to reach and benefit.
Related concepts: Beneficiaries, Intended Recipients, Focus Population
Target Population
Definition: The entire set of units for which an intervention is planned to provide benefits.
Example: The target population for the vaccination campaign included all children under five years old living within the project districts.
When to use: When defining the broader group from which beneficiaries will be drawn or to which findings might be generalized.
Related concepts: Intended Beneficiaries, Coverage Population, Demographic Focus
Technical Assistance
Definition: Non-financial assistance provided by specialists, typically to transfer knowledge, skills, or technologies.
Example: Technical assistance from agricultural experts helped farmers adopt sustainable irrigation practices and integrated pest management.
When to use: When discussing knowledge transfer, capacity building, or specialized support provided to implementers or beneficiaries.
Related concepts: Capacity Building, Knowledge Transfer, Expert Support
Terms of Reference (ToR)
Definition: Document defining the purpose, scope, requirements, and expected deliverables for an evaluation, consultancy, or project.
Example: The evaluation terms of reference outlined the key questions, methodological requirements, timeline, and required expertise for the assessment.
When to use: When establishing formal parameters for commissioned work or services.
Related concepts: Scope of Work, Contract Specifications, Evaluation Brief
Theory of Change
Definition: Comprehensive description and illustration of how and why a desired change is expected to happen in a particular context.
Example: The program’s theory of change articulated how improved agricultural extension services would lead to adoption of better practices, increased productivity, higher income, and ultimately reduced poverty.
When to use: When articulating the causal logic, assumptions, and pathways through which an intervention is expected to achieve results.
Related concepts: Program Theory, Impact Pathway, Logic Model
Theory-Based Evaluation
Definition: Evaluation approach that examines the theories underlying an intervention and uses this understanding to guide the evaluation design and interpretation.
Example: The theory-based evaluation traced the causal pathways from microfinance services to business development, income generation, and household welfare improvements.
When to use: When seeking to understand not just whether an intervention worked, but how and why it did or did not achieve expected results.
Related concepts: Theory-Driven Evaluation, Causal Analysis, Process Tracing
Third-Party Monitoring
Definition: Monitoring conducted by entities independent from the implementing organization, often in contexts where direct access is limited.
Example: Third-party monitors collected data on aid distribution in conflict-affected areas where donor staff could not safely conduct direct observation.
When to use: When independent verification is needed or when security, access, or other constraints prevent direct monitoring by implementers or funders.
Related concepts: Independent Verification, External Monitoring, Remote Monitoring
Time Series Analysis
Definition: Statistical method examining data points collected over time to identify trends, patterns, and potentially causal relationships.
Example: Time series analysis of monthly malnutrition rates revealed seasonal patterns and a gradual declining trend following program implementation.
When to use: When analyzing changes over time and identifying patterns, trends, or the effects of interventions on trends.
Related concepts: Longitudinal Analysis, Trend Analysis, Sequential Data Analysis
Tracer Study
Definition: Study that follows participants after an intervention to assess longer-term outcomes and impacts.
Example: The tracer study tracked graduates of the vocational training program for three years to assess employment stability, income progression, and career advancement.
When to use: When seeking to understand longer-term effects of an intervention on participants after they exit the program.
Related concepts: Follow-up Study, Graduate Tracking, Longitudinal Tracking
Triangulation
Definition: Use of multiple sources, methods, or team members to cross-check and validate data and information to limit biases.
Example: Findings on healthcare quality were triangulated using facility assessments, patient interviews, and service statistics.
When to use: When seeking to validate findings through multiple perspectives or data sources.
Related concepts: Cross-Validation, Multiple Methods, Verification
U
Unintended Outcomes
Definition: Results of an intervention that were not anticipated in the planning phase, either positive or negative.
Example: An unintended positive outcome of the agricultural program was increased school attendance as households could afford school fees due to higher incomes.
When to use: When identifying and assessing consequences of interventions that were not planned or expected.
Related concepts: Side Effects, Spillover Effects, Externalities
Upward Accountability
Definition: Responsibility of implementing organizations to report to donors, funders, or higher authorities about the use of resources and results achieved.
Example: Quarterly financial reports and annual results assessments fulfilled upward accountability requirements to the donor agency.
When to use: When discussing reporting obligations to funding sources or governing authorities.
Related concepts: Donor Reporting, Financial Accountability, Compliance
Utilization-Focused Evaluation
Definition: Evaluation approach based on the principle that an evaluation should be judged by its utility and actual use by intended users.
Example: The utilization-focused evaluation involved program managers in designing the evaluation to ensure findings would directly inform upcoming programming decisions.
When to use: When prioritizing practical utility and use of evaluation findings for decision-making by specific stakeholders.
Related concepts: User Engagement, Practical Evaluation, Evaluation Use
V
Validation
Definition: Process of cross-checking to ensure that data obtained from one monitoring method are confirmed by data from different methods.
Example: Survey findings on vaccination coverage were validated through health facility records and community spot checks.
When to use: When confirming the accuracy of data through independent verification.
Related concepts: Verification, Confirmation, Data Quality Assessment
Validity
Definition: Extent to which something is reliable and accurately measures what it intends to measure or makes a correct claim.
Example: The survey instrument demonstrated high validity through pilot testing and comparison with observed behaviors.
When to use: When assessing whether data collection methods accurately capture the intended information.
Related concepts: Accuracy, Correctness, Soundness
Value Chain Analysis
Definition: Assessment of the full range of activities required to bring a product or service from conception through production to delivery to consumers.
Example: Value chain analysis identified key bottlenecks in processing and transport that were limiting smallholder farmers’ access to profitable markets.
When to use: When examining the sequence of productive processes, identifying inefficiencies, and assessing intervention points in market systems.
Related concepts: Market Systems Analysis, Supply Chain Assessment, Economic Analysis
Value for Money
Definition: Assessment of whether resources are being used effectively, efficiently, and economically to achieve intended outcomes.
Example: The value for money analysis showed that the community-based approach delivered similar results at lower cost than the previous centralized model.
When to use: When assessing whether interventions are maximizing impact relative to resources invested.
Related concepts: Cost-Effectiveness, Efficiency, Economy
Variable
Definition: Characteristic, attribute, or property that can be measured and can vary across different cases or over time.
Example: Key variables in the household survey included income levels, food security status, agricultural practices, and access to services.
When to use: When identifying and defining the specific characteristics that will be measured or analyzed in an evaluation.
Related concepts: Indicator, Measure, Factor
Variance
Definition: Statistical measure of the spread or dispersion of a dataset, calculating how far each value deviates from the mean.
Example: Analysis showed high variance in program effects across different regions, suggesting context-specific factors influenced success.
When to use: When examining the distribution and variability of outcomes or characteristics in a population.
Related concepts: Standard Deviation, Dispersion, Distribution
Vertical Logic
Definition: Summary of project causal relationships between each level of the objective hierarchy and the critical assumptions affecting these linkages.
Example: The vertical logic demonstrated how training activities would lead to improved practices, assuming participants had authority to implement changes.
When to use: When explaining how lower-level results contribute to higher-level objectives, considering external factors.
Related concepts: Causal Logic, Results Chain, If-Then Relationships
Vision Statement
Definition: Articulation of the desired future state toward which a program or organization is working.
Example: The vision statement described a future where all community members have reliable access to clean water and sanitation.
When to use: When establishing inspiring, long-term aspirations that guide program direction.
Related concepts: Mission, Goal, Aspiration
Volunteer
Definition: Individual who contributes time and effort to a cause or organization without payment.
Example: Community volunteers conducted household visits to promote hygiene practices, extending program reach beyond paid staff.
When to use: When discussing unpaid individuals supporting program implementation.
Related concepts: Community Resource Persons, Unpaid Workers, Contributors
W
Work Breakdown Structure
Definition: Hierarchical decomposition of the total work scope of a project into smaller, more manageable components.
Example: The evaluation work breakdown structure divided the assessment into discrete tasks including desk review, tool development, data collection, analysis, and reporting.
When to use: When planning and organizing project or evaluation work into structured, manageable units.
Related concepts: Task Analysis, Project Planning, Activity Definition
Work Plan
Definition: Detailed document stating which activities will be carried out in a given time period, how, and how they relate to common objectives.
Example: The quarterly work plan specified weekly outreach activities, responsible staff, resource requirements, and expected outputs.
When to use: When planning specific implementation activities and responsibilities for short to medium term.
Related concepts: Activity Plan, Implementation Schedule, Operational Plan
Conclusion
Effective monitoring and evaluation is essential for learning, accountability, and improving development outcomes. This glossary serves as a reference to promote common understanding of M&E terminology across stakeholders and to strengthen M&E practice.
The field of M&E continues to evolve with new approaches and methodologies. Practitioners are encouraged to adapt these concepts to their specific contexts while maintaining the core principles of evidence-based assessment and learning for improved results.
For additional guidance on M&E practice, refer to organizational M&E frameworks, donor guidelines, and international standards for evaluation.