A South African school’s guide to evaluating EdTech vendors

Choosing an education technology (EdTech) vendor is one of the most consequential procurement decisions a school—or a district, circuit, or provincial program—can make. The goal isn’t only to buy devices or software. It’s to secure learning impact, value for money, reliable support, and responsible data handling for learners and educators.

This guide is written for South African schools evaluating EdTech vendors under real-world constraints: variable connectivity, diverse learner needs, procurement compliance expectations, and the long-term costs that often get underestimated. You’ll find deep-dive evaluation methods, contract checklists, funding and budgeting considerations, and practical examples that map to South African conditions.

If you’re planning your purchase and rollout, you’ll also benefit from reading: How to budget for education technology procurement in South Africa and Key questions to ask before signing an education technology contract.

What “Good” Looks Like: Outcomes Before Features

Before you compare vendors, define what success means for your school. Vendors will always highlight features; procurement should anchor on measurable learning outcomes and operational sustainability.

A strong vendor proposal aligns with:

  • Learning outcomes: improvements in literacy, numeracy, subject mastery, attendance, or engagement.
  • Teacher enablement: ease of lesson planning, quality of training, and practical classroom support.
  • Equity and inclusion: accessibility features, language support, offline options, and support for learners with barriers.
  • Operational readiness: device management, content updates, network requirements, uptime commitments, and helpdesk capacity.
  • Governance and compliance: data protection, consent processes, and audit-ready documentation.

Avoid the “Demo Trap”

A vendor demo can look impressive while failing critical procurement tests. Demos often focus on the best-case classroom scenario (ideal devices, strong Wi-Fi, time to set up, and motivated learners). In reality, your school may face:

  • shared devices
  • limited bandwidth or electricity stability
  • slow teacher adoption
  • incomplete timetabling integration
  • support delays during term changes or holidays

To avoid this, evaluate vendors using evidence, pilot results, and implementation plans, not only screenshots.

The South African EdTech Context That Should Shape Your Evaluation

South African schools operate in a unique environment. Successful procurement depends on acknowledging constraints early—especially around connectivity, device lifecycle management, procurement capacity, and funding cycles.

If you want to broaden your funding understanding as part of vendor evaluation, see: Funding options for education technology projects in South Africa and How donor funding supports EdTech implementation in South Africa. These affect the vendor model you can realistically sustain.

Connectivity Reality: Designing for Intermittent Internet

Many learning platforms assume reliable broadband, which may not be present. Vendors should demonstrate:

  • offline functionality or download-and-sync workflows
  • content caching strategies
  • lightweight design for low bandwidth
  • clear options for mobile data, Wi-Fi hotspots, or offline tutoring
  • school-level reporting that still works with intermittent access

Power and Device Lifecycle

For schools with electricity interruptions, vendors must address:

  • offline operation windows
  • device battery life and charging solutions
  • spares strategy and repair turnaround times
  • warranties and replacement timelines
  • secure device storage and school policies

Language and Curriculum Alignment

South African learners are taught in multiple languages, and curriculum alignment is non-negotiable. Vendors should provide evidence of:

  • alignment to CAPS or relevant curriculum outcomes
  • multilingual content availability (where applicable)
  • lesson pacing and mapping to school timetables
  • assessment alignment and reporting suitability

Step 1: Define Your Use Case (Then Weight Vendor Criteria)

Start by choosing what problem the vendor must solve. Examples:

  • literacy remediation for grades 4–6
  • numeracy practice with adaptive pathways
  • teacher support tools for formative assessment
  • learner engagement and attendance improvement
  • administrative systems (LMS, reporting, parent communication)

Then assign weights to evaluation criteria. A common mistake is treating all criteria equally. For instance, if your school has unstable internet, offline capability should weigh more than “advanced analytics” in your decision.

Example Scoring Framework (Use or Adapt)

Create a weighted matrix so the team can compare objectively.

Suggested criteria (weights can be adjusted):

  • Learning impact evidence (25%)
  • Implementation and training plan (20%)
  • Total cost of ownership (15%)
  • Data privacy and security (15%)
  • Offline/low-connectivity performance (10%)
  • Hardware/device management (if applicable) (10%)
  • Support and service-level commitments (5%)

This kind of structure supports accountability and is especially helpful when procurement panels need justification.

For implementation planning, cross-reference: How to plan a successful EdTech rollout in South African schools.

Step 2: Verify Learning Impact (Evidence, Not Promises)

Vendor marketing often claims effectiveness. Your job is to verify with credible evidence that matches your learner profiles and use case.

What Evidence to Ask For

Request:

  • peer-reviewed studies or credible external evaluation reports
  • pilot outcomes with similar contexts (not only high-bandwidth schools)
  • clear metrics: baseline → intervention → measurable change
  • disaggregated results (e.g., grade levels, language groups, learners needing remediation)
  • assessment validity and reliability explanations

Ask for Measurement Granularity

A vendor should explain what they measure and why it’s meaningful. For example:

  • Does the platform track skill mastery aligned to curriculum outcomes?
  • Are assessments formative (weekly) or summative (term)?
  • How do learning gains translate into teacher actions?
  • Do reports show actionable next steps—not just usage stats?

ROI Measurement Should Be Part of Vendor Evaluation

You should evaluate vendors with a view to return on investment. Explore the methodology here: How to measure return on investment for EdTech in South Africa.

Practical question to ask:

“If we invest RX per learner, what measurable learning or operational improvement should we expect in 3, 6, and 12 months—and what evidence supports that expectation?”

Step 3: Evaluate Pedagogy and Teacher Workflow

EdTech succeeds when it becomes part of teaching practice. Don’t evaluate only what learners do. Evaluate what teachers can realistically manage.

Look for Teacher-First Design

Strong vendors support teachers through:

  • lesson plans or recommended sequences tied to curriculum
  • formative assessment insights
  • differentiation support (remedial, on-level, enrichment)
  • classroom dashboards that reduce cognitive load
  • tools that work during the school day—not only after-school training

Training and Enablement Must Be Realistic

A vendor should provide training with:

  • a clear schedule (pre-launch, during rollout, refresher sessions)
  • role-based materials (principals, teachers, ICT coordinators, subject heads)
  • practical examples aligned to your school timetable
  • ongoing coaching options (not just one-off workshops)

Ask for specifics:

  • How many hours of training are included?
  • Who delivers the training (and their qualifications)?
  • Is training available in accessible formats (English + other languages if relevant)?
  • How do you handle teacher turnover?

For broader change management, see: Change management tips for introducing EdTech in South African classrooms.

Step 4: Assess Offline, Low-Bandwidth, and Device Readiness

Connectivity and hardware constraints can derail even the best platform.

Offline Functionality: The “Must-Have” Test

Ask vendors to provide:

  • offline learning sessions with progress synchronization
  • how content is stored and how often it updates
  • resilience during partial connectivity (uploading later without data loss)
  • device storage requirements and memory needs

Proof you can request:

  • a low-connectivity demo (not just offline in ideal conditions)
  • test scripts you can run during a pilot
  • documented limits (e.g., number of learners per offline cache)

Hardware Management (If the Vendor Supplies Devices)

If the vendor includes devices, evaluate:

  • device warranty terms (duration and coverage)
  • repair logistics and turnaround times
  • spare device inventory plans
  • remote device management capabilities (if applicable)
  • asset tracking and security controls

Procurement risk: schools sometimes buy devices without a lifecycle plan. The result is that devices become unusable before the school can renew hardware.

For procurement challenges and avoidance strategies, refer to: Procurement challenges for South African education institutions and how to avoid them.

Step 5: Total Cost of Ownership (TCO) and Long-Term Sustainability

One of the biggest reasons EdTech projects fail is that procurement focuses on upfront costs while ignoring ongoing expenses. You need TCO transparency.

Components of TCO to Demand

Ask vendors to break down costs for:

  • licensing fees (per learner/per device/per school)
  • support costs and helpdesk fees
  • content updates and curriculum alignment maintenance
  • training costs (including refreshers and onboarding of new teachers)
  • implementation services (setup, migration, integrations)
  • connectivity costs assumptions (if provided)
  • device replacements, repairs, and spares (if hardware is included)
  • security updates and subscription renewals

Ask for a 3-Year and 5-Year Cost Projection

A credible vendor provides:

  • year-by-year cost structure
  • expected cost increases or renewal schedules
  • what happens if usage targets aren’t met
  • contract termination and renewal options

Example question:

“Show a full 36-month cost plan including licensing, support, device management (if applicable), and training refreshers.”

If your funding approach depends on a grant cycle, read: The role of grants in expanding education technology access in South Africa.

Step 6: Data Protection, Privacy, and Security (Non-Negotiables)

EdTech vendors typically handle learner data—names, assessment results, progress metrics, device identifiers, and sometimes behavior analytics. South African schools must treat privacy as a procurement requirement, not a “nice to have.”

What to Verify Under South African Data Protection Expectations

Ask for written documentation on:

  • how personal information is collected, processed, and stored
  • whether data is used for vendor analytics or third-party purposes
  • retention periods and deletion policies
  • how access is restricted internally (role-based access control)
  • encryption at rest and in transit
  • incident response processes (breach notification timelines)
  • whether the vendor uses subprocessors (and how those are controlled)
  • data residency and hosting location assumptions

Consent and Governance in Schools

Even when vendors handle the technology, schools must govern usage. Ask vendors to provide:

  • data processing agreements (DPAs) or equivalent documentation
  • templates for consent and policy alignment
  • guidance on user roles (teacher vs learner vs admin)
  • audit reports and log access for school administrators

Procurement checklist principle:
If you can’t clearly define who controls data, how it’s secured, and how it’s deleted, you should treat the vendor as high risk.

For legal/contract alignment, use: Key questions to ask before signing an education technology contract.

Step 7: Integrations and Ecosystem Fit

EdTech rarely operates in isolation. Evaluate whether the vendor fits your school’s existing ecosystem: LMS, assessment systems, student information systems, Google/Microsoft environments, or communication tools.

Ask for Integration Clarity

Request documentation on:

  • APIs and integration options
  • supported authentication methods (SSO, rostering)
  • import/export capabilities for learner rosters and assessment results
  • compatibility with your existing devices and accounts
  • data portability: can you exit the platform without losing your historical learning data?

Avoid Vendor Lock-In

A vendor might offer excellent tools but restrict data export or require costly migration. Ask:

  • Can we export our learner data and assessment history in usable formats (CSV, JSON, reports)?
  • What are the costs and timelines for migration?
  • Do you support open standards or interoperable formats?

Step 8: Service, Support, and Accountability

Even the best product fails without responsive support. In your procurement evaluation, prioritize service-level commitments.

Support Should Include “School-Realistic” Response Times

Ask vendors to define:

  • helpdesk hours (weekdays, term times, holidays)
  • response time and resolution targets (not just “best efforts”)
  • escalation paths for serious incidents
  • support channels (ticketing, WhatsApp, email, phone)
  • onboarding support and readiness checks

Who Owns Implementation Risks?

Make sure responsibilities are clear between vendor and school:

  • device setup and user provisioning
  • account management and password reset processes
  • content alignment and curriculum mapping tasks
  • training delivery responsibility and scope
  • troubleshooting responsibilities during rollout

Use contract language that reflects actual operational ownership.

Step 9: Conduct a Structured Pilot (Design It Like an Evaluation)

A pilot is not a casual trial. It should be designed to test assumptions and generate evidence to inform the final decision.

Pilot Design: What You Should Decide Upfront

Define:

  • pilot duration (e.g., 6–10 weeks)
  • grades and subjects included
  • baseline measurements (pre-tests or existing performance data)
  • success metrics (learning outcomes + adoption + reliability)
  • teacher roles and time commitment expectations
  • classroom support plan and escalation procedures

A Pilot Should Test Connectivity and Offline Behavior

Include scenarios like:

  • classes with intermittent internet
  • shared devices among multiple learners
  • teacher-led sessions where learners must continue offline
  • device battery/charging disruptions (within realistic constraints)

Pilot Evidence to Demand

Ask for:

  • usage logs and participation metrics
  • assessment results or learning progress reports
  • teacher feedback structured through rubrics
  • support ticket statistics (volume, resolution time, root causes)
  • qualitative notes on workflow usability

For rollout planning discipline, revisit: How to plan a successful EdTech rollout in South African schools.

Vendor Evaluation Deep-Dive: A Practical Scoring Matrix (No Guesswork)

Below is a detailed rubric you can adapt. Use it to score each vendor consistently across proposals and pilot findings.

Rubric Criteria (Detailed)

Category What to Check Red Flags Evidence to Request
Learning Impact Curriculum alignment, assessment quality, measurable gains “Trust us, it works” claims; no measurable outcomes Pilot results, mapping to outcomes, sample reports
Teacher Workflow Usability, lesson integration, differentiation Overly complex dashboards; teacher burden Teacher demos, workflow mockups, training plan
Offline/Connectivity Caching, sync, offline lessons Assumes stable Wi-Fi; unclear offline behavior Offline test results, documented limitations
TCO Full 3–5 year cost breakdown Hidden renewal costs; vague support pricing Licensing schedule, support costs, renewal terms
Privacy/Security Data handling, encryption, retention Vague privacy policy; unclear subprocessors Data processing documentation, security overview
Support Response times, escalation, term-time coverage No SLA; “we’ll try” SLA, support channels, incident process
Implementation Onboarding and setup clarity Unclear roles; unrealistic onboarding timelines Implementation plan, timelines, responsibilities
Integration Rostering, exports, interoperability No data export; incompatible accounts API/integration docs, export samples
Equity & Inclusion Accessibility features, language support One-size-fits-all design Accessibility spec, multilingual options

Expert Procurement Guidance: What South African Schools Often Miss

1) Not Evaluating Implementation Capacity

A vendor can deliver software, but the school must also be ready to implement. Evaluate whether the vendor supports:

  • user provisioning
  • content roll-out schedules
  • device readiness and classroom routines
  • monitoring and problem resolution

If you lack internal ICT capacity, ensure support services are included—or budget for internal capability building.

2) Treating Training as a One-Time Event

Teacher turnover and curriculum pacing change. Training must include refreshers and onboarding.

Ask:

  • Is there a training refresher included after 3 months?
  • Are training materials available for new teachers?
  • Is there ongoing coaching or office hours?

3) Overlooking Procurement and Governance Processes

In South Africa, schools must align to their governance structures and procurement processes. Vendors should be able to provide:

  • VAT invoices and compliant documentation
  • clear product/service descriptions
  • documentation to support procurement oversight
  • reporting for audit readiness

Procurement friction often causes delays that reduce learning time. Anticipate it by selecting vendors with responsive documentation and clear proposal structures.

For more on how to avoid procurement pitfalls, see: Procurement challenges for South African education institutions and how to avoid them.

Contract and Commercial Evaluation: The Fine Print That Matters

Even after technical evaluation, contracts can undermine your outcomes. Review terms carefully and insist on clarity.

Questions to Ask Before Signing

Use this as a starting set (and combine with the dedicated guide): Key questions to ask before signing an education technology contract.

Common contract issues to evaluate:

  • Renewal terms: price increases, notice periods, and renewal conditions
  • Termination clauses: what happens if the solution fails to meet performance expectations
  • SLA enforcement: whether service-level penalties exist or escalation is defined
  • Data ownership: who owns learner and assessment data
  • Data deletion: timelines and proof of deletion
  • Warranty and returns (for hardware): repair/replace conditions
  • Change management responsibility: who handles content updates and teacher re-training

Performance-Based Clauses (Where Appropriate)

If the vendor offers measurable outcomes, consider:

  • clauses tied to uptime (for platforms)
  • response times (support SLAs)
  • pilot deliverables and acceptance criteria

Even if you can’t fully quantify learning gains in a contract, you can still require proof of implementation delivery and service performance.

Implementation Readiness Checklist for South African Schools

Before procurement finalization, you should ensure your school has the operational readiness to realize value.

Internal Readiness Questions

  • Do we have a named project lead (ICT coordinator or delegated educator)?
  • Are teachers timetabled to use the solution consistently (not only “when there’s time”)?
  • Do we have a device care policy and accountability approach?
  • Can we manage logins, learner rosters, and user lifecycle?
  • Do we have a plan for offline use during connectivity issues?
  • What is our process for handling damaged devices or lost devices?

Vendor Readiness Requirements

  • Implementation plan with milestones and responsibilities
  • Training schedule and materials
  • Support plan with response times and escalation path
  • Offline/low-connectivity documentation and testing evidence
  • Data governance documentation and security documentation

If you want an operational step-by-step view, go deeper with: How to plan a successful EdTech rollout in South African schools.

Funding Alignment: Choosing Vendors That Fit Your Financing Model

Your funding model affects your vendor choice. Some vendors are ideal for short pilots; others are built for multi-year rollouts.

Common Funding Scenarios and Vendor Implications

Scenario A: Grant or Donor-Funded Pilot

In donor-funded contexts, you may need:

  • quick implementation timelines
  • measurable impact reporting for funders
  • documentation and evaluation support

Vendors that can provide evaluation reports and evidence of learning gains typically align better.

Learn more: How donor funding supports EdTech implementation in South Africa and The role of grants in expanding education technology access in South Africa.

Scenario B: School or SGB Budgeted Subscription

In school-funded models, you need:

  • predictable costs
  • low operational burden
  • strong training and self-sufficiency tools

Ask vendors to outline renewal costs and support packages clearly, so the school can plan multi-year sustainability. This connects directly to TCO evaluation and long-term viability.

Scenario C: District-Level Multi-School Rollout

District procurement requires:

  • centralized support and onboarding
  • standardization across schools
  • governance and reporting across multiple sites

The vendor should provide scalable implementation, training materials, and reporting structures.

For broader guidance on funding choices, review: Funding options for education technology projects in South Africa.

Measuring Success After Procurement: KPIs That Matter

A vendor evaluation doesn’t end at contract signature. You should monitor outcomes, adoption, and operational performance continuously.

Recommended KPI Categories

  • Learning indicators
    • assessment improvement over baseline
    • skill mastery progression
    • formative assessment completion rates
  • Teacher adoption
    • weekly active usage by teachers
    • lesson integration consistency
    • perceived usability (structured surveys)
  • Operational performance
    • platform uptime
    • device availability
    • support response and resolution times
  • Equity indicators
    • access and participation by grade/learning support needs
    • usability for learners with barriers
  • Cost indicators
    • cost per active learner
    • support cost per resolved incident

To connect measurement with financial evaluation, consult: How to measure return on investment for EdTech in South Africa.

Practical Examples: How Evaluation Plays Out in Real School Scenarios

Example 1: Literacy Program for Grades 4–6 with Limited Connectivity

A school wants a literacy platform for learners who need remediation. During evaluation, the vendor demonstrates:

  • offline lessons with sync
  • audio support for reading practice
  • teacher dashboards showing specific skill gaps
  • pilot results showing improvement in reading fluency metrics

The school confirms:

  • teachers can run sessions without constant connectivity
  • device charging plan is compatible with daily schedules
  • privacy documents cover learner data retention and export

Result: The vendor wins because offline reliability and measurable learning outcomes are validated.

Example 2: Teacher Support Tool with Strong Analytics but High Teacher Workload

Another vendor’s platform offers advanced analytics but requires manual lesson setup and constant dashboard review. Teachers report that the workflow is unrealistic in the first two weeks.

During the pilot:

  • usage drops after initial excitement
  • teachers struggle to integrate into lesson planning
  • support tickets rise due to onboarding gaps

Result: The school rejects the vendor despite strong analytics, because teacher workflow adoption failure would undermine learning impact.

Example 3: Hardware + Platform Bundle Without a Lifecycle Plan

A vendor offers devices at a good upfront price. However, warranty terms are weak and replacement lead times are unclear.

During evaluation, the school requests:

  • spares strategy
  • repair turnaround commitments
  • device management and asset tracking procedures

The vendor provides only vague answers.

Result: The school avoids long-term cost risk, selecting a vendor with a complete device lifecycle and support plan.

Build a Team: Who Should Participate in Vendor Evaluation?

EdTech evaluation should be cross-functional. A procurement decision made by only one role often misses risks and implementation realities.

Recommended roles include:

  • Principal or delegated leadership: accountability for academic priorities and governance
  • Subject leaders: curriculum alignment and learning impact relevance
  • Teachers (pilot participants): usability and workflow fit
  • ICT coordinator / IT support: offline, device readiness, integrations, and support feasibility
  • Finance and procurement officer / SGB governance: TCO, contract clarity, compliance documentation
  • Data/privacy responsible person: privacy, consent, and data security requirements

If your school lacks specialized staff, ensure training or advisory support is budgeted or requested from the vendor as part of onboarding.

Summary: A High-Confidence Vendor Evaluation Process

Evaluating EdTech vendors is not about picking the most impressive product. It’s about selecting the vendor that can deliver learning impact, practical classroom adoption, reliable service, and responsible data practices within your South African context.

Your Next Steps (Action-Oriented)

  • Define your use case and measurable outcomes before comparing vendors.
  • Use a weighted rubric and require evidence: pilot results, offline testing, and curriculum alignment.
  • Demand total cost of ownership transparency for 3–5 years.
  • Verify privacy, security, and data governance documentation.
  • Design a structured pilot with baseline metrics, offline scenarios, and teacher workflow testing.
  • Review contracts for renewal, data ownership, SLAs, and termination conditions.

If you want to take the process further into implementation mechanics, return to How to plan a successful EdTech rollout in South African schools and build your measurement plan using How to measure return on investment for EdTech in South Africa.

Frequently Asked Questions (FAQ)

How many vendors should we evaluate?

Typically, evaluate 3–5 vendors deeply. Too many options dilute pilot time and increase decision fatigue. Use your scoring rubric to narrow quickly based on non-negotiables like offline support, privacy documentation, and support SLAs.

Should we prioritize learning outcomes or infrastructure first?

Both matter, but prioritize what’s most likely to fail. If your school has weak connectivity, offline performance is often the deciding factor. If connectivity is stable, learning outcomes and teacher workflow usually become the priority.

What if the vendor cannot provide pilot data?

You can still pilot, but require that the vendor supplies:

  • clear success metrics,
  • implementation commitments,
  • and a structured evaluation approach.
    Without evidence, you’re effectively funding experimentation—so protect yourself with pilot deliverables and acceptance criteria.

Are free trials enough?

Free trials often test the platform without the school’s real conditions. A structured pilot with baseline measures, offline scenarios, and training should replace “trial” as your decision-making tool.

If you’d like, share your school’s grade range, connectivity situation (reliable Wi-Fi vs intermittent), and whether you’re buying software only or devices too. I can help you create a custom weighted evaluation scorecard and a pilot plan tailored to your context.

Leave a Comment