Zimbabwe's National Artificial Intelligence Strategy 2026–2030 is arguably the most ambitious national policy document the country has produced in the digital era. Rooted in Ubuntu philosophy and structured around six interconnected strategic pillars, it presents a credible roadmap for transforming Zimbabwe into Southern Africa's hub for inclusive, homegrown artificial intelligence. It is a document that deserves to be read, debated, and implemented with urgency. It is also a document with a structural flaw serious enough to undermine everything it promises and that flaw concerns the 52% of Zimbabwe's population that the strategy claims to serve but never truly centres: women.

The NAIS does not ignore women. Gender inclusion appears across multiple pillars women-led MSMEs receive targeted mention, gender impact assessments are mandated, and women's groups are included in community governance structures. But there is a fundamental analytical difference between a strategy that mentions women and one architecturally designed to include them. Scattered references across a 70-page document do not constitute a gender strategy. They constitute awareness. Without dedicated targets, ring-fenced budgets, gender-specific KPIs, and institutional accountability mechanisms, these commitments will remain aspirational — good intentions dissolved by implementation pressures that always seem more urgent in the moment and always prove more costly in the long run.

The most analytically significant gap concerns data, and it is dangerous precisely because it is invisible. AI systems learn from historical data, and Zimbabwe's historical data is a faithful record of decades of structural gender inequality. Agricultural datasets underrepresent women farmers despite women constituting the majority of Zimbabwe's smallholder agricultural workforce. Financial datasets underrepresent women in the informal economy despite women dominating that sector. The strategy's flagship Project Pangolin plans to digitize national datasets at an unprecedented scale transformational in potential but deeply concerning in its absence of gender disaggregation requirements.

Without explicit legal standards mandating gender-disaggregated data collection from the outset, Pangolin risks encoding inequality into Zimbabwe's digital infrastructure with the authority and apparent objectivity of an algorithm. An AI system trained on gender-blind data does not produce gender-neutral outcomes. It produces outcomes that systematically disadvantage women while appearing scientifically impartial a combination of harm and invisibility that makes it uniquely difficult to challenge and correct.

The most structurally consequential gap, however, is in who builds the AI. Globally, women represent only 22% of AI professionals. In Zimbabwe, gender gaps in STEM participation compounded by cultural barriers mean that without deliberate intervention, Zimbabwe's AI systems will be predominantly designed by men, reflecting male priorities and male blind spots. This is simultaneously a justice argument and a quality argument. Research consistently demonstrates that diverse AI development teams produce more accurate, less biased, and more socially useful systems. A strategy that sets no gender targets for its AI Centres of Excellence or Presidential AI Scholarships is not simply being inequitable; it is being strategically inefficient, building its AI future with half its intellectual capacity underutilized.

What Zimbabwe needs is not cosmetic adjustment but a dedicated, standalone Women in AI Framework with real institutional teeth. Women must be positioned as AI builders with a minimum 40% target for female participation in all nationally funded programs. They must be AI beneficiaries with applications in agriculture, health, and finance co-designed with women from inception. They must be AI governors with gender parity mandated across the National AI Council and all Technical Working Groups. And they must be protected from AI harm with a dedicated Gender AI Ombudsperson empowered to investigate algorithmic discrimination with the same urgency applied to cybersecurity threats.

The strategy is built on Ubuntu, "I am because we are," a philosophy that makes the gender argument more powerfully than any policy framework can. An AI transformation that advantages men at the expense of women is not merely unjust within the Ubuntu framework. It is philosophically incoherent. Zimbabwe is building its AI infrastructure largely from scratch, and the window to embed gender equity at the foundation rather than retrofit it later remains open but not indefinitely. The strategy has laid strong foundations. What it now requires is the honesty to acknowledge that foundations built on unequal ground will crack and the political will to correct that before the building goes up.