Skip to Content Skip to Mainnavigation Skip to Meta Navigation Skip to Footer
Skip to Content Skip to Mainnavigation Skip to Meta Navigation Skip to Footer

AI for Organizations of Persons with Disabilities : A Practical Guide

The question for OPDs in Kenya is therefore not whether AI will affect our work, it already does. The real question is how we can understand it, shape it, and use it in ways that advance disability rights and inclusion. This practical guide introduces simple ways OPDs can begin exploring AI tools responsibly.

A medium shot of a young Black man with a shaved head and a faint mustache, wearing a grey and white horizontal striped short-sleeved t-shirt, sitting at a rustic wooden desk, and typing on a silver tablet with a dark grey case and detached keyboard. He is looking at the screen with a focused expression. The tablet screen shows a messaging app with the text "Hello Assistant, I'm writing on behalf of my family and for persons with disabilities in our area to suggest a change of venue for our proposed summit as soon as common awareness of our need in this area grows. Many members of our groups are concerned and cannot make this venue". There is a beige ceramic mug with an off-white liquid and a blue spoon in it on the right, and a handwritten notebook with a blue pen on it. Behind the man, there is a poster with the "EnableMe DISABILITY RIGHTS" logo, a wheelchair icon, and the text "Global Goals for Sustainable Development". The room is dimly lit with natural light coming from a window on the left.

An OPD member writing a prompt on an AI tool (Nano Banana)

Across Kenya, Organisations of Persons with Disabilities (OPDs) already depend on digital tools in their daily work. Meetings are coordinated through WhatsApp, reports are shared online, and consultations increasingly take place on virtual platforms. Technology is already shaping how OPDs organise and communicate.

But Artificial Intelligence (AI) is now shaping how information is created, shared, and accessed.

For OPDs, this presents both an opportunity and a responsibility. Many OPDs operate with small teams, limited budgets, and large communities to serve. At the same time, they are expected to write proposals, participate in policy discussions, and communicate regularly with members across counties.

AI cannot replace the lived experience and leadership of persons with disabilities. But when used thoughtfully, it can help OPDs save time, strengthen advocacy, improve accessibility, and communicate more effectively with their members and partners.

The question for OPDs in Kenya is therefore not whether AI will affect our work, it already does. The real question is how we can understand it, shape it, and use it in ways that advance disability rights and inclusion.

This practical guide inspired by a session at #ZeroCon26 introduces simple ways OPDs can begin exploring AI tools responsibly, starting small and focusing on areas where they can make everyday organisational work easier and more effective.

Why AI Matters to OPDs

Organisations of Persons with Disabilities (OPDs) face a paradox: they represent communities with the greatest need for accessible and targeted communication, yet they typically have the smallest teams and the tightest budgets.

AI partially solves this paradox. Here is how:

Accessibility: AI is increasingly powering accessibility tools used by persons with disabilities, including screen readers, text-to-speech, automatic video captions, image-to-text conversion for blind users, emerging sign language avatars, and real-time meeting transcription.

Grant writing and reporting: AI can support OPDs in grant writing and reporting by helping structure proposals, summarizing project activities into reports, suggesting disability-inclusive indicators and M&E frameworks, and improving drafts for clarity, grammar, and persuasiveness.

Research and policy: Organizations of persons with disabilities are increasingly expected to present evidence in policy processes. AI can help by summarising complex documents, synthesising member feedback into position papers, and identifying gaps between Kenya’s disability laws and international standards.

Guide to How Your OPD Can Start

Getting started with AI requires curiosity and a willingness to experiment.

AI is a broad concept, but in this guide the term mainly refers to large language models (LLMs) and related tools. Examples of LLMs include ChatGPT, Claude, and Gemini.

The following are the practical steps for an OPD to follow to start using AI Intentionally and ethically.

 

Which tasks take a lot of time but follow predictable or repetitive processes?
  • 1

    Step 1: Build AI Literacy

    Before launching an AI-focused project for program participants, begin by strengthening your own AI knowledge and encouraging staff to do the same. Numerous free AI courses are available online that can help build practical skills and improve overall competence.

  • 2

    Step 2: Identify Organizational Needs

    Before adopting AI tools, it is useful for OPDs to reflect on their organizational needs and capacity. Many OPDs operate with small teams where staff handle multiple responsibilities, from communications and proposal writing to member outreach.

    AI can potentially support areas that consume significant time, such as content creation, grant writing, translation for local language engagement, organizing data, drafting routine documents, or producing meeting summaries.

    To make an informed decision, organizations can reflect on questions such as:

    • Where do we currently experience the greatest capacity constraints as a team?
    • Which tasks take a lot of time but follow predictable or repetitive processes?
    • Are there areas where faster communication or reporting would strengthen our advocacy and member engagement?
    • Do we struggle with translating information into accessible formats or local languages for our members?
    • Which routine administrative tasks consume time that could be redirected toward advocacy or program work?

    Reflecting on these questions can help organisations determine whether and where AI might add practical value to their work.

  • 3

    Step 3: Choose One Tool and One-Use Case

    Rather than introducing AI across many activities at once, it can be helpful to begin with a single tool applied to one specific task. This allows teams to explore how the tool works, understand its strengths and limitations, and build confidence before expanding its use to other areas.

    Starting small also makes it easier to evaluate whether the tool genuinely improves efficiency or supports the organization’s work.

    The table below step 6 shows some recommended starting tools and their use cases.

  • 4

    Step 4: Learn to Give AI Good Instructions (Prompts)

    Using AI effectively requires learning how to communicate clearly with it. Unlike a search engine that simply retrieves information, AI responds to instructions and context. The quality of the output often depends on the clarity of the request.

    ·       Ask Better Questions: Train your team to give the AI context. If you ask a bad question, you will get a bad answer.

    ·       Use Natural Language: Speak to the AI as if you are giving instructions to a human assistant.

    ·       Refine and Repeat: If the first answer isn't right, don't give up. Tell the AI what

     

    A Simple Formula for OPDs

    Role + Context + Task + Format + Tone

    Example of a WEAK prompt:

    "Write about a public participation in Nakuru to inform members."

    Example of a STRONG prompt:

    "You are a communications officer for a Kenyan OPD. Write a 200-word WhatsApp message in English informing our members that the County Assembly will hold a public participation session on the disability services budget on Friday 15th November at County Hall, Nakuru. The tone should be warm and encouraging and include a call to action to attend or send written submissions."

  • 5

    Step 5: Develop Internal AI Guidelines and Policies

    Even a simple one-page policy can help guide how AI is used within an OPD and provide basic safeguards for the organization. It does not need to be complex. Clear, practical guidelines are often enough to support responsible and consistent use.

    An AI policy for an OPD can outline a few clear principles to guide responsible use. These may include:

    • Permitted uses: Clarify the types of tasks where AI may be helpful, such as drafting, summarising, or translating content, while noting that decisions affecting members should always remain human-led.
    • Data protection: Emphasise that personal information about members such as names, identification numbers, or medical details should not be entered into public AI tools.
    • Human review: Ensure that any AI-generated content is reviewed and approved by a staff member before it is shared or published.
    • Transparency: Consider whether and how the organization will acknowledge the use of AI in its communications.
    • Accessibility checks: Confirm that AI-generated materials are reviewed to ensure they meet accessibility standards before being distributed.

    The following is a link of an AI Usage Policy template

  • 6

    Step 6: Advocate for Inclusive AI

    Engagement with AI should go beyond internal organizational use. There is an increasing need to ensure that AI technologies are designed and deployed in ways that are inclusive of persons with disabilities.

    OPDs have an important role to play in raising awareness, contributing perspectives from lived experience, and advocating for AI systems that are accessible, equitable, and responsive to the realities of persons with disabilities in Kenya and across the region.

An image titled "TOOL GUIDE: Best Starting Use Cases for OPDs" featuring a table with seven numbered entries. The table is divided into two columns: "SELECTED TOOL" and "IDEAL USE CASE."  ChatGPT (chat.openai.com) – Drafting letters, reports, social media posts. Works well in Swahili.  Claude (claude.ai) – Summarising documents, grant writing support. Strong at nuanced language.  Google Gemini (gemini.google.com) – Integrated with Google Docs/Drive. Good for members already in Google Workspace.  Microsoft Copilot (https://www.google.com/search?q=copilot.microsoft.com) – Free with a Microsoft account. Integrated into Word, Outlook if your OPD uses Office 365.  Otter.ai / Fireflies.ai – Transcribing meetings and interviews. Useful for qualitative data from member consultations.  ElevenLabs (elevenlabs.io) – AI voice and transcription tool. Convert text to natural-sounding speech for audio newsletters, announcements, and accessible content for members with visual impairments or low literacy. Also transcribes audio to text.  YouTube auto-captions – Free automatic captions on any video your OPD uploads. Enable in settings.  The graphic uses a color palette of lime green, sky blue, and white, with small icons next to each tool name. AI Tool Guide

Risks OPDs Should Be Aware Of

While AI offers significant opportunities to strengthen the work of OPDs in Kenya, its use should be approached thoughtfully. Balancing innovation with responsibility helps ensure that AI supports, rather than undermines disability inclusion efforts

  • 1

    Bias and the "Representation Gap"

    AI systems learn from existing data. If that data lacks diversity or representation, the outputs may reflect those same gaps.

    • Stereotypes and exclusion: AI tools may unintentionally reproduce stereotypes or overlook accessibility needs if such perspectives were not well represented in their training data.
    • Global data imbalance: Much of the data used to train AI systems does not always reflect the realities of many parts of the world. As a result, AI tools may sometimes struggle to fully understand local contexts in Kenya, including cultural dynamics, local languages, and the various barriers, physical, communication, attitudinal, and institutional that persons with disabilities encounter.
  • 2

    Inaccurate Information and "Hallucinations"

    AI systems generate responses based on probability rather than verified truth.

    • Potential inaccuracies: AI may produce confident-sounding responses that are incomplete or incorrect. This can be particularly problematic when discussing sensitive topics such as legal rights, policies, or disability-related services
    • Importance of verification: AI-generated outputs should always be reviewed and validated by a knowledgeable person before being shared.
  • 3

    Data Privacy and Security

    OPDs often manage sensitive personal information about their members, including identification details, contact information, and disability-related data.

    • Risks with public tools: Uploading documents or personal information into public AI systems may expose that information beyond the organization’s control.
    • Protecting confidentiality: Personal details such as names, identification numbers, member records, or confidential organizational documents should not be entered into public AI platforms without proper safeguards.

    Maintaining strong data protection practices helps ensure that the rights, dignity, and privacy of persons with disabilities are respected when using AI tools.

  • 4

    Over-reliance on Technology

    AI is designed to support human work, not replace human judgment.

    • Maintaining human connection: Maintaining human connection: Relying heavily on automated tools for communication or support may weaken the personal relationships that are central to community-based advocacy.

    The use of AI should therefore complement not replace authentic storytelling and the lived experiences of persons with disabilities, which remain at the heart of effective advocacy and social change.

    • Building internal capacity: Organizations benefit most from AI when staff understand how to guide and manage these tools responsibly.

     

    The most practical approach for many OPDs is to start small by experimenting with one tool and one task. By learning, experimenting, and engaging with AI early, OPDs can help ensure that technology supports greater accessibility, participation, and disability inclusion.

    Article by: Maryanne Emomeri


Is this article worth reading

Report an error? Report now .