The educational technology sector is expanding rapidly, with a growing focus on creating tools for neurodiverse learners. Yet, a fundamental disconnect often exists between the products developed in sterile engineering environments and the dynamic, complex needs of students in the classroom.
Shafaq Bajwa, a data scientist and software engineer, experienced this gap firsthand after founding Techno-Stars, an interactive educational platform, and now works on the front lines as a Learning Support Assistant at Moorcroft School in London.
With a background that spans software development at Dubai Islamic Bank to earning an MSc in Data Science, Bajwa's journey from building technology to implementing it in a special needs classroom offers a critical perspective. Her experience reveals how the tech industry's core values—scalability, efficiency, and data-driven independence—can clash with the principles of inclusive education, which demand patience, collaboration, and a deep understanding of human context.
The Myth of Independence
A central goal in technology design is to foster user autonomy. However, this focus on individual achievement can misunderstand the collaborative reality of learning for many neurodiverse students, for whom progress is often built on supported interaction and trust, a core tenet of the Neurodiversity Paradigm.
Bajwa observed this directly in her work. "In the tech industry, products often aim to make users completely independent, but this doesn't always work for neurodiverse students. At Moorcroft, I've learned that real learning happens through trust, guidance, and collaboration," she explains.
The assumption that independence is the ultimate goal overlooks the power of connection. Frameworks like Universal Design for Learning (UDL) emphasize creating multiple pathways for engagement, which often involve structured support rather than isolation.
For many learners, real growth stems from shared experiences. Bajwa notes, "Progress comes from shared experiences and relationships, not isolation. True independence grows through connection and support, not by removing it." This insight suggests that EdTech should function less as a replacement for human interaction and more as a facilitator of it, fostering relationships that build confidence and security.
The Tension of Scalability
The economic model of venture-backed technology, which powered Bajwa’s own startup in a Plan 9 incubator, prizes rapid growth and scalability. This approach is often fundamentally at odds with the patient, resource-intensive, and highly personalized needs of neurodiverse learners. While the impact investing market has grown to $715 billion, the pressure for quick returns remains a powerful force.
"Venture-backed tech models focus on rapid growth and scalability, aiming to reach large markets quickly and show fast returns. However, working with neurodiverse students has shown me that meaningful impact often requires the opposite approach: time, patience, and deep personalization," Bajwa states. This creates a tension, as inclusive education demands empathy and adaptation, a philosophy echoed in the concept of philanthropic "patient-capital".
The challenge is to reconcile market demands with human-centered design. "The needs of these learners can't be standardized or rushed, and progress is measured in individual milestones, not user metrics. This creates a tension: scalable products prioritize efficiency, while inclusive education demands empathy and adaptability," Bajwa concludes. This reflects a broader shift toward strategies like disability-lens investing, which prioritizes inclusion alongside financial goals.
Data's Missing Context
Data science excels at identifying “what” is happening but often struggles to explain “why”. In a classroom environment, this limitation can lead to flawed conclusions about student behavior without the essential context provided by human observation. This gap has led to the development of mixed-method approaches like “Big–thick blending”, which combines quantitative data with qualitative insights.
Bajwa recounts a scenario where data alone would have been misleading. "In one classroom, a student's data might have shown frequent disengagement, leaving tasks incomplete and avoiding eye contact, which could easily be logged as a lack of focus or motivation. But being there in person, I noticed subtle cues. It wasn't disinterest; it was sensory overload," she says. This experience underscores the need for empathy in data interpretation, a key theme in Human-Computer Interaction (HCI) research, which increasingly uses mixed methods to contextualize user behavior.
By adjusting the environment, engagement improved dramatically. Bajwa reflects, "Data alone would have missed that context; my presence made it clear that the 'why' behind the behavior was emotional and sensory, not behavioral. This experience reminded me that data needs empathy and observation to tell the full story."
Redefining Successful Outcomes
Software development operates in 'sprints' measured in weeks, an agile framework that prioritizes rapid iteration and immediate results. This timeline is starkly different from the pace of human development, where a significant breakthrough can take months of quiet, consistent effort. This disparity exposes a fundamental flaw in the tech world’s obsession with instant gratification.
"Working in education has completely changed how I view progress. In software development, success is often measured by quick deliverables and visible results within short sprints," Bajwa explains.
Her classroom experience has reshaped this perception entirely. She notes that a student's growth can take months of quiet effort, a process better valued by metrics like Social Return on Investment (SROI), which assesses long-term quality of life improvements.
This radical shift in perspective highlights the importance of patience. "This experience has taught me that real success isn't always immediate or measurable, and that patience, consistency, and trust are just as valuable as speed," Bajwa says. It is a lesson in valuing slow, meaningful progress, a process that requires tools designed to support productive persistence over time.
The Invisible User Experience
A developer coding an application sees a logical user journey, but in a classroom, that journey is filtered through a complex layer of emotions, moods, and contextual challenges. This emotional landscape is often invisible during the development process, yet it is the most critical part of the user experience. This gap highlights the value of methodologies that prioritize lived experience.
"The most critical part that's invisible to a developer is the emotional and contextual layer of use. When you're coding, you imagine a logical, consistent user journey, but in reality, every student brings unique moods, needs, and challenges that can change minute to minute," Bajwa says. This aligns with principles from Feminist HCI, which uses personal narratives to uncover biases in design, and is supported by research into universally accessible instructional models.
This realization redefined usability for Bajwa. "I've seen how a child's frustration, excitement, or confusion can completely reshape how they interact with technology. It made me realize that usability isn't just about clean interfaces or smooth performance; it's about emotional safety, flexibility, and human connection," she reflects. To truly understand the user experience, developers must connect with the deeply human behaviors that shape it.
The Value of Productive Struggle
In engineering, a 'failed' process or point of friction is something to be eliminated. The tech industry's drive for seamless, frictionless user journeys reflects this ethos. However, in learning, moments of struggle are often where deep understanding and resilience are forged. By designing away all challenges, technology risks creating passive users instead of active learners.
"The tech industry often designs for smooth, flawless user journeys, minimizing friction so users never get stuck. But in learning, those moments of struggle are where real understanding takes root," Bajwa notes. A key risk in AI-driven education is that students may develop 'learned helplessness' if systems intervene too quickly, preventing them from engaging in productive struggle.
In the classroom, Bajwa has seen how guided challenges build confidence, a concept often misaligned with how intelligent tutoring systems detect 'unproductive struggle'. "Tech's obsession with seamless success can unintentionally erase these growth opportunities. To truly support deep learning, we need tools that make space for safe struggle, reflection, and recovery, not just instant results," she asserts. This means designing for resilience, not just efficiency.
The Limits of Digital Sensors
A programmer’s world is mediated through a screen, but a special needs classroom is a rich, multi-sensory environment. The most critical data points are often non-digital cues—a shift in tone, a tensing of muscles, a change in breathing—that current technology cannot fully capture. While research explores real-time physiological measures, studies often lack neurodivergent participants, a significant gap highlighted in reviews of cognitive load research.
"The most critical missing ingredient is the sensory and emotional context. At Moorcroft, I rely on things no sensor or algorithm can yet fully capture: a change in tone of voice, body tension or the way a student's breathing shifts when they're overwhelmed or engaged," Bajwa emphasizes. Emerging systems using Bayesian Immediate Feedback Learning (BIFL) attempt to adapt to physiological signals, but they are still in early stages.
This sensory layer is vital for creating truly adaptive tools. "These signals tell me far more than a data dashboard ever could. They show the emotional state behind the action, not just the action itself," Bajwa says. The future of adaptive technology depends on its ability to interpret these human layers, not to replace empathy, but to inform a more responsive design.
Beyond the Engineering Mindset
An engineering mindset is trained to systematize, streamline, and optimize for efficiency. While invaluable in many contexts, this approach can conflict with the messy, non-linear, and deeply personal nature of human learning. A moment of hesitation or exploration, seen as inefficient by an algorithm, may be a crucial part of a student’s cognitive process.
Bajwa recalls a sequencing game where her instinct was to remove unnecessary steps. "My engineering instinct was to streamline the interface and optimize the process for efficiency. But in that moment, the student needed time to explore, make mistakes, and process each step at their own pace. Rushing or simplifying the task would have robbed them of a valuable learning experience," she remembers.
"That conflict taught me that the engineering mindset of efficiency and predictability has limits: human learning is messy, non-linear, and deeply personal," Bajwa reflects. "Success isn't always about optimization; sometimes it's about patience, presence, and honoring the learner's rhythm." This lesson highlights the need for a more holistic approach to EdTech development, one that balances technical rigor with profound human empathy.
Bajwa’s journey from a technology founder to a classroom aide underscores a critical message for the EdTech industry. Building truly effective tools for neurodiverse learners requires moving beyond metrics of speed and scale. It demands a new paradigm grounded in patience, direct observation, and a deep appreciation for the complex, individual paths that define human learning.