15th May 2026 Written by: Ashton Court

Artificial Intelligence (AI) is now firmly on the agenda for SMEs. For business schools delivering programmes such as the Help to Grow: Management Course, the expectation is clear: AI must be addressed, explained and embedded.

But there is a harder truth that isn't being discussed enough. Most organisations are not failign to adopt AI because they lack access to it - they are failign because they are not operationally ready for it. And, in many cases, that includes the programmes designed to support them.

The conversation has moved - the reality has not

  • AI has moved faster than the environments it is being introduced into. Across SME cohorts, we consistently see:

  • Strong interest and curiosity

  • Widespread experimentation

  • Very limited sustained adoption

Participants are not short of tools. They are short of:

  • Structured, clean and well-managed data

  • Defined processes

  • Integrated systems

  • Clear use cases tied to their business model

The gap is not technological - it is operational.

The myth that AI is a quick win

There is a growing narrative that AI can be 'plugged in' to improve performace. It cannot. AI doesn't fix:

  • Fragmented customer records

  • Inconsistent reporting

  • Manual workarounds

  • Disconnected systems

In fact, it tends to expose them. When introduced into weak operational environments, AI often creates:

  • Inconsistent outputs

  • Loss of trust

  • Increased complexity rather than reduced effort

AI amplifies what is already there - good or bad.

Myths vs reality

Let’s be direct about what we are seeing in practice.

Myth 1: SMEs just need access to AI tools

Reality: Most already have access. The issue is knowing where and how to apply them meaningfully.

Myth 2: AI adoption is primarily a skills issue

Reality: Skills matter, but the bigger constraint is poor data and unclear processes.

Myth 3: AI can drive immediate productivity gains

Reality: Gains are incremental and dependent on operational maturity.

Myth 4: AI can sit alongside existing systems

Reality: Without integration, AI creates duplication and inconsistency.

Myth 5: Teaching AI is enough

Reality: If participants cannot apply it in their own business context, it does not translate into impact.

Business schools face the same constraints

There is a tendency to position SMEs as the challenge. However, many programme delivery environments face similar issues:

  • Participant and alumni data held across multiple systems

  • Limited visibility of engagement across the lifecycle

  • Reporting that is retrospective rather than actionable

  • Difficulty evidencing long-term outcomes.

This creates a credibility gap.

If programmes are not data-driven in their own delivery, it is difficult to demonstrate what data-driven decision-making looks like in practice.

What we are seeing in practice

To make this concrete, here are a few anonymised examples from recent work:

Example 1: “AI for engagement” - without the data

A programme provider explored using AI to identify disengaged participants. The challenge:

  • Engagement data sat across email systems, event tools and Customer Relationship Management (CRM)

  • No consistent definition of “engagement”

  • No real-time visibility.

The outcome: AI models were not the issue — the lack of structured, unified data was.
The focus shifted to consolidating data before AI could be meaningfully applied.

Example 2: “Automated insight” - built on manual processes

An organisation attempted to automate reporting to stakeholders. The reality:

  • Data was manually compiled each month

  • Multiple versions of the same dataset existed

  • No single source of truth.

Introducing AI did not solve the problem.

Standardising data and processes delivered more value than automation alone.

Example 3: “Personalisation at scale” - without a platform

A team wanted to personalise communications to SME participants using AI. The limitation:

  • No segmentation model

  • Limited behavioural data

  • Disconnected communication tools.

AI could generate content — but had no context.

The missing piece was not AI capability — it was a structured engagement platform.

The role of Customer Relationship Management (CRM) and data platforms has changed

The systems underpinning programme delivery are no longer just administrative tools. They are now expected to:

  • Provide real-time insight into engagement

  • Support segmentation and targeted communication

  • Enable longitudinal tracking of SME outcomes

  • Integrate with institutional and government reporting.

This is the foundation on which AI operates. Without it:

  • AI lacks context

  • Outputs are inconsistent

  • Trust is undermined.

Well-structured platforms make AI useful. Poorly structured ones make it visible why it is not.

The real risk: creating a disconnect between expectation and experience

If AI is introduced at a conceptual level but not enabled operationally, a gap emerges. Participants expect:

  • Practical tools

  • Immediate relevance

  • Clear outcomes.

But experience:

  • Abstract concepts

  • Limited applicability

  • No sustained change.

This is where programmes risk losing impact.

If AI cannot be demonstrated in practice, it becomes theoretical — and quickly forgotten.

A more honest starting point

For most organisations, the starting point is not AI. It is:

  • Data hygiene and consistency

  • Process clarity

  • System integration

  • Visibility of outcomes.

From there, AI can be introduced in targeted, practical ways:

  • Identifying patterns in engagement

  • Supporting decision-making

  • Enhancing communication

  • Improving efficiency.

This is slower — but it works.

Where the SBC network can lead

There is a real opportunity for the Small Business Charter community to take a more grounded approach.

Reframe AI as operational, not theoretical

Position AI within day-to-day business activity — not as a standalone topic.

Embed data literacy as a core capability

Ensure participants understand how data is structured, used and governed.

Modernise delivery infrastructure

Use better systems to:

  • Track engagement in real time

  • Personalise support

  • Demonstrate impact.

Shift from insight to evidence

Strengthen the ability to show measurable outcomes — increasingly critical for government-backed programmes.

The bottom line

AI is not the constraint. The constraint is everything around it.

  • Data

  • Systems

  • Processes

  • Delivery models.

AI will not fix weak foundations. But strong foundations will unlock AI.

Conclusion

AI will play a significant role in SME growth and productivity. But its impact will depend on how well it is integrated into real-world operations — not how effectively it is explained in isolation.

For business schools, the opportunity is not just to teach AI. It is to enable it — through better data, better systems and better delivery.

 

This blog was written by Ashton Court, one of our sponsors for the Small Business Charter Annual Conference 2026.

Ashton Court works with universities, government bodies and programme providers to design and deliver CRM, data and stakeholder engagement platforms that underpin large-scale initiatives such as the Help to Grow: Management Course programme.