Microsoft, AI & Small Businesses

Microsoft Logo

Microsoft has announced that Co-Pilot Pro, a premium business Artificial Intelligence (AI) tool will become available to small businesses on Microsoft 365. This opens up fresh opportunities for SMEs to streamline services and create content. However, it comes as Microsoft is being investigated by the UK Competition & Markets Authority for antitrust. What does this mean and how might it impact AI services?

What is Co-Pilot?

Co-Pilot is Microsoft’s AI tool. The basic level version is being rolled out for free as part of the Windows 11 update. This is marketed as your chatbot assistant, which can help answer questions, generate content or assist with coding.

A premium version designed for business use, Co-Pilot Pro, is available on Microsoft 365. Initially restricted to organisations with a minimum of 300 users, the requirement has now been dropped. SMEs and even sole traders can benefit from quality AI tools. You do need to have a Microsoft Business Standard or Premium license, and pay a monthly fee for the Co-Pilot service.

Co-Pilot is integrated across all Microsoft 365 apps, so can be used to automate a wide range of tasks. For example, the Pro version can generate reports in Word, analyse Excel data or assist with the slide deck of a PowerPoint presentation. As such it can streamline processes and drive productivity.

Why Wouldn’t You Use Co-Pilot Pro?

The additional monthly cost May be a barrier to many SMEs. Upskilling the team to make good use of AI features may limit adoption. Another consideration is that AI depends on gathering user data and behaviours to inform relevant responses. This has implications for user privacy and data sharing.

One of the main considerations is that a Microsoft 365 tenancy must be correctly prepared before Co-Pilot should be deployed. Co-Pilot's default scope is anything with the organisation's data repository. Without correct configuration of security and privileges, Co-Pilot may use information that the individual user should not have access to.

We have also picked up on concerns about the fact that Co-Pilot is an integrated feature. This means it cannot be uninstalled from Microsoft's platforms, though it can be disabled (at present). For organisations where privacy and data protection are the top priority, this is an issue.

Why is Microsoft Being Investigated?

Another concern is that Microsoft is currently under investigation by the CMA in relation to its involvement in OpenAI. OpenAI is an organisation researching the potential of AI to inform the development of the technology. In many respects, Microsoft and OpenAI have formed the perfect partnership. Microsoft has invested in OpenAI, enabling them to scale operations, whilst OpenAI technology has been adopted and integrated into Microsoft products and cloud server.

The issue being investigated is whether this partnership violates antitrust laws. Antitrust Laws aim to prevent companies from working together to gain control, influence or exclusivity that impacts market fairness. So, has this strategic partnership limited other company’s ability to benefit from AI technological developments or has it impacted consumer choice?

If the CMA decides that Microsoft has breached antitrust laws, the company is likely to be fined. In addition, directors may be disqualified or prosecuted. However, we are unlikely to see Co-Pilot or other AI tools being removed.

If Your Business Does Embrace AI

If your business does embrace AI, we advise that you create an AI strategy to map out:

  • Appropriate ways in which your team can use it to boost efficiency and productivity
  • Limitations or restrictions on use
  • Training and support to equip employees to use it as intended
  • The distinction between Open and Closed AI platforms
  • Privacy Protection

As a starting point, we want to make you aware that AI gathers data from all users to build knowledge and capabilities. The more it learns, the better it becomes at simulating human intelligence and nuances.

The source of AI data needs careful control. The mainstream AI chat platforms have come under heavy fire for creating incorrect, inappropriate or copyrighted content. Training on reputable data sources is key to the quality of the output. Equally, confidential information should not be used for training publicly available AI.

It is also worth becoming familiar with UK AI Regulations 2023 and keeping pace with developments relating to the accountability of businesses.

If you would like advice and support for your company’s software subscriptions, contact us today.

 

Sign Up To Our TechMoves Newsletter