Share Post:
Return to Blog

What Sales Enablement Teams Need to Know About the EU AI Act

how the EU AI Act will impact sales enablement teams

Last month, European Union lawmakers voted overwhelmingly in favor of the Artificial Intelligence Act (AI Act). The goal is to establish a comprehensive legal framework for developing and using AI systems.

The AI Act aims to regulate AI systems’ use within the EU. It classifies AI systems into four categories based on risk: unacceptable risk, high risk, limited risk, and minimal risk. High-risk AI systems, such as those used in critical infrastructures or law enforcement, would require strict compliance measures, including risk assessments, data quality and documentation, and human oversight.

The Act prohibits certain AI practices, such as social scoring for surveillance, manipulation, and exploitation, and mandates transparency and accountability for AI developers and users. Additionally, it establishes a European Artificial Intelligence Board to oversee implementation and enforcement across member states. It’s expected to become law by June 2024, and provisions will take effect in stages.

Like with the EU’s General Data Protection Regulation (GDPR), the AI Act is expected to guide other governments grappling with how to regulate AI.

The AI Act’s Impact on Software Users

The Act’s passage—at a time when nefarious use of AI has many people worried—was seen as a win for consumers. Deepfakes, automated fraud, and the spread of misinformation, for example, are more prevalent than before because of generative AI (genAI). It makes sense that lawmakers want to protect people from such misuses.

But people who use genAI tools such as OpenAI’s ChatGPT or software powered by genAI were left wondering how the AI Act impacts them. After all, nearly every piece of software these days include AI features. Companies that make software for sales, enablement, marketing, word processing, and more all proudly say they have genAI features. If your sales reps, marketers, or other employees use them, do they have to change their strategies? Do they have to inform their prospects and customers about their genAI usage?

To clarify things, I spoke with Allego’s legal counsel, Richard Raihill about the impact the EU AI Act will have on sales, marketing, and enablement teams, as well as companies that purchase genAI-enhanced enablement software such as Allego.

For sales reps and marketers, much of what is included in the EU Act is irrelevant, he said. The biggest impact will be on high-risk providers.

“I don’t think [sales and marketing] strategies will need to be adjusted significantly for most that are not deemed high-risk providers,” Raihill said. “I do think they will need to assess the AI tools they use to ensure they are acting within the allowable bounds of these new regulations.”

That includes confirming whether the software they use comes from a company considered to be a high-risk provider, which can limit how they use AI, he said.

Continue reading to learn more from Raihill about how the EU AI Act impacts enablement teams and their purchase of genAI-enhanced software.

Michelle Davidson: For organizations that purchase/use genAI-enhanced revenue enablement software, how does the EU AI Act impact them?

Richard Raihill: I would say that question is too broad to effectively answer today. It will largely depend on what AI capabilities are included in the software, how it is being used, what genAI Foundation Models are utilized, and how the systems created under the Act evolve.

In general, the guidelines for AI users (or users of products that contain GenAI capabilities) are fairly straightforward. For certain high-risk providers, there will be broad limitations and prescriptive guidelines on the ability to utilize genAI, and in particular certain highly sensitive capabilities of the new technologies. For most sales and marketing teams, a lot of this will be not relevant, as much of the data utilized by these teams is often derived from direct historical interaction with a customer (analyzing website behavior, purchase history, etc.).

Some of the most closely regulated AI activities fall outside those direct interactions and rely on biometrics, personality analysis, etc., so to the extent a software tool utilizes those elements, then the sellers and marketers using the tool will absolutely need to be sure to evaluate whether they can keep doing so going forward.

In light of the newly passed AI Act, do B2B sales and marketing teams need to adjust their strategies for using generative AI-enhanced products?

Raihill: I don’t think strategies will need to be adjusted significantly for most that are not deemed high-risk providers. I do think they will need to assess the AI tools they use to ensure they are acting within the allowable bounds of these new regulations. This includes confirming whether they are considered a “high-risk provider,” which can limit how they use AI.  For example, if they are using a sentiment analysis tool, and that tool utilizes AI components, they may need to ensure that these do not include biometric components or emotion recognition capabilities. But I am expecting that providers like Allego that work with sales and marketing teams will not trigger higher scrutiny outside of the limited areas scrutinized as “high-risk.”

I am expecting that providers like Allego that work with sales and marketing teams will not trigger higher scrutiny outside of the limited areas scrutinized as “high-risk.”

What adjustments do sales teams need to make? What adjustments do marketing teams need to make?

Raihill: First, they will need to do some scrutiny on their tools that utilize AI to ensure that they are not doing things prohibited by the Act. Again, things like sentiment analysis, personalized recommendation engines, and biometric-powered capabilities will be subject to higher levels of regulatory scrutiny, and so sales and marketing teams will need to evaluate whether the benefits of those capabilities outweigh the risks of running afoul of these regulations. In addition, the issues of data privacy, data location (processing and storage), and data security will be amplified in the context of AI tools, and they should ensure their software providers take those into account as they develop AI-based tools.

Also, as I said, the determination of whether an AI provider is considered high risk or low risk under the Act will likely determine how much scrutiny a seller or marketer should apply to their tool providers, so extra attention should be paid to ensure that analysis is done properly.

Finally, buyers should be extra vigilant in scrutinizing the AI Foundation Providers utilized in the tools they purchase. Foundation Providers should be able to provide data on their bias prevention approach, methods, data sources, algorithm transparency, and liability approach.  For home-grown ML [machine learning] and LLM [large language models], these answers become even more critical. As the Act begins to be enforced, it is expected that there will be certification systems put in place that can take much of this burden off of the purchaser.

How important is it for sales teams to ensure humans are at the helm of genAI-enhanced functions?

Raihill: That is an important check on the AI systems and should be a part of any process using these genAI systems to engage buyers. This is especially true at this nascent stage in the lifecycle of AI, as issues like implicit bias and discrimination remain top-of-mind concerns. Note that this doesn’t require a human to “double-check” every output from an AI system, but it may require a human to monitor outputs generally so they can broadly adjust outputs to account for these concerns.

Will sales reps have to disclose if they use genAI-enhanced sales or sales enablement tools?

Raihill: It is more likely that the companies using the tools will be required to disclose certain elements of their AI usage. I don’t expect individual sellers to be required to make disclosures in their day-to-day interactions, although as with any new regulation buyers should monitor any future guidelines relative to these issues that may be developed by the relevant authorities.

I don’t expect individual sellers to be required to make disclosures [about their use of AI] in their day-to-day interactions…

If sales reps use genAI-enhanced tools to create content, such as emails, or recommend content, will they have to disclose that?

Raihill: No, not in the normal course, unless they are relying on systems that encompass high-risk activities under the Act to do so. Again, for example, using facial recognition to discern prospect mood, ethnic background, etc. prior to the communication with the prospect, which would be a high-risk activity, would potentially require such notice, if it even were to remain allowed.

If a sales rep shares with a buyer a piece of content that was created using genAI, do they have to disclose that genAI was used?

Raihill: No, again assuming the specific genAI functionality utilized is not a high-risk activity.

Will sellers in certain industries, such as financial services and life sciences, be impacted more than others by the AI Act?

Raihill: It is highly likely that sellers and marketers in certain sensitive industries such as financial services and health care will be more impacted than others. Buyers in those industries should be expected to act with greater care, and perform more due diligence, before buying tools with AI capabilities in general, because of the risk of an AI system exploiting inherent vulnerabilities in their products and customer bases.


About Richard Raihill: Richard Raihill is VP, general counsel & secretary at Allego. As such, he oversees all legal, regulatory, and compliance affairs for Allego.


sales with generative AIHow Generative AI Is Impacting Sales

AI expert Sam Richter shares his insights on the rapid evolution of genAI tools for sales and their ability to transform sales practices. Listen to the interview.

See Allego in Action

Learn how to accelerate training and empower teams with modern learning that delivers real business results.

Demo Request