AI & TechArtificial IntelligenceCybersecurityNewswireTechnology

Exabeam AI Security Expands to ChatGPT, Copilot, Gemini

▼ Summary

– Exabeam has expanded its Agent Behavior Analytics (ABA) product.
– The expansion addresses a lack of direct visibility into employee use of AI assistants.
– This includes an inability to monitor what employees query and what data they share with AI.
– Organizations also cannot see how frequently these AI interactions occur.
– They further lack visibility into the locations from which these AI interactions happen.

The challenge of securing employee interactions with generative AI tools has become a critical priority for modern enterprises. Exabeam is addressing this gap by expanding its Agent Behavior Analytics (ABA) platform to now monitor usage of popular assistants like ChatGPT, Copilot, and Gemini. This move provides organizations with the visibility they have been lacking into how these powerful tools are being used across their networks.

Without direct oversight, companies remain blind to what employees are querying, what sensitive data might be shared, and the frequency and origin of these interactions. This lack of telemetry makes it impossible to establish a security baseline for normal AI activity, leaving organizations vulnerable to data leaks, policy violations, and sophisticated threats. The expanded ABA capability aims to close this visibility gap by collecting and analyzing detailed logs from these AI applications.

The platform functions by ingesting activity data to build individual user and entity behavior profiles. By understanding typical patterns for each employee, the system can then detect significant deviations that may signal risk. An example might be a user suddenly querying an AI tool for large volumes of proprietary source code, a behavior that would stand out against their established profile. This approach shifts security from a static rule-based model to a dynamic, behavior-focused strategy.

This expansion is a direct response to the rapid, often ungoverned, adoption of generative AI in the workplace. While these tools boost productivity, they introduce new data exfiltration and compliance risks that traditional security measures are not designed to catch. By applying behavioral analytics to AI interactions, Exabeam enables security teams to identify potentially malicious or negligent activity that would otherwise go unnoticed, allowing for faster investigation and response.

The enhanced ABA offering provides a framework for organizations to safely embrace the benefits of generative AI while implementing necessary guardrails. It represents a proactive step toward managing the shadow AI phenomenon, where employees use applications without official sanction or security review. With this level of insight, companies can move from a position of uncertainty to one of informed control, ensuring their AI adoption is both powerful and secure.

(Source: Help Net Security)

Topics

ai security analytics 95% employee ai monitoring 92% data privacy risks 90% ai assistant usage 88% security visibility 87% behavioral analytics 86% ai query monitoring 85% data sharing oversight 84% interaction frequency tracking 82% access location monitoring 80%