Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Description of the fields in the Scan Configuration popup
The below screenshot shows the fields that appear in the Scan Configuration screen.
Please note that not all of these fields are available for all Data Sources.
Set a unique name so that the Data Source is easy to identify.
Credentials
This a dropdown to select the credentials that have already been configured for the Data Source.
Geographic Location
This is to indicate the physical location of the server the data sits on.
Path
This only needs to be defined for a specific location needs to be scanned.
If left blank the entire Data Source will be scanned.
Data Owner
This is the person that is to be the person responsible for the data.
This setting is optional.
If the Data streaming check box is not visible it may be because the license for DDR is not present.
To learn more about getting a license for DDR please reach out to the Getvisibility Enablement Team.

The two types of scan are Trustee Scan and File Scan
This scan provides the list of Users and Groups on a Data Source
This scan provides information about files and folders on a Data Source including structure and metadata.
Once both scans are completed the data is processed and the two sets are combined to show who has access to what files.
How to find the list of permissions granted for a Data Source
The required permissions for scanning are documented by Data Source.
For more information please review the list here.
To Check the configured permissions for a Data Source Navigate to Administration > Data Sources and click on the hamburger menu.
In the dropdown click permissions.:
The example below shows the permissions for SharePoint Online.
How to find the history of Scans performed on a Data Source
Go to Administration > Data Sources
Click on a Data Source
Click on the “Last Scan Status” symbol
Go to Administration > Data Sources
Click on a Data Source
Find the required Hamburger Menu
Click on Scan History
Either of the above options will show the history of scans performed on the relevant Data Source
How to set a specific schedule for a scan.
When a Data Source is added to Getvisibility for scanning, the scan begins automatically.
If a rescan is needed this can be configured by clicking on Administration > Data Source > (the Data Source that needs Rescan e.g. One Drive) > Hamburger menu > Rescan Scheduler.
The default configuration is Does Not Repeat.
By clicking the drop-down menu other options can be choosen:
In this option both the time zone and time of day can be chosen
With this option as well as the above configuration a specific or multiple days of the week
This gives the option to pick a specific day or days each month to run the rescan.
Getvisibility DDR offers a Quick Start option for enabling out-of-the-box data controls
Go to Administration > Quick Start.
Under the Data Controls section, enable predefined DDR rules, such as:
Public Exposure of Personal Identifiable Information (PII).
A brief description of DDR
Getvisibility's Data Detection and Response (DDR) solution is designed to protect sensitive data by providing near real-time detection and response capabilities. It ensures that data across user environments are constantly monitored and any potential threats are flagged immediately. DDR focuses on data-centric security, ensuring organisations have visibility and control over their critical information assets.
Real-Time Monitoring: DDR continuously identifies data activities, including access, modification, sharing, deletion, and other activities to identify suspicious and malicious events.
Full list of file types that can be scanned by DSPM
DOC
RTF
ODT
ODS
Below is a list of Data Sources that Getvisibility DDR (Streaming) currently supports:
AWS IAM
AWS S3
Azure AD
Automated Response: DDR sends instant alerts for quick remediation.
Risk Mitigation: It ensures regulatory compliance with Privacy Compliance standards like GDPR, HIPAA, PCI-DSS, CCPA and other standards.
AI-Powered Insights: DDR leverages proprietary Getvisibility’s AI-mesh models to analyse data context for the best accuracy.
Data Intelligence: It provides dashboards with visibility into sensitive data and risks to your data.
Data Analysis: DDR identifies all data across unstructured data environments and then classifies the data based on its content and context.
Risks Analysis: It evaluates user access, permissions, sharing and data location to identify risks related to your data.
Policy Enforcement: DDR applies predefined and custom security policies to protect data based on its classification and sensitivity.
Incident Response: Upon detecting a threat, DDR generates alerts and enables users to take remediation actions, such as moving files or revoking access.
DOCX
DOCM
XLS
XLSX
XLSM
PPT
PPTX
PPTM
VSD
TXT
C
H
DESC
CSV
TSV
XML
XHTML
HTML
HTM
EML
MSG
PNG
JPG
JPEG
TIFF
TIF
Azure Files
Exchange Online
OneDrive
SharePoint Online
Box
Confluence Cloud
Gmail
Google Drive
Google IAM
SMB
LDAP (Windows AD)









Monitoring of Payment Card Industry (PCI) data.
Import the desired Control Rules to start monitoring immediately.
Define Scopes: Specify the data sources that will be connect to.
Verify Configuration: Ensure that at least one data source is successfully connected. A green checkmark will confirm the completion.
Once the scan configuration is complete:
Go to Administration > Live Events Streaming to view real-time events.
Monitor Event Activity: Filter events by source, user name, action type (create, update, delete), and event type.
The Overview Page provides a comprehensive view of DDR's performance:
Event Statistics: Displays the number of events by source, such as Google Drive, SharePoint, OneDrive, and Box.
Data Source Activity: Visualizes active data sources and the volume of events generated by each.
Event Timeline: Shows when events occurred, helping identify peak activity periods and anomalies.
The Open Risks section highlights detected threats, categorised by risk type:
Public Exposure: Identifies sensitive files accessible to external users via public links.
External Sharing: Detects files shared outside the organisation, potentially exposing sensitive information.
Internal Over-Sharing: Flags data with excessive permissions within the organisation.
For each risk, DDR provides detailed insights, including the file path, user activity, and recommended remediation steps.
How to configure SMB/CIFS connection for scanning
Navigate to Administration -> Data Sources -> SMB -> New scan
Enter the details of the SMB server to scan
Name: Give a name to the scan to identify it later
Username: The user must be an admin level and have access to all the SMB/CIFS shares to be scanned
Password: Password for the admin user
Click on the Folder icon in Path to select a particular share/folder to scan, or leave the path as empty to scan all shares
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start file scan to begin scanning
The scan results can be viewed under Dashboard -> Enterprise Search
The connector supports all SMB dialects up to SMB 3.1.1
How to configure Atlassian Confluence Cloud connection to scan it.
Log in to
Click Create API token
From the dialog that appears, enter a memorable and concise Label for the token and click Create
Click Copy to clipboard, and save it somewhere secure. It isn't possible to view the token after closing the creation dialog
Navigate to Administration -> Data Sources -> Confluence Cloud -> New scan
Enter the details
Name: Give a name to the scan to identify it later
Username: The email address for the Atlassian account you used to create the token
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin the trustee scanning
The scan results can be viewed under Dashboard -> Access Governance
Click on the icon on the right and select Start file scan to begin the files scanning
The results can be viewed under Dashboard -> Enterprise Search
When a targeted rescan is needed it is possible to scan individual files or a specific selection.
Ensuring that recent changes to files are reflected in the UI.
If new patterns have been added to Pattern Matching.
If new rules have been added in Controls Orchestration.
Files can be sent for rescan individually by clicking on the hamburger menu for that file and click on “send to classification pipeline.
There is also an option to reclassify multiple files at once by selecting them using the tickboxes on the left of the screen.
Once the required files are selected the option to rescan appears on the bottom right of the screen.
How to configure LDAP connection to gather permissions and access rights for groups, users, and other entities (Trustees) on an LDAP server.
Navigate to Administration -> Data Sources -> LDAP -> New scan
Enter the details of the LDAP server to scan
Name: Give a name to the scan to identify it later
Username: The user must be an admin level and have access to all the LDAP utilities to be scanned. The username should be entered in the format [email protected]
Password: Password for the admin user
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin scanning
The scan results can be viewed under Dashboard -> Access Governance
How to configure ChatGPT connection for scanning.
Owners can generate an API key in the OpenAI API Platform Portal. Note that the correct Organization must be selected when creating a key, corresponding to the administered workspace. Do not select the owner's personal organization.
Create a new API key:
Settings: Default Project | All Permissions
Note that this must be a new key. Once the Compliance API scopes are granted, all other scopes are revoked.
Reminder: This key can only be viewed/copied once. Store it securely.
Send an email to with:
The last 4 digits of the API key
The Key Name
The Created By Name
The OpenAI team will verify the key and grant the requested Compliance API scopes.
Administrators may then use this key or pass it to a partner for use with the Compliance API.
Workspace IDs can be found on the
Navigate to Administration -> Data Sources -> ChatGPT -> New scan
Provide the workspace id and the api key value obtained from above steps
Click on the Folder icon in Path to select a particular user or gpt to scan, or leave the path as empty to scan all
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin the trustee scanning
The scan results can be viewed under Dashboard -> Access Governance
Click on the icon on the right and select Start file scan to begin the files scanning
The results can be viewed under Dashboard -> Enterprise Search
This guide outlines how to configure Microsoft O365 Streaming in environments where Getvisibility’s Data Detection and Response (DDR) platform is deployed on-premise or in a private cloud. The integration enables DDR to receive and act upon real-time Microsoft 365 activity notifications.
Ensure the following prerequisites are in place before starting the integration:
A deployed and operational DDR instance.
A public DNS record pointing to the DDR listener endpoint.
A valid SSL/TLS certificate from a trusted Certificate Authority.
An internet-accessible port 443 (HTTPS) endpoint.
Make sure the DDR webhook endpoint is:
Publicly accessible via a fully qualified domain name (FQDN).
Protected with a valid SSL/TLS certificate.
Accessible on port 443 (HTTPS).
Note: You can use a reverse proxy (e.g., NGROK, NGINX) to securely expose internal services if needed.
Microsoft recommends restricting webhook traffic to only allow inbound requests from Microsoft Graph servers. This reduces the attack surface and prevents spoofed webhook messages.
Allowlist Required Endpoints:
More info at
⚠️ Action Required: Your firewall or reverse proxy must allow inbound HTTPS traffic from all IP addresses Microsoft uses to deliver change notifications. Regularly update your rules using Microsoft’s published IP ranges.
Scan Analytics shows in-depth information gathered during the scan.
There are two ways to access Scan Analytics, either via the main Analytics Dashboard or via the Data Sources page.
To access the Analytics Dashboards click on the link on the Getvisibilty homepage.
In the side bar detailed information regarding the scan of the chosen Data Source can be reviewed.
Clicking on any of the fields in the Sidebar brings up a more detailed view of the data as well as giving the option to Remediate any issues that have been found.
For a more detailed breakdown of Analytics please see .
Getvisibility DDR continuously monitors new files generated through streaming and provides real-time insights
Filter by Streaming: Under Enterprise Search, use the filter scanTrigger=streaming.
View File Details: DDR displays:
File Path: The location of the file in the data source.
Classification: Sensitivity level (Confidential, Highly Confidential, etc.).
Risk Level: Based on context and user activity.
Compliance Tags: Indicators for GDPR, HIPAA, PCI, and other regulations.
Detection Rules: The specific DDR rules triggered by the file.
Incident Response: If a high-risk file is detected, DDR generates an alert and suggests remediation steps, such as quarantining the file or revoking access.
The integration of Data Streaming and File Lineage into the DSPM platform provides a comprehensive solution for real-time data monitoring and tracking across both cloud and on-premises data sources. This enhancement enables organizations to dynamically track file origins, data transformations and movements, and end-usage in real time, strengthening security, compliance, and auditability. By introducing these functionalities, businesses can seamlessly monitor data activities and movements across various data sources, providing up-to-date visibility over data estate and offering deeper insights into file history for e-forensics use cases and risk mitigation.
By implementing Streaming, we unlock crucial use cases such as File Lineage tracking, and Data Detection and Response capabilities, enabling real-time visibility into data activities. This also builds the foundation for anomaly detection capabilities, frequently requested by customers. For instance, scenarios like a user resetting their password, accessing confidential data, and downloading it can be quickly identified. By providing almost real-time updates and visibility into the data estate, businesses can seamlessly monitor data activities, mitigating risks and improving security.
PRECONDITION:
During cluster installation, network administrators need to open on firewall exclusion for incoming requests for path:
where ${HOST_DOMAIN} it's host domain of DSPM platform installation.
Listed below are the languages supported by the ML (Machine Learning) classifiers, grouped by language pack.
Firewall rules allowing inbound traffic from Microsoft Graph servers.
The requested scope (read and delete)














Host IP Address: The IP Address of the SMB/CIFS server
Domain/Workgroup: The domain or workgroup to which the CIFS/SMB server belongs
Port: 445 is the default port, however if the default port is not used, input the correct port number for the SMB protocol








IP Address: The IP Address of the server where the LDAP is installed
Certificate (Optional): If the server to be scaned uses LDAPS (LDAP over SSL/TLS) enter the certificate text here. Otherwise leave it blank
Port: 389 is the default port for LDAP, however for Secure LDAP 636 is used
Use Global Catalog ports at 3268 (LDAP) and 3269 (LDAPS), in case standard ports doesn't allow us to traverse through the whole LDAP tree
Inactivity: This defines inactive users. Default is 90 days
Search base: This is the point in the LDAP directory where Focus will start searching from. In this example:
DC stands for Domain Component. An attribute used to represent domain levels
aws-gv is the name of the first-level domain
local is the top-level domain
Together, DC=aws-gv,DC=local represents the domain aws-gv.local









Chinese
English, Chinese (Simplified, Traditional)
Finnish
English, Finnish
West-Slavic-3
English, Polish, Czech, Slovak
German-Dutch
English, German, Dutch
Nordic-3
English, Danish, Swedish, Norwegian
Hebrew
English, Hebrew
Greek
English, Greek
Korean
English, Korean
Thai
English, Thai
If additional language packs are needed after the initial setup please reach out to support for assistance as each additional pack is a separate AI model that needs to be added.
Name
Languages in Pack
Arabic
English, Arabic
Turkish
English, Turkish
Hindi
English, Hindi
Latin-5
English, French, Spanish, Portuguese, Italian, Romanian
Japanese
English, Japanese
The host domain needs to be publicly available on the web.
Ensure that the certificate used is one that is trusted by the Data Source provider. For example with Microsoft services more information on the certificates that they accept can be found here.
Multitenancy Setup
For the multitenancy setup, we need to specify ${HOST_DOMAIN} as
For Data Detection and Response (DDR) to function effectively, the callback endpoint URL must remain open and accessible beyond just the initial setup phase. DDR relies on real-time event notifications and data stream updates, continuously sent to the callback URL. If the callback endpoint is closed or restricted after setup, DDR will fail to receive critical updates, which may result in:
Delayed or missing alerts on data access, movement, or security threats.
Incomplete monitoring of file lineage and activities, impacting compliance and forensic investigations.
To ensure uninterrupted functionality, organisations must configure their network to allow incoming requests to the callback URL from all necessary data sources.
Additionally, for on-premise deployments, it is critical that the webhook URL is accessible by external resources to receive notifications. If external services cannot reach the callback URL, DDR will not function correctly, leading to missed event detections and security blind spots. Network administrators must ensure the necessary firewall rules and routing configurations are in place to allow external communication with the webhook.
https://${HOST_DOMAIN}/scan-manager/external/webhooks/notification{{ .Values.clusterLabels.cluster_name }}.{{.Values.clusterLabels.rancher}}.app.getvisibility.com Domain: The Atlassian domain
Click on the Folder icon in Path to select a particular space to scan, or leave the path as empty to scan all spaces







Provide the Domain URL, an admin username and its password
Click on the Folder icon in Site and path to select a particular site to scan, or leave the path as empty to scan all sites
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning
The results can be viewed under Dashboard -> Enterprise Search
An admin level user is required to scan and tag files in SharePoint On-Premise. The user must be a member of Site Owners Group where they have full control permissions to the SharePoint site.
The default Getvisibility tags need to be created as a new column in their SharePoint. This process is described below:
In SharePoint, navigate to Documents
In the files view, select + Add column
Select Choice and then Next
Give the name as Classification and the choices as: Public, Internal, Confidential, Highly-Confidential. Select Save
Similary create Compliance and Distribution columns (if required)
Getvisibility and SharePoint's tags are now aligned
When tags are written to SharePoint files automatically over the API, as the tags are added by Getvisibility, Modified By changes to System Account.
Getvisibility preserves the Modified date where applicable.
The connector supports SharePoint 2013, 2016, 2019.

In the Policy editor section, find the Select a service section, then choose IAM service, and select Next
In Actions allowed, choose the below actions to add to the policy:
Read > GetUser
Read > GetPolicyVersion
Read > GetPolicy
Read > GetUserPolicy
List > ListUserPolicies
List > ListAttachedGroupPolicies
List > ListAttachedUserPolicies
List > ListGroups
List > ListUsers
List > ListGroupsForUser
For Resources, choose All and select Create policy to save the new policy
Sign in to the AWS Management Console and open the IAM console with the appropriate admin level account
In the navigation pane on the left, choose Users and then choose Create user
On the Specify user details page, under User details, in User name, enter the name for the new user, example iam-connector-user and select Next
On the Set permissions page, select Attach policies directly and choose the policy created in above steps
Select Next
Once the user is created, select it, and from the user page, choose Create access key
Select Other then Next
Enter a description if you wish and select Create access key
The Access and Secret Access Keys have now been created. These can be downloaded as a CSV, and also copied from this section. NOTE: the secret access key cannot be viewed once you leave this page
Navigate to Administration -> Data Sources -> AWS IAM -> New scan
Provide the access key and secret access key values generated in the above steps
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin the scanning
The scan results can be viewed under Dashboard -> Access Governance
How to configure Google IAM connection to gather permissions and access rights for trustees.
Create a Project in Google Cloud Console:
Go to the
Create a new project or select an existing project
Enable the Admin SDK:
In the Google Cloud Console, navigate to the "APIs & Services" > "Library"
Search for "Admin SDK" and click on it
Create OAuth 2.0 Credentials:
In the Google Cloud Console, go to APIs & Services > Credentials
Click "Create credentials" and select "Service account"
From your domain's , go to Main menu menu > Security > Access and data control > API controls
In the Domain wide delegation pane, select Manage Domain Wide Delegation
Click Add new
In the Client ID field, enter the client ID obtained from the service account creation steps above
In the OAuth Scopes field, enter a comma-delimited list of the scopes required for the application
Use the below scopes:
https://www.googleapis.com/auth/admin.directory.user.readonly
https://www.googleapis.com/auth/admin.directory.domain.readonly
https://www.googleapis.com/auth/admin.directory.group.readonly
DirectoryService.Scope.AdminDirectoryUserReadonly
DirectoryService.Scope.AdminDirectoryDomainReadonly
DirectoryService.Scope.AdminDirectoryGroupReadonly
DirectoryService.Scope.AdminDirectoryRolemanagementReadonly
Navigate to Administration -> Data Sources -> Google IAM -> New scan
Enter the details of the OAuth2 credemtials obtained previously
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin scanning
The scan results can be viewed under Dashboard -> Access Governance
This document provides information on how to configure Gmail connection for Focus product.
Create a Project in Google Cloud Console:
Go to the
Create a new project or select an existing project
Enable the Gmail:
In the Google Cloud Console, navigate to the "APIs & Services" > "Library"
Search for "Gmail API" and click on it
Create OAuth 2.0 Credentials:
In the Google Cloud Console, navigate to the "APIs & Services" > "Credentials" tab
Click "Create credentials" and select "Service account"
From your domain's , go to Main menu menu > Security > Access and data control > API controls
In the Domain wide delegation pane, select Manage Domain Wide Delegation
Click Add new
In the Client ID field, enter the client ID obtained from the service account creation steps above
In the OAuth Scopes field, enter a comma-delimited list of the scopes required for the application
Use the below scopes:
For scanning
https://www.googleapis.com/auth/admin.directory.user.readonly
https://www.googleapis.com/auth/gmail.readonly
For tagging
This document provides information on configuring an AWS S3 connection with real-time event monitoring and data streaming.
To enable DDR (Streaming) for an existing AWS S3 scan, follow these steps:
Existing AWS S3 connection:
An AWS S3 scan configuration must already exist.
If an AWS S3 scan has not been configured yet, follow this guide to and ensure the necessary credentials are set up.
Extend AWS S3 policy permissions to allow data streaming:
Require a separate set of permissions for AWS SNS service:
Go to the Scan Configurations page in the product UI.
Select AWS S3 and create credentials for AWS S3
Find the AWS S3 scan configuration and select Edit Configuration from the options menu.
To validate that streaming events coming though the system you may check Administration -> Live Events -> Streaming
This guide provides steps on how to enable real-time data streaming for a Google drive connection and monitor streaming events within the Getvisibility platform.
From the Data Sources page, select Google drive from the list of available data sources. In the Scan Configurations list, create a New Configuration.
Make sure the connection has a Name and Credentials set then click on Data streaming toggle and click Save & Close to finalize the changes
Clock icon: When data streaming is being activated, the "Requested" status will appear, indicating that the subscription is being processed. Once the subscription is activated, this status will change to "On".
After enabling Data Streaming, the system will automatically handle the subscription to Google Drive’s real-time events. There is no need to manually configure Webhooks.
After the subscription is activated, real-time events will start flowing into the platform, and can be monitored from the relevant parts of the platform.
Go to the Live Events section under Administration to view a detailed audit log of all streaming events.
Filter by source to get only Google Drive events
Overview
Extended streaming events provide deeper insights into file activities within Google Drive by leveraging the admin.reports.audit.readonly permission. This allows the system to capture additional event types beyond standard data streaming, such as file permission changes. These events are crucial for comprehensive monitoring, alerting, and data lineage tracking within the platform.
Prerequisites
Before enabling extended streaming events, ensure that:
The required permission https://www.googleapis.com/auth/admin.reports.audit.readonly is granted to your Google Drive connection.
You have followed the delegation process as outlined in the Getvisibility documentation: .
Enabling Extended Streaming Events
If the necessary permission was not granted at the time of the initial streaming subscription, click on unsubscribe and then re-subscribe to streaming events from the Data Sources view.
Steps:
Go to the Data Sources section under Administration.
Locate the Google Drive connection.
If extended streaming is not enabled, uncheck "Data streaming" box from streaming events.
Ensure that the admin.reports.audit.readonly
Monitoring Extended Streaming Events
Once extended streaming is enabled, events will be available for monitoring in multiple sections of the platform:
Live Events Section
Go to Live Events under Administration to view real-time extended events.
Use the filter options to narrow down events to only Google Drive activities.
Extended events such as permission changes, sharing modifications, and file deletions will be listed.
Data Lineage Tracking
Extended events are integrated into Data Lineage, providing a clear visualization of file activity over time.
Users can track who performed actions on a file and when, enabling forensic investigation and compliance tracking.
Alerting and Monitoring
Alerts can be configured for specific event types such as sensitive file shared externally, file permissions changed, or file deletion.
These alerts help organizations proactively detect potential security risks or data leaks.
This guide provides steps on how to enable real-time data streaming for a Box connection and monitor streaming events within the Getvisibility platform.
This guide walks you through enabling real-time data streaming for a Box connection and how to monitor live streaming events within the Getvisibility platform.
If you haven't created a Box credentials yet, follow this guide to and ensure the necessary credentials are set up.
Data streaming needs Manage webhooks scope, as shown below
Confirm that Manage Webhooks are present in App Scopes
exit Dev Console and switch to the Admin Console
From the Data Sources page, select Box from the list of available data sources. In the Scan Configurations list, create a New Configuration.
Make sure the connection has a Name and Credentials set. Then select the Path icon.
Click on the Folder icon in the Path field to select the folder you want to monitor for real-time events.
Magnifying glass icon: Folders with this icon next to them indicate that real-time events can be subscribed to from this directory.
After selecting the folder, click Save & Close to finalize the changes.
You can view the configured webhook in your Box Dev Console
Login to your Box account and navigate to
Select the Box app configured in previous steps
Navigate to Webhooks tab, here you can see the list of configured webhooks
After the subscription is activated (green magnifying glass icon), real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.
Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events.
In this section, you can filter and view event details.
How to create an iManage Connector app to connect to iManage accounts for the cloud.
Registering an iManage App
To register iManage App you need to contact iManage support by sending an email to [email protected]
Once an account is created login to iManage
Click on username in the upper right corner and click Control Center
Note: Only users with admin role have access to Control Center
Go to the Applications menu item, click Desktop Auth Client and find Client ID
Customer ID should be provided by iManage admins, but if it is not provided, it can be retrieved from the /api response
Get Access Token
Get Customer ID
Click on the Folder icon in Path to select a particular path to scan, or leave the path as empty to scan all
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin Trustee scanning
The scan results can be viewed under Dashboard -> Access Governance
Click on the icon on the right and select Start file scan to begin file scanning
The results can be viewed under Dashboard -> Enterprise Search
This guide will explain the features available from the files Explore page on the Focus platform.
From the Getvisibility Dashboard, select the 'Enterprise Search' section.
Search: Use the search bar to type in a file or folder name to find specific items.
Utilise the dropdown menus to filter the files by:
File Extension: Choose from a list of file types.
Source: Select the source of the files.
Each file listed will have details displayed such as:
Path: The directory path of the file.
Category and Subcategory: The classification categories of the file.
Classification: The sensitivity level of the file.
Compliance Tags: Any compliance-related tags assigned to the file.
All available columns will be listed:
Use the 'EXPORT' button to create a CSV or JSONL download of the filtered data
Click 'CLEAR SEARCH' to reset all filters and search criteria.
To view or modify the permissions and access rights of a file, click on the Actions menu.
A detailed list will appear showing the Security Identifier (SID), the common name associated with that SID, the organizational unit, domain, and the specific permissions granted.
To adjust the machine learning classification of a file, click on the pencil icon under the 'Actions' column.
A dialog box titled 'Verify ML Classification' will appear, allowing you to change:
The 'Category' and 'Subcategory' of the document.
The 'Compliance' tags associated with the file.
The 'Classification' level (e.g., Public, Internal, Confidential).
Confidence levels for the category and subcategory will be displayed, providing insight into the ML model's certainty regarding its classification.
Toggle the 'Switch to advanced search' to use GQL (Getvisibility Query Language). Please refer to this for full guidance on using GQL.
Overview of Lineage
Data Lineage in Getvisibility provides a comprehensive view of a file's lifecycle, tracking its origin, movement, transformation, and usage. This enhances security, compliance, and forensic investigations by offering end-to-end visibility into data activities.
Traditional data monitoring provides static snapshots, which quickly become outdated, especially for large datasets. Real-time lineage addresses this by:
Reducing Dependency on Rescans: Once streaming is enabled, changes are captured instantly.
Improving Visibility: Organizations can see data movements in near real-time.
Enabling Faster Incident Response: Security teams can quickly assess and respond to threats.
Data Lineage was developed to enable forensic investigations, ensuring organisations can:
Investigate Incidents: Identify the root cause of security incidents, such as data breaches or unauthorised sharing.
Enhance Compliance: Maintain audit trails for regulatory requirements.
Support Risk Mitigation: Quickly respond to suspicious activities and apply appropriate remediation actions.
Connection to Each Data Source: Ensure that each Data Source to be monitored has been configured in Getvisibility.
Enabling Streaming: Activate real-time event streaming for each connector.
From Enterprise Search: Select a file and click on "Lineage" in the dropdown.
From Open Risks: Identify a flagged file and expand the side menu.
Event Type (Create, Modify, Delete, Share, Move, etc.)
Data Source
User Activity
Export lineage details to CSV for auditing and reporting.
Green: Normal activity
Yellow: Medium-risk events (e.g., permission changes)
Red: High-risk events (e.g., external sharing)
Lifecycle: Displays the complete lifecycle of a file from creation to current state.
Event Timeline: Chronological list of all file-related actions.
User & Device: Shows which users and devices interacted with the file.
File Path: Original and current locations of the file.
Create
Modify
Delete
Change Permissions
Share
Move
Copy
Google Drive: Audit log events available.
Azure (SharePoint Online, OneDrive, Blob, Files): Audit log events supported.
Box & Confluence: Extended events available in regular logs.
AWS S3, SMB, Dropbox: Limited to Create, Modify, and Delete.
Lineage supports forensic investigations, such as:
External Sharing Investigation: When a file is shared externally, security analysts can trace its history to determine if the action was intentional or accidental.
Suspicious Activity Investigation: If a user accesses and downloads sensitive information after a password reset, lineage provides detailed insights.
Incident Response: Analysts can determine what actions to take, such as revoking access, quarantining files, or addressing user behaviour.
Enterprise Search: Select the file, click the dropdown, and choose "Lineage."
File View: Expand the file details and navigate to the "Lineage" tab.
Event Description: Hovering over event icons shows a brief description.
Export: Export the entire lineage history, including metadata, to CSV for audit trails and reporting.
Data Lineage empowers organisations with real-time visibility, advanced threat detection, and comprehensive forensic capabilities, ensuring sensitive data remains secure and traceable.
This guide details how to create and configure an iManage connector to scan an on-premise iManage Work Server.
To connect Forcepoint DSPM to your iManage server, you will need to gather three key pieces of information:
Your Server's URL: The fully qualified domain name of your iManage server (e.g., imanage.mycompany.com).
An Application Client ID: A unique ID from your iManage Control Center that identifies the Getvisibility application.
This guide provides steps on how to enable real-time data streaming for a Dropbox connection and monitor streaming events within the Getvisibility platform.
This guide provides steps on how to enable real-time data streaming for a Sharepoint Online connection and monitor streaming events within the Getvisibility platform.
This guide provides steps on how to enable real-time data streaming for a Azuer AD connection and monitor streaming events within the Getvisibility platform.
This guide provides steps on how to enable real-time data streaming for a SMB connection and monitor streaming events within the Getvisibility platform.
This guide provides steps on how to enable real-time data streaming for a OneDrive connection and monitor streaming events within the Getvisibility platform.
Upload
Download


















Risk: Filter files based on content and the amount of user access to the file.
Keyword Hits: Filter by specific keywords using regex (regular expressions).
Group: Select a specific group that files may be associated with, such as Sensitive.
Category and Subcategory: Filter files based on predefined categories and subcategories.
Classification: Choose the classification level of the files (e.g., Public, Internal, Confidential).
Alias: Filter files by their Data Source aliases.
Compliance: Select a compliance regulation to see relevant files.
Flow: Filter by the data flow process, catalogued or classification.
Trustees: Filter files accessible by specific users or groups.
Created and Last Modified: Timestamps indicating when the file was created and last modified.
Subcategory Confidence: A confidence score indicating the accuracy of the subcategory classification.
Ingested Time: The timestamp when the file was ingested into the system.
Actions: Icons indicating possible actions to take on the file.
File Size: The size of the file.
These details can be configure by selecting the Column Configuration menu here:
After making changes, click 'SAVE' to apply them.













{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "SNSScoped",
"Effect": "Allow",
"Action": [
"sns:CreateTopic",
"sns:DeleteTopic",
"sns:TagResource",
"sns:SetTopicAttributes",
"sns:Subscribe",
"sns:ConfirmSubscription"
],
"Resource": [
"arn:aws:sns:*:876326936841:s3-event-topic-*"
]
},
{
"Sid": "S3BucketNatification",
"Effect": "Allow",
"Action": [
"s3:PutBucketNotification"
],
"Resource": "*"
}
]
}Click the "Enable" button to enable the Admin SDK API for your project
Enter a name in the Service account name field andclick CREATE CREDENTIALS
Under "Grant this service account access to the project," select role as Owner and click DONE
Select the newly created service account and click Keys > Add Key > Create new key
Make sure the key type is set to json and click CREATE
The new private key pair is generated and downloaded to the machine. Note the values of private_key, client_email and client_id
https://www.googleapis.com/auth/admin.directory.rolemanagement.readonly
Click Authorize












Click the "Enable" button to enable the Goolge Drive Activity API for your project
Enter a name in the Service account name field and CREATE AND CONTINUE
Under Grant this service account access to the project, select role as Owner and click DONE
Select the newly created service account and click Keys > Add Key > Create new key
Make sure the key type is set to json and click Create
The new private key pair is generated and downloaded to the machine. Note the values of private_key, client_email and client_id
https://www.googleapis.com/auth/gmail.modify
https://www.googleapis.com/auth/gmail.labels
https://www.googleapis.com/auth/gmail.metadata
Click Authorize







Click check "Data streaming" box from streaming events again to re-enable streaming with extended event tracking.
Verify the status of the subscription to ensure it is active.






In Admin Console, go to Apps > Integration > Platform Apps Manager and locate the newly created app, then click View button
Note: in case of Manage webhooks v2 is not visible on the list, create a new Box credentials
Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. Once the subscription is activated, this icon will change to a green magnifying glass.
After enabling Data Streaming, the system will automatically handle the subscription to Box’s real-time events. There is no need to manually configure Webhooks.









Go to the Roles menu item and set the following:
Select Global Management to setup admin roles. Enable the necessary options.
Select Library-level Management to setup library roles
Permissions required
For scanning
System Access > Read-only
To move files
Document > Delete
To revoke permissions
System Access > Not Read-only
For tagging
Document > Import / Create
Navigate to Administration -> Data Sources -> iManage -> New scan
Provide the customer id, client id, username, password and domain value












Browse to App Registration and select New registration
On the App Registration page enter the below information and click the Register button.
Name: (Enter a meaningful application name that will be displayed to users of the app)
Supported account types:
Select which accounts that the application will support. The options should be similar to the below screenshot.
“Accounts in this organizational directory only” can be selected:
Leave the Redirect URI as empty and Click Register
Note the Application (client) ID, Directory (tenant) ID values
Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret
Provide a meaningful description and expiry to the secret, and click on Add
Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once this page is closed.
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Microsoft APIs -> Microsoft Graph
Select Application permissions
Permissions required
Scanning only:
Microsoft Graph > Application permissions > AuditLog > AuditLog.Read.All
Microsoft Graph > Application permissions > Directory > Directory.Read.All
Once all the required permissions are added, click Grant admin consent
Navigate to Administration -> Data Sources -> Azure AD -> New scan
Provide the Directory (tenant) ID, Application (client) ID and Client Secret value generated in the above steps from the Azure application
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin scanning
The scan results can be viewed under Dashboard -> Access Governance
This guide will walk you through the steps for your iManage administrator to find this information and how to use it to configure the connector.
Before you begin, ensure the Forcepoint DSPM server has network access to your on-premise iManage server's API. You may need to configure internal firewall rules to allow this connection.
Before you begin, ensure you have the following:
Administrative access to your on-premise iManage Control Center.
The fully qualified domain name (hostname) of your on-premise iManage server (e.g., imanage.mycompany.com).
A dedicated iManage service account with a username and password.
This step must be performed by your internal iManage administrator.
Log in to your on-premise iManage server.
Click on your username in the upper-right corner and select Control Center.
From the side menu, navigate to Applications.
Select Desktop Auth Client from the list.
Copy the Client ID value. This ID is used to identify the Forcepoint DSPM application to your iManage server. You will need this for Part 2 and Part 4.
You can use a command-line tool like curl to perform these one-time steps. Replace your.imanage.server.com with your on-premise server's actual hostname in the commands below.
A. Get Access Token
Run the following command in your terminal. Be sure to replace the placeholder values (YOUR_USERNAME, YOUR_PASSWORD, YOUR_CLIENT_ID) with your actual service account credentials and the Client ID from Part 1.
The JSON response will contain your access_token.
B. Get Customer ID
Run the next command, replacing YOUR_ACCESS_TOKEN with the access_token value you received from the previous step.
The JSON response will contain your customer_id.
This is performed in the iManage Control Center to grant the service account the necessary permissions.
Navigate to Control Center > Roles.
Create or edit the role assigned to your service account.
Grant the following privileges:
For Scanning: System Access > Read-only
For Tagging: Document > Import / Create
For Moving Files: Document > Delete
For Revoking Permissions: System Access > Not Read-only
In the Forcepoint DSPM, navigate to Administration > Data Sources.
Find iManage in the list and click New Scan.
Fill in the connector configuration fields:
Field
Value
Description
Name
My On-Prem iManage
A friendly name for this connection.
Customer Id
(ID from Part 2B)
The numeric Customer ID for your instance.
Username
(Service Account)
The iManage service account username.
Password
(Service Account)
The service account password.
Click Save.
Find your newly configured iManage connection in the list.
Click the ... (three-dot) menu on the right.
Select Start trustee scan to scan permissions (Optional).
Once the trustee scan is complete (optional), click the ... menu again and select Start file scan to scan content.
Permission and access issues can be viewed in Dashboard > Access Governance (if you ran the trustee scan).
File classification and content results can be viewed in Dashboard > Enterprise Search.
In the Policy editor section, find the Select a service section, then choose S3 service, and select Next. Once S3 service permissions are added, next, move on to IAM service
In Actions allowed, choose the below actions to add to the policy:
For scanning
IAM service
Read > GetUser
Read > GetPolicyVersion
Read > GetPolicy
Read > GetUserPolicy
List > ListUserPolicies
List > ListAttachedUserPolicies
S3 service
Read > GetBucketAcl
Read > GetBucketLocation
Read > GetObject
EC2 service
List > DescribeRegions
For revoke permissions (S3 service)
Permission Management > PutBucketAcl
Permission Management > PutObjectAcl
For tagging (S3 service)
Write > DeleteObject
Write > PutObject
Tagging > DeleteObjectTagging
For Resources, choose All and select Create policy to save the new policy
Sign in to the AWS Management Console and open the IAM console with the appropriate admin level account
In the navigation pane on the left, choose Users and then choose Create user
On the Specify user details page, under User details, in User name, enter the name for the new user, example S3-connector-user and select Next
On the Set permissions page, select Attach policies directly and choose the policy created in above steps
Select Next
Once the user is created, select it, and from the user page, choose Create access key
Select Other then Next
Enter a description if you wish and select Create access key
The Access and Secret Access Keys have now been created. These can be downloaded as a CSV, and also copied from this section. NOTE: the secret access key cannot be viewed once you leave this page
Navigate to Administration -> Data Sources -> AWS S3 -> New scan
Provide the access key and secret access key values generated in the above steps
Click on the Folder icon in Path to select a particular bucket to scan, or leave the path as empty to scan all buckets
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning
The results can be viewed under Dashboard -> Enterprise Search
Create credentials name and copy the Redirect URL - it will be needed later. App Key and App Secret fields will be filled later once we Register a Dropbox App.
Login to Dropbox
Go to Dropbox App Console and click Create app
On the App Creation page enter below information and click Create app button
Choose an API: Most applications will use "Dropbox API"
Choose Access Type: Select "Full Dropbox" for complete access.
Name Your App and click Create app: Enter a name that will be visible to users.
Go to the Settings tab and find app key and secret above the OAuth 2 section
We need to set proper permissions for Dropbox app. Below you can find a list of required permissions:
For scanning
Files and Folders > files.metadata.read, files.content.read
Collaboration > sharing.read
Go to the Permissions tab of the newly created App and set the following:
Account Info: account_info.read
Files and Folders: files.metadata.write, files.metadata.read, files.content.write, files.content.read
Once permissions are set click Save button located on the black snackbar at the bottom of the window.
Go back to Settings tab and scroll to Redirect URI section. Put here copied link from Dashboard and click Add
Then copy App key from Dropbox App settings page and put it into App key field in Dashboard Create connection form. Similar action should be made for App secret.
Once done click Authorize with Dropbox button as below:
Then you'll be redirected to 1st page to trust your application - click Continue
Then you'll see a list of permissions app will be granted - click Allow
Once done you'll be redirected back to Dashboard page with success message as below:
Connection has been configured successfully

If a specific path has not been set, the entire Data Source will be scanned.
Metadata (path, size, format, etc.) and permissions are extracted and recorded for each file.
This step ensures that every every file and folder is identified and that access permissions are understood.
The scan discovery process can have the following statuses, reflecting its progress:
Not Started: Data Source has been added but the scan has not started.
Queued: Scan has been put into the queue for the execution.
Failed To Start: Scan unable to start, usually due to issues with permissions or network.
In Progress: Scan is actively running and processing data discovery.
Cancelled: Scan was manually stopped or automatically aborted.
Incomplete: Scan is partially completed but permissions to files were changed during scan.
Completed: Scan has successfully finished Discovery phase.
These statuses can be seen in the Last Scan Status column.
Metadata information is processed for each file that has been collected as part of the Discovery step.
A detailed analysis of each file's metadata is performed .
Permissions are analysed and the shared level is identified.
A detailed analysis of each file's content is performed.
Content is extracted and the sensitivity level and risk of each file is determined for classification.
This is determined by the Patterns/Detector setting and the AI Mesh
This ensures that sensitive information is properly identified and protected.
This is a scan to determine the Users and Groups present in a Data Source.
Metadata is extracted for each user, with specific fields depending on the data source. Some of the fields that will be picked up by the scan include Enabled, Last Login, Last Modified, etc.
The statuses for these scans are the same as for files but there are two additional ones.
Completed Only Users: The scan has been completed only for user-specific policies.
Completed Only Groups: The scan has been completed only for group-specific policies.
To see additional information on a running or completed scan click on the Scan Analytics Icon.
This will pop out the Analytics sidebar where there is information such as scan duration, how many files have been scanned, classification insights, etc.

Browse to App Registration and select New registration
On the App Registration page enter below information and click Register button
Name: (Enter a meaningful application name that will be displayed to users of the app)
Supported account types:
Select which accounts the application will support. The options should be similar to those below. Select �Accounts in this organizational directory only�:
Leave the Redirect URI as empty and Click Register
Note the Application (client) ID, Directory (tenant) ID values
Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret
Provide a meaningful description and expiry to the secret, and click on Add
Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once you leave this page
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Microsoft APIs -> Microsoft Graph
Select Application permissions
Permissions required
Microsoft Graph > Application permissions > Device > Device.Read.All
Microsoft Graph > Application permissions > Directory > Directory.Read.All
Microsoft Graph > Application permissions > Group > Group.Read.All
Microsoft Graph > Application permissions > User > User.Read.All
Once all the required permissions are added, click "Grant admin consent"
A connection string is needed for the storage account you wish to scan.
Login to Azure Portal
If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu.
Browse to Storage accounts and select the account to be scanned
Once the storage account is selected, note the Resouce group and Subscription ID values in the Overview page
Navigate to Security + networking -> Access keys on the left menu, and click on Show on the Connection string
Copy this Connection string value
Access Control (IAM) Role assignment
In the storage account, go to Access Control (IAM) and assign Reader role to the azure app created in the first step
Save the changes.
Navigate to Administration -> Data Sources -> Azure Files -> New scan
Provide the Connection string value obtained from above steps
Click on the Folder icon in Path to select a particular share to scan, or leave the path as empty to scan all shares
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start file scan to begin scanning
The results can be viewed under Dashboard -> Enterprise Search
If you haven't created a Dropbox credentials yet, follow this guide to create a new Dropbox app and ensure the necessary credentials are set up.
Data streaming needs events.read scope, as shown below, setting is located under Permissions for your app in Dropbox App Console
From the Data Sources page, select Dropbox from the list of available data sources. In the Scan Configurations list, create a New Configuration.
Make sure the connection has a Name and Credentials set. Leave Path untouched. Dropbox supports only root level data streaming.
Enable Data streaming checkbox and copy Webhook URL
Go to , open app Settings for your credentials and paste previously copped Webhook URL into Webhook URIs
After clicking Add webhook should have status enabled
Click Save & Close to finalize the changes.
Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. On completion it will show created configuration with Requested Data Streaming.
After the subscription is activated, real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.
Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events.
View streaming event details
It is also possible to monitor extended events.
If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu
Browse to App Registration and select your application that was created for the scanning
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Microsoft APIs -> Office 365 Management API
Select Application permission
Select ActivityFeed.Read permission
Permissions required
All the scanning permissions(https://docs.getvisibility.com/scan-with-getvisibility/configure-data-sources/onedrive)
Office 365 Management API ⇒ Application Permissions ⇒ ActivityFeed.Read
Once all the required permission is added, click "Grant admin consent"
Sign into the Microsoft Purview portal using Microsoft Edge browser
Select the Audit solution card. If the Audit solution card isn't displayed, select View all solutions and then select Audit from the Core section
If auditing isn't turned on for your organization, a banner is displayed prompting you start recording user and admin activity. Select the Start recording user and admin activity banner.
In certain cases, recoding cannot be enabled immediately and requires additional configuration. If this applies, users will be prompted to enable the customization setting. Select OK, and a new banner will appear, informing you that the process may take 24 to 48 hours to complete. After this waiting period, repeat the previous step to proceed with enabling recoding.
From the Data Sources page, select Sharepoint Online from the list of available data sources. In the Scan Configurations list create New Configuration
Make sure the connection has a Name, Credentials are set. Then select the Path icon.
Click on the Folder icon in the Path field to select the folder you want to monitor for real-time events.
Magnifying glass icon: Folders with this icon next to them indicate that real-time events can be subscribed to from this directory.
After selecting the folder, click Save & Close to finalize the changes.
Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. Once the subscription is activated, this icon will change to a green magnifying glass.
After enabling Data Streaming, the system will automatically handle the subscription to Sharepoint Online’s real-time events. There is no need to manually configure Webhooks.
After the subscription is activated (green magnifying glass icon), real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.
Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events.
In this section, you can filter and view event details
If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu
Browse to App Registration and select your application that was created for the scanning
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Microsoft APIs -> Office 365 Management API
Select Application permission
Select ActivityFeed.Read permission
Permissions required
Office 365 Management API ⇒ Application Permissions ⇒ ActivityFeed.Read
Microsoft Graph > Application permissions > AuditLog > AuditLog.Read.All
Microsoft Graph > Application permissions > Directory > Directory.Read.All
Once all the required permission is added, click "Grant admin consent"
Sign into the Microsoft Purview portal using Microsoft Edge browser
Select the Audit solution card. If the Audit solution card isn't displayed, select View all solutions and then select Audit from the Core section
If auditing isn't turned on for your organization, a banner is displayed prompting you start recording user and admin activity. Select the Start recording user and admin activity banner.
In certain cases, recoding cannot be enabled immediately and requires additional configuration. If this applies, users will be prompted to enable the customization setting. Select OK, and a new banner will appear, informing you that the process may take 24 to 48 hours to complete. After this waiting period, repeat the previous step to proceed with enabling recoding.
From the Data Sources page, select Azure AD from the list of available data sources. In the Scan Configurations list create New Configuration
Make sure the connection has a Name, Credentials are set and Data streaming is enabled.
Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. Once the subscription is activated, this icon will change to a green magnifying glass.
After enabling Data Streaming, the system will automatically handle the subscription to Azure AD’s real-time events. There is no need to manually configure Webhooks.
Once streaming is enabled, events can be monitored across multiple sections of the platform, providing comprehensive visibility into user and group activities. The Streaming tab offers an overview of essential operations, such as user and group creation, updates, and deletions.
For deeper insights, Extended Streaming Events leverage Azure AD’s audit logging functionality along with the ActivityFeed.Read permission. This enables the system to capture a broader range of event types beyond standard data streaming, including administrative actions, role changes, and authentication events.
Navigate to the Live Events section under Administration and then to Streaming tab to view a detailed audit log of streaming events.
Navigate to the Live Events section under Administration and then to Extended Streaming tab to view a detailed audit log of extended streaming events.
In both sections, you can filter and view event details
From the Data Sources page, select SMB from the list of available data sources. In the Scan Configurations list Create New Configuration
Make sure the connection has a Name, Credentials are set. Then select the SMB share Path that is going to listen.
After selecting the folder, Select the Data streaming checkbox:
Follow the download tab link and installation instructions of SMB agent:
Follow installation instructions for SMB streaming agent:
This section addresses the different methods to install the SMB Connector on a single machine.
OS: Windows Server 2016 or later.
Processor: 2 GHz or faster, 2 cores (64-bit processor recommended).
Memory: 4GB RAM.
Hard Disk: 1GB free space.
Administrator Privileges: user needs admin permissions to install.
must be installed.
The SMB Connector supports various configuration options which can be specified via smb_connector_application_config.json
Pre-requisites:
The ZIP of the installer files.
smb_connector_application_config.json file.
Windows Server machine access.
Admin access to install the connector.
Steps
Download the SMB Connector ZIP File: Obtain the ZIP file and save it to the Windows machine.
Prepare for Installation:
Unzip the contents of the ZIP file
Place the smb_connector_application_config.json file in the same directory as the unzipped contents.
Configure the Installer:
Edit the smb_connector_application_config.json file as needed. Use the smb_connector_application_config.json.example file in the unzipped folder if creating the configuration from scratch.
Create a folder mapping for every SMB share on the server that is to be scanned. WatchFolder should be the root directory of the share, and WebhookUrl should be from the scan configuration page for the SMB share on the GV dashboard (shown below).
Keep useDefaultFileFilters set to false if you want all files in the share to be scanned. If set to true, the connector will only scan files supported by the GV Synergy agent for classification.
IncludedExtensions and AdditionalFileFilters can be used if you wish to apply filters other than the defaults. IncludedExtensions supports file extensions in the format .txt, etc. AdditionalFileFilters allows for any custom file filter, including * as a wildcard
Start the Installation:
Execute the install.ps1 script by right clicking and choosing Run with PowerShell
Complete the Installation:
After the installation completes, the PowerShell window can be closed.
Save Streaming configuration
After the subscription is activated (green magnifying glass icon), real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.
Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events (you may specify source filter to focus only on SMB events):
Browse to App Registration and select New registration
On the App Registration page enter below information and click Register button
Name: (Enter a meaningful application name that will be displayed to users of the app)
Supported account types:
Select which accounts you would like your application to support. You should see the options similar to below. You can select “Accounts in this organizational directory only”:
Leave the Redirect URI as empty and Click Register
Note the Application (client) ID, Directory (tenant) ID values
Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret
Provide a meaningful description and expiry to the secret, and click on Add
Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once you leave this page
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Microsoft APIs -> Microsoft Graph
Select Application permissions
Permissions required
For scanning
Microsoft Graph > Application permissions > Mail > Mail.Read
Microsoft Graph > Application permissions > User > User.Read.All
Microsoft Graph > Application permissions > DeviceManagementApps > DeviceManagementApps.Read.All
Microsoft Graph > Application permissions > MailboxSettings > MailboxSettings.Read
For tagging
Microsoft Graph > Application permissions > Mail > Mail.ReadWrite
Once all the required permissions are added, Grant admin consent to them
If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu
Browse to App Registration and select your application that was created for the scanning
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Microsoft APIs -> Office 365 Management API
Select Application permission
Select ActivityFeed.Read permission
Permissions required
All the scanning permissions(https://docs.getvisibility.com/scan-with-getvisibility/configure-data-sources/sharepoint-online)
Office 365 Management API ⇒ Application Permissions ⇒ ActivityFeed.Read
Once all the required permission is added, click "Grant admin consent"
Sign into the Microsoft Purview portal using Microsoft Edge browser
Select the Audit solution card. If the Audit solution card isn't displayed, select View all solutions and then select Audit from the Core section
If auditing isn't turned on for your organization, a banner is displayed prompting you start recording user and admin activity. Select the Start recording user and admin activity banner.
In certain cases, recoding cannot be enabled immediately and requires additional configuration. If this applies, users will be prompted to enable the customization setting. Select OK, and a new banner will appear, informing you that the process may take 24 to 48 hours to complete. After this waiting period, repeat the previous step to proceed with enabling recoding.
From the Data Sources page, select OneDrive from the list of available data sources. In the Scan Configurations list, create a New Configuration.
Make sure the connection has a Name and Credentials set. Then select the Path icon.
Click on the Folder icon in the Path field to select the folder you want to monitor for real-time events.
Magnifying glass icon: Folders with this icon next to them indicate that real-time events can be subscribed to from this directory.
After selecting the folder, click Save & Close to finalize the changes.
Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. Once the subscription is activated, this icon will change to a green magnifying glass.
After enabling Data Streaming, the system will automatically handle the subscription to OneDrive’s real-time events. There is no need to manually configure Webhooks.
After the subscription is activated (green magnifying glass icon), real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.
Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events.
In this section, you can filter and view event details.







How to configure Azure Blob connection for scanning.
Login to Azure Portal
If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu.
Browse to App Registration and select New registration
On the App Registration page enter below information and click Register button
Name: (Enter a meaningful application name that will be displayed to users of the app)
Supported account types:
Select which accounts the application will support. The options should be similar to those below. Select �Accounts in this organizational directory only�:
Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret
Provide a meaningful description and expiry to the secret, and click on Add
Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once you leave this page
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Microsoft APIs -> Microsoft Graph
Select Application permissions
Permissions required
Microsoft Graph > Application permissions > Device > Device.Read.All
Microsoft Graph > Application permissions > Directory > Directory.Read.All
Microsoft Graph > Application permissions > Group > Group.Read.All
A is needed for the storage account that is to be scanned.
Login to
If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant which needs to be registered to the application from the Directories + subscriptions menu
Browse to Storage accounts and select the account to be scanned
Once the storage account is selected, note the Resouce group and Subscription ID values in the Overview page
Navigate to Security + networking -> Access keys on the left menu, and click on Show on the Connection string
Copy this Connection string value
Access Control (IAM) Role assignment - there are 2 options, one is to assign a built-in role, the other is to create and assign a custom role. Using a built-in role is an easier option to configure, while a custom role may be preferred to ensure least privileges assignment for increased security.
Option 1: In the storage account, go to Access Control (IAM) and check on either Storage Blob Data Owner or Data Contributor role to assign the role to the blob storage. (Per the Data Contributor role is the least privileged, built-in role for Listing Containers)
*** Firewall rules must also be in place to allow the DSPM server to connect to
Navigate to Administration -> Data Sources -> Azure Blob -> New scan
Provide the Connection string value obtained from above steps
Click on the Folder icon in Path to select a particular share to scan, or leave the path as empty to scan all shares
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning
The results can be viewed under Dashboard -> Enterprise Search
This guide provides steps on how to enable real-time data streaming for a AWS IAM connection and monitor streaming events within the Getvisibility platform.
Create a policy
In the navigation pane on the left, choose Policies and then choose Create policy
In the Policy editor section, find the Select a service section, then choose IAM service, and select Next
In Actions allowed, choose the below actions to add to the policy:
GetPolicy
GetUserPolicy
ListUserPolicies
ListAttachedGroupPolicies
Choose SNS service and select the below actions:
CreateTopic,
DeleteTopic,
TagResource,
Choose Event Bridge service and select the below actions:
TagResource
PutTargets
EnableRule
Choose EC2 sercice and select the below action:
DescribeRegions
For Resources, choose All and select Create policy to save the new policy
Sign in to the AWS Management Console and open the with the appropriate admin level account
In the navigation pane on the left, choose Users and then choose Create user
On the Specify user details page, under User details, in User name, enter the name for the new user, example iam-connector-user and select Next
On the Set permissions page, select Attach policies directly and choose the policy created in above steps
Select Next
Once the user is created, select it, and from the user page, choose Create access key
Select Other then Next
Enter a description if you wish and select Create access key
The Access and Secret Access Keys have now been created. These can be downloaded as a CSV, and also copied from this section. NOTE: the secret access key cannot be viewed once you leave this page
Navigate to Administration -> Data Sources -> AWS IAM ->Credentials - New credentials
Provide the access key and secret access key values generated in the above steps and select Save & Create Scan
Make sure the connection has a Name and Credentials set then click on Data streaming toggle and click Save & Close to finalize the changes
Clock icon: When data streaming is being activated, the "Requested" status will appear, indicating that the subscription is being processed. Once the subscription is activated, this status will change to "On".
After enabling Data Streaming, the system will automatically handle the subscription to AWS Iam’s real-time events. There is no need to manually configure Webhooks.
After the subscription is activated, real-time events will start flowing into the platform, and can be monitored from the relevant parts of the platform.
Viewing Events in the Live Events Section
Go to the Live Events section under Administration to view a detailed audit log of all streaming events.
Filter by source to get only AWS IAM events
Monitoring Extended Streaming Events
Once extended streaming is enabled, events will be available for monitoring in multiple sections of the platform:
Live Events Section
Go to Live Events under Administration to view real-time extended events.
Use the filter options to narrow down events to only AWS IAM activities.
This document provides information on how to configure Confluence Cloud connection with real-time events monitoring and data streaming.
curl -X POST "https://your.imanage.server.com/auth/oauth2/token" \
-d "username=YOUR_USERNAME" \
-d "password=YOUR_PASSWORD" \
-d "grant_type=password" \
-d "client_id=YOUR_CLIENT_ID"curl -X GET "https://your.imanage.server.com/api" \
-H "X-Auth-Token: YOUR_ACCESS_TOKEN"Client Id
(ID from Part 1)
The application Client ID.
Domain
your.imanage.server.com
Crucial: Your on-premise server's hostname.
Path
(Optional)
Leave blank to scan all content, or click the folder icon to select a specific path.


























List > ListAllMyBuckets
List > ListBucket
Tagging > PutObjectTagging
































































































































Team Data > team_data.member
Members > members.read, groups.read
For remediations
Collaboration > sharing.write
Files and Folders > files.content.write
For tagging
Files and Folders > files.content.write, files.metadata.write
Collaboration: sharing.read, sharing.write
Team: team_info.read
Team Data: team_data.member, team_data.content.write, team_data.content.read, files.team_metadata.write, files.team_metadata.read, files.permanent_delete
Members: members.read, groups.read














Leave the Redirect URI as empty and Click Register
Note the Application (client) ID, Directory (tenant) ID values
Microsoft Graph > Application permissions > User > User.Read.All
Once all the required permissions are added, click "Grant admin consent"
We also need to assign Reader role to the azure app created in the first step
Save the changes.
Option 2: This option creates a custom role and assigns the same permissions as the Data Contributor role, except for the delete permissions. In the Blob storage account, go to Access Control (IAM) and click Add to create a new role. Name the role with a preferred name, and choose the following actions below to assign to this custom role. Select this custom role for the blob and save changes.
We also need to assign Reader role to the azure app created in the first step
Real Time Events Monitoring (Streaming) Permissions: To enable "Real Time Events Monitoring (Streaming)", the following additional Azure permission roles are required:
EventGrid Data Contributor
EventGrid EventSubscription Contributor
EventGrid TopicSpaces Publisher
Assign these roles using Access Control (IAM) in the Blob storage account, similar to the steps mentioned above for assigning the Storage Blob Data Owner or Data Contributor role.
Next, in the Networking tab, under Public network access, select "Enabled from all networks", or "Enabled from select virtual networks and IP addresses". If the latter was chosen, then under Firewall section add the IP address range for the DSPM server.
Enable "Allow trusted Microsoft services to access this storage account" and Save the changes.





















ListAttachedUserPolicies
ListGroups
ListUsers
ListGroupsForUser
PutRolePolicy
TagRole
GetGroup
GetRole
CreateRole
Subscribe,
ConfirmSubscription
UntagResource
ListTargetsByRule
RemoveTargets
DeleteRule














Create a Project in Google Cloud Console:
Go to the Google Cloud Console
Create a new project or select an existing project
Enable the Admin SDK:
In the Google Cloud Console, navigate to the "APIs & Services" > "Library"
Search for "Admin SDK" and click on it
Click the "Enable" button to enable the Admin SDK API for your project
Create OAuth 2.0 Credentials:
In the Google Cloud Console, go to APIs & Services > Credentials
Click "Create credentials" and select "Service account"
Enter a name in the Service account name field andclick CREATE CREDENTIALS
Under "Grant this service account access to the project," select role as Owner and click DONE
Select the newly created service account and click Keys > Add Key > Create new key
Make sure the key type is set to json and click CREATE
The new private key pair is generated and downloaded to the machine. Note the values of private_key, client_email and client_id
From your domain's Admin console, go to Main menu menu > Security > Access and data control > API controls
In the Domain wide delegation pane, select Manage Domain Wide Delegation
Click Add new
In the Client ID field, enter the client ID obtained from the service account creation steps above
In the OAuth Scopes field, enter a comma-delimited list of the scopes required for the application
Use the below scopes:
https://www.googleapis.com/auth/admin.directory.user.readonly
https://www.googleapis.com/auth/admin.directory.domain.readonly
https://www.googleapis.com/auth/admin.directory.group.readonly
https://www.googleapis.com/auth/admin.directory.rolemanagement.readonly
https://www.googleapis.com/auth/admin.reports.audit.readonly
Click Authorize
Go to the Data Sources section under Administration.
From the Data Sources page, select Google iam from the list of available data sources. In the Scan Configurations list, create a New Configuration.
Make sure the connection has a Name and Credentials set then click on Data streaming toggle and click Save & Close to finalize the changes
Clock icon: When data streaming is being activated, the "Requested" status will appear, indicating that the subscription is being processed. Once the subscription is activated, this status will change to "On".
After enabling Data Streaming, the system will automatically handle the subscription to Google Iam’s real-time events. There is no need to manually configure Webhooks.
After the subscription is activated, real-time events will start flowing into the platform, and can be monitored from the relevant parts of the platform.
Go to the Live Events section under Administration to view a detailed audit log of all streaming events.
Filter by source to get only Google IAM events
Monitoring Extended Streaming Events
Once extended streaming is enabled, events will be available for monitoring in multiple sections of the platform:
Live Events Section
Go to Live Events under Administration to view real-time extended events.
Use the filter options to narrow down events to only Google IAM activities.
Ensure the following prerequisites are met:
Existing Confluence Cloud Instance: There needs to be an active Confluence Cloud instance.
Enable Development Mode: Activate Development Mode on the Confluence Cloud site to be monitored. Refer to the official Confluence documentation.
Deploy Proxy Container: Set up the Getvisibility container with a public proxy to allow integration with Confluence Cloud.
In the product UI, go to the Data Sources > Confluence Cloud page.
Locate the existing Confluence Cloud scan configuration and select Edit Configuration.
Within the Edit Confluence Cloud Configuration page, toggle Data Streaming to ON.
Copy the Webhook URL provided, as it will be used later.
Click Save & Close to apply changes.
To enable data streaming, the confluence-cloud-streaming-proxy container will need to be deployed in the infrastructure e.g. using Docker or Kubernetes. This step involves configuring environment variables and setting up Docker for integration with Confluence Cloud.
Deployment Instructions
Download Docker image parts: Please download all files listed below:
Merge Docker image parts:
Load Docker image:
Prepare a Docker Environment: Ensure that Docker is installed and configured on the infrastructure where the confluence-cloud-streaming-proxy application will be hosted. This will be the user environment.
Set Environment Variables: Configure the following environment variables to allow the Confluence Cloud instance to communicate with the proxy application:
APP_LISTENER_PUBLIC_ACCESSIBLE_URL
Publicly accessible URL at which app can be accessed. It is used in communication between Confluence Cloud Webhook mechanism and app
e.g.
APP_WEBHOOK_URL
Webhook URL (taken from Getvisibility UI Confluence Cloud connector configuration form)
e.g.
Map Persistent Volume: Map a persistent volume to the /app/db/ directory within the container to ensure data retention across sessions.
Example docker-compose.yml Configuration
Use the following example to help set up the Docker configuration. Update the values as needed for the specific environment:
Once configured, start the container by running docker-compose up -d or an equivalent command based on configured setup.
To expose the application publicly, consult with relevant internal team such as IT or DevOps team. For testing ngrok's free plan can be used to expose the app port as needed.
Start the Application: Ensure the application runs before proceeding with the integration setup.
To install the integration, follow the steps:
Go to the Manage apps page in Confluence Cloud.
Select the Upload app
Paste the publicly accessible address in the form and press Upload.
The application will install, and the integration will be ready in a few seconds.
To uninstall the integration follow the steps:
Go to the Manage apps page in Confluence Cloud.
Find Getvisibility Confluence Cloud Streaming Proxy and click Uninstall.
Confirm by selecting Uninstall app.
Delete any associated containers and settings from your organization’s infrastructure




How to create a Box Connector app to scan Box accounts.
Login to relevant Box account.
Navigate to Dev Console.
Select Create New App and then Custom App
Select Server Authentication (with JWT) and enter app name, then click Create App
In the Configuration tab, change App Access Level to App + Enterprise Access, then, enable Generate user access tokens and Make API calls using the as-user header.
Click on Save changes
Make sure the below Application Scopes are selected
Content Actions > Read all files and folders stored in Box
Content Actions > Write all files and folders stored in Box
Administrative Actions > Manage users
In the same Configuration tab, scroll down to Generate a Public/Private Keypair
This will result in a JSON file being downloaded by the browser
In Authorization tab, click Review and Submit followed up with adding a description before submitting the app for review
Make note of User ID and Enterprise ID of the App in General Settings tab
Exit Dev Console and switch to the Admin Console
In Admin Console, go to Apps > Custom Apps Manager and locate the newly created app and click View button
Review the information and Authorize the app
Navigate to Administration -> Data Sources -> Box -> New scan
Provide the values generated in the above steps from the Box application
Click on the Folder icon in Path to select a particular folder to scan, or leave the path as empty to scan all folders
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin the trustee scanning
The scan results can be viewed under Dashboard > Access Governance
Click on the icon on the right and select Start file scan to begin the files scanning
The results can be viewed under Dashboard > Enterprise Search
The Box Pricing Plans required for metadata writing are Business Plus, Enterprise, or Enterprise Plus. The basic Business plan does not include custom metadata and metadata templates.
A metadata template must be created to support Getvisibility's tags. Please follow the below steps to achive this.
In the Admin Console, in the lefthand navigation click Content
Toward the top of the page, click Metadata
How to create a OneDrive Connector app to scan OneDrive accounts.
The following URLs needs to be whitelisted:
Microsoft Graph API: https://graph.microsoft.com
Azure Authentication:
Login to
If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu.
Browse to App Registration and select New registration
On the App Registration page enter below information and click Register button
Name: (Enter a meaningful application name that will be displayed to users of the app)
Supported account types:
Select which accounts the application will support. The options should be similar to those below. Select “Accounts in this organizational directory only”:
Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret
Provide a meaningful description and expiry to the secret, and click on Add
Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once you leave this page
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Microsoft APIs -> Microsoft Graph
Select Application permissions
For UnifiedPolicy.Tenant.Read
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select APIs my organization uses tab
Navigate to Administration -> Data Sources -> OneDrive -> New scan
Provide the Directory (tenant) ID, Application (client) ID and Client Secret value generated in the above steps from the Azure application
Click on the Folder icon in Path to select a particular user's OneDrive to scan, or leave the path as empty to scan all users
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning
The results can be viewed under Dashboard -> Enterprise Search
Microsoft.Storage/storageAccounts/blobServices/containers/read (Return a container or a list of containers)
Microsoft.Storage/storageAccounts/blobServices/containers/write (Modify a container's metadata or properties)
Microsoft.Storage/storageAccounts/blobServices/generateUserDelegationKey/action (Returns a user delegation key for the Blob service)
Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read (Return a blob or a list of blobs)
Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write (Write to a blob)
Microsoft.Storage/storageAccounts/blobServices/containers/blobs/move/action (Moves the blob from one path to another)
Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action (Returns the result of adding blob content)cat confluence-cloud-streaming-proxy.tar.gz.part* > \
confluence-cloud-streaming-proxy.tar.gz.joineddocker load --input confluence-cloud-streaming-proxy.tar.gz.joinedservices:
app:
image: getvisibility/confluence-cloud-streaming-proxy:v0.3.2
ports:
- "8080:8080"
environment:
APP_LISTENER_PUBLIC_ACCESSIBLE_URL: https://5977-88-156-142-22.ngrok-free.app
APP_WEBHOOK_URL: https://tenantabc.getvisibility.com/scan-manager/external/webhooks/notification/71ccab3d56980a2d9c766f42c86d36ffedc34258a0f226aaf56a628f06e9d89d
volumes:
- ./app-db/:/app/db/








































Administrative Actions > Manage groups
Click Create New
Click Name Your Template and enter name as getvisibility
Create a new attribute named as Classification with options as: Public, General Business, Confidential, Highly-Confidential
Similarly, create two more attributes:
Distribution with options as: Internal, External
Compliance with options as: PCI, PII, PHI
Use the Status drop down to indicate this template is Visible
Click Save






















Leave the Redirect URI as empty and Click Register
Note the Application (client) ID, Directory (tenant) ID values
Search for Microsoft Information Protection Sync Service
Select Application permissions > UnifiedPolicy.Tenant.Read
For InformationProtectionPolicy.Read.All
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select APIs my organization uses tab
Search for Microsoft Information Protection API
Select Application permissions > InformationProtectionPolicy.Read.All
For Azure Rights Management Services > Content.Writer
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Azure Rights Management Services tab
Permissions required
For scanning
Microsoft Graph > Application permissions > Sites > Sites.Read.All
Microsoft Graph > Application permissions > Directory > Directory.Read.All
Microsoft Graph > Application permissions > Files > Files.Read.All
Microsoft Graph > Application permissions > User > User.Read.All
For reading Sensitivity labels
Microsoft Graph > Application permissions > InformationProtectionPolicy > InformationProtectionPolicy.Read.All
APIs my organization uses > Microsoft Information Protection Sync Service > Application permissions > UnifiedPolicy.Tenant.Read
For revoke permissions
Microsoft Graph > Application permissions > Files > Files.ReadWrite.All
For tagging
Microsoft Graph > Application permissions > Sites > Sites.Manage.All
For MIP tagging
Azure Rights Management Services > Application permissions > Content.Writer
Microsoft Graph > Application permissions > Directory > Directory.Read.All
Microsoft Graph > Application permissions > Files > Files.ReadWrite.All
Once all the required permissions are added, click "Grant admin consent"

















How to create a SharePoint Connector app to scan SharePoint Online (SPO) accounts.
Login to Azure Portal
If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu.
Browse to App Registration and select New registration
On the App Registration page enter below information and click Register button
Name: (Enter a meaningful application name that will be displayed to users of the app)
Supported account types:
Select which accounts the application will support. The options should be similar to those below. Select “Accounts in this organizational directory only”:
Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret
Provide a meaningful description and expiry to the secret, and click on Add
Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once the page is closed.
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Microsoft APIs -> Microsoft Graph
Select Application permissions
For UnifiedPolicy.Tenant.Read
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select APIs my organization uses tab
Permissions required
For scanning
Microsoft Graph > Application permissions > Sites > Sites.Read.All
For reading Sensitivity labels
Navigate to Administration -> Data Sources -> SharePoint Online -> New scan
Provide the Directory (tenant) ID, Application (client) ID and Client Secret value generated in the above steps from the azure application
Click on the Folder icon in Site and path to select a particular site to scan, or leave the path as empty to scan all sites
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning
The results can be viewed under Dashboard -> Enterprise Search
First create the default Getvisibility tags as a new column in SharePoint. This process is described below:
In SharePoint, navigate to Documents
In the files view, select + Add column

Select Application permissions
Select Content > Content.Writer
Microsoft Graph > Application permissions > Sites > Sites.Manage.All
Microsoft Graph > Application permissions > InformationProtectionPolicy > InformationProtectionPolicy.Read.All
APIs my organization uses > Microsoft Information Protection API > Application permissions > InformationProtectionPolicy.Read.All







Leave the Redirect URI as empty and Click Register
Note the Application (client) ID, Directory (tenant) ID values
Search for Microsoft Information Protection Sync Service
Select Application permissions > UnifiedPolicy.Tenant.Read
For InformationProtectionPolicy.Read.All
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select APIs my organization uses tab
Search for Microsoft Information Protection API
Select Application permissions > InformationProtectionPolicy.Read.All
For Azure Rights Management Services > Content.Writer
Navigate to Manage -> API permissions on the left menu, and Add a permission
Select Azure Rights Management Services tab
Select Application permissions
Select Content > Content.Writer
Microsoft Graph > Application permissions > InformationProtectionPolicy > InformationProtectionPolicy.Read.All
APIs my organization uses > Microsoft Information Protection Sync Service > Application permissions > UnifiedPolicy.Tenant.Read
For revoke permissions
Microsoft Graph > Application permissions > Files > Files.ReadWrite.All
For tagging
Microsoft Graph > Application permissions > Sites > Sites.Manage.All
For MIP tagging
Azure Rights Management Services > Application permissions > Content.Writer
Microsoft Graph > Application permissions > Directory > Directory.Read.All
Microsoft Graph > Application permissions > Sites > Sites.Manage.All
Microsoft Graph > Application permissions > InformationProtectionPolicy > InformationProtectionPolicy.Read.All
APIs my organization uses > Microsoft Information Protection API > Application permissions > InformationProtectionPolicy.Read.All
Once all the required permissions are added, click "Grant admin consent"
Select Choice and then Next
Set the name to Classification and the choices as: Public, Internal, Confidential, Highly-Confidential. Select.
Then click Save
Similary create Compliance and Distribution columns (if required)
Getvisibility and SharePoint's tags are now aligned
When tags are written to SharePoint files automatically over the API, as the tags are added by Getvisibility, Modified By changes to System Account.
Getvisibility preserves the Modified date where applicable.


















Create a new project or select an existing project
Enable the Google Drive, Drive Labels and Admin SDK API:
In the Google Cloud Console, navigate to APIs & Services > Library
Search for "Google Drive API" and click on it
Click the "Enable" button to enable the Google Drive API for the project
Search for "Admin SDK API" and click on it
Click the "Enable" button to enable the Admin SDK API for the project
Search for "Drive Labels API" and click on it
Click the "Enable" button to enable Drive Labels API for the project
Create OAuth 2.0 Credentials:
In the Google Cloud Console, navigate to the APIs & Services > Credentials
Click "Create credentials" and select "Service account"
Enter a name in the Service account name field and click CREATE AND CONTINUE
Under Grant this service account access to the project, select role as Owner and click DONE
Select the newly created service account and click Keys > Add Key > Create new key
Make sure the key type is set to json and click Create
The new private key pair is generated and downloaded to the machine. Note the values of private_key, client_email and client_id
From your domain's Admin console, go to Main menu menu > Security > Access and data control > API controls
In the Domain wide delegation pane, select "MANAGE DOMAIN-WIDE DELEGATION"
Click Add new
In the Client ID field, enter the client ID obtained from the service account creation steps above
In the OAuth Scopes field, enter a comma-delimited list of the scopes required for the application
Use the below scopes:
For scanning
https://www.googleapis.com/auth/admin.directory.user.readonly
https://www.googleapis.com/auth/admin.directory.group.readonly
https://www.googleapis.com/auth/drive.readonly
For revoke permissions
https://www.googleapis.com/auth/drive
For tagging
https://www.googleapis.com/auth/drive.file
https://www.googleapis.com/auth/drive
https://www.googleapis.com/auth/drive.admin.labels
https://www.googleapis.com/auth/drive.metadata
For Extended Streaming Events
https://www.googleapis.com/auth/admin.reports.audit.readonly
Click Authorize
In order to perform a scan using Google Drive connector, it needs a user with the below Admin roles assigned:
Services Admin
User Management
Groups Reader
They can be added/checked here for the UserID which will be used for impersonation: admin.google.com > Directory > Users > Assign roles > add Services Admin, User Management, and Groups Reader roles, as follows:
Navigate to Admin console
Select Users under Directory from the left menu
Select a user you want to use for scanning
Navigate to User details -> Admin roles and privileges
Edit the roles, and enable:
Services Admin
User Management
Groups Reader
Click on Save
Note: It might take few minutes before the changes are affected.
Navigate to Administration -> Data Sources -> Google Drive -> New scan
Enter the details of the OAuth2 credentials obtained previously, also add the user id (in the form of [email protected]) of the user you assigned roles in the above steps
Click on the Folder icon in Path to select a particular user's drive to scan, or leave the path as empty to scan all users
Save the configuration
Once the configuration is saved, click on the icon on the right and select Start file scan to begin scanning
The scan results can be viewed under Dashboard -> Enterprise Search
Default Getvisibility labels need to be created in Google Drive. This process is described below:
Turn on Drive labels for the organization
In the Google Admin Console (at admin.google.com)
Go to Menu Apps > Google Workspace > Drive and Docs
Click Labels
Select Turn Labels On
Click Save
Create Drive labels:
Go to the labels manager at .
Requires having the .
Click New label.
To create one badged label:
Publish the labels
If it’s not open already, open the labels manager () and click the label.
Review the label and any fields.
Click Publish.
If an Azure Files scan does not already exist, follow this guide to create a new Azure Files scan and ensure the necessary credentials are set up.
Go to the Scan configurations page in the product UI.
Locate your existing Azure Files scan configuration and select Edit Configuration from the options menu. Note the configured path (folder) and save it, as it will be used in step 9 to replace {FolderPath}.
Within the Edit Azure Files Scan Configuration page, toggle Data Streaming to ON.
Copy the Webhook URL provided, as you will use it later in the Azure Portal. Save this Webhook URL, as it will be used in step 9 to replace {WebhookUrl}.
Click Save & Close button to save configuration.
Navigate to Azure Portal Event hubs and click Create
In Create Namespace Window fill in the details
Give it a Name
Select your subscription and resource group
Select location
Pricing tier - standard
Throughput Units - 1
Click on Review + Create and then Create after validation
After namespace is created, click on + Event Hub button
In Create Event Hub Window fill in name and click Create + Review and Create after validation. Save the name of the Event Hub you created in this step, as it will be used later in step 9 to replace {eventHubName}.
Configure access policy
In the event hubs namespace window click on Settings/Shared access policies and then +Add button
Fill in the details in the new tab, set LogicAppsListenerPolicy as name, select Listen policy, and click Save.
Click on the newly created policy, then copy and save the Connection string–primary key. This will be needed later in step 8b.
Navigate to Azure Portal and open your Storage Account.
Select needed account from the Storage Accounts
In the left-hand menu, select Monitoring/Diagnostic settings and click file
In Diagnostic settings Window click on "+ Add diagnostic setting" button
In Create Diagnostic setting Window fill in the details:
Give it a Name
Select Category groups allLogs
Select Destination details Stream to an event hub and select newly created Event Hub Namespace and Event Hub
Go to Azure logic apps and click "Add" button
In Create Logic App Window select Workflow Service Plan
In Create Logic App (Workflow Service Plan) Window fill in the details and click "Create + Review":
Select your subscription and resource group
Give logic app name
Select region
Pricing plan should be WS1
In the monitoring tab select No for the application insights
Click Review + create button
Click Create after validation
In newly created logic app click on Workflows/Workflows and then +Add button
In new workflow tab fill in name, select State type: Stateful and click Create
In created workflow go to Developer/Designer and click on Add a trigger, then in search type "Event hub" and select "When events are available in Event Hub"
Configure API connection
Click on the trigger, set "Temp" for Event Hub Name and then click on Change connection.
Then click Add New and fill in the details. Enter any name for the connection name and use the connection string {Connection string–primary key} from step 3.6.c.
In workflow navigation tab go to Developer/Code and set the provided code, then click save:
Replace with a path to the streaming folder. For ex., you want to get events from the folder "StreamingFolder" which is located in file share with the name "DocumentsShare" and in the folder with the name "Personal". In this case, the path should be "DocumentsShare/Personal/StreamingFolder"
Replace with webhook url provided in the application in the scan configuration window
After configuring the event subscription:
You may upload documents to the configured path.
The events triggered by these uploads will be processed by the Data Streaming setup, and the results will appear in your Getvisibility dashboard.
If you experience any issues with the configuration, ensure that:
The Webhook URL is correct and matches the configuration in Azure.
Steps 5.8 and 5.9 properly executed and all the variables are replaced with real values.
You can also check if the trigger was unsuccessful by navigating to your configured in previos steps Logic App, then Workflow and Trigger History. If you see any failed triggers, you can inspect the error details to identify the issue.
This document provides information on how to configure Azure Blob connection with real-time events monitoring and data streaming.









https://www.googleapis.com/auth/drive.labels
Choose a badged label
Choose to start from an example, or from scratch.
Update the title as Classification.
(Optional) Add a description or a learn more URL that points to internal documentation about the label.
Customize options, and assign a colour.
To create a standard label:
Two standard labelsneed to be created; Distribution and Compliance
Click a standard label template or click Create New.
Enter or update the label name.
(Optional) Add a description.
Choose whether the label is copied when the file is copied.
Add a field.
Confirm that the lable will be published by clicking Publish.






















Click Save.
On the Change Connection tab, click Details and copy the Name from the connection details. Save this Name, as it will be used later in step 9 to replace {connectionName}.
Click save on workflow designer window
{eventHubName} with azure event hub name that was created previouslyReplace {connectionName} with connection name from previouse step

















If an Azure Blob scan has not yet been created, follow this guide to create a new Azure Blob scan and ensure the necessary credentials are configured.
Go to the Scan configurations page in the product UI.
Find the existing Azure Blob scan configuration and select Edit Configuration from the options menu.
Within the Edit Azure Blob Scan Configuration page, toggle Data Streaming to ON.
Copy the Webhook URL provided, as you will use it later in the Azure Portal.
Navigate to Azure Portal and open the Storage Account.
Select one of the connector from the Storage Accounts
In the left-hand menu, select Events and click Create Event Subscription.tor menu
In Create Event Subscription Window fill in the details:
Give it a Name
Select endpoint type Web Hook
Set configure an endpoint
Go to Filters Menu on top
In the Subject Filters section, enter the correct path format for the subscription:
Use the following pattern:
/blobServices/default/containers/{connectionDetails.ContainerName}/blobs/{connectionDetails.FolderPath}
For example, if the container is mycontainer and the folder path is accuracy test/repository1, the path will look like:
/blobServices/default/containers/mycontainer/blobs/accuracy test/repository1
Click Create to complete the Event Subscription setup.
Ensure the following permissions are assigned to the Azure Storage Account:
EventGrid Data Contributor
EventGrid EventSubscription Contributor
EventGrid TopicSpaces Publisher
For details on assigning these roles, refer to this documentation.
Navigate to Azure Portal Event hubs and click Create
In Create Namespace Window fill in the details
Give it a Name
Select your subscription and resource group
Select location
Pricing tier - standard
Throughput Units - 1
Click on Review + Create and then Create after validation
After namespace is created, click on + Event Hub button
In Create Event Hub Window fill in name and click Create + Review and Create after validation. Save the name of the Event Hub you created in this step, as it will be used later in step 9 to replace {eventHubName}.
Configure access policy
In the event hubs namespace window click on Settings/Shared access policies and then +Add button
Fill in the details in the new tab, set LogicAppsListenerPolicy as name, select Listen policy, and click Save.
Click on the newly created policy, then copy and save the Connection string–primary key. This will be needed later in step 8b.
Navigate to Azure Portal and open your Storage Account.
Select needed account from the Storage Accounts
In the left-hand menu, select Monitoring/Diagnostic settings and click blob
In Diagnostic settings Window click on "+ Add diagnostic setting" button
In Create Diagnostic setting Window fill in the details:
Give it a Name
Select Category groups allLogs
Select Destination details Stream to an event hub and select newly created Event Hub Namespace and Event Hub
Go to Azure logic apps and click "Add" button
In Create Logic App Window select Workflow Service Plan
In Create Logic App (Workflow Service Plan) Window fill in the details and click "Create + Review":
Select your subscription and resource group
Give logic app name
Select region
Pricing plan should be WS1
In the monitoring tab select No for the application insights
Click Review + create button
Click Create after validation
In newly created logic app click on Workflows/Workflows and then +Add button
In new workflow tab fill in name, select State type: Stateful and click Create
In created workflow go to Developer/Designer and click on Add a trigger, then in search type "Event hub" and select "When events are available in Event Hub"
Configure API connection
Click on the trigger, set "Temp" for Event Hub Name and then click on Change connection.
Then click Add New and fill in the details. Enter any name for the connection name and use the connection string {Connection string–primary key} from step 3.6.c.
In workflow navigation tab go to Developer/Code and set the provided code, then click save:
Replace with a path to the streaming folder. For ex., you want to get events from the folder "StreamingFolder" which is located in file share with the name "DocumentsShare" and in the folder with the name "Personal". In this case, the path should be "DocumentsShare/Personal/StreamingFolder"
Replace with webhook url provided in the application in the scan configuration window
If you experience any issues with the configuration, ensure that:
The Webhook URL is correct and matches the configuration in Azure.
Steps 5.8 and 5.9 properly executed and all the variables are replaced with real values.
You can also check if the trigger was unsuccessful by navigating to your configured in previos steps Logic App, then Workflow and Trigger History. If you see any failed triggers, you can inspect the error details to identify the issue.
After configuring the event subscription:
Documents may be uploaded to the configured path.
The events triggered by these uploads will be processed by the Data Streaming setup, and the results will appear in the Getvisibility dashboard.
If there any issues with the configuration, ensure that:
The Webhook URL is correct and matches the configuration in Azure.
The required Azure permissions are correctly assigned.
Steps 5.8 and 5.9 properly executed and all the variables are replaced with real values.
You can also check if the trigger was unsuccessful by navigating to your configured in previos steps Logic App, then Workflow and Trigger History. If you see any failed triggers, you can inspect the error details to identify the issue.
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Filter_Records": {
"type": "Query",
"inputs": {
"from": "@triggerBody()?['ContentData']?['records']",
"where": "@and(not(empty(item()?['uri'])),or(contains(item()?['uri'], '{FolderPath}/'),contains(item()?['uri'], '{FolderPath}?')))"
},
"runAfter": {}
},
"Condition": {
"type": "If",
"expression": "@greater(length(body('Filter_Records')), 0)",
"actions": {
"HTTP-copy": {
"type": "Http",
"inputs": {
"uri": "{WebhookUrl}",
"method": "POST",
"headers": {
"Content-Type": "application/json"
},
"body": {
"event": "@setProperty(triggerBody(),'ContentData',setProperty(triggerBody()?['ContentData'],'records',body('Filter_Records')))"
}
},
"runAfter": {}
}
},
"else": {},
"runAfter": {
"Filter_Records": [
"Succeeded"
]
}
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"triggers": {
"When_events_are_available_in_Event_Hub": {
"type": "ApiConnection",
"inputs": {
"host": {
"connection": {
"referenceName": "{connectionName}"
}
},
"method": "get",
"path": "/@{encodeURIComponent('{eventHubName}')}/events/batch/head",
"queries": {
"contentType": "application/json",
"consumerGroupName": "$Default",
"maximumEventsCount": 50
}
},
"recurrence": {
"interval": 30,
"frequency": "Second"
},
"splitOn": "@triggerBody()"
}
}
},
"kind": "Stateful"
}



Use the Webhook URL provided at the step 2 to Subscriber endpoint and Confirm selection.
Make sure to replace {connectionDetails.ContainerName} and {connectionDetails.FolderPath} with the actual container name and folder path from the scan configuration.
Click Save.
On the Change Connection tab, click Details and copy the Name from the connection details. Save this Name, as it will be used later in step 9 to replace {connectionName}.
Click save on workflow designer window
{eventHubName} with azure event hub name that was created previouslyReplace {connectionName} with connection name from previouse step


























{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Filter_Records": {
"type": "Query",
"inputs": {
"from": "@triggerBody()?['ContentData']?['records']",
"where": "@and(not(empty(item()?['uri'])),or(contains(item()?['uri'], '{FolderPath}/'),contains(item()?['uri'], '{FolderPath}?')))"
},
"runAfter": {}
},
"Condition": {
"type": "If",
"expression": "@greater(length(body('Filter_Records')), 0)",
"actions": {
"HTTP-copy": {
"type": "Http",
"inputs": {
"uri": "{WebhookUrl}",
"method": "POST",
"headers": {
"Content-Type": "application/json"
},
"body": {
"event": "@setProperty(triggerBody(),'ContentData',setProperty(triggerBody()?['ContentData'],'records',body('Filter_Records')))"
}
},
"runAfter": {}
}
},
"else": {},
"runAfter": {
"Filter_Records": [
"Succeeded"
]
}
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"triggers": {
"When_events_are_available_in_Event_Hub": {
"type": "ApiConnection",
"inputs": {
"host": {
"connection": {
"referenceName": "{connectionName}"
}
},
"method": "get",
"path": "/@{encodeURIComponent('{eventHubName}')}/events/batch/head",
"queries": {
"contentType": "application/json",
"consumerGroupName": "$Default",
"maximumEventsCount": 50
}
},
"recurrence": {
"interval": 30,
"frequency": "Second"
},
"splitOn": "@triggerBody()"
}
}
},
"kind": "Stateful"
}
<figure><img src="../../.gitbook/assets/cab519c5-725f-4f62-a8d4-3bce7eb60737 (1).png" alt=""><figcaption></figcaption></figure>




A comprehensive list of the supported event types by Data Source for DDR
When DDR (aka streaming) is enabled and events start coming in from the data source there are two types of events:
Examples would be Read, View, etc.
No actions are taken when these events are detected.
These are events that alter the file or the file permissions. Examples would include creating a file or user, changing a file name etc.
When these types of events are detected a scan or rescan of the item will occur so that it can be classified.
CreateUser - A new user account is created.
CreateGroup - A new user group is created.
CreateRole - A new role is created with specific permissions.
UpdateUser - Modifications are made to an existing user.
UpdateGroup - Changes are made to a group, such as adding or removing members.
UpdateRole - A role is updated with new permissions or settings.
AttachUserPolicy - A policy is attached to a user, modifying access rights.
DeleteUser - A user account is deleted.
DeleteGroup - A group is deleted along with its associated permissions.
DeleteRole - A role is deleted from IAM.
ConsoleLogin - A user logs in through the AWS console.
SignInFailure - A login attempt fails.
SignInSuccess - A login attempt is successful.
FederatedLogin - A user logs in via federated authentication.
s3:ObjectCreated: - A new object is uploaded to an S3 bucket.
s3:ObjectCreated:Post – A new object is uploaded to an S3 bucket by an HTTP POST operation.
s3:ObjectCreated:CompleteMultipartUpload – An object was created after a multipart upload operation.
s3:ObjectCreated:Copy – A new object is created by an S3 copy operation.
s3:ObjectRestore:Post – A restore request for an archived object is initiated.
s3:ObjectRestore:Delete – A restore request for an archived object is deleted.
s3:ObjectAcl:Put – Access control settings for an object are updated.
s3:ObjectTagging:Put – Tags for an object are added or modified.
s3:ObjectRemoved:Delete – An object is deleted from an S3 bucket.
s3:ObjectRemoved:DeleteMarkerCreated – A delete marker is created for an object, marking it as deleted.
s3:LifecycleExpiration:Delete – An object is removed due to lifecycle rules.
s3:LifecycleExpiration:DeleteMarkerCreated – A delete marker is created due to lifecycle rules.
s3:ReducedRedundancyLostObject - An object stored in Reduced Redundancy Storage is lost.
s3:LifecycleTransition – An object is transitioned to a different storage class based on lifecycle rules.
s3:Replication:OperationFailedReplication – The replication operation for an object failed.
s3:Replication:OperationNotTracked – The replication operation for an object is not tracked.
Microsoft.Storage.BlobCreated - A new blob is created or content is updated in a storage container.
Microsoft.Storage.DirectoryCreated - A new directory is created in a storage container.
Microsoft.Storage.BlobRenamed - A blob is renamed within a container.
Microsoft.Storage.DirectoryRenamed - A directory is renamed within a container.
Microsoft.Storage.BlobDeleted - A blob is deleted from a storage container.
Microsoft.Storage.DirectoryDeleted - A directory is deleted from a storage container.
Microsoft.EventGrid.SubscriptionValidationEvent - A subscription validation event.
Microsoft.Storage.BlobTierChanged - The storage tier of a blob is modified.
GetBlobServiceProperties - Retrieves properties of the Blob service.
GetContainerProperties - Retrieves properties of a storage container.
CreateFile - A new file is created in an Azure Files share.
CreateDirectory - A new directory is created in an Azure Files share.
CopyFile - A file is copied to a new location.
SetFileProperties - The properties of a file are updated.
SetFileMetadata - Metadata of a file is updated.
DeleteFile - A file is deleted from an Azure Files share.
DeleteDirectory - A directory is deleted from an Azure Files share.
ListShares - Lists file shares in an account.
GetShareProperties - Retrieves properties of a file share.
GetShareMetadata - Retrieves metadata of a file share.
GetDirectoryProperties - Retrieves properties of a directory.
FILE.UPLOADED - A new file is uploaded.
FOLDER.CREATED - A new folder is created.
FILE.RESTORED - A previously deleted file is restored.
FOLDER.RESTORED - A previously deleted folder is restored.
FILE.MOVED - A file is moved to a new location.
FILE.RENAMED - A file is renamed.
FOLDER.RENAMED - A folder is renamed.
FOLDER.MOVED - A folder is moved to a new location.
FILE.TRASHED - A file is moved to the trash.
FILE.DELETED - A file is permanently deleted.
FOLDER.TRASHED - A folder is moved to the trash.
FOLDER.DELETED - A folder is permanently deleted.
FILE.DOWNLOADED - A file is downloaded.
FOLDER.DOWNLOADED - A folder is downloaded.
FILE.COPIED - A file is copied to another location.
FOLDER.COPIED - A folder is copied to another location.
page_created - A new page is created in Confluence.
blogpost_created - A new blog post is created.
attachment_created - A new attachment is uploaded.
page_updated - An existing page is modified.
blogpost_updated - A blog post is updated.
attachment_updated - An attachment is updated.
page_deleted - A page is deleted from Confluence.
blogpost_deleted - A blog post is deleted.
attachment_deleted - An attachment is removed.
All other events are categorized as informational.
MessagesAdded - A new email message is added.
LabelsAdded - A label is added to an email.
LabelsRemoved - A label is removed from an email.
MessagesDeleted - An email message is deleted.
create - A new file or folder is created.
upload - A new file is uploaded.
edit - A file or folder is modified.
rename - A file or folder is renamed.
move - An item is moved to a different location.
delete - An item is permanently removed.
trash - An item is moved to the trash.
view - A file or folder is viewed.
download - A file is downloaded.
preview - A file is previewed.
print - A file is printed.
create_group - A new group is created.
create_user - A new user is created.
2sv_disable - Two-step verification is disabled.
2sv_enroll - Two-step verification is enrolled.
password_edit - A user's password is modified.
recovery_email_edit - A recovery email is changed.
delete_group - A group is deleted.
delete_user - A user is deleted.
archive_user - A user is archived.
unarchive_user - A user is unarchived.
login_success - A user successfully logs in.
login_failure - A login attempt fails.
login_challenge - A login challenge occurs.
application_login_failure - An application login fails.
FileUploaded - A new file is uploaded.
FolderCreated - A new folder is created.
FileRestored - A previously deleted file is restored.
FolderRestored - A previously deleted folder is restored.
FileModified - A file is modified.
FileMoved - A file is moved to a new location.
FileRenamed - A file is renamed.
FolderModified - A folder is modified.
FileDeleted - A file is permanently deleted.
FolderDeleted - A folder is permanently deleted.
FileRecycled - A file is moved to the recycle bin.
FolderRecycled - A folder is moved to the recycle bin.
FileAccessed - A file is accessed.
FileDownloaded - A file is downloaded.
FilePreviewed - A file is previewed.
FolderCopied - A folder is copied.
DetachUserPolicy - A policy is removed from a user, altering permissions.
PutUserPolicy - A new policy is assigned to a user.
AttachGroupPolicy - A policy is attached to a group, affecting all its members.
DetachGroupPolicy - A policy is removed from a group.
PutGroupPolicy - A policy is assigned to a group.
AttachRolePolicy - A policy is attached to a role, modifying access rights.
DetachRolePolicy - A policy is removed from a role.
PutRolePolicy - A new policy is assigned to a role.
ChangePassword - A user changes their password.
AddUserToGroup - A user is added to a group, changing their access permissions.
RemoveUserFromGroup - A user is removed from a group.
SessionStart - A session begins.
SessionEnd - A session ends.
GenerateCredentialReport - A report on credentials is generated.
GetCredentialReport - A credential report is retrieved.
ListAccessKeys - Access keys for a user are listed.
ListUserTags - Tags associated with a user are retrieved.
ListUsers - Users within an AWS account are listed.
ListGroups - Groups within an AWS account are listed.
ListRoles - Roles within an AWS account are listed.
GetUser - Information about a specific user is retrieved.
GetGroup - Information about a specific group is retrieved.
GetRole - Information about a specific role is retrieved.
s3:ObjectRestore:Completed – An archived object has been fully restored and is now available.
s3:ObjectTagging:Delete – Tags for an object are removed.
s3:Replication:OperationMissedThreshold – The replication operation did not meet its threshold requirements.
s3:Replication:OperationReplicatedAfterThreshold – The replication operation succeeded after surpassing the threshold.
s3:IntelligentTiering – An object is moved between storage tiers.
GetContainerServiceMetadata - Retrieves metadata for a storage container.
ListContainers - Lists storage containers in an account.
BlobPreflightRequest - A request to verify blob upload conditions.
ListBlobs - Lists blobs in a container.
GetBlobProperties - Retrieves properties of a blob.
GetBlobMetadata - Retrieves metadata associated with a blob.
GetBlockList - Retrieves the list of blocks in a blob.
GetContainerACL - Retrieves the access control list of a container.
GetContainerMetadata - Retrieves metadata for a container.
CopyBlob - Copies a blob from one location to another.
CopyBlobSource - Identifies the source blob for a copy operation.
CopyBlobDestination - Identifies the destination blob for a copy operation.
DeleteBlob - Deletes a blob from a container.
DeleteBlobSnapshot - Deletes a snapshot of a blob.
DeleteContainer - Deletes a storage container.
PutBlob - Uploads a new blob to a container.
PutBlock - Uploads a block for a blob.
PutBlockList - Commits a set of uploaded blocks as a blob.
CreateBlobSnapshot - Creates a snapshot of an existing blob.
CreateBlockBlob - Creates a new block blob.
CreateContainer - Creates a new storage container.
SetBlobMetadata - Updates metadata for a blob.
SetBlobProperties - Updates properties of a blob.
SetContainerMetadata - Updates metadata for a storage container.
SetContainerACL - Modifies the access control list of a container.
AcquireBlobLease - Acquires a lease on a blob.
ReleaseBlobLease - Releases a lease on a blob.
RenewBlobLease - Renews a lease on a blob.
BreakBlobLease - Breaks an active lease on a blob.
AcquireContainerLease - Acquires a lease on a container.
BreakContainerLease - Breaks an active lease on a container.
ChangeBlobLease - Changes an active lease on a blob.
ChangeContainerLease - Changes an active lease on a container.
RenewContainerLease - Renews a lease on a container.
UndeleteBlob - Restores a deleted blob.
GetFileProperties - Retrieves properties of a file.
ListDirectoriesAndFiles - Lists directories and files in a share.
GetFile - Retrieves a file from a share.
GetFileRangeList - Retrieves the range list of a file.
GetShareStats - Retrieves statistics for a file share.
CreateShare - Creates a new file share.
PutRange - Uploads a range of data to a file.
SetShareMetadata - Updates metadata for a file share.
SetShareProperties - Updates properties of a file share.
SetDirectoryMetadata - Updates metadata of a directory.
SetDirectoryProperties - Updates properties of a directory.
ResizeFile - Resizes an existing file.
SetFileTier - Sets the tier of a file.
SetShareQuota - Updates the quota of a file share.
SetShareACL - Updates the access control list of a file share.
SetDirectoryACL - Updates the access control list of a directory.
SetFileACL - Updates the access control list of a file.
DeleteShare - Deletes a file share.
AcquireShareLease - Acquires a lease on a file share.
ReleaseShareLease - Releases a lease on a file share.
RenewShareLease - Renews a lease on a file share.
BreakShareLease - Breaks an active lease on a file share.
ChangeShareLease - Changes an active lease on a file share.
StartCopyFile - Initiates a file copy operation.
AbortCopyFile - Cancels an ongoing file copy operation.
CopyFileSource - Specifies the source file in a copy operation.
CopyFileDestination - Specifies the destination file in a copy operation.
CreateShareSnapshot - Creates a snapshot of a file share.
DeleteShareSnapshot - Deletes a snapshot of a file share.
UndeleteShare - Restores a deleted file share.
UndeleteFile - Restores a deleted file.
UndeleteDirectory - Restores a deleted directory.
RenameFile - Renames a file within a share.
RenameFileSource - Specifies the source file in a rename operation.
RenameFileDestination - Specifies the destination file in a rename operation.
RenameDirectory - Renames a directory within a share.
RenameDirectorySource - Specifies the source directory in a rename operation.
RenameDirectoryDestination - Specifies the destination directory in a rename operation.
COLLABORATION.CREATED - A collaboration event is created.
COLLABORATION.REMOVED - A collaboration is removed.
COLLABORATION.UPDATED - A collaboration is updated.
SHARED_LINK.CREATED - A shared link is created.
SHARED_LINK.UPDATED - A shared link is updated.
SHARED_LINK.DELETED - A shared link is deleted.
FILE.LOCKED - A file is locked for editing.
FILE.UNLOCKED - A file is unlocked for editing.
COMMENT.CREATED - A comment is added to a file.
COMMENT.UPDATED - A comment is updated.
COMMENT.DELETED - A comment is deleted.
METADATA_INSTANCE.CREATED - A metadata instance is created.
METADATA_INSTANCE.UPDATED - A metadata instance is updated.
METADATA_INSTANCE.DELETED - A metadata instance is deleted.
TASK_ASSIGNMENT.CREATED - A task is assigned.
TASK_ASSIGNMENT.UPDATED - A task assignment is updated.
SIGN_REQUEST.COMPLETED - A signature request is completed.
SIGN_REQUEST.DECLINED - A signature request is declined.
SIGN_REQUEST.EXPIRED - A signature request expired.
SIGN_REQUEST.SIGNER_EMAIL_BOUNCED - A signature request email bounced.
sync - A file or folder is synced.
request_access - Access to an item is requested.
approval_requested - An approval request is sent.
approval_completed - An approval request is completed.
approval_canceled - An approval request is cancelled.
approval_comment_added - A comment is added to an approval request.
approval_due_time_change - The due time for an approval request is changed.
approval_reviewer_change - The reviewer of an approval request is changed.
approval_reviewer_responded - A reviewer responds to an approval request.
deny_access_request - An access request is denied.
expire_access_request - An access request expires.
change_owner - The owner of an item is changed.
change_document_access_scope - The access scope of a document is changed.
change_document_visibility - The visibility of a document is changed.
change_acl_editors - The list of editors for a document is modified.
change_user_access - User access permissions are modified.
shared_drive_membership_change - Membership in a shared drive is changed.
shared_drive_settings_change - Shared drive settings are modified.
apply_security_update - Security updates are applied.
shared_drive_apply_security_update - A security update is applied to a shared drive.
shared_drive_remove_security_update - A security update is removed from a shared drive.
remove_security_update - A security update is removed.
enable_inherited_permissions - Inherited permissions are enabled.
disable_inherited_permissions - Inherited permissions are disabled.
recovery_phone_edit - A recovery phone number is changed.
recovery_secret_qa_edit - A recovery question or answer is changed.
account_disabled_password_leak - A user account is disabled due to a password leak.
account_disabled_generic - A user account is disabled.
account_disabled_spamming - A user account is disabled due to spamming.
account_disabled_spamming_through_relay - A user account is disabled for spamming via relay.
accept_invitation - A user accepts an invitation.
add_info_setting - An informational setting is added.
add_member - A new member is added to a group.
add_member_role - A role is assigned to a member.
add_security_setting - A security setting is added.
add_service_account_permission - A permission is assigned to a service account.
approve_join_request - A join request is approved.
ban_member_with_moderation - A member is banned.
change_info_setting - An informational setting is modified.
change_security_setting - A security setting is changed.
change_group_setting - A group setting is modified.
change_group_name - A group's name is changed.
change_first_name - A user's first name is changed.
change_password - A user's password is changed.
suspend_user - A user is suspended.
unsuspend_user - A user is unsuspended.
update_group_settings - A group's settings are updated.
user_license_assignment - A license is assigned to a user.
user_license_revoke - A license is revoked from a user.
add_group_member - A member is added to a group.
remove_group_member - A member is removed from a group.
change_user_access - User access permissions are changed.
change_acl_editors - The list of editors for a document is changed.
application_login_success - An application login succeeds.
alert_center_view - The alert center is accessed.
request_to_join - A request to join a group is sent.
request_to_join_via_mail - A request to join a group via email is sent.
approval_requested - An approval request is made.
approval_canceled - An approval request is canceled.
approval_comment_added - A comment is added to an approval request.
approval_completed - An approval request is completed.
approval_due_time_change - The due time of an approval request is changed.
approval_reviewer_change - The reviewer of an approval request is changed.
approval_reviewer_responded - A reviewer responds to an approval request.
deny_access_request - An access request is denied.
expire_access_request - An access request expires.
shared_drive_membership_change - Membership in a shared drive is changed.
shared_drive_settings_change - Shared drive settings are changed.
apply_security_update - A security update is applied.
remove_security_update - A security update is removed.
shared_drive_apply_security_update - A security update is applied to a shared drive.
shared_drive_remove_security_update - A security update is removed from a shared drive.
suspicious_login - A suspicious login is detected.
suspicious_login_less_secure_app - A suspicious login from a less secure app is detected.
suspicious_programmatic_login - A suspicious programmatic login is detected.
user_signed_out_due_to_suspicious_session_cookie - A user is signed out due to a suspicious session cookie.
FolderRenamed - A folder is renamed.
FileSensitivityLabelChanged - A file's sensitivity label is modified.
FileSensitivityLabelApplied - A sensitivity label is applied to a file.
SharingSet - Sharing permissions are updated.
AddedToGroup - A user is added to a group.
SiteDeleted - A SharePoint site is deleted.
GroupRemoved - A group is removed.
SharedLinkCreated - A shared link is created.
SharedLinkDisabled - A shared link is disabled.
SharingInvitationAccepted - A sharing invitation is accepted.
SharingRevoked - A sharing invitation is revoked.
AnonymousLinkCreated - An anonymous link is created.
SecureLinkCreated - A secure link is created.
SecureLinkUpdated - A secure link is updated.
SecureLinkDeleted - A secure link is deleted.
AccessInvitationAccepted - An access invitation is accepted.
AccessInvitationRevoked - An access invitation is revoked.
AccessRequestApproved - An access request is approved.
AccessRequestRejected - An access request is rejected.
FileCheckOutDiscarded - A file checkout is discarded.
FileCheckedIn - A file is checked in.
FileCheckedOut - A file is checked out.
SharingInheritanceBroken - Sharing inheritance is broken.
AddedToSecureLink - A user is added to a secure link.
RemovedFromSecureLink - A user is removed from a secure link.
SiteCollectionCreated - A new SharePoint site collection is created.