All pages
Powered by GitBook
1 of 56

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Scan with Getvisibility

Configure data sources

Configure detectors

Scan Configuration Fields

Description of the fields in the Scan Configuration popup

The below screenshot shows the fields that appear in the Scan Configuration screen.

Please note that not all of these fields are available for all Data Sources.

Name

Set a unique name so that the Data Source is easy to identify.

Credentials

This a dropdown to select the credentials that have already been configured for the Data Source.

Geographic Location

This is to indicate the physical location of the server the data sits on.

Path

This only needs to be defined for a specific location needs to be scanned.

If left blank the entire Data Source will be scanned.

Data Owner

This is the person that is to be the person responsible for the data.

This setting is optional.

If the Data streaming check box is not visible it may be because the license for DDR is not present.

To learn more about getting a license for DDR please reach out to the Getvisibility Enablement Team.

Types of Scan

The two types of scan are Trustee Scan and File Scan

Trustee Scan

This scan provides the list of Users and Groups on a Data Source

File Scan

This scan provides information about files and folders on a Data Source including structure and metadata.

Once both scans are completed the data is processed and the two sets are combined to show who has access to what files.

Data Source Permissions

How to find the list of permissions granted for a Data Source

The required permissions for scanning are documented by Data Source.

For more information please review the list here.

To Check the configured permissions for a Data Source Navigate to Administration > Data Sources and click on the hamburger menu.

In the dropdown click permissions.:

The example below shows the permissions for SharePoint Online.

Scan History

How to find the history of Scans performed on a Data Source

  1. Go to Administration > Data Sources

  2. Click on a Data Source

  3. Click on the “Last Scan Status” symbol

  1. Go to Administration > Data Sources

  2. Click on a Data Source

  3. Find the required Hamburger Menu

  4. Click on Scan History

Either of the above options will show the history of scans performed on the relevant Data Source

Scan Scheduler

How to set a specific schedule for a scan.

When a Data Source is added to Getvisibility for scanning, the scan begins automatically.

If a rescan is needed this can be configured by clicking on Administration > Data Source > (the Data Source that needs Rescan e.g. One Drive) > Hamburger menu > Rescan Scheduler.

The default configuration is Does Not Repeat.

By clicking the drop-down menu other options can be choosen:

Daily

In this option both the time zone and time of day can be chosen

Weekly

With this option as well as the above configuration a specific or multiple days of the week

Monthly

This gives the option to pick a specific day or days each month to run the rescan.

Import Data Controls

Getvisibility DDR offers a Quick Start option for enabling out-of-the-box data controls

  1. Go to Administration > Quick Start.

  2. Under the Data Controls section, enable predefined DDR rules, such as:

    1. Public Exposure of Personal Identifiable Information (PII).

How to Configure DDR Rules

Create Scan Configuration

To configure DDR rules, follow these steps:

  1. Access the Getvisibility DDR dashboard using your credentials.

  2. Under the DDR tab, select

What is DDR?

A brief description of DDR

Getvisibility's Data Detection and Response (DDR) solution is designed to protect sensitive data by providing near real-time detection and response capabilities. It ensures that data across user environments are constantly monitored and any potential threats are flagged immediately. DDR focuses on data-centric security, ensuring organisations have visibility and control over their critical information assets.

Key Features of DDR:

  1. Real-Time Monitoring: DDR continuously identifies data activities, including access, modification, sharing, deletion, and other activities to identify suspicious and malicious events.

Supported File Types

Full list of file types that can be scanned by DSPM

  • DOC

  • RTF

  • ODT

  • ODS

Supported Data Sources

Below is a list of Data Sources that Getvisibility DDR (Streaming) currently supports:

  • AWS IAM

  • AWS S3

  • Azure AD

Automated Response: DDR sends instant alerts for quick remediation.

  • Risk Mitigation: It ensures regulatory compliance with Privacy Compliance standards like GDPR, HIPAA, PCI-DSS, CCPA and other standards.

  • AI-Powered Insights: DDR leverages proprietary Getvisibility’s AI-mesh models to analyse data context for the best accuracy.

  • Data Intelligence: It provides dashboards with visibility into sensitive data and risks to your data.

  • How DDR Works:

    1. Data Analysis: DDR identifies all data across unstructured data environments and then classifies the data based on its content and context.

    2. Risks Analysis: It evaluates user access, permissions, sharing and data location to identify risks related to your data.

    3. Policy Enforcement: DDR applies predefined and custom security policies to protect data based on its classification and sensitivity.

    4. Incident Response: Upon detecting a threat, DDR generates alerts and enables users to take remediation actions, such as moving files or revoking access.

    DOCX

  • DOCM

  • XLS

  • XLSX

  • XLSM

  • PPT

  • PPTX

  • PPTM

  • PDF

  • VSD

  • TXT

  • C

  • H

  • DESC

  • CSV

  • TSV

  • XML

  • XHTML

  • HTML

  • HTM

  • EML

  • MSG

  • PNG

  • JPG

  • JPEG

  • TIFF

  • TIF

  • Azure Blob
  • Azure Files

  • Exchange Online

  • OneDrive

  • SharePoint Online

  • Box

  • Confluence Cloud

  • Gmail

  • Google Drive

  • Google IAM

  • SMB

  • LDAP (Windows AD)

  • Detection of Protected Health Information (PHI).
  • Monitoring of Payment Card Industry (PCI) data.

  • Import the desired Control Rules to start monitoring immediately.

  • Create Scan Configuration
    to connect to the data sources to be monitored.
    1. Define Scopes: Specify the data sources that will be connect to.

    2. Verify Configuration: Ensure that at least one data source is successfully connected. A green checkmark will confirm the completion.

    Check for Incoming Events

    Once the scan configuration is complete:

    1. Go to Administration > Live Events Streaming to view real-time events.

    1. Monitor Event Activity: Filter events by source, user name, action type (create, update, delete), and event type.

    Overview Page

    The Overview Page provides a comprehensive view of DDR's performance:

    1. Event Statistics: Displays the number of events by source, such as Google Drive, SharePoint, OneDrive, and Box.

    2. Data Source Activity: Visualizes active data sources and the volume of events generated by each.

    3. Event Timeline: Shows when events occurred, helping identify peak activity periods and anomalies.

    Open Risks

    The Open Risks section highlights detected threats, categorised by risk type:

    • Public Exposure: Identifies sensitive files accessible to external users via public links.

    • External Sharing: Detects files shared outside the organisation, potentially exposing sensitive information.

    • Internal Over-Sharing: Flags data with excessive permissions within the organisation.

    For each risk, DDR provides detailed insights, including the file path, user activity, and recommended remediation steps.

    SMB

    How to configure SMB/CIFS connection for scanning

    Configuring SMB connector in Dashboard

    • Navigate to Administration -> Data Sources -> SMB -> New scan

    • Enter the details of the SMB server to scan

      • Name: Give a name to the scan to identify it later

      • Username: The user must be an admin level and have access to all the SMB/CIFS shares to be scanned

      • Password: Password for the admin user

    • Click on the Folder icon in Path to select a particular share/folder to scan, or leave the path as empty to scan all shares

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start file scan to begin scanning

    • The scan results can be viewed under Dashboard -> Enterprise Search

    Supported Protocols:

    The connector supports all SMB dialects up to SMB 3.1.1

    Confluence Cloud

    How to configure Atlassian Confluence Cloud connection to scan it.

    URLS for whitelisting if a proxy is in place

    https://{your-domain}/wiki/api/v2 https://{your-domain}/wiki/rest/api *.atlassian.com *.atlassian.net

    Generating an API token

    • Log in to

    • Click Create API token

    • From the dialog that appears, enter a memorable and concise Label for the token and click Create

    • Click Copy to clipboard, and save it somewhere secure. It isn't possible to view the token after closing the creation dialog

    Configuring Confluence Cloud connector in Dashboard

    • Navigate to Administration -> Data Sources -> Confluence Cloud -> New scan

    • Enter the details

      • Name: Give a name to the scan to identify it later

      • Username: The email address for the Atlassian account you used to create the token

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin the trustee scanning

    • The scan results can be viewed under Dashboard -> Access Governance

    • Click on the icon on the right and select Start file scan to begin the files scanning

    • The results can be viewed under Dashboard -> Enterprise Search

    Rescan Files

    When a targeted rescan is needed it is possible to scan individual files or a specific selection.

    Reasons for a rescan can include:

    Ensuring that recent changes to files are reflected in the UI.

    If new patterns have been added to Pattern Matching.

    If new rules have been added in Controls Orchestration.

    Files can be sent for rescan individually by clicking on the hamburger menu for that file and click on “send to classification pipeline.

    There is also an option to reclassify multiple files at once by selecting them using the tickboxes on the left of the screen.

    Once the required files are selected the option to rescan appears on the bottom right of the screen.

    LDAP

    How to configure LDAP connection to gather permissions and access rights for groups, users, and other entities (Trustees) on an LDAP server.

    Configuring LDAP connector in Dashboard

    • Navigate to Administration -> Data Sources -> LDAP -> New scan

    • Enter the details of the LDAP server to scan

      • Name: Give a name to the scan to identify it later

      • Username: The user must be an admin level and have access to all the LDAP utilities to be scanned. The username should be entered in the format [email protected]

      • Password: Password for the admin user

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin scanning

    • The scan results can be viewed under Dashboard -> Access Governance

    ChatGPT

    How to configure ChatGPT connection for scanning.

    API key

    Owners can generate an API key in the OpenAI API Platform Portal. Note that the correct Organization must be selected when creating a key, corresponding to the administered workspace. Do not select the owner's personal organization.

    • Create a new API key:

      • Settings: Default Project | All Permissions

      • Note that this must be a new key. Once the Compliance API scopes are granted, all other scopes are revoked.

      • Reminder: This key can only be viewed/copied once. Store it securely.

    • Send an email to with:

      • The last 4 digits of the API key

      • The Key Name

      • The Created By Name

    • The OpenAI team will verify the key and grant the requested Compliance API scopes.

    • Administrators may then use this key or pass it to a partner for use with the Compliance API.

    • Workspace IDs can be found on the

    Configuring ChatGPT connector in Dashboard

    • Navigate to Administration -> Data Sources -> ChatGPT -> New scan

    • Provide the workspace id and the api key value obtained from above steps

    • Click on the Folder icon in Path to select a particular user or gpt to scan, or leave the path as empty to scan all

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin the trustee scanning

    • The scan results can be viewed under Dashboard -> Access Governance

    • Click on the icon on the right and select Start file scan to begin the files scanning

    • The results can be viewed under Dashboard -> Enterprise Search

    Enabling Microsoft O365 Streaming with on-premise or private cloud DDR deployments

    Overview

    This guide outlines how to configure Microsoft O365 Streaming in environments where Getvisibility’s Data Detection and Response (DDR) platform is deployed on-premise or in a private cloud. The integration enables DDR to receive and act upon real-time Microsoft 365 activity notifications.


    Prerequisites

    Ensure the following prerequisites are in place before starting the integration:

    • A deployed and operational DDR instance.

    • A public DNS record pointing to the DDR listener endpoint.

    • A valid SSL/TLS certificate from a trusted Certificate Authority.

    • An internet-accessible port 443 (HTTPS) endpoint.


    Deployment Steps

    Step 1: Expose DDR Webhook Endpoint

    Make sure the DDR webhook endpoint is:

    • Publicly accessible via a fully qualified domain name (FQDN).

    • Protected with a valid SSL/TLS certificate.

    • Accessible on port 443 (HTTPS).

    Note: You can use a reverse proxy (e.g., NGROK, NGINX) to securely expose internal services if needed.


    Step 2: Configure Firewall for Microsoft Graph

    Microsoft recommends restricting webhook traffic to only allow inbound requests from Microsoft Graph servers. This reduces the attack surface and prevents spoofed webhook messages.

    Allowlist Required Endpoints:

    More info at

    ⚠️ Action Required: Your firewall or reverse proxy must allow inbound HTTPS traffic from all IP addresses Microsoft uses to deliver change notifications. Regularly update your rules using Microsoft’s published IP ranges.

    Scan Analytics

    Scan Analytics shows in-depth information gathered during the scan.

    There are two ways to access Scan Analytics, either via the main Analytics Dashboard or via the Data Sources page.

    Analytics Dashboards

    To access the Analytics Dashboards click on the link on the Getvisibilty homepage.

    Analytics by Data Source

    To access a per Data Source drill down click on Administration > Data Sources > Data Source of Choice (e.g. OneDrive) > Analytics Icon

    In the side bar detailed information regarding the scan of the chosen Data Source can be reviewed.

    Clicking on any of the fields in the Sidebar brings up a more detailed view of the data as well as giving the option to Remediate any issues that have been found.

    For a more detailed breakdown of Analytics please see .

    Monitoring New Files via DDR Streaming

    Getvisibility DDR continuously monitors new files generated through streaming and provides real-time insights

    1. Filter by Streaming: Under Enterprise Search, use the filter scanTrigger=streaming.

    2. View File Details: DDR displays:

      1. File Path: The location of the file in the data source.

      2. Classification: Sensitivity level (Confidential, Highly Confidential, etc.).

      3. Risk Level: Based on context and user activity.

      4. Compliance Tags: Indicators for GDPR, HIPAA, PCI, and other regulations.

      5. Detection Rules: The specific DDR rules triggered by the file.

    Incident Response: If a high-risk file is detected, DDR generates an alert and suggests remediation steps, such as quarantining the file or revoking access.

    SharePoint on-premise

    How to configure SharePoint On-Premise connection to scan it.

    Configuring SharePoint On-Premise connector in Dashboard

    • Navigate to Administration -> Data Sources -> SharePoint On-Premise -> New scan

    AWS IAM

    How to configure IAM connection to gather permissions and access rights for groups and users on an AWS IAM.

    Create a policy

    • Sign in to the AWS Management Console and open the with the appropriate admin level account

    • In the navigation pane on the left, choose

    Streaming

    The integration of Data Streaming and File Lineage into the DSPM platform provides a comprehensive solution for real-time data monitoring and tracking across both cloud and on-premises data sources. This enhancement enables organizations to dynamically track file origins, data transformations and movements, and end-usage in real time, strengthening security, compliance, and auditability. By introducing these functionalities, businesses can seamlessly monitor data activities and movements across various data sources, providing up-to-date visibility over data estate and offering deeper insights into file history for e-forensics use cases and risk mitigation.

    By implementing Streaming, we unlock crucial use cases such as File Lineage tracking, and Data Detection and Response capabilities, enabling real-time visibility into data activities. This also builds the foundation for anomaly detection capabilities, frequently requested by customers. For instance, scenarios like a user resetting their password, accessing confidential data, and downloading it can be quickly identified. By providing almost real-time updates and visibility into the data estate, businesses can seamlessly monitor data activities, mitigating risks and improving security.

    PRECONDITION:

    During cluster installation, network administrators need to open on firewall exclusion for incoming requests for path:

    where ${HOST_DOMAIN} it's host domain of DSPM platform installation.

    Supported Languages for ML Classifiers

    Listed below are the languages supported by the ML (Machine Learning) classifiers, grouped by language pack.

    Firewall rules allowing inbound traffic from Microsoft Graph servers.

    Microsoft 365 URLs and IP Address Ranges
    Additional Microsoft 365 IP Addresses and URLs
    Graph Change Notification Delivery – Firewall Configuration

    The requested scope (read and delete)

    [email protected]
    Admin dashboard

    Host IP Address: The IP Address of the SMB/CIFS server

  • Domain/Workgroup: The domain or workgroup to which the CIFS/SMB server belongs

  • Port: 445 is the default port, however if the default port is not used, input the correct port number for the SMB protocol

  • IP Address: The IP Address of the server where the LDAP is installed

  • Certificate (Optional): If the server to be scaned uses LDAPS (LDAP over SSL/TLS) enter the certificate text here. Otherwise leave it blank

  • Port: 389 is the default port for LDAP, however for Secure LDAP 636 is used

    • Use Global Catalog ports at 3268 (LDAP) and 3269 (LDAPS), in case standard ports doesn't allow us to traverse through the whole LDAP tree

  • Inactivity: This defines inactive users. Default is 90 days

  • Search base: This is the point in the LDAP directory where Focus will start searching from. In this example:

    • DC stands for Domain Component. An attribute used to represent domain levels

    • aws-gv is the name of the first-level domain

    • local is the top-level domain

    Together, DC=aws-gv,DC=local represents the domain aws-gv.local

  • here

    Chinese

    English, Chinese (Simplified, Traditional)

    Finnish

    English, Finnish

    West-Slavic-3

    English, Polish, Czech, Slovak

    German-Dutch

    English, German, Dutch

    Nordic-3

    English, Danish, Swedish, Norwegian

    Hebrew

    English, Hebrew

    Greek

    English, Greek

    Korean

    English, Korean

    Thai

    English, Thai

    If additional language packs are needed after the initial setup please reach out to support for assistance as each additional pack is a separate AI model that needs to be added.

    Name

    Languages in Pack

    Arabic

    English, Arabic

    Turkish

    English, Turkish

    Hindi

    English, Hindi

    Latin-5

    English, French, Spanish, Portuguese, Italian, Romanian

    Japanese

    English, Japanese

    The host domain needs to be publicly available on the web.

    Ensure that the certificate used is one that is trusted by the Data Source provider. For example with Microsoft services more information on the certificates that they accept can be found here.

    Multitenancy Setup

    For the multitenancy setup, we need to specify ${HOST_DOMAIN} as

    For Data Detection and Response (DDR) to function effectively, the callback endpoint URL must remain open and accessible beyond just the initial setup phase. DDR relies on real-time event notifications and data stream updates, continuously sent to the callback URL. If the callback endpoint is closed or restricted after setup, DDR will fail to receive critical updates, which may result in:

    • Delayed or missing alerts on data access, movement, or security threats.

    • Incomplete monitoring of file lineage and activities, impacting compliance and forensic investigations.

    To ensure uninterrupted functionality, organisations must configure their network to allow incoming requests to the callback URL from all necessary data sources.

    Additionally, for on-premise deployments, it is critical that the webhook URL is accessible by external resources to receive notifications. If external services cannot reach the callback URL, DDR will not function correctly, leading to missed event detections and security blind spots. Network administrators must ensure the necessary firewall rules and routing configurations are in place to allow external communication with the webhook.

    https://${HOST_DOMAIN}/scan-manager/external/webhooks/notification
    {{ .Values.clusterLabels.cluster_name }}.{{.Values.clusterLabels.rancher}}.app.getvisibility.com 
    API Token: The API token created in previous steps
  • Domain: The Atlassian domain

  • Click on the Folder icon in Path to select a particular space to scan, or leave the path as empty to scan all spaces

  • https://id.atlassian.com/manage-profile/security/api-tokens

    Provide the Domain URL, an admin username and its password

    • Click on the Folder icon in Site and path to select a particular site to scan, or leave the path as empty to scan all sites

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning

    • The results can be viewed under Dashboard -> Enterprise Search

    File tagging

    Prerequisites

    • An admin level user is required to scan and tag files in SharePoint On-Premise. The user must be a member of Site Owners Group where they have full control permissions to the SharePoint site.

    • The default Getvisibility tags need to be created as a new column in their SharePoint. This process is described below:

      • In SharePoint, navigate to Documents

      • In the files view, select + Add column

      • Select Choice and then Next

      • Give the name as Classification and the choices as: Public, Internal, Confidential, Highly-Confidential. Select Save

      • Similary create Compliance and Distribution columns (if required)

      • Getvisibility and SharePoint's tags are now aligned

    • When tags are written to SharePoint files automatically over the API, as the tags are added by Getvisibility, Modified By changes to System Account.

      • Getvisibility preserves the Modified date where applicable.

    Supported SharePoint On-Premise versions:

    The connector supports SharePoint 2013, 2016, 2019.

    Policies
    and then choose
    Create policy
    • In the Policy editor section, find the Select a service section, then choose IAM service, and select Next

    • In Actions allowed, choose the below actions to add to the policy:

      • Read > GetUser

      • Read > GetPolicyVersion

      • Read > GetPolicy

      • Read > GetUserPolicy

      • List > ListUserPolicies

      • List > ListAttachedGroupPolicies

      • List > ListAttachedUserPolicies

      • List > ListGroups

      • List > ListUsers

      • List > ListGroupsForUser

    • For Resources, choose All and select Create policy to save the new policy

    Create a user

    • Sign in to the AWS Management Console and open the IAM console with the appropriate admin level account

    • In the navigation pane on the left, choose Users and then choose Create user

    • On the Specify user details page, under User details, in User name, enter the name for the new user, example iam-connector-user and select Next

    • On the Set permissions page, select Attach policies directly and choose the policy created in above steps

    • Select Next

    • Once the user is created, select it, and from the user page, choose Create access key

    • Select Other then Next

    • Enter a description if you wish and select Create access key

    • The Access and Secret Access Keys have now been created. These can be downloaded as a CSV, and also copied from this section. NOTE: the secret access key cannot be viewed once you leave this page

    Configuring AWS IAM connector in Dashboard

    • Navigate to Administration -> Data Sources -> AWS IAM -> New scan

    • Provide the access key and secret access key values generated in the above steps

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin the scanning

    • The scan results can be viewed under Dashboard -> Access Governance

    IAM console

    Google IAM

    How to configure Google IAM connection to gather permissions and access rights for trustees.

    Create OAuth2 Credentials

    • Create a Project in Google Cloud Console:

      • Go to the

      • Create a new project or select an existing project

    • Enable the Admin SDK:

      • In the Google Cloud Console, navigate to the "APIs & Services" > "Library"

      • Search for "Admin SDK" and click on it

    • Create OAuth 2.0 Credentials:

      • In the Google Cloud Console, go to APIs & Services > Credentials

      • Click "Create credentials" and select "Service account"

    Delegate domain-wide authority to your service account

    • From your domain's , go to Main menu menu > Security > Access and data control > API controls

    • In the Domain wide delegation pane, select Manage Domain Wide Delegation

    • Click Add new

    • In the Client ID field, enter the client ID obtained from the service account creation steps above

    • In the OAuth Scopes field, enter a comma-delimited list of the scopes required for the application

    • Use the below scopes:

      • https://www.googleapis.com/auth/admin.directory.user.readonly

      • https://www.googleapis.com/auth/admin.directory.domain.readonly

      • https://www.googleapis.com/auth/admin.directory.group.readonly

    Required scopes

    • DirectoryService.Scope.AdminDirectoryUserReadonly

    • DirectoryService.Scope.AdminDirectoryDomainReadonly

    • DirectoryService.Scope.AdminDirectoryGroupReadonly

    • DirectoryService.Scope.AdminDirectoryRolemanagementReadonly

    Configuring Google IAM connector in Dashboard

    • Navigate to Administration -> Data Sources -> Google IAM -> New scan

    • Enter the details of the OAuth2 credemtials obtained previously

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin scanning

    • The scan results can be viewed under Dashboard -> Access Governance

    Gmail

    This document provides information on how to configure Gmail connection for Focus product.

    Create OAuth2 Credentials

    • Create a Project in Google Cloud Console:

      • Go to the

      • Create a new project or select an existing project

    • Enable the Gmail:

      • In the Google Cloud Console, navigate to the "APIs & Services" > "Library"

      • Search for "Gmail API" and click on it

    • Create OAuth 2.0 Credentials:

      • In the Google Cloud Console, navigate to the "APIs & Services" > "Credentials" tab

      • Click "Create credentials" and select "Service account"

    Delegate domain-wide authority to your service account

    • From your domain's , go to Main menu menu > Security > Access and data control > API controls

    • In the Domain wide delegation pane, select Manage Domain Wide Delegation

    • Click Add new

    • In the Client ID field, enter the client ID obtained from the service account creation steps above

    • In the OAuth Scopes field, enter a comma-delimited list of the scopes required for the application

    • Use the below scopes:

      For scanning

      • https://www.googleapis.com/auth/admin.directory.user.readonly

      • https://www.googleapis.com/auth/gmail.readonly

      For tagging

    AWS S3 Streaming Configuration

    This document provides information on configuring an AWS S3 connection with real-time event monitoring and data streaming.

    To enable DDR (Streaming) for an existing AWS S3 scan, follow these steps:

    Prerequisites

    Existing AWS S3 connection:

    1. An AWS S3 scan configuration must already exist.

    • If an AWS S3 scan has not been configured yet, follow this guide to and ensure the necessary credentials are set up.

    1. Extend AWS S3 policy permissions to allow data streaming:

    Require a separate set of permissions for AWS SNS service:

    Steps to Enable Data Streaming

    1. Select an Existing Scan Configuration

    1. Go to the Scan Configurations page in the product UI.

    2. Select AWS S3 and create credentials for AWS S3

    3. Find the AWS S3 scan configuration and select Edit Configuration from the options menu.

    To validate that streaming events coming though the system you may check Administration -> Live Events -> Streaming

    Google Drive Streaming Configuration

    This guide provides steps on how to enable real-time data streaming for a Google drive connection and monitor streaming events within the Getvisibility platform.

    Steps to Enable Data Streaming for Google drive

    1. From the Data Sources page, select Google drive from the list of available data sources. In the Scan Configurations list, create a New Configuration.

    1. Make sure the connection has a Name and Credentials set then click on Data streaming toggle and click Save & Close to finalize the changes

    1. Clock icon: When data streaming is being activated, the "Requested" status will appear, indicating that the subscription is being processed. Once the subscription is activated, this status will change to "On".

    2. After enabling Data Streaming, the system will automatically handle the subscription to Google Drive’s real-time events. There is no need to manually configure Webhooks.

    Monitoring Real-Time Events

    After the subscription is activated, real-time events will start flowing into the platform, and can be monitored from the relevant parts of the platform.

    Viewing Events in the Live Events Section

    1. Go to the Live Events section under Administration to view a detailed audit log of all streaming events.

    2. Filter by source to get only Google Drive events

    Viewing Extended Streaming Events in the Live Events Section

    Overview

    Extended streaming events provide deeper insights into file activities within Google Drive by leveraging the admin.reports.audit.readonly permission. This allows the system to capture additional event types beyond standard data streaming, such as file permission changes. These events are crucial for comprehensive monitoring, alerting, and data lineage tracking within the platform.

    Prerequisites

    Before enabling extended streaming events, ensure that:

    • The required permission https://www.googleapis.com/auth/admin.reports.audit.readonly is granted to your Google Drive connection.

    • You have followed the delegation process as outlined in the Getvisibility documentation: .

    Enabling Extended Streaming Events

    If the necessary permission was not granted at the time of the initial streaming subscription, click on unsubscribe and then re-subscribe to streaming events from the Data Sources view.

    Steps:

    1. Go to the Data Sources section under Administration.

    2. Locate the Google Drive connection.

    3. If extended streaming is not enabled, uncheck "Data streaming" box from streaming events.

    4. Ensure that the admin.reports.audit.readonly

    Monitoring Extended Streaming Events

    Once extended streaming is enabled, events will be available for monitoring in multiple sections of the platform:

    Live Events Section

    • Go to Live Events under Administration to view real-time extended events.

    • Use the filter options to narrow down events to only Google Drive activities.

    • Extended events such as permission changes, sharing modifications, and file deletions will be listed.

    Data Lineage Tracking

    • Extended events are integrated into Data Lineage, providing a clear visualization of file activity over time.

    • Users can track who performed actions on a file and when, enabling forensic investigation and compliance tracking.

    Alerting and Monitoring

    • Alerts can be configured for specific event types such as sensitive file shared externally, file permissions changed, or file deletion.

    • These alerts help organizations proactively detect potential security risks or data leaks.

    Box Streaming Configuration

    This guide provides steps on how to enable real-time data streaming for a Box connection and monitor streaming events within the Getvisibility platform.

    This guide walks you through enabling real-time data streaming for a Box connection and how to monitor live streaming events within the Getvisibility platform.

    Prerequisites

    1. Existing Box app

    • If you haven't created a Box credentials yet, follow this guide to and ensure the necessary credentials are set up.

    2. Additional scope to Box app

    • Data streaming needs Manage webhooks scope, as shown below

    • Confirm that Manage Webhooks are present in App Scopes

      • exit Dev Console and switch to the Admin Console

    Steps to Enable Data Streaming for Box

    1. Create a New Scan Configuration

    1. From the Data Sources page, select Box from the list of available data sources. In the Scan Configurations list, create a New Configuration.

    2. Make sure the connection has a Name and Credentials set. Then select the Path icon.

    2. Pick a Folder for Real-Time Events

    1. Click on the Folder icon in the Path field to select the folder you want to monitor for real-time events.

      • Magnifying glass icon: Folders with this icon next to them indicate that real-time events can be subscribed to from this directory.

    2. After selecting the folder, click Save & Close to finalize the changes.

    View the created webhook in Box UI

    You can view the configured webhook in your Box Dev Console

    1. Login to your Box account and navigate to

    2. Select the Box app configured in previous steps

    1. Navigate to Webhooks tab, here you can see the list of configured webhooks

    Monitoring Real-Time Events

    After the subscription is activated (green magnifying glass icon), real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.

    Viewing Events in the Live Events Section

    1. Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events.

    2. In this section, you can filter and view event details.

    iManage Cloud

    How to create an iManage Connector app to connect to iManage accounts for the cloud.

    Registering an iManage App

    • To register iManage App you need to contact iManage support by sending an email to [email protected]

    • Once an account is created login to iManage

    • Click on username in the upper right corner and click Control Center

    • Note: Only users with admin role have access to Control Center

    • Go to the Applications menu item, click Desktop Auth Client and find Client ID

    • Customer ID should be provided by iManage admins, but if it is not provided, it can be retrieved from the /api response

      • Get Access Token

      • Get Customer ID

    • Click on the Folder icon in Path to select a particular path to scan, or leave the path as empty to scan all

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin Trustee scanning

    • The scan results can be viewed under Dashboard -> Access Governance

    • Click on the icon on the right and select Start file scan to begin file scanning

    • The results can be viewed under Dashboard -> Enterprise Search

    Enterprise Search columns meaning

    This guide will explain the features available from the files Explore page on the Focus platform.

    Accessing the Enterprise Search page:

    From the Getvisibility Dashboard, select the 'Enterprise Search' section.

    Enterprise Search dashboard.

    Filters and options explained:

    • Search: Use the search bar to type in a file or folder name to find specific items.

    • Utilise the dropdown menus to filter the files by:

      • File Extension: Choose from a list of file types.

      • Source: Select the source of the files.

    Reviewing File Details:

    Each file listed will have details displayed such as:

    • Path: The directory path of the file.

    • Category and Subcategory: The classification categories of the file.

    • Classification: The sensitivity level of the file.

    • Compliance Tags: Any compliance-related tags assigned to the file.

    All available columns will be listed:

    Exporting Data:

    • Use the 'EXPORT' button to create a CSV or JSONL download of the filtered data

    Clearing Search:

    • Click 'CLEAR SEARCH' to reset all filters and search criteria.

    File Actions:

    Permissions & Access Rights:

    • To view or modify the permissions and access rights of a file, click on the Actions menu.

    • A detailed list will appear showing the Security Identifier (SID), the common name associated with that SID, the organizational unit, domain, and the specific permissions granted.

    Manually Verify ML Classification:

    To adjust the machine learning classification of a file, click on the pencil icon under the 'Actions' column.

    A dialog box titled 'Verify ML Classification' will appear, allowing you to change:

    • The 'Category' and 'Subcategory' of the document.

    • The 'Compliance' tags associated with the file.

    • The 'Classification' level (e.g., Public, Internal, Confidential).

    • Confidence levels for the category and subcategory will be displayed, providing insight into the ML model's certainty regarding its classification.

    Advanced Search:

    • Toggle the 'Switch to advanced search' to use GQL (Getvisibility Query Language). Please refer to this for full guidance on using GQL.

    Lineage

    Overview of Lineage

    Data Lineage in Getvisibility provides a comprehensive view of a file's lifecycle, tracking its origin, movement, transformation, and usage. This enhances security, compliance, and forensic investigations by offering end-to-end visibility into data activities.

    Traditional data monitoring provides static snapshots, which quickly become outdated, especially for large datasets. Real-time lineage addresses this by:

    1. Reducing Dependency on Rescans: Once streaming is enabled, changes are captured instantly.

    2. Improving Visibility: Organizations can see data movements in near real-time.

    3. Enabling Faster Incident Response: Security teams can quickly assess and respond to threats.

    Use Cases

    Data Lineage was developed to enable forensic investigations, ensuring organisations can:

    1. Investigate Incidents: Identify the root cause of security incidents, such as data breaches or unauthorised sharing.

    2. Enhance Compliance: Maintain audit trails for regulatory requirements.

    3. Support Risk Mitigation: Quickly respond to suspicious activities and apply appropriate remediation actions.

    Pre-Requisites to See Lineage

    1. Connection to Each Data Source: Ensure that each Data Source to be monitored has been configured in Getvisibility.

    2. Enabling Streaming: Activate real-time event streaming for each connector.

    Navigation to Lineage

    1. From Enterprise Search: Select a file and click on "Lineage" in the dropdown.

    1. From Open Risks: Identify a flagged file and expand the side menu.

    Lineage UI Explanation

    Filters:

    • Event Type (Create, Modify, Delete, Share, Move, etc.)

    • Data Source

    • User Activity

    Export:

    • Export lineage details to CSV for auditing and reporting.

    Color Scheme:

    • Green: Normal activity

    • Yellow: Medium-risk events (e.g., permission changes)

    • Red: High-risk events (e.g., external sharing)

    Description of the Lineage Screen

    Lifecycle: Displays the complete lifecycle of a file from creation to current state.

    Event Timeline: Chronological list of all file-related actions.

    User & Device: Shows which users and devices interacted with the file.

    File Path: Original and current locations of the file.

    List of Events Supported by Each Data Source

    Common Events:

    • Create

    • Modify

    • Delete

    Extended Events (via Audit Logs)

    • Change Permissions

    • Share

    • Move

    • Copy

    Data Source Specifics:

    • Google Drive: Audit log events available.

    • Azure (SharePoint Online, OneDrive, Blob, Files): Audit log events supported.

    • Box & Confluence: Extended events available in regular logs.

    • AWS S3, SMB, Dropbox: Limited to Create, Modify, and Delete.

    Use Case for Lineage

    Lineage supports forensic investigations, such as:

    1. External Sharing Investigation: When a file is shared externally, security analysts can trace its history to determine if the action was intentional or accidental.

    2. Suspicious Activity Investigation: If a user accesses and downloads sensitive information after a password reset, lineage provides detailed insights.

    3. Incident Response: Analysts can determine what actions to take, such as revoking access, quarantining files, or addressing user behaviour.

    How to Access Lineage

    1. Enterprise Search: Select the file, click the dropdown, and choose "Lineage."

    2. File View: Expand the file details and navigate to the "Lineage" tab.

    Hover and Export Options

    1. Event Description: Hovering over event icons shows a brief description.

    2. Export: Export the entire lineage history, including metadata, to CSV for audit trails and reporting.

    Data Lineage empowers organisations with real-time visibility, advanced threat detection, and comprehensive forensic capabilities, ensuring sensitive data remains secure and traceable.

    Azure AD

    How to create an Azure AD Connector app to connect to Azure Active Directory (Microsoft Entra ID).

    Registering an Azure App

    • Login to

    • If there are multiple tenants to choose from, use the

    iManage On-Premise

    This guide details how to create and configure an iManage connector to scan an on-premise iManage Work Server.

    To connect Forcepoint DSPM to your iManage server, you will need to gather three key pieces of information:

    1. Your Server's URL: The fully qualified domain name of your iManage server (e.g., imanage.mycompany.com).

    2. An Application Client ID: A unique ID from your iManage Control Center that identifies the Getvisibility application.

    AWS S3

    How to create an AWS S3 user with policies, to connect to S3 accounts.

    Create a policy

    • Sign in to the AWS Management Console and open the with the appropriate admin level account

    • In the navigation pane on the left, choose

    Dropbox

    How to configure Dropbox connection to scan it.

    Initial creation of Dropbox connector in Dashboard

    • Navigate to Administration -> Data Sources -> Dropbox

    Scanning

    Scanning process and statuses

    To review Scans and their status go to Data Sources in the Administration drop-down.

    The scanning process discovers and analyses files across all configured data sources. It operates in three steps:

    1) Discovery

    Azure Files

    How to configure Azure Files connection for to scanning.

    Registering an Azure App

    • Login to

    • If there are multiple tenants to choose from, use the Settings

    Dropbox Streaming Configuration

    This guide provides steps on how to enable real-time data streaming for a Dropbox connection and monitor streaming events within the Getvisibility platform.

    This guide walks you through enabling real-time data streaming for a Dropbox connection and how to monitor live streaming events within the Getvisibility platform.

    Prerequisites

    Sharepoint Online Streaming Configuration

    This guide provides steps on how to enable real-time data streaming for a Sharepoint Online connection and monitor streaming events within the Getvisibility platform.

    This guide walks you through enabling real-time data streaming for a Sharepoint Online connection and how to monitor live streaming events within the Getvisibility platform.

    Configuring permissions for an Azure App

    • Login to

    Azure AD Streaming Configuration

    This guide provides steps on how to enable real-time data streaming for a Azuer AD connection and monitor streaming events within the Getvisibility platform.

    This guide walks you through enabling real-time data streaming for a Azure AD connection and how to monitor live streaming events within the Getvisibility platform.

    Configuring permissions for an Azure App

    • Login to

    SMB Streaming Configuration

    This guide provides steps on how to enable real-time data streaming for a SMB connection and monitor streaming events within the Getvisibility platform.

    This guide walks you through enabling real-time data streaming for a SMB connection and how to monitor live streaming events within the Getvisibility platform.

    Steps to Enable Data Streaming for SMB

    Exchange Online

    This document provides information for about creating a Exchange Connector app, which is required for Focus product to connect to customer's Exchange Online accounts.

    Registering an Azure App

    • Login to

    OneDrive Streaming Configuration

    This guide provides steps on how to enable real-time data streaming for a OneDrive connection and monitor streaming events within the Getvisibility platform.

    This guide walks you through enabling real-time data streaming for a OneDrive connection and how to monitor live streaming events within the Getvisibility platform.

    Configuring permissions for an Azure App

    • Login to

    Rename
  • Upload

  • Download

  • Risk: Filter files based on content and the amount of user access to the file.

  • Keyword Hits: Filter by specific keywords using regex (regular expressions).

  • Group: Select a specific group that files may be associated with, such as Sensitive.

  • Category and Subcategory: Filter files based on predefined categories and subcategories.

  • Classification: Choose the classification level of the files (e.g., Public, Internal, Confidential).

  • Alias: Filter files by their Data Source aliases.

  • Compliance: Select a compliance regulation to see relevant files.

  • Flow: Filter by the data flow process, catalogued or classification.

  • Trustees: Filter files accessible by specific users or groups.

  • Created and Last Modified: Timestamps indicating when the file was created and last modified.

  • Subcategory Confidence: A confidence score indicating the accuracy of the subcategory classification.

  • Ingested Time: The timestamp when the file was ingested into the system.

  • Actions: Icons indicating possible actions to take on the file.

  • File Size: The size of the file.

  • These details can be configure by selecting the Column Configuration menu here:

  • After making changes, click 'SAVE' to apply them.

    GQL
    create a new AWS S3 scan
    Create AWS S3 credentials with required permissions
    Create new AWS S3 Scan
    Select Patch to be tracked for streaming
    select Data streaming checkbox and modify webhook host if it's required for firewall configuration
    {
    	"Version": "2012-10-17",
    	"Statement": [
    		{
    			"Sid": "SNSScoped",
    			"Effect": "Allow",
    			"Action": [
    				"sns:CreateTopic",
    				"sns:DeleteTopic",
    				"sns:TagResource",
    				"sns:SetTopicAttributes",
    				"sns:Subscribe",
    				"sns:ConfirmSubscription"
    			],
    			"Resource": [
    				"arn:aws:sns:*:876326936841:s3-event-topic-*"
    			]
    		},
    		{
    			"Sid": "S3BucketNatification",
    			"Effect": "Allow",
    			"Action": [
    				"s3:PutBucketNotification"
    			],
    			"Resource": "*"
    		}
    	]
    }

    Click the "Enable" button to enable the Admin SDK API for your project

    Enter a name in the Service account name field andclick CREATE CREDENTIALS

    • Under "Grant this service account access to the project," select role as Owner and click DONE

    • Select the newly created service account and click Keys > Add Key > Create new key

    • Make sure the key type is set to json and click CREATE

    • The new private key pair is generated and downloaded to the machine. Note the values of private_key, client_email and client_id

    https://www.googleapis.com/auth/admin.directory.rolemanagement.readonly

  • Click Authorize

  • Google Cloud Console
    Admin console

    Click the "Enable" button to enable the Goolge Drive Activity API for your project

    Enter a name in the Service account name field and CREATE AND CONTINUE

    • Under Grant this service account access to the project, select role as Owner and click DONE

    • Select the newly created service account and click Keys > Add Key > Create new key

    • Make sure the key type is set to json and click Create

    • The new private key pair is generated and downloaded to the machine. Note the values of private_key, client_email and client_id

    https://www.googleapis.com/auth/gmail.modify

  • https://www.googleapis.com/auth/gmail.labels

  • https://www.googleapis.com/auth/gmail.metadata

  • Click Authorize

  • Google Cloud Console
    Admin console
    permission is granted as per the prerequisites.
  • Click check "Data streaming" box from streaming events again to re-enable streaming with extended event tracking.

  • Verify the status of the subscription to ensure it is active.

  • Delegate Domain-Wide Authority to Your Service Account

    In Admin Console, go to Apps > Integration > Platform Apps Manager and locate the newly created app, then click View button

    Note: in case of Manage webhooks v2 is not visible on the list, create a new Box credentials

  • Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. Once the subscription is activated, this icon will change to a green magnifying glass.

  • After enabling Data Streaming, the system will automatically handle the subscription to Box’s real-time events. There is no need to manually configure Webhooks.

  • create a new Box app
    Dev Console
    Live Events Section
    Event Details View

    Go to the Roles menu item and set the following:

    • Select Global Management to setup admin roles. Enable the necessary options.

    • Select Library-level Management to setup library roles

  • Permissions required

    • For scanning

      • System Access > Read-only

    • To move files

      • Document > Delete

    • To revoke permissions

      • System Access > Not Read-only

    • For tagging

      • Document > Import / Create

    Configuring iManage connector in Dashboard

    • Navigate to Administration -> Data Sources -> iManage -> New scan

    • Provide the customer id, client id, username, password and domain value

  • Settings
    icon in the top menu to switch to the tenant in which needs to be registered to the application from the
    Directories + subscriptions
    menu
    • Browse to App Registration and select New registration

    • On the App Registration page enter the below information and click the Register button.

      • Name: (Enter a meaningful application name that will be displayed to users of the app)

      • Supported account types:

        • Select which accounts that the application will support. The options should be similar to the below screenshot.

        • “Accounts in this organizational directory only” can be selected:

        • Leave the Redirect URI as empty and Click Register

    • Note the Application (client) ID, Directory (tenant) ID values

    • Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret

    • Provide a meaningful description and expiry to the secret, and click on Add

    • Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once this page is closed.

    • Navigate to Manage -> API permissions on the left menu, and Add a permission

    • Select Microsoft APIs -> Microsoft Graph

    • Select Application permissions

    • Permissions required

      • Scanning only:

        • Microsoft Graph > Application permissions > AuditLog > AuditLog.Read.All

        • Microsoft Graph > Application permissions > Directory > Directory.Read.All

    • Once all the required permissions are added, click Grant admin consent

    Configuring Azure AD connector in Dashboard

    • Navigate to Administration -> Data Sources -> Azure AD -> New scan

    • Provide the Directory (tenant) ID, Application (client) ID and Client Secret value generated in the above steps from the Azure application

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin scanning

    • The scan results can be viewed under Dashboard -> Access Governance

    Azure Portal
    A Service Account: A dedicated iManage user account with specific permissions for scanning.

    This guide will walk you through the steps for your iManage administrator to find this information and how to use it to configure the connector.

    Network Access Requirement

    Before you begin, ensure the Forcepoint DSPM server has network access to your on-premise iManage server's API. You may need to configure internal firewall rules to allow this connection.

    Prerequisites

    Before you begin, ensure you have the following:

    • Administrative access to your on-premise iManage Control Center.

    • The fully qualified domain name (hostname) of your on-premise iManage server (e.g., imanage.mycompany.com).

    • A dedicated iManage service account with a username and password.

    Part 1: Obtain Your Client ID (via iManage Control Center UI)

    This step must be performed by your internal iManage administrator.

    1. Log in to your on-premise iManage server.

    2. Click on your username in the upper-right corner and select Control Center.

    3. From the side menu, navigate to Applications.

    4. Select Desktop Auth Client from the list.

    5. Copy the Client ID value. This ID is used to identify the Forcepoint DSPM application to your iManage server. You will need this for Part 2 and Part 4.

    Part 2: Get Access Token and Customer ID (via API)

    You can use a command-line tool like curl to perform these one-time steps. Replace your.imanage.server.com with your on-premise server's actual hostname in the commands below.

    A. Get Access Token

    Run the following command in your terminal. Be sure to replace the placeholder values (YOUR_USERNAME, YOUR_PASSWORD, YOUR_CLIENT_ID) with your actual service account credentials and the Client ID from Part 1.

    The JSON response will contain your access_token.

    B. Get Customer ID

    Run the next command, replacing YOUR_ACCESS_TOKEN with the access_token value you received from the previous step.

    The JSON response will contain your customer_id.

    Part 3: Configure iManage Roles & Permissions (via iManage Control Center UI)

    This is performed in the iManage Control Center to grant the service account the necessary permissions.

    1. Navigate to Control Center > Roles.

    2. Create or edit the role assigned to your service account.

    3. Grant the following privileges:

      • For Scanning: System Access > Read-only

      • For Tagging: Document > Import / Create

      • For Moving Files: Document > Delete

      • For Revoking Permissions: System Access > Not Read-only

    Part 4: Configure the iManage Connector in Forcepoint DSPM

    1. In the Forcepoint DSPM, navigate to Administration > Data Sources.

    2. Find iManage in the list and click New Scan.

    3. Fill in the connector configuration fields:

    Field

    Value

    Description

    Name

    My On-Prem iManage

    A friendly name for this connection.

    Customer Id

    (ID from Part 2B)

    The numeric Customer ID for your instance.

    Username

    (Service Account)

    The iManage service account username.

    Password

    (Service Account)

    The service account password.

    1. Click Save.

    Part 5: Run the Scan

    1. Find your newly configured iManage connection in the list.

    2. Click the ... (three-dot) menu on the right.

    3. Select Start trustee scan to scan permissions (Optional).

    4. Once the trustee scan is complete (optional), click the ... menu again and select Start file scan to scan content.

    Part 6: View Results

    • Permission and access issues can be viewed in Dashboard > Access Governance (if you ran the trustee scan).

    • File classification and content results can be viewed in Dashboard > Enterprise Search.

    Policies
    and then choose
    Create policy
    • In the Policy editor section, find the Select a service section, then choose S3 service, and select Next. Once S3 service permissions are added, next, move on to IAM service

    • In Actions allowed, choose the below actions to add to the policy:

      • For scanning

        • IAM service

          • Read > GetUser

          • Read > GetPolicyVersion

          • Read > GetPolicy

          • Read > GetUserPolicy

          • List > ListUserPolicies

          • List > ListAttachedUserPolicies

        • S3 service

          • Read > GetBucketAcl

          • Read > GetBucketLocation

          • Read > GetObject

        • EC2 service

          • List > DescribeRegions

      • For revoke permissions (S3 service)

        • Permission Management > PutBucketAcl

        • Permission Management > PutObjectAcl

      • For tagging (S3 service)

        • Write > DeleteObject

        • Write > PutObject

        • Tagging > DeleteObjectTagging

    • For Resources, choose All and select Create policy to save the new policy

    Create a user

    • Sign in to the AWS Management Console and open the IAM console with the appropriate admin level account

    • In the navigation pane on the left, choose Users and then choose Create user

    • On the Specify user details page, under User details, in User name, enter the name for the new user, example S3-connector-user and select Next

    • On the Set permissions page, select Attach policies directly and choose the policy created in above steps

    • Select Next

    • Once the user is created, select it, and from the user page, choose Create access key

    • Select Other then Next

    • Enter a description if you wish and select Create access key

    • The Access and Secret Access Keys have now been created. These can be downloaded as a CSV, and also copied from this section. NOTE: the secret access key cannot be viewed once you leave this page

    Configuring AWS S3 connector in Dashboard

    • Navigate to Administration -> Data Sources -> AWS S3 -> New scan

    • Provide the access key and secret access key values generated in the above steps

    • Click on the Folder icon in Path to select a particular bucket to scan, or leave the path as empty to scan all buckets

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning

    • The results can be viewed under Dashboard -> Enterprise Search

    IAM console
    Then go to Credentials tab and click New credentials
  • Create credentials name and copy the Redirect URL - it will be needed later. App Key and App Secret fields will be filled later once we Register a Dropbox App.

  • Registering a Dropbox App

    • Login to Dropbox

    • Go to Dropbox App Console and click Create app

    • On the App Creation page enter below information and click Create app button

      • Choose an API: Most applications will use "Dropbox API"

      • Choose Access Type: Select "Full Dropbox" for complete access.

      • Name Your App and click Create app: Enter a name that will be visible to users.

    • Go to the Settings tab and find app key and secret above the OAuth 2 section

    • We need to set proper permissions for Dropbox app. Below you can find a list of required permissions:

      • For scanning

        • Files and Folders > files.metadata.read, files.content.read

        • Collaboration > sharing.read

    • Go to the Permissions tab of the newly created App and set the following:

      • Account Info: account_info.read

      • Files and Folders: files.metadata.write, files.metadata.read, files.content.write, files.content.read

    • Once permissions are set click Save button located on the black snackbar at the bottom of the window.

    • Go back to Settings tab and scroll to Redirect URI section. Put here copied link from Dashboard and click Add

    • Then copy App key from Dropbox App settings page and put it into App key field in Dashboard Create connection form. Similar action should be made for App secret.

    Finishing creation of Dropbox connector in Dashboard

    • Once done click Authorize with Dropbox button as below:

    • Then you'll be redirected to 1st page to trust your application - click Continue

    • Then you'll see a list of permissions app will be granted - click Allow

    • Once done you'll be redirected back to Dashboard page with success message as below:

    • Connection has been configured successfully

    The system searches through all files and folders.
    • If a specific path has not been set, the entire Data Source will be scanned.

    • Metadata (path, size, format, etc.) and permissions are extracted and recorded for each file.

  • This step ensures that every every file and folder is identified and that access permissions are understood.

  • The scan discovery process can have the following statuses, reflecting its progress:

    Not Started: Data Source has been added but the scan has not started.

    Queued: Scan has been put into the queue for the execution.

    Failed To Start: Scan unable to start, usually due to issues with permissions or network.

    In Progress: Scan is actively running and processing data discovery.

    Cancelled: Scan was manually stopped or automatically aborted.

    Incomplete: Scan is partially completed but permissions to files were changed during scan.

    Completed: Scan has successfully finished Discovery phase.

    These statuses can be seen in the Last Scan Status column.

    2) Metadata classification

    This is the continuation of the Discovery process where:

    • Metadata information is processed for each file that has been collected as part of the Discovery step.

    • A detailed analysis of each file's metadata is performed .

    3) Content Classification

    • Permissions are analysed and the shared level is identified.

    • A detailed analysis of each file's content is performed.

    • Content is extracted and the sensitivity level and risk of each file is determined for classification.

      • This is determined by the Patterns/Detector setting and the AI Mesh

    • This ensures that sensitive information is properly identified and protected.

    Trustee Scan

    This is a scan to determine the Users and Groups present in a Data Source.

    • Metadata is extracted for each user, with specific fields depending on the data source. Some of the fields that will be picked up by the scan include Enabled, Last Login, Last Modified, etc.

    The statuses for these scans are the same as for files but there are two additional ones.

    Completed Only Users: The scan has been completed only for user-specific policies.

    Completed Only Groups: The scan has been completed only for group-specific policies.

    To see additional information on a running or completed scan click on the Scan Analytics Icon.

    This will pop out the Analytics sidebar where there is information such as scan duration, how many files have been scanned, classification insights, etc.

    icon in the top menu to switch to the tenant in which needs to be registered to the application from the
    Directories + subscriptions
    menu.
    • Browse to App Registration and select New registration

    • On the App Registration page enter below information and click Register button

      • Name: (Enter a meaningful application name that will be displayed to users of the app)

      • Supported account types:

        • Select which accounts the application will support. The options should be similar to those below. Select �Accounts in this organizational directory only�:

        • Leave the Redirect URI as empty and Click Register

    • Note the Application (client) ID, Directory (tenant) ID values

    • Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret

    • Provide a meaningful description and expiry to the secret, and click on Add

    • Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once you leave this page

    • Navigate to Manage -> API permissions on the left menu, and Add a permission

    • Select Microsoft APIs -> Microsoft Graph

    • Select Application permissions

    • Permissions required

      • Microsoft Graph > Application permissions > Device > Device.Read.All

      • Microsoft Graph > Application permissions > Directory > Directory.Read.All

      • Microsoft Graph > Application permissions > Group > Group.Read.All

      • Microsoft Graph > Application permissions > User > User.Read.All

    • Once all the required permissions are added, click "Grant admin consent"

    Azure Storage Subscription ID, Resource group and connection strings

    A connection string is needed for the storage account you wish to scan.

    • Login to Azure Portal

    • If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu.

    • Browse to Storage accounts and select the account to be scanned

    • Once the storage account is selected, note the Resouce group and Subscription ID values in the Overview page

    • Navigate to Security + networking -> Access keys on the left menu, and click on Show on the Connection string

    • Copy this Connection string value

    • Access Control (IAM) Role assignment

      • In the storage account, go to Access Control (IAM) and assign Reader role to the azure app created in the first step

      • Save the changes.

    Configuring Azure Files connector in Dashboard

    • Navigate to Administration -> Data Sources -> Azure Files -> New scan

    • Provide the Connection string value obtained from above steps

    • Click on the Folder icon in Path to select a particular share to scan, or leave the path as empty to scan all shares

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start file scan to begin scanning

    • The results can be viewed under Dashboard -> Enterprise Search

    Azure Portal
    1. Existing Dropbox app
    • If you haven't created a Dropbox credentials yet, follow this guide to create a new Dropbox app and ensure the necessary credentials are set up.

    2. Additional scope to Dropbox app

    • Data streaming needs events.read scope, as shown below, setting is located under Permissions for your app in Dropbox App Console

    Steps to Enable Data Streaming for Dropbox

    1. From the Data Sources page, select Dropbox from the list of available data sources. In the Scan Configurations list, create a New Configuration.

    2. Make sure the connection has a Name and Credentials set. Leave Path untouched. Dropbox supports only root level data streaming.

    3. Enable Data streaming checkbox and copy Webhook URL

    4. Go to , open app Settings for your credentials and paste previously copped Webhook URL into Webhook URIs

    5. After clicking Add webhook should have status enabled

    6. Click Save & Close to finalize the changes.

    7. Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. On completion it will show created configuration with Requested Data Streaming.

    Monitoring Real-Time Events

    After the subscription is activated, real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.

    Viewing Events in the Live Events Section

    1. Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events.

    2. View streaming event details

    3. It is also possible to monitor extended events.

    If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu

  • Browse to App Registration and select your application that was created for the scanning

  • Navigate to Manage -> API permissions on the left menu, and Add a permission

  • Select Microsoft APIs -> Office 365 Management API

    • Select Application permission

    • Select ActivityFeed.Read permission

    • Permissions required

      • All the scanning permissions(https://docs.getvisibility.com/scan-with-getvisibility/configure-data-sources/onedrive)

      • Office 365 Management API ⇒ Application Permissions ⇒ ActivityFeed.Read

    • Once all the required permission is added, click "Grant admin consent"

    Enabling Auditing

    • Sign into the Microsoft Purview portal using Microsoft Edge browser

    • Select the Audit solution card. If the Audit solution card isn't displayed, select View all solutions and then select Audit from the Core section

    • If auditing isn't turned on for your organization, a banner is displayed prompting you start recording user and admin activity. Select the Start recording user and admin activity banner.

    • In certain cases, recoding cannot be enabled immediately and requires additional configuration. If this applies, users will be prompted to enable the customization setting. Select OK, and a new banner will appear, informing you that the process may take 24 to 48 hours to complete. After this waiting period, repeat the previous step to proceed with enabling recoding.

    Steps to Enable Data Streaming for Sharepoint Online

    1. Create a New Scan Configuration

    1. From the Data Sources page, select Sharepoint Online from the list of available data sources. In the Scan Configurations list create New Configuration

    2. Make sure the connection has a Name, Credentials are set. Then select the Path icon.

    2. Pick a Folder for Real-Time Events

    1. Click on the Folder icon in the Path field to select the folder you want to monitor for real-time events.

      • Magnifying glass icon: Folders with this icon next to them indicate that real-time events can be subscribed to from this directory.

    2. After selecting the folder, click Save & Close to finalize the changes.

    3. Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. Once the subscription is activated, this icon will change to a green magnifying glass.

    4. After enabling Data Streaming, the system will automatically handle the subscription to Sharepoint Online’s real-time events. There is no need to manually configure Webhooks.

    Monitoring Real-Time Events

    After the subscription is activated (green magnifying glass icon), real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.

    Viewing Events in the Live Events Section

    1. Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events.

    2. In this section, you can filter and view event details

    Azure Portal

    If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu

  • Browse to App Registration and select your application that was created for the scanning

  • Navigate to Manage -> API permissions on the left menu, and Add a permission

  • Select Microsoft APIs -> Office 365 Management API

    • Select Application permission

    • Select ActivityFeed.Read permission

    • Permissions required

      • Office 365 Management API ⇒ Application Permissions ⇒ ActivityFeed.Read

      • Microsoft Graph > Application permissions > AuditLog > AuditLog.Read.All

      • Microsoft Graph > Application permissions > Directory > Directory.Read.All

    • Once all the required permission is added, click "Grant admin consent"

    Enabling Auditing

    • Sign into the Microsoft Purview portal using Microsoft Edge browser

    • Select the Audit solution card. If the Audit solution card isn't displayed, select View all solutions and then select Audit from the Core section

    • If auditing isn't turned on for your organization, a banner is displayed prompting you start recording user and admin activity. Select the Start recording user and admin activity banner.

    • In certain cases, recoding cannot be enabled immediately and requires additional configuration. If this applies, users will be prompted to enable the customization setting. Select OK, and a new banner will appear, informing you that the process may take 24 to 48 hours to complete. After this waiting period, repeat the previous step to proceed with enabling recoding.

    Steps to Enable Data Streaming for Azure AD

    1. Create a New Scan Configuration

    1. From the Data Sources page, select Azure AD from the list of available data sources. In the Scan Configurations list create New Configuration

    2. Make sure the connection has a Name, Credentials are set and Data streaming is enabled.

    3. Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. Once the subscription is activated, this icon will change to a green magnifying glass.

    4. After enabling Data Streaming, the system will automatically handle the subscription to Azure AD’s real-time events. There is no need to manually configure Webhooks.

    Monitoring Real-Time Events

    Once streaming is enabled, events can be monitored across multiple sections of the platform, providing comprehensive visibility into user and group activities. The Streaming tab offers an overview of essential operations, such as user and group creation, updates, and deletions.

    For deeper insights, Extended Streaming Events leverage Azure AD’s audit logging functionality along with the ActivityFeed.Read permission. This enables the system to capture a broader range of event types beyond standard data streaming, including administrative actions, role changes, and authentication events.

    Viewing Events in the Live Events Section

    1. Navigate to the Live Events section under Administration and then to Streaming tab to view a detailed audit log of streaming events.

    2. Navigate to the Live Events section under Administration and then to Extended Streaming tab to view a detailed audit log of extended streaming events.

    3. In both sections, you can filter and view event details

    Azure Portal
    1. Create a New Scan Configuration
    1. From the Data Sources page, select SMB from the list of available data sources. In the Scan Configurations list Create New Configuration

    1. Make sure the connection has a Name, Credentials are set. Then select the SMB share Path that is going to listen.

    1. After selecting the folder, Select the Data streaming checkbox:

    1. Follow the download tab link and installation instructions of SMB agent:

    link to download agent instruction
    1. Follow installation instructions for SMB streaming agent:

    SMB Agent Installation

    This section addresses the different methods to install the SMB Connector on a single machine.

    SMB Connector Pre-requisites:

    • OS: Windows Server 2016 or later.

    • Processor: 2 GHz or faster, 2 cores (64-bit processor recommended).

    • Memory: 4GB RAM.

    • Hard Disk: 1GB free space.

    • Administrator Privileges: user needs admin permissions to install.

    • must be installed.

    Installation config

    The SMB Connector supports various configuration options which can be specified via smb_connector_application_config.json

    Manual Installation

    Pre-requisites:

    1. The ZIP of the installer files.

    2. smb_connector_application_config.json file.

    3. Windows Server machine access.

    4. Admin access to install the connector.

    Steps

    1. Download the SMB Connector ZIP File: Obtain the ZIP file and save it to the Windows machine.

    2. Prepare for Installation:

      • Unzip the contents of the ZIP file

      • Place the smb_connector_application_config.json file in the same directory as the unzipped contents.

    3. Configure the Installer:

      • Edit the smb_connector_application_config.json file as needed. Use the smb_connector_application_config.json.example file in the unzipped folder if creating the configuration from scratch.

    Create a folder mapping for every SMB share on the server that is to be scanned. WatchFolder should be the root directory of the share, and WebhookUrl should be from the scan configuration page for the SMB share on the GV dashboard (shown below).

    • Keep useDefaultFileFilters set to false if you want all files in the share to be scanned. If set to true, the connector will only scan files supported by the GV Synergy agent for classification.

    • IncludedExtensions and AdditionalFileFilters can be used if you wish to apply filters other than the defaults. IncludedExtensions supports file extensions in the format .txt, etc. AdditionalFileFilters allows for any custom file filter, including * as a wildcard

    • Start the Installation:

      • Execute the install.ps1 script by right clicking and choosing Run with PowerShell

    • Complete the Installation:

      • After the installation completes, the PowerShell window can be closed.

    1. Save Streaming configuration

    Monitoring Real-Time Events

    After the subscription is activated (green magnifying glass icon), real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.

    Viewing Events in the Live Events Section

    1. Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events (you may specify source filter to focus only on SMB events):

    If you have access to multiple tenants, use the
    Settings
    icon in the top menu to switch to the tenant in which you want to register the application from the
    Directories + subscriptions
    menu
    • Browse to App Registration and select New registration

    • On the App Registration page enter below information and click Register button

      • Name: (Enter a meaningful application name that will be displayed to users of the app)

      • Supported account types:

        • Select which accounts you would like your application to support. You should see the options similar to below. You can select “Accounts in this organizational directory only”:

        • Leave the Redirect URI as empty and Click Register

    • Note the Application (client) ID, Directory (tenant) ID values

    • Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret

    • Provide a meaningful description and expiry to the secret, and click on Add

    • Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once you leave this page

    • Navigate to Manage -> API permissions on the left menu, and Add a permission

    • Select Microsoft APIs -> Microsoft Graph

    • Select Application permissions

    • Permissions required

      • For scanning

        • Microsoft Graph > Application permissions > Mail > Mail.Read

        • Microsoft Graph > Application permissions > User > User.Read.All

        • Microsoft Graph > Application permissions > DeviceManagementApps > DeviceManagementApps.Read.All

        • Microsoft Graph > Application permissions > MailboxSettings > MailboxSettings.Read

      • For tagging

        • Microsoft Graph > Application permissions > Mail > Mail.ReadWrite

    • Once all the required permissions are added, Grant admin consent to them

    Azure Portal

    If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu

  • Browse to App Registration and select your application that was created for the scanning

  • Navigate to Manage -> API permissions on the left menu, and Add a permission

  • Select Microsoft APIs -> Office 365 Management API

    • Select Application permission

    • Select ActivityFeed.Read permission

    • Permissions required

      • All the scanning permissions(https://docs.getvisibility.com/scan-with-getvisibility/configure-data-sources/sharepoint-online)

      • Office 365 Management API ⇒ Application Permissions ⇒ ActivityFeed.Read

    • Once all the required permission is added, click "Grant admin consent"

    Enabling Auditing

    • Sign into the Microsoft Purview portal using Microsoft Edge browser

    • Select the Audit solution card. If the Audit solution card isn't displayed, select View all solutions and then select Audit from the Core section

    • If auditing isn't turned on for your organization, a banner is displayed prompting you start recording user and admin activity. Select the Start recording user and admin activity banner.

    • In certain cases, recoding cannot be enabled immediately and requires additional configuration. If this applies, users will be prompted to enable the customization setting. Select OK, and a new banner will appear, informing you that the process may take 24 to 48 hours to complete. After this waiting period, repeat the previous step to proceed with enabling recoding.

    Steps to Enable Data Streaming for OneDrive

    1. Create a New Scan Configuration

    1. From the Data Sources page, select OneDrive from the list of available data sources. In the Scan Configurations list, create a New Configuration.

    2. Make sure the connection has a Name and Credentials set. Then select the Path icon.

    2. Pick a Folder for Real-Time Events

    1. Click on the Folder icon in the Path field to select the folder you want to monitor for real-time events.

      • Magnifying glass icon: Folders with this icon next to them indicate that real-time events can be subscribed to from this directory.

    2. After selecting the folder, click Save & Close to finalize the changes.

    3. Clock icon: When data streaming is being activated, the clock icon will appear, indicating that the subscription is being processed. Once the subscription is activated, this icon will change to a green magnifying glass.

    4. After enabling Data Streaming, the system will automatically handle the subscription to OneDrive’s real-time events. There is no need to manually configure Webhooks.

    Monitoring Real-Time Events

    After the subscription is activated (green magnifying glass icon), real-time events will start flowing into the platform, and you will be able to monitor them from various sections of Getvisibility.

    Viewing Events in the Live Events Section

    1. Navigate to the Live Events section under Administration to view a detailed audit log of all streaming events.

    2. In this section, you can filter and view event details.

    Azure Portal

    Azure Blob

    How to configure Azure Blob connection for scanning.

    Registering an Azure App

    • Login to Azure Portal

    • If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu.

    • Browse to App Registration and select New registration

    • On the App Registration page enter below information and click Register button

      • Name: (Enter a meaningful application name that will be displayed to users of the app)

      • Supported account types:

        • Select which accounts the application will support. The options should be similar to those below. Select �Accounts in this organizational directory only�:

    • Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret

    • Provide a meaningful description and expiry to the secret, and click on Add

    • Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once you leave this page

    • Navigate to Manage -> API permissions on the left menu, and Add a permission

    • Select Microsoft APIs -> Microsoft Graph

    • Select Application permissions

    • Permissions required

      • Microsoft Graph > Application permissions > Device > Device.Read.All

      • Microsoft Graph > Application permissions > Directory > Directory.Read.All

      • Microsoft Graph > Application permissions > Group > Group.Read.All

    Azure Storage Subscription ID, Resource group and connection strings

    A is needed for the storage account that is to be scanned.

    • Login to

    • If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant which needs to be registered to the application from the Directories + subscriptions menu

    • Browse to Storage accounts and select the account to be scanned

    • Once the storage account is selected, note the Resouce group and Subscription ID values in the Overview page

    • Navigate to Security + networking -> Access keys on the left menu, and click on Show on the Connection string

    • Copy this Connection string value

    • Access Control (IAM) Role assignment - there are 2 options, one is to assign a built-in role, the other is to create and assign a custom role. Using a built-in role is an easier option to configure, while a custom role may be preferred to ensure least privileges assignment for increased security.

      • Option 1: In the storage account, go to Access Control (IAM) and check on either Storage Blob Data Owner or Data Contributor role to assign the role to the blob storage. (Per the Data Contributor role is the least privileged, built-in role for Listing Containers)

    *** Firewall rules must also be in place to allow the DSPM server to connect to

    Configuring Azure Blob connector in Dashboard

    • Navigate to Administration -> Data Sources -> Azure Blob -> New scan

    • Provide the Connection string value obtained from above steps

    • Click on the Folder icon in Path to select a particular share to scan, or leave the path as empty to scan all shares

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning

    • The results can be viewed under Dashboard -> Enterprise Search

    AWS IAM Streaming Configuration

    This guide provides steps on how to enable real-time data streaming for a AWS IAM connection and monitor streaming events within the Getvisibility platform.

    Create a policy

    • In the navigation pane on the left, choose Policies and then choose Create policy

    • In the Policy editor section, find the Select a service section, then choose IAM service, and select Next

    • In Actions allowed, choose the below actions to add to the policy:

      • GetPolicy

      • GetUserPolicy

      • ListUserPolicies

      • ListAttachedGroupPolicies

    • Choose SNS service and select the below actions:

      • CreateTopic,

      • DeleteTopic,

      • TagResource,

    • Choose Event Bridge service and select the below actions:

      • TagResource

      • PutTargets

      • EnableRule

    • Choose EC2 sercice and select the below action:

      • DescribeRegions

    • For Resources, choose All and select Create policy to save the new policy

    Create a user

    • Sign in to the AWS Management Console and open the with the appropriate admin level account

    • In the navigation pane on the left, choose Users and then choose Create user

    • On the Specify user details page, under User details, in User name, enter the name for the new user, example iam-connector-user and select Next

    • On the Set permissions page, select Attach policies directly and choose the policy created in above steps

    • Select Next

    • Once the user is created, select it, and from the user page, choose Create access key

    • Select Other then Next

    • Enter a description if you wish and select Create access key

    • The Access and Secret Access Keys have now been created. These can be downloaded as a CSV, and also copied from this section. NOTE: the secret access key cannot be viewed once you leave this page

    Configuring AWS IAM connector in Dashboard

    • Navigate to Administration -> Data Sources -> AWS IAM ->Credentials - New credentials

    • Provide the access key and secret access key values generated in the above steps and select Save & Create Scan

    • Make sure the connection has a Name and Credentials set then click on Data streaming toggle and click Save & Close to finalize the changes

    • Clock icon: When data streaming is being activated, the "Requested" status will appear, indicating that the subscription is being processed. Once the subscription is activated, this status will change to "On".

    • After enabling Data Streaming, the system will automatically handle the subscription to AWS Iam’s real-time events. There is no need to manually configure Webhooks.

    Monitoring Real-Time Events

    After the subscription is activated, real-time events will start flowing into the platform, and can be monitored from the relevant parts of the platform.

    Viewing Events in the Live Events Section

    1. Go to the Live Events section under Administration to view a detailed audit log of all streaming events.

    2. Filter by source to get only AWS IAM events

    Monitoring Extended Streaming Events

    Once extended streaming is enabled, events will be available for monitoring in multiple sections of the platform:

    Live Events Section

    • Go to Live Events under Administration to view real-time extended events.

    • Use the filter options to narrow down events to only AWS IAM activities.

    Google IAM Streaming Configuration

    Steps to Enable Data Streaming for Google IAM

    Create OAuth2 Credentials

    Confluence Cloud Streaming Configuration

    This document provides information on how to configure Confluence Cloud connection with real-time events monitoring and data streaming.

    Overview

    Follow this guide to integrate Confluence Cloud with your system for real-time events monitoring.

    To enable DDR (Streaming) for an existing Confluence Cloud instance, follow these steps:

    curl -X POST "https://your.imanage.server.com/auth/oauth2/token" \
    -d "username=YOUR_USERNAME" \
    -d "password=YOUR_PASSWORD" \
    -d "grant_type=password" \
    -d "client_id=YOUR_CLIENT_ID"
    curl -X GET "https://your.imanage.server.com/api" \
    -H "X-Auth-Token: YOUR_ACCESS_TOKEN"

    Client Id

    (ID from Part 1)

    The application Client ID.

    Domain

    your.imanage.server.com

    Crucial: Your on-premise server's hostname.

    Path

    (Optional)

    Leave blank to scan all content, or click the folder icon to select a specific path.

    Read > GetObjectAcl
  • List > ListAllMyBuckets

  • List > ListBucket

  • Tagging > PutObjectTagging

    Dropbox App Console
    .NET 8
    Live Events Section
    Event Details View

    Team Data > team_data.member

  • Members > members.read, groups.read

  • For remediations

    • Collaboration > sharing.write

    • Files and Folders > files.content.write

  • For tagging

    • Files and Folders > files.content.write, files.metadata.write

  • Collaboration: sharing.read, sharing.write

  • Team: team_info.read

  • Team Data: team_data.member, team_data.content.write, team_data.content.read, files.team_metadata.write, files.team_metadata.read, files.permanent_delete

  • Members: members.read, groups.read

  • Leave the Redirect URI as empty and Click Register

  • Note the Application (client) ID, Directory (tenant) ID values

  • Microsoft Graph > Application permissions > User > User.Read.All

  • Once all the required permissions are added, click "Grant admin consent"

  • We also need to assign Reader role to the azure app created in the first step

    • Save the changes.

    • Option 2: This option creates a custom role and assigns the same permissions as the Data Contributor role, except for the delete permissions. In the Blob storage account, go to Access Control (IAM) and click Add to create a new role. Name the role with a preferred name, and choose the following actions below to assign to this custom role. Select this custom role for the blob and save changes.

    • We also need to assign Reader role to the azure app created in the first step

  • Real Time Events Monitoring (Streaming) Permissions: To enable "Real Time Events Monitoring (Streaming)", the following additional Azure permission roles are required:

    • EventGrid Data Contributor

    • EventGrid EventSubscription Contributor

    • EventGrid TopicSpaces Publisher

    Assign these roles using Access Control (IAM) in the Blob storage account, similar to the steps mentioned above for assigning the Storage Blob Data Owner or Data Contributor role.

  • Next, in the Networking tab, under Public network access, select "Enabled from all networks", or "Enabled from select virtual networks and IP addresses". If the latter was chosen, then under Firewall section add the IP address range for the DSPM server.

  • Enable "Allow trusted Microsoft services to access this storage account" and Save the changes.

  • connection string
    Azure Portal
    Microsoft's documentation
    https://(mystorageaccount).blob.core.windows.net

    ListAttachedUserPolicies

  • ListGroups

  • ListUsers

  • ListGroupsForUser

  • PutRolePolicy

  • TagRole

  • GetGroup

  • GetRole

  • CreateRole

  • SetTopicAttributes,
  • Subscribe,

  • ConfirmSubscription

  • PutRule
  • UntagResource

  • ListTargetsByRule

  • RemoveTargets

  • DeleteRule

  • IAM console

    Create a Project in Google Cloud Console:

    • Go to the Google Cloud Console

    • Create a new project or select an existing project

  • Enable the Admin SDK:

    • In the Google Cloud Console, navigate to the "APIs & Services" > "Library"

    • Search for "Admin SDK" and click on it

    • Click the "Enable" button to enable the Admin SDK API for your project

  • Create OAuth 2.0 Credentials:

    • In the Google Cloud Console, go to APIs & Services > Credentials

    • Click "Create credentials" and select "Service account"

    • Enter a name in the Service account name field andclick CREATE CREDENTIALS

    • Under "Grant this service account access to the project," select role as Owner and click DONE

    • Select the newly created service account and click Keys > Add Key > Create new key

    • Make sure the key type is set to json and click CREATE

    • The new private key pair is generated and downloaded to the machine. Note the values of private_key, client_email and client_id

  • Delegate domain-wide authority to your service account

    • From your domain's Admin console, go to Main menu menu > Security > Access and data control > API controls

    • In the Domain wide delegation pane, select Manage Domain Wide Delegation

    • Click Add new

    • In the Client ID field, enter the client ID obtained from the service account creation steps above

    • In the OAuth Scopes field, enter a comma-delimited list of the scopes required for the application

    • Use the below scopes:

      • https://www.googleapis.com/auth/admin.directory.user.readonly

      • https://www.googleapis.com/auth/admin.directory.domain.readonly

      • https://www.googleapis.com/auth/admin.directory.group.readonly

      • https://www.googleapis.com/auth/admin.directory.rolemanagement.readonly

      • https://www.googleapis.com/auth/admin.reports.audit.readonly

    • Click Authorize

    Steps to Enable Data Streaming for Google Iam

    1. Go to the Data Sources section under Administration.

    2. From the Data Sources page, select Google iam from the list of available data sources. In the Scan Configurations list, create a New Configuration.

    3. Make sure the connection has a Name and Credentials set then click on Data streaming toggle and click Save & Close to finalize the changes

    4. Clock icon: When data streaming is being activated, the "Requested" status will appear, indicating that the subscription is being processed. Once the subscription is activated, this status will change to "On".

    5. After enabling Data Streaming, the system will automatically handle the subscription to Google Iam’s real-time events. There is no need to manually configure Webhooks.

    Monitoring Real-Time Events

    After the subscription is activated, real-time events will start flowing into the platform, and can be monitored from the relevant parts of the platform.

    Viewing Events in the Live Events Section

    1. Go to the Live Events section under Administration to view a detailed audit log of all streaming events.

    2. Filter by source to get only Google IAM events

    Monitoring Extended Streaming Events

    Once extended streaming is enabled, events will be available for monitoring in multiple sections of the platform:

    Live Events Section

    • Go to Live Events under Administration to view real-time extended events.

    • Use the filter options to narrow down events to only Google IAM activities.

    Prerequisites

    Ensure the following prerequisites are met:

    1. Existing Confluence Cloud Instance: There needs to be an active Confluence Cloud instance.

    2. Enable Development Mode: Activate Development Mode on the Confluence Cloud site to be monitored. Refer to the official Confluence documentation.

    3. Deploy Proxy Container: Set up the Getvisibility container with a public proxy to allow integration with Confluence Cloud.

    Steps to Enable Data Streaming

    Step 1: Configure Confluence Cloud Data Streaming

    1. In the product UI, go to the Data Sources > Confluence Cloud page.

    2. Locate the existing Confluence Cloud scan configuration and select Edit Configuration.

    3. Within the Edit Confluence Cloud Configuration page, toggle Data Streaming to ON.

    4. Copy the Webhook URL provided, as it will be used later.

    5. Click Save & Close to apply changes.

    To enable data streaming, the confluence-cloud-streaming-proxy container will need to be deployed in the infrastructure e.g. using Docker or Kubernetes. This step involves configuring environment variables and setting up Docker for integration with Confluence Cloud.

    Step 2: Set Up confluence-cloud-streaming-proxy Application

    Deployment Instructions

    1. Download Docker image parts: Please download all files listed below:

    1. Merge Docker image parts:

    2. Load Docker image:

    3. Prepare a Docker Environment: Ensure that Docker is installed and configured on the infrastructure where the confluence-cloud-streaming-proxy application will be hosted. This will be the user environment.

    4. Set Environment Variables: Configure the following environment variables to allow the Confluence Cloud instance to communicate with the proxy application:

    Environment variable
    Description
    Example

    APP_LISTENER_PUBLIC_ACCESSIBLE_URL

    Publicly accessible URL at which app can be accessed. It is used in communication between Confluence Cloud Webhook mechanism and app

    e.g.

    APP_WEBHOOK_URL

    Webhook URL (taken from Getvisibility UI Confluence Cloud connector configuration form)

    e.g.

    1. Map Persistent Volume: Map a persistent volume to the /app/db/ directory within the container to ensure data retention across sessions.

    Example docker-compose.yml Configuration

    Use the following example to help set up the Docker configuration. Update the values as needed for the specific environment:

    Once configured, start the container by running docker-compose up -d or an equivalent command based on configured setup.

    Step 3: Expose the Application

    To expose the application publicly, consult with relevant internal team such as IT or DevOps team. For testing ngrok's free plan can be used to expose the app port as needed.

    1. Start the Application: Ensure the application runs before proceeding with the integration setup.

    Step 4: Install the Integration in Confluence Cloud

    To install the integration, follow the steps:

    1. Go to the Manage apps page in Confluence Cloud.

    2. Select the Upload app

    3. Paste the publicly accessible address in the form and press Upload.

    4. The application will install, and the integration will be ready in a few seconds.


    Uninstall integration from Confluence Cloud

    To uninstall the integration follow the steps:

    1. Go to the Manage apps page in Confluence Cloud.

    2. Find Getvisibility Confluence Cloud Streaming Proxy and click Uninstall.

    3. Confirm by selecting Uninstall app.

    4. Delete any associated containers and settings from your organization’s infrastructure

    90MB
    confluence-cloud-streaming-proxy.tar.gz.partaa
    Open
    46MB
    confluence-cloud-streaming-proxy.tar.gz.partab
    Open

    Box

    How to create a Box Connector app to scan Box accounts.

    Creating a Box app

    • Login to relevant Box account.

    • Navigate to Dev Console.

    • Select Create New App and then Custom App

    • Select Server Authentication (with JWT) and enter app name, then click Create App

    • In the Configuration tab, change App Access Level to App + Enterprise Access, then, enable Generate user access tokens and Make API calls using the as-user header.

    • Click on Save changes

    • Make sure the below Application Scopes are selected

      • Content Actions > Read all files and folders stored in Box

      • Content Actions > Write all files and folders stored in Box

      • Administrative Actions > Manage users

    • In the same Configuration tab, scroll down to Generate a Public/Private Keypair

    • This will result in a JSON file being downloaded by the browser

    • In Authorization tab, click Review and Submit followed up with adding a description before submitting the app for review

    • Make note of User ID and Enterprise ID of the App in General Settings tab

    • Exit Dev Console and switch to the Admin Console

    • In Admin Console, go to Apps > Custom Apps Manager and locate the newly created app and click View button

    • Review the information and Authorize the app

    Configuring Box connector in Dashboard

    • Navigate to Administration -> Data Sources -> Box -> New scan

    • Provide the values generated in the above steps from the Box application

    • Click on the Folder icon in Path to select a particular folder to scan, or leave the path as empty to scan all folders

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start trustee scan to begin the trustee scanning

    • The scan results can be viewed under Dashboard > Access Governance

    • Click on the icon on the right and select Start file scan to begin the files scanning

    • The results can be viewed under Dashboard > Enterprise Search

    File tagging

    Prerequisites

    • The Box Pricing Plans required for metadata writing are Business Plus, Enterprise, or Enterprise Plus. The basic Business plan does not include custom metadata and metadata templates.

    • A metadata template must be created to support Getvisibility's tags. Please follow the below steps to achive this.

      • In the Admin Console, in the lefthand navigation click Content

      • Toward the top of the page, click Metadata

    OneDrive

    How to create a OneDrive Connector app to scan OneDrive accounts.

    Required Whitelisting

    The following URLs needs to be whitelisted:

    • Microsoft Graph API: https://graph.microsoft.com

    • Azure Authentication:

    Registering an Azure App

    • Login to

    • If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu.

    • Browse to App Registration and select New registration

    • On the App Registration page enter below information and click Register button

      • Name: (Enter a meaningful application name that will be displayed to users of the app)

      • Supported account types:

        • Select which accounts the application will support. The options should be similar to those below. Select “Accounts in this organizational directory only”:

    • Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret

    • Provide a meaningful description and expiry to the secret, and click on Add

    • Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once you leave this page

    • Navigate to Manage -> API permissions on the left menu, and Add a permission

    • Select Microsoft APIs -> Microsoft Graph

    • Select Application permissions

    • For UnifiedPolicy.Tenant.Read

      • Navigate to Manage -> API permissions on the left menu, and Add a permission

      • Select APIs my organization uses tab

    Configuring OneDrive connector in Dashboard

    • Navigate to Administration -> Data Sources -> OneDrive -> New scan

    • Provide the Directory (tenant) ID, Application (client) ID and Client Secret value generated in the above steps from the Azure application

    • Click on the Folder icon in Path to select a particular user's OneDrive to scan, or leave the path as empty to scan all users

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning

    • The results can be viewed under Dashboard -> Enterprise Search

    Microsoft.Storage/storageAccounts/blobServices/containers/read	(Return a container or a list of containers)
    Microsoft.Storage/storageAccounts/blobServices/containers/write	(Modify a container's metadata or properties)
    Microsoft.Storage/storageAccounts/blobServices/generateUserDelegationKey/action	(Returns a user delegation key for the Blob service)
    Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read	(Return a blob or a list of blobs)
    Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write	(Write to a blob)
    Microsoft.Storage/storageAccounts/blobServices/containers/blobs/move/action	(Moves the blob from one path to another)
    Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action	(Returns the result of adding blob content)
    cat confluence-cloud-streaming-proxy.tar.gz.part* > \
    confluence-cloud-streaming-proxy.tar.gz.joined
    docker load --input confluence-cloud-streaming-proxy.tar.gz.joined
    services:
      app:
        image: getvisibility/confluence-cloud-streaming-proxy:v0.3.2
        ports:
          - "8080:8080"
        environment:
          APP_LISTENER_PUBLIC_ACCESSIBLE_URL: https://5977-88-156-142-22.ngrok-free.app
          APP_WEBHOOK_URL: https://tenantabc.getvisibility.com/scan-manager/external/webhooks/notification/71ccab3d56980a2d9c766f42c86d36ffedc34258a0f226aaf56a628f06e9d89d
        volumes:
          - ./app-db/:/app/db/
    https://your-ngrok-url.app
    https://5977-88-156-142-22.ngrok-free.app
    https://tenantabc.getvisibility.com/...
    https://tenantabc.getvisibility.com/scan-manager/external/webhooks/notification/71ccab3d56980a2d9c766f42c86d36ffedc34258a0f226aaf56a628f06e9d89d

    Administrative Actions > Manage groups

    Click Create New

  • Click Name Your Template and enter name as getvisibility

  • Create a new attribute named as Classification with options as: Public, General Business, Confidential, Highly-Confidential

  • Similarly, create two more attributes:

    • Distribution with options as: Internal, External

    • Compliance with options as: PCI, PII, PHI

  • Use the Status drop down to indicate this template is Visible

  • Click Save

  • Leave the Redirect URI as empty and Click Register

  • Note the Application (client) ID, Directory (tenant) ID values

  • Search for Microsoft Information Protection Sync Service

    • Select Application permissions > UnifiedPolicy.Tenant.Read

  • For InformationProtectionPolicy.Read.All

    • Navigate to Manage -> API permissions on the left menu, and Add a permission

    • Select APIs my organization uses tab

    • Search for Microsoft Information Protection API

    • Select Application permissions > InformationProtectionPolicy.Read.All

    • For Azure Rights Management Services > Content.Writer

      • Navigate to Manage -> API permissions on the left menu, and Add a permission

      • Select Azure Rights Management Services tab

  • Permissions required

    • For scanning

      • Microsoft Graph > Application permissions > Sites > Sites.Read.All

      • Microsoft Graph > Application permissions > Directory > Directory.Read.All

      • Microsoft Graph > Application permissions > Files > Files.Read.All

      • Microsoft Graph > Application permissions > User > User.Read.All

    • For reading Sensitivity labels

      • Microsoft Graph > Application permissions > InformationProtectionPolicy > InformationProtectionPolicy.Read.All

      • APIs my organization uses > Microsoft Information Protection Sync Service > Application permissions > UnifiedPolicy.Tenant.Read

    • For revoke permissions

      • Microsoft Graph > Application permissions > Files > Files.ReadWrite.All

    • For tagging

      • Microsoft Graph > Application permissions > Sites > Sites.Manage.All

    • For MIP tagging

      • Azure Rights Management Services > Application permissions > Content.Writer

      • Microsoft Graph > Application permissions > Directory > Directory.Read.All

      • Microsoft Graph > Application permissions > Files > Files.ReadWrite.All

  • Once all the required permissions are added, click "Grant admin consent"

  • https://login.microsoftonline.com
    Azure Portal

    SharePoint Online

    How to create a SharePoint Connector app to scan SharePoint Online (SPO) accounts.

    Registering an Azure App

    • Login to Azure Portal

    • If there are multiple tenants to choose from, use the Settings icon in the top menu to switch to the tenant in which needs to be registered to the application from the Directories + subscriptions menu.

    • Browse to App Registration and select New registration

    • On the App Registration page enter below information and click Register button

      • Name: (Enter a meaningful application name that will be displayed to users of the app)

      • Supported account types:

        • Select which accounts the application will support. The options should be similar to those below. Select “Accounts in this organizational directory only”:

    • Navigate to Manage -> Certificates and secrets on the left menu, to create a new client secret

    • Provide a meaningful description and expiry to the secret, and click on Add

    • Once a client secret is created, note its Value and store it somewhere safe. NOTE: this value cannot be viewed once the page is closed.

    • Navigate to Manage -> API permissions on the left menu, and Add a permission

    • Select Microsoft APIs -> Microsoft Graph

    • Select Application permissions

    • For UnifiedPolicy.Tenant.Read

      • Navigate to Manage -> API permissions on the left menu, and Add a permission

      • Select APIs my organization uses tab

    • Permissions required

      • For scanning

        • Microsoft Graph > Application permissions > Sites > Sites.Read.All

      • For reading Sensitivity labels

    Configuring SharePoint Online connector in Dashboard

    • Navigate to Administration -> Data Sources -> SharePoint Online -> New scan

    • Provide the Directory (tenant) ID, Application (client) ID and Client Secret value generated in the above steps from the azure application

    • Click on the Folder icon in Site and path to select a particular site to scan, or leave the path as empty to scan all sites

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start file scan to begin the scanning

    • The results can be viewed under Dashboard -> Enterprise Search

    File tagging

    Prerequisites

    • First create the default Getvisibility tags as a new column in SharePoint. This process is described below:

      • In SharePoint, navigate to Documents

      • In the files view, select + Add column

    Google Drive

    How to configure a Google Drive connection to scan files and folders.

    Create OAuth2 Credentials

    • Create a Project in Google Cloud Console:

    Azure Files Streaming Configuration

    This document provides information on how to configure Azure Files connection with real-time events monitoring and data streaming.

    Prerequisites

    1. Existing Azure Files connection: An Azure Files scan configuration must already exist.

  • Select Application permissions

    • Select Content > Content.Writer

    Microsoft Graph > Application permissions > Sites > Sites.Manage.All

  • Microsoft Graph > Application permissions > InformationProtectionPolicy > InformationProtectionPolicy.Read.All

  • APIs my organization uses > Microsoft Information Protection API > Application permissions > InformationProtectionPolicy.Read.All

  • Leave the Redirect URI as empty and Click Register

  • Note the Application (client) ID, Directory (tenant) ID values

  • Search for Microsoft Information Protection Sync Service

    • Select Application permissions > UnifiedPolicy.Tenant.Read

  • For InformationProtectionPolicy.Read.All

    • Navigate to Manage -> API permissions on the left menu, and Add a permission

    • Select APIs my organization uses tab

    • Search for Microsoft Information Protection API

    • Select Application permissions > InformationProtectionPolicy.Read.All

  • For Azure Rights Management Services > Content.Writer

    • Navigate to Manage -> API permissions on the left menu, and Add a permission

    • Select Azure Rights Management Services tab

    • Select Application permissions

    • Select Content > Content.Writer

    • Microsoft Graph > Application permissions > InformationProtectionPolicy > InformationProtectionPolicy.Read.All

    • APIs my organization uses > Microsoft Information Protection Sync Service > Application permissions > UnifiedPolicy.Tenant.Read

  • For revoke permissions

    • Microsoft Graph > Application permissions > Files > Files.ReadWrite.All

  • For tagging

    • Microsoft Graph > Application permissions > Sites > Sites.Manage.All

  • For MIP tagging

    • Azure Rights Management Services > Application permissions > Content.Writer

    • Microsoft Graph > Application permissions > Directory > Directory.Read.All

    • Microsoft Graph > Application permissions > Sites > Sites.Manage.All

    • Microsoft Graph > Application permissions > InformationProtectionPolicy > InformationProtectionPolicy.Read.All

    • APIs my organization uses > Microsoft Information Protection API > Application permissions > InformationProtectionPolicy.Read.All

  • Once all the required permissions are added, click "Grant admin consent"

  • Select Choice and then Next

    • Set the name to Classification and the choices as: Public, Internal, Confidential, Highly-Confidential. Select.

    • Then click Save

    • Similary create Compliance and Distribution columns (if required)

    • Getvisibility and SharePoint's tags are now aligned

  • When tags are written to SharePoint files automatically over the API, as the tags are added by Getvisibility, Modified By changes to System Account.

    • Getvisibility preserves the Modified date where applicable.

  • Go to the
  • Create a new project or select an existing project

  • Enable the Google Drive, Drive Labels and Admin SDK API:

    • In the Google Cloud Console, navigate to APIs & Services > Library

    • Search for "Google Drive API" and click on it

    • Click the "Enable" button to enable the Google Drive API for the project

    • Search for "Admin SDK API" and click on it

    • Click the "Enable" button to enable the Admin SDK API for the project

    • Search for "Drive Labels API" and click on it

    • Click the "Enable" button to enable Drive Labels API for the project

  • Create OAuth 2.0 Credentials:

    • In the Google Cloud Console, navigate to the APIs & Services > Credentials

    • Click "Create credentials" and select "Service account"

    • Enter a name in the Service account name field and click CREATE AND CONTINUE

    • Under Grant this service account access to the project, select role as Owner and click DONE

    • Select the newly created service account and click Keys > Add Key > Create new key

    • Make sure the key type is set to json and click Create

    • The new private key pair is generated and downloaded to the machine. Note the values of private_key, client_email and client_id

  • Delegate domain-wide authority to your service account

    • From your domain's Admin console, go to Main menu menu > Security > Access and data control > API controls

    • In the Domain wide delegation pane, select "MANAGE DOMAIN-WIDE DELEGATION"

    • Click Add new

    • In the Client ID field, enter the client ID obtained from the service account creation steps above

    • In the OAuth Scopes field, enter a comma-delimited list of the scopes required for the application

    • Use the below scopes:

      For scanning

      • https://www.googleapis.com/auth/admin.directory.user.readonly

      • https://www.googleapis.com/auth/admin.directory.group.readonly

      • https://www.googleapis.com/auth/drive.readonly

      For revoke permissions

      • https://www.googleapis.com/auth/drive

      For tagging

      • https://www.googleapis.com/auth/drive.file

      • https://www.googleapis.com/auth/drive

      • https://www.googleapis.com/auth/drive.admin.labels

      • https://www.googleapis.com/auth/drive.metadata

      For Extended Streaming Events

      • https://www.googleapis.com/auth/admin.reports.audit.readonly

    • Click Authorize

    Provide required Admin roles to a user

    In order to perform a scan using Google Drive connector, it needs a user with the below Admin roles assigned:

    • Services Admin

    • User Management

    • Groups Reader

    They can be added/checked here for the UserID which will be used for impersonation: admin.google.com > Directory > Users > Assign roles > add Services Admin, User Management, and Groups Reader roles, as follows:

    • Navigate to Admin console

    • Select Users under Directory from the left menu

    • Select a user you want to use for scanning

    • Navigate to User details -> Admin roles and privileges

    • Edit the roles, and enable:

      • Services Admin

      • User Management

      • Groups Reader

    • Click on Save

    Note: It might take few minutes before the changes are affected.

    Configuring Google Drive connector in Dashboard

    • Navigate to Administration -> Data Sources -> Google Drive -> New scan

    • Enter the details of the OAuth2 credentials obtained previously, also add the user id (in the form of [email protected]) of the user you assigned roles in the above steps

    • Click on the Folder icon in Path to select a particular user's drive to scan, or leave the path as empty to scan all users

    • Save the configuration

    • Once the configuration is saved, click on the icon on the right and select Start file scan to begin scanning

    • The scan results can be viewed under Dashboard -> Enterprise Search

    File tagging

    Prerequisites

    Default Getvisibility labels need to be created in Google Drive. This process is described below:

    • Turn on Drive labels for the organization

      1. In the Google Admin Console (at admin.google.com)

      2. Go to Menu Apps > Google Workspace > Drive and Docs

      3. Click Labels

      4. Select Turn Labels On

      5. Click Save

    • Create Drive labels:

      1. Go to the labels manager at .

        Requires having the .

      2. Click New label.

      3. To create one badged label:

    • Publish the labels

      1. If it’s not open already, open the labels manager () and click the label.

      2. Review the label and any fields.

      3. Click Publish.

    Google Cloud Console

    If an Azure Files scan does not already exist, follow this guide to create a new Azure Files scan and ensure the necessary credentials are set up.

    Steps to Enable Data Streaming

    1. Select an Existing Scan Configuration

    1. Go to the Scan configurations page in the product UI.

    2. Locate your existing Azure Files scan configuration and select Edit Configuration from the options menu. Note the configured path (folder) and save it, as it will be used in step 9 to replace {FolderPath}.

    2. Enable Data Streaming

    1. Within the Edit Azure Files Scan Configuration page, toggle Data Streaming to ON.

    2. Copy the Webhook URL provided, as you will use it later in the Azure Portal. Save this Webhook URL, as it will be used in step 9 to replace {WebhookUrl}.

    1. Click Save & Close button to save configuration.

    3. Create Azure Event Hub

    1. Navigate to Azure Portal Event hubs and click Create

    1. In Create Namespace Window fill in the details

      1. Give it a Name

      2. Select your subscription and resource group

      3. Select location

      4. Pricing tier - standard

      5. Throughput Units - 1

    1. Click on Review + Create and then Create after validation

    1. After namespace is created, click on + Event Hub button

    2. In Create Event Hub Window fill in name and click Create + Review and Create after validation. Save the name of the Event Hub you created in this step, as it will be used later in step 9 to replace {eventHubName}.

    1. Configure access policy

      1. In the event hubs namespace window click on Settings/Shared access policies and then +Add button

      2. Fill in the details in the new tab, set LogicAppsListenerPolicy as name, select Listen policy, and click Save.

      3. Click on the newly created policy, then copy and save the Connection string–primary key. This will be needed later in step 8b.

    4. Configure Azure Storage Diagnostic settings

    1. Navigate to Azure Portal and open your Storage Account.

    2. Select needed account from the Storage Accounts

    3. In the left-hand menu, select Monitoring/Diagnostic settings and click file

    4. In Diagnostic settings Window click on "+ Add diagnostic setting" button

    5. In Create Diagnostic setting Window fill in the details:

      1. Give it a Name

      2. Select Category groups allLogs

      3. Select Destination details Stream to an event hub and select newly created Event Hub Namespace and Event Hub

    5. Configure Azure Logic Apps

    1. Go to Azure logic apps and click "Add" button

    2. In Create Logic App Window select Workflow Service Plan

    3. In Create Logic App (Workflow Service Plan) Window fill in the details and click "Create + Review":

      1. Select your subscription and resource group

      2. Give logic app name

      3. Select region

      4. Pricing plan should be WS1

      5. In the monitoring tab select No for the application insights

      6. Click Review + create button

    4. Click Create after validation

    5. In newly created logic app click on Workflows/Workflows and then +Add button

    6. In new workflow tab fill in name, select State type: Stateful and click Create

    7. In created workflow go to Developer/Designer and click on Add a trigger, then in search type "Event hub" and select "When events are available in Event Hub"

    8. Configure API connection

      1. Click on the trigger, set "Temp" for Event Hub Name and then click on Change connection.

      2. Then click Add New and fill in the details. Enter any name for the connection name and use the connection string {Connection string–primary key} from step 3.6.c.

    9. In workflow navigation tab go to Developer/Code and set the provided code, then click save:

      1. Replace with a path to the streaming folder. For ex., you want to get events from the folder "StreamingFolder" which is located in file share with the name "DocumentsShare" and in the folder with the name "Personal". In this case, the path should be "DocumentsShare/Personal/StreamingFolder"

      2. Replace with webhook url provided in the application in the scan configuration window

    Next Steps

    After configuring the event subscription:

    • You may upload documents to the configured path.

    • The events triggered by these uploads will be processed by the Data Streaming setup, and the results will appear in your Getvisibility dashboard.

    Troubleshooting

    If you experience any issues with the configuration, ensure that:

    1. The Webhook URL is correct and matches the configuration in Azure.

    2. Steps 5.8 and 5.9 properly executed and all the variables are replaced with real values.

    3. You can also check if the trigger was unsuccessful by navigating to your configured in previos steps Logic App, then Workflow and Trigger History. If you see any failed triggers, you can inspect the error details to identify the issue.

    Azure Blob Streaming Configuration

    This document provides information on how to configure Azure Blob connection with real-time events monitoring and data streaming.

    To enable DDR (Streaming) for an existing Azure Blob scan, follow these steps:

    Prerequisites

    1. Existing Azure Blob connection: An Azure Blob scan configuration must already exist.

    https://www.googleapis.com/auth/drive.labels

  • Choose a badged label

  • Choose to start from an example, or from scratch.

  • Update the title as Classification.

  • (Optional) Add a description or a learn more URL that points to internal documentation about the label.

  • Customize options, and assign a colour.

  • To create a standard label:

    1. Two standard labelsneed to be created; Distribution and Compliance

    2. Click a standard label template or click Create New.

    3. Enter or update the label name.

    4. (Optional) Add a description.

    5. Choose whether the label is copied when the file is copied.

    6. Add a field.

  • Confirm that the lable will be published by clicking Publish.

    https://drive.google.com/labels
    Manage Labels privilege
    https://drive.google.com/labels
    and then
  • Click Save.

  • On the Change Connection tab, click Details and copy the Name from the connection details. Save this Name, as it will be used later in step 9 to replace {connectionName}.

  • Click save on workflow designer window

  • Replace {eventHubName} with azure event hub name that was created previously
  • Replace {connectionName} with connection name from previouse step

  • {FolderPath}
    {WebhookUrl}

    If an Azure Blob scan has not yet been created, follow this guide to create a new Azure Blob scan and ensure the necessary credentials are configured.

    Steps to Enable Data Streaming

    1. Select an Existing Scan Configuration

    1. Go to the Scan configurations page in the product UI.

    2. Find the existing Azure Blob scan configuration and select Edit Configuration from the options menu.

    2. Enable Data Streaming

    1. Within the Edit Azure Blob Scan Configuration page, toggle Data Streaming to ON.

    2. Copy the Webhook URL provided, as you will use it later in the Azure Portal.

    3. Configure Azure Event Grid Subscription

    1. Navigate to Azure Portal and open the Storage Account.

    2. Select one of the connector from the Storage Accounts

    3. In the left-hand menu, select Events and click Create Event Subscription.tor menu

    4. In Create Event Subscription Window fill in the details:

      1. Give it a Name

      2. Select endpoint type Web Hook

      3. Set configure an endpoint

    5. Go to Filters Menu on top

    6. In the Subject Filters section, enter the correct path format for the subscription:

      • Use the following pattern: /blobServices/default/containers/{connectionDetails.ContainerName}/blobs/{connectionDetails.FolderPath}

      • For example, if the container is mycontainer and the folder path is accuracy test/repository1, the path will look like: /blobServices/default/containers/mycontainer/blobs/accuracy test/repository1

    7. Click Create to complete the Event Subscription setup.

    4. Assign Required Azure Permissions

    Ensure the following permissions are assigned to the Azure Storage Account:

    • EventGrid Data Contributor

    • EventGrid EventSubscription Contributor

    • EventGrid TopicSpaces Publisher

    For details on assigning these roles, refer to this documentation.

    5. Create Azure Event Hub

    1. Navigate to Azure Portal Event hubs and click Create

    1. In Create Namespace Window fill in the details

      1. Give it a Name

      2. Select your subscription and resource group

      3. Select location

      4. Pricing tier - standard

      5. Throughput Units - 1

    1. Click on Review + Create and then Create after validation

    1. After namespace is created, click on + Event Hub button

    2. In Create Event Hub Window fill in name and click Create + Review and Create after validation. Save the name of the Event Hub you created in this step, as it will be used later in step 9 to replace {eventHubName}.

    1. Configure access policy

      1. In the event hubs namespace window click on Settings/Shared access policies and then +Add button

      2. Fill in the details in the new tab, set LogicAppsListenerPolicy as name, select Listen policy, and click Save.

      3. Click on the newly created policy, then copy and save the Connection string–primary key. This will be needed later in step 8b.

    6. Configure Azure Storage Diagnostic settings

    1. Navigate to Azure Portal and open your Storage Account.

    2. Select needed account from the Storage Accounts

    3. In the left-hand menu, select Monitoring/Diagnostic settings and click blob

    4. In Diagnostic settings Window click on "+ Add diagnostic setting" button

    5. In Create Diagnostic setting Window fill in the details:

      1. Give it a Name

      2. Select Category groups allLogs

      3. Select Destination details Stream to an event hub and select newly created Event Hub Namespace and Event Hub

    7. Configure Azure Logic Apps

    1. Go to Azure logic apps and click "Add" button

    2. In Create Logic App Window select Workflow Service Plan

    3. In Create Logic App (Workflow Service Plan) Window fill in the details and click "Create + Review":

      1. Select your subscription and resource group

      2. Give logic app name

      3. Select region

      4. Pricing plan should be WS1

      5. In the monitoring tab select No for the application insights

      6. Click Review + create button

    4. Click Create after validation

    5. In newly created logic app click on Workflows/Workflows and then +Add button

    6. In new workflow tab fill in name, select State type: Stateful and click Create

    7. In created workflow go to Developer/Designer and click on Add a trigger, then in search type "Event hub" and select "When events are available in Event Hub"

    8. Configure API connection

      1. Click on the trigger, set "Temp" for Event Hub Name and then click on Change connection.

      2. Then click Add New and fill in the details. Enter any name for the connection name and use the connection string {Connection string–primary key} from step 3.6.c.

    9. In workflow navigation tab go to Developer/Code and set the provided code, then click save:

      1. Replace with a path to the streaming folder. For ex., you want to get events from the folder "StreamingFolder" which is located in file share with the name "DocumentsShare" and in the folder with the name "Personal". In this case, the path should be "DocumentsShare/Personal/StreamingFolder"

      2. Replace with webhook url provided in the application in the scan configuration window

    Troubleshooting

    If you experience any issues with the configuration, ensure that:

    1. The Webhook URL is correct and matches the configuration in Azure.

    2. Steps 5.8 and 5.9 properly executed and all the variables are replaced with real values.

    3. You can also check if the trigger was unsuccessful by navigating to your configured in previos steps Logic App, then Workflow and Trigger History. If you see any failed triggers, you can inspect the error details to identify the issue.

    Next Steps

    After configuring the event subscription:

    • Documents may be uploaded to the configured path.

    • The events triggered by these uploads will be processed by the Data Streaming setup, and the results will appear in the Getvisibility dashboard.

    Troubleshooting

    If there any issues with the configuration, ensure that:

    1. The Webhook URL is correct and matches the configuration in Azure.

    2. The required Azure permissions are correctly assigned.

    3. Steps 5.8 and 5.9 properly executed and all the variables are replaced with real values.

    4. You can also check if the trigger was unsuccessful by navigating to your configured in previos steps Logic App, then Workflow and Trigger History. If you see any failed triggers, you can inspect the error details to identify the issue.

    {
        "definition": {
            "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
            "actions": {
                "Filter_Records": {
                    "type": "Query",
                    "inputs": {
                        "from": "@triggerBody()?['ContentData']?['records']",
                        "where": "@and(not(empty(item()?['uri'])),or(contains(item()?['uri'], '{FolderPath}/'),contains(item()?['uri'], '{FolderPath}?')))"
                    },
                    "runAfter": {}
                },
                "Condition": {
                    "type": "If",
                    "expression": "@greater(length(body('Filter_Records')), 0)",
                    "actions": {
                        "HTTP-copy": {
                            "type": "Http",
                            "inputs": {
                                "uri": "{WebhookUrl}",
                                "method": "POST",
                                "headers": {
                                    "Content-Type": "application/json"
                                },
                                "body": {
                                    "event": "@setProperty(triggerBody(),'ContentData',setProperty(triggerBody()?['ContentData'],'records',body('Filter_Records')))"
                                }
                            },
                            "runAfter": {}
                        }
                    },
                    "else": {},
                    "runAfter": {
                        "Filter_Records": [
                            "Succeeded"
                        ]
                    }
                }
            },
            "contentVersion": "1.0.0.0",
            "outputs": {},
            "triggers": {
                "When_events_are_available_in_Event_Hub": {
                    "type": "ApiConnection",
                    "inputs": {
                        "host": {
                            "connection": {
                                "referenceName": "{connectionName}"
                            }
                        },
                        "method": "get",
                        "path": "/@{encodeURIComponent('{eventHubName}')}/events/batch/head",
                        "queries": {
                            "contentType": "application/json",
                            "consumerGroupName": "$Default",
                            "maximumEventsCount": 50
                        }
                    },
                    "recurrence": {
                        "interval": 30,
                        "frequency": "Second"
                    },
                    "splitOn": "@triggerBody()"
                }
            }
        },
        "kind": "Stateful"
    }
    

    Use the Webhook URL provided at the step 2 to Subscriber endpoint and Confirm selection.

    Make sure to replace {connectionDetails.ContainerName} and {connectionDetails.FolderPath} with the actual container name and folder path from the scan configuration.

  • Click Save.

  • On the Change Connection tab, click Details and copy the Name from the connection details. Save this Name, as it will be used later in step 9 to replace {connectionName}.

  • Click save on workflow designer window

  • Replace {eventHubName} with azure event hub name that was created previously
  • Replace {connectionName} with connection name from previouse step

  • {FolderPath}
    {WebhookUrl}
    {
        "definition": {
            "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
            "actions": {
                "Filter_Records": {
                    "type": "Query",
                    "inputs": {
                        "from": "@triggerBody()?['ContentData']?['records']",
                        "where": "@and(not(empty(item()?['uri'])),or(contains(item()?['uri'], '{FolderPath}/'),contains(item()?['uri'], '{FolderPath}?')))"
                    },
                    "runAfter": {}
                },
                "Condition": {
                    "type": "If",
                    "expression": "@greater(length(body('Filter_Records')), 0)",
                    "actions": {
                        "HTTP-copy": {
                            "type": "Http",
                            "inputs": {
                                "uri": "{WebhookUrl}",
                                "method": "POST",
                                "headers": {
                                    "Content-Type": "application/json"
                                },
                                "body": {
                                    "event": "@setProperty(triggerBody(),'ContentData',setProperty(triggerBody()?['ContentData'],'records',body('Filter_Records')))"
                                }
                            },
                            "runAfter": {}
                        }
                    },
                    "else": {},
                    "runAfter": {
                        "Filter_Records": [
                            "Succeeded"
                        ]
                    }
                }
            },
            "contentVersion": "1.0.0.0",
            "outputs": {},
            "triggers": {
                "When_events_are_available_in_Event_Hub": {
                    "type": "ApiConnection",
                    "inputs": {
                        "host": {
                            "connection": {
                                "referenceName": "{connectionName}"
                            }
                        },
                        "method": "get",
                        "path": "/@{encodeURIComponent('{eventHubName}')}/events/batch/head",
                        "queries": {
                            "contentType": "application/json",
                            "consumerGroupName": "$Default",
                            "maximumEventsCount": 50
                        }
                    },
                    "recurrence": {
                        "interval": 30,
                        "frequency": "Second"
                    },
                    "splitOn": "@triggerBody()"
                }
            }
        },
        "kind": "Stateful"
    }
    
    <figure><img src="../../.gitbook/assets/cab519c5-725f-4f62-a8d4-3bce7eb60737 (1).png" alt=""><figcaption></figcaption></figure>

    DDR Supported Events

    A comprehensive list of the supported event types by Data Source for DDR

    When DDR (aka streaming) is enabled and events start coming in from the data source there are two types of events:

    Informational

    Examples would be Read, View, etc.

    No actions are taken when these events are detected.

    Modification Events:

    These are events that alter the file or the file permissions. Examples would include creating a file or user, changing a file name etc.

    When these types of events are detected a scan or rescan of the item will occur so that it can be classified.

    AWS IAM

    Events that Trigger (Re)Scan:

    Create Events:

    • CreateUser - A new user account is created.

    • CreateGroup - A new user group is created.

    • CreateRole - A new role is created with specific permissions.

    Update Events:

    • UpdateUser - Modifications are made to an existing user.

    • UpdateGroup - Changes are made to a group, such as adding or removing members.

    • UpdateRole - A role is updated with new permissions or settings.

    • AttachUserPolicy - A policy is attached to a user, modifying access rights.

    Delete Events:

    • DeleteUser - A user account is deleted.

    • DeleteGroup - A group is deleted along with its associated permissions.

    • DeleteRole - A role is deleted from IAM.

    Other Processed Events:

    Informational Events:

    • ConsoleLogin - A user logs in through the AWS console.

    • SignInFailure - A login attempt fails.

    • SignInSuccess - A login attempt is successful.

    • FederatedLogin - A user logs in via federated authentication.

    List of Processed AWS S3 Events

    Events that Trigger (Re)Scan:

    Create Events:

    • s3:ObjectCreated: - A new object is uploaded to an S3 bucket.

    • s3:ObjectCreated:Post – A new object is uploaded to an S3 bucket by an HTTP POST operation.

    • s3:ObjectCreated:CompleteMultipartUpload – An object was created after a multipart upload operation.

    • s3:ObjectCreated:Copy – A new object is created by an S3 copy operation.

    Update Events:

    • s3:ObjectRestore:Post – A restore request for an archived object is initiated.

    • s3:ObjectRestore:Delete – A restore request for an archived object is deleted.

    • s3:ObjectAcl:Put – Access control settings for an object are updated.

    • s3:ObjectTagging:Put – Tags for an object are added or modified.

    Delete Events:

    • s3:ObjectRemoved:Delete – An object is deleted from an S3 bucket.

    • s3:ObjectRemoved:DeleteMarkerCreated – A delete marker is created for an object, marking it as deleted.

    • s3:LifecycleExpiration:Delete – An object is removed due to lifecycle rules.

    • s3:LifecycleExpiration:DeleteMarkerCreated – A delete marker is created due to lifecycle rules.

    Other Processed Events:

    Informational Events:

    • s3:ReducedRedundancyLostObject - An object stored in Reduced Redundancy Storage is lost.

    • s3:LifecycleTransition – An object is transitioned to a different storage class based on lifecycle rules.

    • s3:Replication:OperationFailedReplication – The replication operation for an object failed.

    • s3:Replication:OperationNotTracked – The replication operation for an object is not tracked.

    Azure Blob

    Events that Trigger (Re)Scan:

    Create Events:

    • Microsoft.Storage.BlobCreated - A new blob is created or content is updated in a storage container.

    • Microsoft.Storage.DirectoryCreated - A new directory is created in a storage container.

    Update Events:

    • Microsoft.Storage.BlobRenamed - A blob is renamed within a container.

    • Microsoft.Storage.DirectoryRenamed - A directory is renamed within a container.

    Delete Events:

    • Microsoft.Storage.BlobDeleted - A blob is deleted from a storage container.

    • Microsoft.Storage.DirectoryDeleted - A directory is deleted from a storage container.

    Other Processed Events:

    • Microsoft.EventGrid.SubscriptionValidationEvent - A subscription validation event.

    • Microsoft.Storage.BlobTierChanged - The storage tier of a blob is modified.

    • GetBlobServiceProperties - Retrieves properties of the Blob service.

    • GetContainerProperties - Retrieves properties of a storage container.

    List of Processed Azure Files Events

    Events that Trigger (Re)Scan:

    Create Events:

    • CreateFile - A new file is created in an Azure Files share.

    • CreateDirectory - A new directory is created in an Azure Files share.

    • CopyFile - A file is copied to a new location.

    Update Events:

    • SetFileProperties - The properties of a file are updated.

    • SetFileMetadata - Metadata of a file is updated.

    Delete Events:

    • DeleteFile - A file is deleted from an Azure Files share.

    • DeleteDirectory - A directory is deleted from an Azure Files share.

    Other Processed Events:

    • ListShares - Lists file shares in an account.

    • GetShareProperties - Retrieves properties of a file share.

    • GetShareMetadata - Retrieves metadata of a file share.

    • GetDirectoryProperties - Retrieves properties of a directory.

    Box

    Events that Trigger (Re)Scan:

    Create Events:

    • FILE.UPLOADED - A new file is uploaded.

    • FOLDER.CREATED - A new folder is created.

    • FILE.RESTORED - A previously deleted file is restored.

    • FOLDER.RESTORED - A previously deleted folder is restored.

    Update Events:

    • FILE.MOVED - A file is moved to a new location.

    • FILE.RENAMED - A file is renamed.

    • FOLDER.RENAMED - A folder is renamed.

    • FOLDER.MOVED - A folder is moved to a new location.

    Delete Events:

    • FILE.TRASHED - A file is moved to the trash.

    • FILE.DELETED - A file is permanently deleted.

    • FOLDER.TRASHED - A folder is moved to the trash.

    • FOLDER.DELETED - A folder is permanently deleted.

    Other Processed Events:

    • FILE.DOWNLOADED - A file is downloaded.

    • FOLDER.DOWNLOADED - A folder is downloaded.

    • FILE.COPIED - A file is copied to another location.

    • FOLDER.COPIED - A folder is copied to another location.

    Confluence Cloud

    Events that Trigger (Re)Scan:

    Create Events:

    • page_created - A new page is created in Confluence.

    • blogpost_created - A new blog post is created.

    • attachment_created - A new attachment is uploaded.

    Update Events:

    • page_updated - An existing page is modified.

    • blogpost_updated - A blog post is updated.

    • attachment_updated - An attachment is updated.

    Delete Events:

    • page_deleted - A page is deleted from Confluence.

    • blogpost_deleted - A blog post is deleted.

    • attachment_deleted - An attachment is removed.

    Other Processed Events:

    • All other events are categorized as informational.

    Gmail Events

    Events that Trigger (Re)Scan:

    Create Events:

    • MessagesAdded - A new email message is added.

    Update Events:

    • LabelsAdded - A label is added to an email.

    • LabelsRemoved - A label is removed from an email.

    Delete Events:

    • MessagesDeleted - An email message is deleted.

    Google Drive Events

    Events that Trigger (Re)Scan:

    Create Events:

    • create - A new file or folder is created.

    • upload - A new file is uploaded.

    Update Events:

    • edit - A file or folder is modified.

    • rename - A file or folder is renamed.

    • move - An item is moved to a different location.

    Delete Events:

    • delete - An item is permanently removed.

    • trash - An item is moved to the trash.

    Other Processed Events:

    • view - A file or folder is viewed.

    • download - A file is downloaded.

    • preview - A file is previewed.

    • print - A file is printed.

    Google IAM Events

    Events that Trigger (Re)Scan:

    Create Events:

    • create_group - A new group is created.

    • create_user - A new user is created.

    Update Events:

    • 2sv_disable - Two-step verification is disabled.

    • 2sv_enroll - Two-step verification is enrolled.

    • password_edit - A user's password is modified.

    • recovery_email_edit - A recovery email is changed.

    Delete Events:

    • delete_group - A group is deleted.

    • delete_user - A user is deleted.

    • archive_user - A user is archived.

    • unarchive_user - A user is unarchived.


    Other Processed Events:

    • login_success - A user successfully logs in.

    • login_failure - A login attempt fails.

    • login_challenge - A login challenge occurs.

    • application_login_failure - An application login fails.

    OneDrive and SharePoint Online Events

    Events that Trigger (Re)Scan:

    Create Events:

    • FileUploaded - A new file is uploaded.

    • FolderCreated - A new folder is created.

    • FileRestored - A previously deleted file is restored.

    • FolderRestored - A previously deleted folder is restored.

    Update Events:

    • FileModified - A file is modified.

    • FileMoved - A file is moved to a new location.

    • FileRenamed - A file is renamed.

    • FolderModified - A folder is modified.

    Delete Events:

    • FileDeleted - A file is permanently deleted.

    • FolderDeleted - A folder is permanently deleted.

    • FileRecycled - A file is moved to the recycle bin.

    • FolderRecycled - A folder is moved to the recycle bin.

    Other Processed Events:

    • FileAccessed - A file is accessed.

    • FileDownloaded - A file is downloaded.

    • FilePreviewed - A file is previewed.

    • FolderCopied - A folder is copied.

    DetachUserPolicy - A policy is removed from a user, altering permissions.

  • PutUserPolicy - A new policy is assigned to a user.

  • AttachGroupPolicy - A policy is attached to a group, affecting all its members.

  • DetachGroupPolicy - A policy is removed from a group.

  • PutGroupPolicy - A policy is assigned to a group.

  • AttachRolePolicy - A policy is attached to a role, modifying access rights.

  • DetachRolePolicy - A policy is removed from a role.

  • PutRolePolicy - A new policy is assigned to a role.

  • ChangePassword - A user changes their password.

  • AddUserToGroup - A user is added to a group, changing their access permissions.

  • RemoveUserFromGroup - A user is removed from a group.

  • SessionStart - A session begins.

  • SessionEnd - A session ends.

  • GenerateCredentialReport - A report on credentials is generated.

  • GetCredentialReport - A credential report is retrieved.

  • ListAccessKeys - Access keys for a user are listed.

  • ListUserTags - Tags associated with a user are retrieved.

  • ListUsers - Users within an AWS account are listed.

  • ListGroups - Groups within an AWS account are listed.

  • ListRoles - Roles within an AWS account are listed.

  • GetUser - Information about a specific user is retrieved.

  • GetGroup - Information about a specific group is retrieved.

  • GetRole - Information about a specific role is retrieved.

  • s3:ObjectRestore:Completed – An archived object has been fully restored and is now available.

    s3:ObjectTagging:Delete – Tags for an object are removed.

    s3:Replication:OperationMissedThreshold – The replication operation did not meet its threshold requirements.

  • s3:Replication:OperationReplicatedAfterThreshold – The replication operation succeeded after surpassing the threshold.

  • s3:IntelligentTiering – An object is moved between storage tiers.

  • GetContainerServiceMetadata - Retrieves metadata for a storage container.

  • ListContainers - Lists storage containers in an account.

  • BlobPreflightRequest - A request to verify blob upload conditions.

  • ListBlobs - Lists blobs in a container.

  • GetBlobProperties - Retrieves properties of a blob.

  • GetBlobMetadata - Retrieves metadata associated with a blob.

  • GetBlockList - Retrieves the list of blocks in a blob.

  • GetContainerACL - Retrieves the access control list of a container.

  • GetContainerMetadata - Retrieves metadata for a container.

  • CopyBlob - Copies a blob from one location to another.

  • CopyBlobSource - Identifies the source blob for a copy operation.

  • CopyBlobDestination - Identifies the destination blob for a copy operation.

  • DeleteBlob - Deletes a blob from a container.

  • DeleteBlobSnapshot - Deletes a snapshot of a blob.

  • DeleteContainer - Deletes a storage container.

  • PutBlob - Uploads a new blob to a container.

  • PutBlock - Uploads a block for a blob.

  • PutBlockList - Commits a set of uploaded blocks as a blob.

  • CreateBlobSnapshot - Creates a snapshot of an existing blob.

  • CreateBlockBlob - Creates a new block blob.

  • CreateContainer - Creates a new storage container.

  • SetBlobMetadata - Updates metadata for a blob.

  • SetBlobProperties - Updates properties of a blob.

  • SetContainerMetadata - Updates metadata for a storage container.

  • SetContainerACL - Modifies the access control list of a container.

  • AcquireBlobLease - Acquires a lease on a blob.

  • ReleaseBlobLease - Releases a lease on a blob.

  • RenewBlobLease - Renews a lease on a blob.

  • BreakBlobLease - Breaks an active lease on a blob.

  • AcquireContainerLease - Acquires a lease on a container.

  • BreakContainerLease - Breaks an active lease on a container.

  • ChangeBlobLease - Changes an active lease on a blob.

  • ChangeContainerLease - Changes an active lease on a container.

  • RenewContainerLease - Renews a lease on a container.

  • UndeleteBlob - Restores a deleted blob.

  • GetFileProperties - Retrieves properties of a file.

  • ListDirectoriesAndFiles - Lists directories and files in a share.

  • GetFile - Retrieves a file from a share.

  • GetFileRangeList - Retrieves the range list of a file.

  • GetShareStats - Retrieves statistics for a file share.

  • CreateShare - Creates a new file share.

  • PutRange - Uploads a range of data to a file.

  • SetShareMetadata - Updates metadata for a file share.

  • SetShareProperties - Updates properties of a file share.

  • SetDirectoryMetadata - Updates metadata of a directory.

  • SetDirectoryProperties - Updates properties of a directory.

  • ResizeFile - Resizes an existing file.

  • SetFileTier - Sets the tier of a file.

  • SetShareQuota - Updates the quota of a file share.

  • SetShareACL - Updates the access control list of a file share.

  • SetDirectoryACL - Updates the access control list of a directory.

  • SetFileACL - Updates the access control list of a file.

  • DeleteShare - Deletes a file share.

  • AcquireShareLease - Acquires a lease on a file share.

  • ReleaseShareLease - Releases a lease on a file share.

  • RenewShareLease - Renews a lease on a file share.

  • BreakShareLease - Breaks an active lease on a file share.

  • ChangeShareLease - Changes an active lease on a file share.

  • StartCopyFile - Initiates a file copy operation.

  • AbortCopyFile - Cancels an ongoing file copy operation.

  • CopyFileSource - Specifies the source file in a copy operation.

  • CopyFileDestination - Specifies the destination file in a copy operation.

  • CreateShareSnapshot - Creates a snapshot of a file share.

  • DeleteShareSnapshot - Deletes a snapshot of a file share.

  • UndeleteShare - Restores a deleted file share.

  • UndeleteFile - Restores a deleted file.

  • UndeleteDirectory - Restores a deleted directory.

  • RenameFile - Renames a file within a share.

  • RenameFileSource - Specifies the source file in a rename operation.

  • RenameFileDestination - Specifies the destination file in a rename operation.

  • RenameDirectory - Renames a directory within a share.

  • RenameDirectorySource - Specifies the source directory in a rename operation.

  • RenameDirectoryDestination - Specifies the destination directory in a rename operation.

  • COLLABORATION.CREATED - A collaboration event is created.

  • COLLABORATION.REMOVED - A collaboration is removed.

  • COLLABORATION.UPDATED - A collaboration is updated.

  • SHARED_LINK.CREATED - A shared link is created.

  • SHARED_LINK.UPDATED - A shared link is updated.

  • SHARED_LINK.DELETED - A shared link is deleted.

  • FILE.LOCKED - A file is locked for editing.

  • FILE.UNLOCKED - A file is unlocked for editing.

  • COMMENT.CREATED - A comment is added to a file.

  • COMMENT.UPDATED - A comment is updated.

  • COMMENT.DELETED - A comment is deleted.

  • METADATA_INSTANCE.CREATED - A metadata instance is created.

  • METADATA_INSTANCE.UPDATED - A metadata instance is updated.

  • METADATA_INSTANCE.DELETED - A metadata instance is deleted.

  • TASK_ASSIGNMENT.CREATED - A task is assigned.

  • TASK_ASSIGNMENT.UPDATED - A task assignment is updated.

  • SIGN_REQUEST.COMPLETED - A signature request is completed.

  • SIGN_REQUEST.DECLINED - A signature request is declined.

  • SIGN_REQUEST.EXPIRED - A signature request expired.

  • SIGN_REQUEST.SIGNER_EMAIL_BOUNCED - A signature request email bounced.

  • access_item_content - An item’s content is accessed.
  • sync - A file or folder is synced.

  • request_access - Access to an item is requested.

  • approval_requested - An approval request is sent.

  • approval_completed - An approval request is completed.

  • approval_canceled - An approval request is cancelled.

  • approval_comment_added - A comment is added to an approval request.

  • approval_due_time_change - The due time for an approval request is changed.

  • approval_reviewer_change - The reviewer of an approval request is changed.

  • approval_reviewer_responded - A reviewer responds to an approval request.

  • deny_access_request - An access request is denied.

  • expire_access_request - An access request expires.

  • change_owner - The owner of an item is changed.

  • change_document_access_scope - The access scope of a document is changed.

  • change_document_visibility - The visibility of a document is changed.

  • change_acl_editors - The list of editors for a document is modified.

  • change_user_access - User access permissions are modified.

  • shared_drive_membership_change - Membership in a shared drive is changed.

  • shared_drive_settings_change - Shared drive settings are modified.

  • apply_security_update - Security updates are applied.

  • shared_drive_apply_security_update - A security update is applied to a shared drive.

  • shared_drive_remove_security_update - A security update is removed from a shared drive.

  • remove_security_update - A security update is removed.

  • enable_inherited_permissions - Inherited permissions are enabled.

  • disable_inherited_permissions - Inherited permissions are disabled.

  • recovery_phone_edit - A recovery phone number is changed.

  • recovery_secret_qa_edit - A recovery question or answer is changed.

  • account_disabled_password_leak - A user account is disabled due to a password leak.

  • account_disabled_generic - A user account is disabled.

  • account_disabled_spamming - A user account is disabled due to spamming.

  • account_disabled_spamming_through_relay - A user account is disabled for spamming via relay.

  • accept_invitation - A user accepts an invitation.

  • add_info_setting - An informational setting is added.

  • add_member - A new member is added to a group.

  • add_member_role - A role is assigned to a member.

  • add_security_setting - A security setting is added.

  • add_service_account_permission - A permission is assigned to a service account.

  • approve_join_request - A join request is approved.

  • ban_member_with_moderation - A member is banned.

  • change_info_setting - An informational setting is modified.

  • change_security_setting - A security setting is changed.

  • change_group_setting - A group setting is modified.

  • change_group_name - A group's name is changed.

  • change_first_name - A user's first name is changed.

  • change_password - A user's password is changed.

  • suspend_user - A user is suspended.

  • unsuspend_user - A user is unsuspended.

  • update_group_settings - A group's settings are updated.

  • user_license_assignment - A license is assigned to a user.

  • user_license_revoke - A license is revoked from a user.

  • add_group_member - A member is added to a group.

  • remove_group_member - A member is removed from a group.

  • change_user_access - User access permissions are changed.

  • change_acl_editors - The list of editors for a document is changed.

  • application_login_success - An application login succeeds.

  • alert_center_view - The alert center is accessed.

  • request_to_join - A request to join a group is sent.

  • request_to_join_via_mail - A request to join a group via email is sent.

  • approval_requested - An approval request is made.

  • approval_canceled - An approval request is canceled.

  • approval_comment_added - A comment is added to an approval request.

  • approval_completed - An approval request is completed.

  • approval_due_time_change - The due time of an approval request is changed.

  • approval_reviewer_change - The reviewer of an approval request is changed.

  • approval_reviewer_responded - A reviewer responds to an approval request.

  • deny_access_request - An access request is denied.

  • expire_access_request - An access request expires.

  • shared_drive_membership_change - Membership in a shared drive is changed.

  • shared_drive_settings_change - Shared drive settings are changed.

  • apply_security_update - A security update is applied.

  • remove_security_update - A security update is removed.

  • shared_drive_apply_security_update - A security update is applied to a shared drive.

  • shared_drive_remove_security_update - A security update is removed from a shared drive.

  • suspicious_login - A suspicious login is detected.

  • suspicious_login_less_secure_app - A suspicious login from a less secure app is detected.

  • suspicious_programmatic_login - A suspicious programmatic login is detected.

  • user_signed_out_due_to_suspicious_session_cookie - A user is signed out due to a suspicious session cookie.

  • FolderMoved - A folder is moved to a new location.
  • FolderRenamed - A folder is renamed.

  • FileSensitivityLabelChanged - A file's sensitivity label is modified.

  • FileSensitivityLabelApplied - A sensitivity label is applied to a file.

  • SharingSet - Sharing permissions are updated.

  • AddedToGroup - A user is added to a group.

  • SiteDeleted - A SharePoint site is deleted.

  • GroupRemoved - A group is removed.

  • FileCopied - A file is copied.
  • SharedLinkCreated - A shared link is created.

  • SharedLinkDisabled - A shared link is disabled.

  • SharingInvitationAccepted - A sharing invitation is accepted.

  • SharingRevoked - A sharing invitation is revoked.

  • AnonymousLinkCreated - An anonymous link is created.

  • SecureLinkCreated - A secure link is created.

  • SecureLinkUpdated - A secure link is updated.

  • SecureLinkDeleted - A secure link is deleted.

  • AccessInvitationAccepted - An access invitation is accepted.

  • AccessInvitationRevoked - An access invitation is revoked.

  • AccessRequestApproved - An access request is approved.

  • AccessRequestRejected - An access request is rejected.

  • FileCheckOutDiscarded - A file checkout is discarded.

  • FileCheckedIn - A file is checked in.

  • FileCheckedOut - A file is checked out.

  • SharingInheritanceBroken - Sharing inheritance is broken.

  • AddedToSecureLink - A user is added to a secure link.

  • RemovedFromSecureLink - A user is removed from a secure link.

  • SiteCollectionCreated - A new SharePoint site collection is created.