Integrating Cato Events with Azure Storage Account

This article explains how to integrate an Azure storage account with your Cato account to upload events directly to a storage account.

Overview of Events Integration

For customers that review and analyze event data in an Azure storage account, you can configure your Cato account to automatically and continuously upload events to it. This is different from the eventsFeed API, which requires customers to pull the data from Cato and is impacted by issues such as rate-limiting.

The Cato Cloud uploads data to the storage account as follows, every 60 seconds, or when there is more than 10MB of data. Cato uses HTTPS to upload data to the Azure storage account.

Note

Note: You can define up to a total of three Event Integrations for your account.

Events Integration Use Case

Sample company is using the IPS Suspicious Activity Monitoring feature which generates a lot of security events. They decide to create an Azure storage account to store all the event data, which they can then integrate with their SIEM solution. Sample company enables Events Integration and adds the Azure storage account as an integration to their Cato account so that all the IPS events area automatically uploaded to the Azure storage.

High-Level Overview of Azure Event Integration

  1. Create new Azure storage account and container.

  2. Azure provides a connection string as follows:

    1. Access keys - connection string is automatically generated.

    2. SAS - configure the recommended permissions and settings, and then the connection string is generated.

  3. Create the Azure integration in the Cato Management Application using the connection string from the previous step.

Configuring the Azure Storage Account

Create a new storage account and container for the Cato event data, we recommend that you don't use an existing storage account for the Event Integration. You can use an Azure connection string from an access key or from a Shared access signature (SAS).

Using Access Keys for the Connection String

For customers that are using Azure access keys to authenticate the storage account to Cato, copy the connection string. You will paste the access keys connection string in the Cato Management Application when you configure the Azure integration.

To create a storage account that uses access keys:

  1. Create a new storage account with the appropriate settings.

    1. In the Instance details, select Standard performance.

      basic_storage_account.png
    2. Click Review and then click Create.

  2. Create a new container for the event data (Data storage > Containers).

    You will enter the container Name in the Cato Management Application when you create the integration for the events (below).

  3. In the left-hand navigation pane, go to the Security + networking section and select Access keys.

  4. Copy the access keys connection string for the storage account.

    access_key_string.png
  5. Continue with Adding Azure Account Storage for Events (below).

Using SAS for the Connection String

Azure SAS lets you restrict permissions for the storage container, such as allowed IP addresses, and an expiration date for the connection string.

The token for the SAS connection string includes an expiration date, which is shown on the Event Integration page. After the expiration date, the token is no longer valid, and Cato can't push events to the storage container. To maintain uninterrupted uploading of events, make sure to generate a new connection string and apply it to the integration before the SAS expiration date.

Note

Note: If access to the third-party service is limited to specific IP addresses, please refer to this article for the list of Cato IP addresses that you need to allow (you must be signed in to view this article).

To configure a storage account in Azure to receive Cato event data:

  1. Create a new storage account with the appropriate settings.

    1. In the Instance details, select Standard performance.

      basic_storage_account.png
    2. Click Review and then click Create.

  2. Create a new container for the event data (Data storage > Containers).

    You will enter the container Name in the Cato Management Application when you create the integration for the events (below).

  3. In the left-hand navigation pane, go to the Security + networking section and select Shared access signature.

  4. Configure the SAS with the following access permissions:

    • Allowed services - Blob, File

    • Allowed resource types - Container, Object

    • Allowed permissions - Read, Write, List

    SAS_settings.png
  5. Click Generate SAS and connection string.

  6. Copy the Connection string for the storage account. You will paste this string when you create the integration for the events (below).

    sas_string.png

Adding Azure Account Storage for Events

Create a new integration for the Azure storage account in the Events Integration tab, and paste the connection string to the integration. This string gives Cato permission to upload the event data to the storage account. You can't edit the string after creating the integration, instead you can Reset the field, and then paste the connection string.

After you define and enable the Azure storage integration, it takes a few minutes for Cato to start uploading events to the storage account.

You can choose to filter the events that are uploaded to the storage account. For example, only upload IPS events for your account to it. The default setting is no filter and all events are uploaded to the storage account.

EventIntegration.png

To add an Azure storage integration to upload events for your account:

  1. From the navigation menu, select Resources > Event Integrations.

  2. Select Enable integration with Cato events.

  3. Click New. The New Integration panel opens.

  4. In Integration, select Azure Account Storage and enter the Name for the integration.

  5. Enter these Connection Details for the integration based on the settings in Azure:

    • Connection String - Paste the connection string that you copied from the storage account

    • Name - Identical name of the container in the storage account

    • (Optional) Folder - Identical name for the folder path within the container (if necessary)

  6. (Optional) Define the filter settings for events that are uploaded to the storage account.

    When you define multiple filters, there is an AND relationship, and the events that match all filters are uploaded.

  7. Click Apply. The Azure storage account is now integrated with your account.

    Note: You can define up to a total of three Event Integrations for your account.

Was this article helpful?

2 out of 4 found this helpful

2 comments

  • Comment author
    antonmosyagin

    what's the point of making non-real gzip format?
    all logs are coming to azure blob with gz extension, but it's not a real archive, and for example in azure data explorer you will get 
    Partial query failure: Unspecified error (message: 'invalid gzip header: ', details: 'StgError { kind: Io(Custom { kind: InvalidInput, error: "invalid gzip header" }), source: None }') 
    or error in logstash - “not in gzip format”.

    now we need somehow rename all logs files extension from .log.gz → .log

  • Comment author
    JM

    CMA screen shot should be updated to reflect current layout for Event Integrations