This article explains how to integrate an Amazon Web Service (AWS) S3 bucket with your Cato account to upload events directly to an S3 bucket.
For customers who review and analyze event data in an AWS S3 bucket, you can configure your Cato account to automatically and continuously upload events to the bucket. This is different from the eventsFeed API, which requires customers to pull the data from Cato and is impacted by issues such as rate-limiting.
Note
Note: You can define up to three Event Integrations for your account.
Sample company is using the IPS Suspicious Activity Monitoring feature which generates a lot of security events. They decide to create an AWS S3 bucket to store all the event data, which they can then integrate with their SIEM solution. Sample company enables Events Integration and adds the S3 bucket as an integration to their Cato account so that all the IPS events are automatically uploaded to the S3 bucket.
Create a new S3 bucket and define the policy that allows it to receive data. Then define the IAM role for the S3 bucket with Cato's role ARN to set the bucket permissions to allow Cato to upload data to the bucket.
The Cato Cloud uploads data to the S3 bucket as follows, every 60 seconds, or when there is more than 10MB of data. Cato uses HTTPS to upload data to the S3 bucket.
Note
Notes:
-
Only regions for S3 buckets where Security Token Service (STS) is active are supported.
For more information about enabling STS for a region, see relevant the AWS documentation.
-
If access to the third-party service is limited to specific IP addresses, please refer to this article for the list of Cato IP addresses that you need to allow (you must be signed in to view this article).
To configure an S3 bucket in AWS to receive Cato event data:
-
Create a new S3 bucket with the appropriate AWS Region.
-
Create a new IAM policy for the S3 bucket that allows uploading data to the bucket.
-
In the policy, click the JSON tab, and copy the Cato JSON below.
Edit the JSON and add the name for the S3 bucket, and then paste it in the tab.
{ "Version": "2012-10-17", "Statement": [ { "Sid": "", "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:GetBucketLocation" ], "Resource": [ "arn:aws:s3:::<bucket name>" ] }, { "Sid": "", "Effect": "Allow", "Action": [ "s3:PutObject" ], "Resource": [ "arn:aws:s3:::<bucket name>/*" ] } ] }
-
Review the settings for the policy and click Create policy.
-
Create a new IAM role with Cato's ARN, to allow Cato to upload events for your account to the S3 bucket.
-
In the Select trusted entity screen, add Cato's ARN to the role:
arn:aws:iam::428465470022:role/cato-events-integration
{ "Version": "2012-10-17", "Statement": [ { "Sid": "Statement1", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::428465470022:role/cato-events-integration" }, "Action": "sts:AssumeRole" } ] }
Click Next.
-
In the Add permissions screen, attach the policy that you created in step 4 to the role.
Click Next.
-
Enter the Role name and click Create role.
The AWS S3 bucket is ready to integrate with your Cato account.
-
Create a new integration for the AWS S3 bucket in the Events Integration tab, and add the Role ARN to the integration. This ARN gives Cato permission to upload the event data to the S3 bucket. After you define and enable the AWS S3 integration, it takes a few minutes for Cato to start uploading events to the S3 bucket.
You can choose to filter the events that are uploaded to S3 bucket. For example, only upload IPS events for your account to the S3 bucket. The default setting is no filter and all events are uploaded to the S3 bucket.
To add an AWS S3 bucket integration to upload events for your account:
-
From the navigation menu, select Administration > Event Integrations.
-
Select Enable integration with Cato events.
-
Click New. The New Integration panel opens.
-
Configure the settings for the S3 bucket integration:
-
Enter the Name for the integration.
-
Enter these Connection Details for the integration based on the settings in AWS:
-
Bucket Name - Identical name of the S3 bucket
-
Folder - Identical name for the folder path within the S3 bucket (if necessary)
-
Region - Identical region for the S3 bucket
Note: Only regions for S3 buckets where Security Token Service (STS) is active are supported.
-
Role ARN - Copy and paste the ARN for the role for the S3 bucket
-
-
(Optional) Define the filter settings for events that are uploaded to the S3 bucket.
When you define multiple filters, there is an AND relationship, and the events that match all filters are uploaded.
-
-
Click Apply. The AWS S3 bucket is now integrated with your account.
Note: You can define up to a total of three Event Integrations for your account.
7 comments
i have followed the specified procedure, but it is still not working. I am unsure about allowing Cato IP address. Where should I configure it?
Sai Woon Si Thanks for your question. If there is an existing IAM Role that limits access based on the Source IP address, then you need to edit that Role and add the Cato IP address to it.
This AWS article explains how to deny access based on the Source IP.
I have followed the steps as described, but no object is being created in the AWS S3 bucket. Are the following S3 settings unnecessary to configure?
- Block Public Access (Default: ON)
- Bucket Policy
- ACL
- Object Ownership
- Access Point
I appreciate your assistance.
Naoki Kimura Make sure that the ACL includes the Cato IP addresses in this article.
Otherwise, I suggest you create a topic in the Cato Community and see if there is a customer who is more familiar with the situation.
Naoki Kimura - I see that we updated the Cato IP addresses in the article linked above, this may help resolve the issue.
Sorry for the inconvenience.
Hi, Simon.
Thank you for sharing the information. Although we couldn't pinpoint the exact cause, we are now able to output logs to AWS S3 without registering allowed IPs in the ACL. The only notable change we made was shortening the bucket name.
Thank you very much.
This works, but Cato sends logs in a zipped format (.gz). This is not helpful when integrating it into any SIEM. How can we get the Cato logs to be sent to S3 in it's original format (.log) so our SIEMs can parse it as unstructured data?
Please sign in to leave a comment.