Streaming data
Stream data from an IBM Log Analysis instance to other corporate tools such as Security Information and Event Management (SIEM) tools.
As of 28 March 2024 the IBM Log Analysis and IBM Cloud Activity Tracker services are deprecated and will no longer be supported as of 30 March 2025. Customers will need to migrate to IBM Cloud Logs, which replaces these two services, prior to 30 March 2025. For information about IBM Cloud Logs, see the IBM Cloud Logs documentation.
When you stream data to data lakes, other analysis tools, or other SIEM tools, you can add additional capabilities to the ones provided by the IBM Log Analysis service:
- You can gain visibility into enterprise data across on-premises and cloud-based environments.
- You can identify and prioritize security threats that might affect your organization.
- You can detect vulnerabilities by using Artificial Intelligence (AI) to investigate threats and incidents.
You can stream data to an Event Streams instance or to an IBM Log Analysis instance. For example, when you enable streaming on an IBM Log Analysis instance, you configure Log Analysis to send data to an Event Streams instance. Then, you can configure Kafka Connect to consume the data and forward it to your destination tool. Once the data is persisted within Event Streams, you can configure any application or service to create a subscription and take action on log data being streamed.
You can also also configure streaming from one IBM Log Analysis instance to a second IBM Log Analysis instance.
You can only stream from one IBM Log Analysis instance to one other IBM Log Analysis instance. You cannot stream from the second IBM Log Analysis instance to another IBM Log Analysis instance.
Currently, you can only stream up to 1TB of data per day.
If you have any regulatory requirement for data residency and compliance needs, you must control the location where Log Analysis, Event Streams, Kafka Connect and the destination tool are available.
Configure streaming
For information on how to configure streaming, see Configuring streaming.
Consider the following information when streaming data to an Event Streams instance:
-
You must have manager role to configure streaming in the Log Analysis instance. This role includes the logdna.dashboard.manage IAM action role that allows a user to perform admin tasks such as configure streaming.
-
When you configure streaming, the IBM Log Analysis instance and the Event Streams instance must be provisioned in the same account.
-
To connect the IBM Log Analysis instance to the Event Streams instance, you need the following information:
-
Endpoint URLs to call the APIs
-
Credentials for authentication
-
-
If you configure the account to restrict access to configured IP addresses via IAM settings, or if the account limits the network locations that connections are accepted from via context based restrictions rules (CBR) for the Event Streams service, you must allowlist the Log Analysis CIDR blocks in the account. For more information, see Log Analysis CIDR blocks and Event Streams - Restricting network access.
-
To create a topic in Event Streams, you must have manager role. This role includes the messagehub.topic.manage IAM action role that allows an app or user to create or delete topic.
-
The credential that Log Analysis uses to publish data in Event Streams must have writer role. This role includes the messagehub.topic.write IAM action role that allows an app or service to write data to 1 or more topics.
Consider the following information when streaming data to a Log Analysis instance:
-
The Log Analysis instance data that will receive data must be configured with a paid service plan. Log Analysis instances on the
Lite
plan cannot receive streamed data. -
You must have manager role to configure streaming in the Log Analysis instance. This role includes the logdna.dashboard.manage IAM action role that allows a user to perform admin tasks such as configure streaming.
-
When you configure streaming, the source IBM Log Analysis instance and the destination IBM Log Analysis instance can be provisioned in the same account or in different accounts.
-
To connect the source IBM Log Analysis instance to the destination IBM Log Analysis instance, you need the following information:
-
Destination IBM Log Analysis ingestion URL
-
Ingestion key for the destination IBM Log Analysis for authentication
-
-
If you configure your log sources to send data through private endpoints, make sure you configure a private ingestion endpoint for streaming.
-
If you configure your log sources to send data through private and public endpoints, make sure you configure a private ingestion endpoint for streaming.
-
If you have any regulatory restriction to keep data within specific regions, make sure streaming is only configured to a valid destination.
Monitor streaming
To monitor streaming, you can use the following services:
-
IBM Cloud Monitoring service to monitor streaming to an Event Streams instance:
Event Streams is integrated with the Monitoring service. Monitoring provides a default template that you can customize to monitor the Event Streams instance, how data is streamed out of Log Analysis and consumed by any application or service that is subscribed to Event Streams.
For more information, see Monitoring streaming by using IBM Cloud Monitoring.
-
IBM Cloud Activity Tracker:
Streaming generates Activity Tracker events with the action logdna.streaming-logs.send to notify of failures sending data. There are different reasons for failure such as invalid credentials and topic deleted.
For more information, see Monitoring streaming by using Activity Tracker.
Conditional streaming
You can configure exclusion rules to filter out data from streaming. For more information, see Configuring conditional streaming.
- You configure streaming exclusion rules through Settings > Streaming > Exclusion rules.
- The exclusion rules that you define for streaming are different from the exclusion rules that you can define at the instance level through Settings > Usage > Exclusion rules.
When you define exclusion rules, either at the instance level, or for streaming, they are applied as follows:
- Exclusion rules that you define at the instance level are applied first.
- Only the data that is retained and available for search is in scope of the exclusion rules that you define for streaming.
- After a streaming exclusion rule is active, data that matches the filter criteria is not streamed.
- Conditions that are applied by a query are enforced.
Activity Tracker events
The following Activity Tracker events are generated when you configure streaming:
Action | Description |
---|---|
logdna.streaming-configuration.validate |
This event is generated when you configure the connection in Log Analysis to Event Streams. |
logdna.streaming-samples.send |
This event is generated when sample data is sent to verify the connection. |
logdna.account-streaming-setting.configure |
This event is generated when you start streaming. |
logdna.streaming-configuration.deactivate |
This event is generated when you stop streaming. |
logdna.streaming-logs.send |
This event is generated when there is a failure streaming data. |
logdna.exclusion-rule.create |
This event is generated when an streaming exclusion rule is configured. |
logdna.exclusion-rule.delete |
This event is generated when an streaming exclusion rule is deleted. |