Streaming data
Stream data from an IBM® Cloud Logs instance to other corporate tools such as Security Information and Event Management (SIEM) tools by integrating IBM® Cloud Logs and Event Streams.
When you stream data to data lakes, other analysis tools, or other SIEM tools, you can add additional capabilities to the ones provided by the IBM Cloud Logs service:
- You can gain visibility into enterprise data across on-premises and cloud-based environments.
- You can identify and prioritize security threats that might affect your organization.
- You can detect vulnerabilities by using Artificial Intelligence (AI) to investigate threats and incidents.
For example, when you enable streaming on an IBM® Cloud Logs instance, you configure IBM® Cloud Logs to send data to an Event Streams instance. Then, you can configure Kafka Connect to consume the data and forward it to your destination tool. Once the data is persisted within Event Streams, you can configure any application or service to create a subscription and take action on the log data being streamed.
If you have any regulatory requirement for data residency and compliance needs, you must control the location where IBM Cloud Logs, Event Streams, Kafka Connect and the destination tool are available.
Configure streaming
For information on how to configure streaming, see Integrating IBM Cloud Logs with Event Streams.
Consider the following information when configuring the streaming feature between an IBM Cloud Logs instance and an Event Streams instance:
- The IBM Cloud Logs instance and the Event Streams instance must be provisioned in the same account.
- You must have the manager role for the IBM Cloud Logs to configure streaming in the IBM Cloud Logs instance.
- To connect the IBM Cloud Logs instance to the Event Streams instance, you need to define a service to service authentication. The credential that IBM Cloud Logs uses to publish data in Event Streams must have writer role. This role includes the messagehub.topic.write IAM action role that allows an app or service to write data to 1 or more topics.
- To create a topic in Event Streams, you must have the manager role for the Event Streams instance. This role includes the messagehub.topic.manage IAM action role that allows an app or user to create or delete topics.
- You can define DataPrime data rules to filter the data that you stream from IBM Cloud Logs to Event Streams. For more information, see Configuring streaming data rules.
Activity Tracker events
The following Activity Tracker Event Routing events are generated when you mange your streaming configuration:
Action | Required permission | Description |
---|---|---|
logs.logs-stream-setup.get |
reader | This event is generated when details about a specific streaming configuration between IBM Cloud Logs to Event Streams is retrieved. |
logs.logs-stream-setup.list |
reader | This event is generated when a list of all streaming configurations between IBM Cloud Logs to Event Streams is requested. |