Configuring the TCO Optimizer

In IBM® Cloud Logs, you can configure the Total Cost of Ownership (TCO Optimizer) and define policies that specify how to route logs to different data pipelines based on their business value. Each pipeline has a different storage price and offers different features. By defining the data pipeline based on the importance of the data to your business, the TCO Optimizer can help you improve real-time analysis and alerting and helps you manage costs. In addition, if you enable archive retention tags in your IBM Cloud Logs instance, you can also configure different data retention periods per policy.

About the TCO Optimizer

By default, when you send data to an IBM Cloud Logs instance, data is routed to the Priority insights data pipeline at ingestion. This data is retained and available for search by the IBM® Cloud Logs service for the number of days that you specify at the instance level. You can configure 7 days, 14 days, 30 days or 90 days.

The TCO Optimizer includes the following data pipelines:

Priority insights
Logs that require immediate access and full IBM Cloud Logs analysis capabilities. These logs are typically high-severity or business-critical logs that need to be analyzed or queried individually.
Analyze and alert
Logs that require processing and can be queried later if needed from an archive. These logs are typically logs used for monitoring, troubleshooting, and statistical analysis.
Store and search
Logs that need to be kept for compliance or post-processing reasons but can be maintained and queried from an archive.
Blocked
Logs that are discarded and are not available for search.

When you provision an IBM Cloud Logs instance without an IBM Cloud Object Storage bucket, data can only be routed to the Priority insights pipeline or blocked.

When you provision an IBM Cloud Logs instance with an IBM Cloud Object Storage bucket, data routed to the Priority insights pipeline, the Analyze and alert pipeline, and the Store and search pipeline is stored in the IBM Cloud Object Storage data bucket.

To configure the TCO Optimizer and use different data pipelines, you must have an IBM Cloud Object Storage bucket attached to your IBM Cloud Logs instance. If you do not have the data bucket attached to the instance and you configure policies that send data to the Analyze and alert or the Store and search data pipelines, the data is not available for search or alerting as it cannot be archived in the bucket.

Data that you send to the Store and search pipeline by configuring a block parsing rule with the option to see data through live tail and search also stores data in the data bucket. However, notice that a block parsing rule is applied after the TCO Optimizer policies are applied. Archive retention tags cannot be used with the data sent to Store and search through a block parsing rule.

When you configure TCO policies, each policy is assigned a priority. The selected priority determines the TCO data pipeline for the logs that match the criteria.

Mapping of policy priority to TCO pipeline
Priority value TCO pipeline
High Priority insights
Medium Analyze and alert
Low Store and search
Blocked [*]

[*] Logs matching policies with the Blocked priority are dropped and are not sent to any TCO pipeline.

You can apply policies to data based on the application name, the subsystem name, and log severity. These 3 fields are metadata fields that all log data must have. For more information, see Metadata fields.

The image shows the different available data pipelines.
Available data pipelines.

Each pipeline offers different features:

Features available in each data pipeline
Feature Priority insights Analyze and alert Store and search
High-speed search Yes No No
Dashboards and analytics using hot storage Yes No No
Dashboards and analytics using IBM Cloud Object Storage Yes Yes No
Intelligent log analytics Yes Yes No
Alert on logs Yes Yes No
Metrics maintained on log data for up to 1 year Yes Yes No
Re-index logs for further analysis Yes Yes Yes
Search logs in IBM Cloud Object Storage Yes Yes Yes
Store logs in IBM Cloud Object Storage Yes Yes Yes
Parsing rules Yes Yes Yes
Custom data enrichment Yes Yes Yes
Schema store Yes Yes Yes
Dynamic alerting Yes Yes No
Templating Yes Yes No
Anomaly detection Yes Yes No

Policies are evaluated in order, top down by default. The first matching policy is applied, and no other policies are evaluated.

  • If policies conflict, the first policy that is listed on the TCO Optimizer page takes precedence.
  • Policies are processed in the order configured and displayed on the UI. The first policy that is matched is the policy that is applied. Remaining policies are ignored.
  • Policies that block data should be defined last.

Priority insights data pipeline

Use the Priority insights data pipeline for high priority logs that require the most immediate attention and intervention such as logs for troubleshooting problems or analyzing unexpected behaviour.

Features available for high prioirty logs are:

  • Serverless monitoring

  • Rapid query

  • Custom dashboards

  • Service Catalog

  • Service Map

  • Alerting

  • Events to Metrics

  • Query archive

  • Viewing traces in your explore screen

Analyze and alert data pipeline

Use the Analyze and alert data pipeline for medium priority logs that may require attention at some point, but do not require immediate attention.

Features available for medium priority logs are:

  • Service Catalog

  • Service Map

  • Alerting

  • Events to Metrics

  • Query archive

  • Viewing traces in your explore screen

Store and search data pipeline

Use the Store and search data pipeline for logs that you must keep for compliance purposes but do not require action to be taken.

Features available for low priority logs include:

  • Query archive

  • Viewing traces in your explore screen

Accessing the TCO Optimizer

Complete the following steps to access the TCO Optimizer:

  1. Launch the IBM Cloud Logs UI.

  2. Click the Data pipeline icon Data pipeline icon > TCO Optimizer.

The TCO Optimizer page shows the percentage of ingested data that is flowing to each pipeline after the configured policies are applied.

Creating a policy

You must have a IBM Cloud Object Storage data bucket configured before creating a policy.

On the TCO Optimizer page, complete the following steps to create a new policy:

  1. Click Create policy.

  2. In the Details section, complete the following tasks:

    Enter a policy name.

    Enter a description. The description is optional.

    Define the policy order. This order determines which rule is applied when multiple policies match. By default, the first policy has the highest priority.

  3. In the Filters section, add 1 or more filters to configure the applications, subsystems, and severity values that are relevant for this policy.

    For applications and subsystems, criteria can be specified when the value matches one of: All, Is, Is Not, Includes, or Starts With.

  4. In the Priority section, set the priority for the policy. The priority determines the pipeline for logs that are matched by the policy.

    Valid values are: High for data managed through Priority insights, Medium for data managed through Analyze and alert, Low for data managed through Store and search and Block for data that you drop and is not available for search.

    The default value is High.

  5. In the Archive retention section, choose a retention tag.

    By default, the Default tag is selected.

    Notice that retention tags are available if they are defined and activated.

  6. Click Apply.

Modifying a policy

On the TCO Optimizer page, complete the following steps to modify an existing policy:

  1. Click the policy that you want to change.

  2. Modify the criteria.

  3. Click Apply.

If you want to change the policy priority value, you can also change the priority in the Priority drop-down list for the policy in the policy list. The priority determines the pipeline for logs that are matched by the policy.

Deleting a policy

On the TCO Optimizer page, complete the following steps to delete an existing policy:

  1. Click the policy that you want to delete.

  2. Click Delete.

  3. Confirm that you want to delete the policy.

Creating an application and policy override

Configuring application and policy overrides is only available through the UI.

The Application and policy overrides section displays the usage of all applications and subsystems producing logs, sorted by the top producers. You can use the filters to easily search and filter the list.

By clicking a row, you will see a detailed view of the application-subsystem pair usage organized by severity level and the TCO pipeline priority assigned.

In this view, you are able to change the priority for an entire application-subsystem pair, or change the priority for specific log severities within any application-subsystem pair. The priority will determine the TCO data pipeline where the data for the pair will be sent. If there are different priorities for different severities, the policy status displayed in the Application and policy overrides table will be Multiple.

To override the priority for a speciic application-subsystem pair, do the following:

  1. In Application and policy overrides click the application-subsystem pair you want to modify.

  2. Update the priority as needed.

    • To update the priority used for the entire application-sybsystem pair, change Set priority to to the preferred priority.

    • To update the priority based on the log severity, in Tune severity change the priority for the specific severity to the preferred priority.

  3. Click Apply to save the changes.

If you want to reset all overrides for all application-subsystem pairs to the default behavior, click Reset All Overrides. Be aware this will reset all overrides.