IBM Cloud Docs
Controlling data ingested for search in IBM Cloud Logs

Controlling data ingested for search in IBM Cloud Logs

You can control data that is ingested, and is available for search in IBM Cloud Logs. Data can be dropped during ingestion by using TCO policies or by using parsing rules.

Flow of logs through IBM Cloud Logs
Flow of logs through IBM Cloud Logs

TCO policy to drop logs

You can configure TCO policies:

to manage logs through different data pipelines based on

You can also define a TCO policy to drop logs based on application name, subsystem name, and severity. The TCO policy is applied when data is received by the ingestion endpoint and before any other IBM Cloud Logs processing.

Using parsing rules

After TCO policies are applied, you can drop or remove data within ingested logs by using parsing rules.

Block parsing rule

You can drop ingested logs that weren't dropped by TCO policies by using the block parsing rule. The block rule drops logs based on a RegEx expression.

If you configure a rule group, any application name, subsystem name, and severity filtering are applied before the block rule is applied.

When you define a block rule, you can select View blocked logs in LiveTail and archive to IBM Cloud Object Storage. Your dropped logs are saved in the Store and search pipeline. You can search the logs from archived data. In this way, ingested log data is not lost.

Using the block rule is a way to move logs to low priority in a more refined way than using TCO policies. For performance reasons, specify block rules in a rules group before any other parsing rules.

Remove parsing rule

You can drop parts of ingested logs that you don't need by using the remove parsing rule.

By removing log data that you do not need, you can control IBM Cloud Logs costs.