Operational logging
Operational logs are an important complement to audit logs for ensuring your application runs smoothly. Proper operational logging can help you determine whether you need to fail over to an alternative storage or processing site. In addition, operational logging can help you determine whether operations have returned to normal after a system disruption.
Audit logs contain auditable events, while operational logs contain everything else that might be logged. For example, an operational log might contain entries that reflect what's happening as a program executes, such as functions getting called or stack traces getting thrown. While this kind of log data is key to keeping a system running smoothly, it's typically not sent to a SIEM.
Operational logs can be split into two categories:
- Application - Log data generated by software components that you deploy and manage within your deployment. This might be from your own code, or other software components like databases or message queues.
- Platform - Log data from IBM Cloud service instances that is not contained in the audit the data sent to Activity Tracker Event Routing.
Application logging
You need to install your own software solution for capturing application log data within your VPC. There are various ways an operational logging solution can be implemented. See Setting up an operational logging solution for one example that uses the Red Hat OpenShift on IBM Cloud Container Platform Elasticsearch, Fluentd, and Kibana EFK stack.
Platform logging
VPC network traffic
If you are using virtual server instances, you are required to use Flow Logs for VPC to enable the collection, storage, and presentation of information about the Internet Protocol (IP) traffic going to and from network interfaces on virtual server instances within your VPC.
Flow logs can help with a number of tasks, including:
- Troubleshooting why specific traffic isn't reaching an instance, which helps to diagnose restrictive security group rules
- Recording the metadata network traffic that is reaching your instance
- Determining source and destination traffic from the network interfaces
- Adhering to compliance regulations
- Assisting with root cause analysis
See Creating a flow log collector for more details. You must configure flow logs to go to a properly secured IBM Cloud Object Storage bucket, encrypted with KYOK.
Red Hat OpenShift on IBM Cloud and virtual server instances
Platform logs generated from Red Hat OpenShift on IBM Cloud and virtual server instances can be collected by using your own software solution or by using the previously mentioned solution for Setting up an operational logging solution.
Other IBM Cloud services
There is not a Financial Services Validated solution for capturing operational platform logs from other IBM Cloud services. These are usually collected by sending them to IBM Cloud Log Analysis. However, if seeking the Financial Services Validated designation, you should not send platform logs to IBM Cloud Log Analysis.
Related controls in IBM Cloud Framework for Financial Services
The following IBM Cloud Framework for Financial Services controls are most related to this guidance. However, in addition to following the guidance here, do your own due diligence to ensure you meet the requirements.
Family | Control |
---|---|
Contingency Planning (CP) | CP-2 (3) Contingency Plan | Resume Essential Missions / Business Functions CP-6 Alternate Storage Site CP-7 Alternate Processing Site CP-10 Information System Recovery and Reconstitution |
System and Information Integrity (SI) | SI-11 Error Handling |