IBM Cloud API Docs

Introduction

You can use a collection of IBM DataStage REST APIs to process, compile, and run flows. DataStage flows are design-time assets that contain data integration logic in JSON-based schemas.

Process flows Use the processing API to manipulate data that you have read from a data source before writing it to a data target.

Compile flows Use the compile API to compile flows. All flows must be compiled before you run them. .

Run flows Use the run API to run flows. When you run a flow, the extraction, loading, and transforming tasks that were built into the flow designs are actually implemented.

You can use the DataStage REST APIs for both DataStage in Cloud Pak for Data as a service and DataStage in Cloud Pak for Data.

For more information on the DataStage service, see the following links:

The code examples on this tab use the client library that is provided for Java.

<dependency>
<groupId>com.ibm.cloud</groupId>
<artifactId>datastage</artifactId>
<version>0.0.1</version>
</dependency>

Gradle

compile 'com.ibm.cloud:datastage:0.0.1'

GitHub

The code examples on this tab use the client library that is provided for Node.js.

Installation

npm install datastage

GitHub

The code examples on this tab use the client library that is provided for Python.

Installation

pip install --upgrade "datastage>=0.0.1"

GitHub

Authentication

Before you can call an IBM DataStage API, you must first create an IAM bearer token. Tokens support authenticated requests without embedding service credentials in every call. Each token is valid for one hour. After a token expires, you must create a new one if you want to continue using the API. The recommended method to retrieve a token programmatically is to create an API key for your IBM Cloud identity and then use the IAM token API to exchange that key for a token. For more information on authentication, see the following links:

Replace {apikey} and {url} with your service credentials.

curl -X {request_method} -u "apikey:{apikey}" "{url}/v4/{method}"

Setting client options through external configuration

Example environment variables, where <SERVICE_URL> is the endpoint URL, <API_KEY> is your IAM API key and <IAM_URL> is your IAM URL endpoint

DATASTAGE_AUTH_TYPE=iam
DATASTAGE_URL=https://api.dataplatform.cloud.ibm.com/data_intg
DATASTAGE_APIKEY=<API_KEY>
DATASTAGE_AUTH_URL=https://iam.cloud.ibm.com/identity/token

Example of constructing the service client

import com.ibm.cloud.datastage.v3.Datastage;
Datastage service = Datastage.newInstance();

Setting client options through external configuration

Example environment variables, where <SERVICE_URL> is the endpoint URL, <API_KEY> is your IAM API key and <IAM_URL> is your IAM URL endpoint

DATASTAGE_AUTH_TYPE=iam
DATASTAGE_URL=https://api.dataplatform.cloud.ibm.com/data_intg
DATASTAGE_APIKEY=<API_KEY>
DATASTAGE_AUTH_URL=https://iam.cloud.ibm.com/identity/token

Example of constructing the service client

const DatastageV3 = require('datastage/datastage/v3');
const datastageService = DatastageV3.newInstance({});

Setting client options through external configuration

To authenticate when using this sdk, an external credentials file is necessary (i.e. credentials.env). In this credentials file you will define and set 4 required fields for authenticating your sdk use against IAM.

Example environment variables, where <API_KEY> is your IAM API key

DATASTAGE_AUTH_TYPE=iam
DATASTAGE_URL=https://api.dataplatform.cloud.ibm.com/data_intg
DATASTAGE_APIKEY=<API_KEY>
DATASTAGE_AUTH_URL=https://iam.cloud.ibm.com/identity/token

Example of constructing the service client

import os
from datastage.datastage_v3 import DatastageV3

# define path to external credentials file
config_file = 'credentials.env'
# define a chosen service name
custom_service_name = 'DATASTAGE'

datastage_service = None

if os.path.exists(config_file):
    # set environment variable to point towards credentials file path
	os.environ['IBM_CREDENTIALS_FILE'] = config_file

    # create datastage instance using custom service name
	datastage_service = DatastageV3.new_instance(custom_service_name)

Endpoint URLs

Identify the base URL for your service instance.

IBM Cloud URLs

The base URLs come from the service instance. To find the URL, view the service credentials by clicking the name of the service in the Resource list. Use the value of the URL. Add the method to form the complete API endpoint for your request.

https://api.dataplatform.cloud.ibm.com/data_intg

Example API request

curl --request GET --header "Content-Type: application/json" --header "Accept: application/json" --header "Authorization: Bearer <IAM token>" --url "https://api.dataplatform.cloud.ibm.com/data_intg/v3/data_intg_flows?project_id=<Project ID>&limit=10"

Replace <IAM token> and <Project ID> in this example with the values for your particular API call.

Auditing

You can monitor API activity within your account by using the DataStage service. Whenever an API method is called, an event is generated that you can then track and audit from within Acitivty Tracker. The specific event type is listed for each individual method.

Error handling

DataStage uses standard HTTP response codes to indicate whether a method completed successfully. HTTP response codes in the 2xx range indicate success. A response in the 4xx range is some sort of failure, and a response in the 5xx range usually indicates an internal system error that cannot be resolved by the user. Response codes are listed with the method.

ErrorResponse

Name Description
error
string
Description of the problem.
code
integer
HTTP response code.
code_description
string
Response message.
warnings
string
Warnings associated with the error.

Methods

Delete DataStage flows

Deletes the specified data flows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data flows and their runs will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

All the data flow runs associated with the data flows will also be deleted. If a data flow is still running, it will not be deleted unless the force parameter is set to true. If a data flow is still running and the force parameter is set to true, the call returns immediately with a 202 response. The related data flows are deleted after the data flow runs are stopped.

Deletes the specified data flows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data flows and their runs will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously. All the data flow runs associated with the data flows will also be deleted. If a data flow is still running, it will not be deleted unless the force parameter is set to true. If a data flow is still running and the force parameter is set to true, the call returns immediately with a 202 response. The related data flows are deleted after the data flow runs are stopped.

Deletes the specified data flows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data flows and their runs will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously. All the data flow runs associated with the data flows will also be deleted. If a data flow is still running, it will not be deleted unless the force parameter is set to true. If a data flow is still running and the force parameter is set to true, the call returns immediately with a 202 response. The related data flows are deleted after the data flow runs are stopped.

Deletes the specified data flows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data flows and their runs will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously. All the data flow runs associated with the data flows will also be deleted. If a data flow is still running, it will not be deleted unless the force parameter is set to true. If a data flow is still running and the force parameter is set to true, the call returns immediately with a 202 response. The related data flows are deleted after the data flow runs are stopped.

DELETE /v3/data_intg_flows
ServiceCall<Void> deleteDatastageFlows(DeleteDatastageFlowsOptions deleteDatastageFlowsOptions)
deleteDatastageFlows(params)
delete_datastage_flows(self,
        id: List[str],
        *,
        catalog_id: str = None,
        project_id: str = None,
        force: bool = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the DeleteDatastageFlowsOptions.Builder to create a DeleteDatastageFlowsOptions object that contains the parameter values for the deleteDatastageFlows method.

Query Parameters

  • The list of DataStage flow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • Whether to stop all running data flows. Running DataStage flows must be stopped before the DataStage flows can be deleted.

The deleteDatastageFlows options.

parameters

  • The list of DataStage flow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • Whether to stop all running data flows. Running DataStage flows must be stopped before the DataStage flows can be deleted.

parameters

  • The list of DataStage flow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • Whether to stop all running data flows. Running DataStage flows must be stopped before the DataStage flows can be deleted.

  • curl -X DELETE --location --header "Authorization: Bearer {iam_token}"   "{base_url}/v3/data_intg_flows?id=[]&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • String[] ids = new String[] {flowID, cloneFlowID};
    DeleteDatastageFlowsOptions deleteDatastageFlowsOptions = new DeleteDatastageFlowsOptions.Builder()
      .id(Arrays.asList(ids))
      .projectId(projectID)
      .build();
    
    datastageService.deleteDatastageFlows(deleteDatastageFlowsOptions).execute();
  •      const params = {
           id: [subflow_assetID, subflowCloneID],
           projectId: projectID,
         };
         const res = await datastageService.deleteDatastageSubflows(params);
  • response = datastage_service.delete_datastage_flows(
      id=createdFlowId,
      project_id=config['PROJECT_ID']
    )

Response

Status Code

  • The requested operation is in progress.

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Get metadata and lock information for DataStage flows

Lists the metadata, entity and lock information for DataStage flows that are contained in the specified project.

Lists the metadata, entity and lock information for DataStage flows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

Lists the metadata, entity and lock information for DataStage flows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

Lists the metadata, entity and lock information for DataStage flows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

GET /v3/data_intg_flows
ServiceCall<DataFlowPagedCollection> listDatastageFlows(ListDatastageFlowsOptions listDatastageFlowsOptions)
listDatastageFlows(params)
list_datastage_flows(self,
        *,
        catalog_id: str = None,
        project_id: str = None,
        sort: str = None,
        start: str = None,
        limit: int = None,
        entity_name: str = None,
        entity_description: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the ListDatastageFlowsOptions.Builder to create a ListDatastageFlowsOptions object that contains the parameter values for the listDatastageFlows method.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

  • The limit of the number of items to return, for example limit=50. If not specified a default of 100 will be used.

    Possible values: value ≥ 1

    Example: 100

  • Filter results based on the specified name.

    Example: MyDataStageFlow

  • Filter results based on the specified description.

The listDatastageFlows options.

parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

  • The limit of the number of items to return, for example limit=50. If not specified a default of 100 will be used.

    Possible values: value ≥ 1

    Examples:
    value
    _source
    _lines
    _html
  • Filter results based on the specified name.

  • Filter results based on the specified description.

parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

  • The limit of the number of items to return, for example limit=50. If not specified a default of 100 will be used.

    Possible values: value ≥ 1

    Examples:
    value
    _source
    _lines
    _html
  • Filter results based on the specified name.

  • Filter results based on the specified description.

  • curl -X GET --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23&limit=100"
  • ListDatastageFlowsOptions listDatastageFlowsOptions = new ListDatastageFlowsOptions.Builder()
      .projectId(projectID)
      .limit(Long.valueOf("100"))
      .build();
    
    Response<DataFlowPagedCollection> response = datastageService.listDatastageFlows(listDatastageFlowsOptions).execute();
    DataFlowPagedCollection dataFlowPagedCollection = response.getResult();
    
    System.out.println(dataFlowPagedCollection);
  •      const params = {
           projectId: projectID,
           sort: 'name',
           limit: 100,
         };
         const res = await datastageService.listDatastageFlows(params);
  • data_flow_paged_collection = datastage_service.list_datastage_flows(
      project_id=config['PROJECT_ID'],
      limit=100
    ).get_result()
    
    print(json.dumps(data_flow_paged_collection, indent=2))

Response

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "data_flows": [
        {
          "entity": {
            "data_intg_flow": {
              "mime_type": "application/json",
              "dataset": false
            }
          },
          "metadata": {
            "asset_id": "{asset_id}",
            "asset_type": "data_intg_flow",
            "create_time": "2021-04-03 15:32:55+00:00",
            "creator_id": "IBMid-xxxxxxxxx",
            "description": " ",
            "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?project_id={project_id}",
            "name": "{job_name}",
            "project_id": "{project_id}",
            "resource_key": "{project_id}/data_intg_flow/{job_name}",
            "size": 5780,
            "usage": {
              "access_count": 0,
              "last_access_time": "2021-04-03 15:33:01.320000+00:00",
              "last_accessor_id": "IBMid-xxxxxxxxx",
              "last_modification_time": "2021-04-03 15:33:01.320000+00:00",
              "last_modifier_id": "IBMid-xxxxxxxxx"
            }
          }
        }
      ],
      "first": {
        "href": "{url}/data_intg/v3/data_intg_flows?project_id={project_id}&limit=2"
      },
      "next": {
        "href": "{url}/data_intg/v3/data_intg_flows?project_id={project_id}&limit=2&start=g1AAAADOeJzLYWBgYMpgTmHQSklKzi9KdUhJMjTUS8rVTU7WLS3WLc4vLcnQNbLQS87JL01JzCvRy0styQHpyWMBkgwNQOr____9WWCxXCAhYmRgZKhrYKJrYBxiaGplbGRlahqVaJCFZocB8XYcgNhxHrcdhlamhlGJ-llZAD4lOMI"
      },
      "total_count": 135
    }
  • {
      "data_flows": [
        {
          "entity": {
            "data_intg_flow": {
              "mime_type": "application/json",
              "dataset": false
            }
          },
          "metadata": {
            "asset_id": "{asset_id}",
            "asset_type": "data_intg_flow",
            "create_time": "2021-04-03 15:32:55+00:00",
            "creator_id": "IBMid-xxxxxxxxx",
            "description": " ",
            "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?project_id={project_id}",
            "name": "{job_name}",
            "project_id": "{project_id}",
            "resource_key": "{project_id}/data_intg_flow/{job_name}",
            "size": 5780,
            "usage": {
              "access_count": 0,
              "last_access_time": "2021-04-03 15:33:01.320000+00:00",
              "last_accessor_id": "IBMid-xxxxxxxxx",
              "last_modification_time": "2021-04-03 15:33:01.320000+00:00",
              "last_modifier_id": "IBMid-xxxxxxxxx"
            }
          }
        }
      ],
      "first": {
        "href": "{url}/data_intg/v3/data_intg_flows?project_id={project_id}&limit=2"
      },
      "next": {
        "href": "{url}/data_intg/v3/data_intg_flows?project_id={project_id}&limit=2&start=g1AAAADOeJzLYWBgYMpgTmHQSklKzi9KdUhJMjTUS8rVTU7WLS3WLc4vLcnQNbLQS87JL01JzCvRy0styQHpyWMBkgwNQOr____9WWCxXCAhYmRgZKhrYKJrYBxiaGplbGRlahqVaJCFZocB8XYcgNhxHrcdhlamhlGJ-llZAD4lOMI"
      },
      "total_count": 135
    }

Create DataStage flow

Creates a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

POST /v3/data_intg_flows
ServiceCall<DataIntgFlow> createDatastageFlows(CreateDatastageFlowsOptions createDatastageFlowsOptions)
createDatastageFlows(params)
create_datastage_flows(self,
        data_intg_flow_name: str,
        *,
        pipeline_flows: 'PipelineJson' = None,
        catalog_id: str = None,
        project_id: str = None,
        asset_category: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the CreateDatastageFlowsOptions.Builder to create a CreateDatastageFlowsOptions object that contains the parameter values for the createDatastageFlows method.

Query Parameters

  • The data flow name.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The category of the asset. Must be either SYSTEM or USER. Only a registered service can use this parameter.

    Allowable values: [system,user]

Pipeline json to be attached.

The createDatastageFlows options.

parameters

  • The data flow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The category of the asset. Must be either SYSTEM or USER. Only a registered service can use this parameter.

    Allowable values: [system,user]

parameters

  • The data flow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The category of the asset. Must be either SYSTEM or USER. Only a registered service can use this parameter.

    Allowable values: [system,user]

  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   --header "Content-Type: application/json;charset=utf-8"   --data '{}'   "{base_url}/v3/data_intg_flows?data_intg_flow_name={data_intg_flow_name}&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • PipelineJson exampleFlow = PipelineFlowHelper.buildPipelineFlow(flowJson);
    CreateDatastageFlowsOptions createDatastageFlowsOptions = new CreateDatastageFlowsOptions.Builder()
      .dataIntgFlowName(flowName)
      .pipelineFlows(exampleFlow)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.createDatastageFlows(createDatastageFlowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    System.out.println(dataIntgFlow);
  •      const pipelineJsonFromFile = JSON.parse(fs.readFileSync('testInput/rowgen_peek.json', 'utf-8'));
         const params = {
           dataIntgFlowName,
           pipelineFlows: pipelineJsonFromFile,
           projectId: projectID,
           assetCategory: 'system',
         };
         const res = await datastageService.createDatastageFlows(params);
  • data_intg_flow = datastage_service.create_datastage_flows(
      data_intg_flow_name='testFlowJob1',
      pipeline_flows=UtilHelper.readJsonFileToDict('inputFiles/exampleFlow.json'),
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}",
        "name": "{job_name}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_flow/{job_name}"
      }
    }
  • {
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}",
        "name": "{job_name}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_flow/{job_name}"
      }
    }

Get DataStage flow

Lists the DataStage flow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage flow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage flow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage flow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

GET /v3/data_intg_flows/{data_intg_flow_id}
ServiceCall<DataIntgFlowJson> getDatastageFlows(GetDatastageFlowsOptions getDatastageFlowsOptions)
getDatastageFlows(params)
get_datastage_flows(self,
        data_intg_flow_id: str,
        *,
        catalog_id: str = None,
        project_id: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the GetDatastageFlowsOptions.Builder to create a GetDatastageFlowsOptions object that contains the parameter values for the getDatastageFlows method.

Path Parameters

  • The DataStage flow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

The getDatastageFlows options.

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X GET --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/{data_intg_flow_id}?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • GetDatastageFlowsOptions getDatastageFlowsOptions = new GetDatastageFlowsOptions.Builder()
      .dataIntgFlowId(flowID)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlowJson> response = datastageService.getDatastageFlows(getDatastageFlowsOptions).execute();
    DataIntgFlowJson dataIntgFlowJson = response.getResult();
    
    System.out.println(dataIntgFlowJson);
  •      const params = {
           dataIntgFlowId: assetID,
           projectId: projectID,
         };
         const res = await datastageService.getDatastageFlows(params);
  • data_intg_flow_json = datastage_service.get_datastage_flows(
      data_intg_flow_id=createdFlowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow_json, indent=2))

Response

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • Unexpected error.

Example responses
  • {
      "attachments": {
        "app_data": {
          "datastage": {
            "external_parameters": []
          }
        },
        "doc_type": "pipeline",
        "id": "98cc1fa0-0fd8-4d55-9b27-d477096b4b37",
        "json_schema": "{url}/schemas/common-pipeline/pipeline-flow/pipeline-flow-v3-schema.json",
        "pipelines": [
          {
            "app_data": {
              "datastage": {
                "runtime_column_propagation": "false"
              },
              "ui_data": {
                "comments": []
              }
            },
            "id": "287b2b30-95ff-4cc8-b18f-92e23c464134",
            "nodes": [
              {
                "app_data": {
                  "datastage": {
                    "outputs_order": "46e18367-1820-4fe8-8c7c-d8badbc76aa3"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/PxRowGenerator.svg",
                    "label": "RowGen_1",
                    "x_pos": 239,
                    "y_pos": 236
                  }
                },
                "id": "77e6d535-8312-4692-8850-c129dcf921ed",
                "op": "PxRowGenerator",
                "outputs": [
                  {
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "55b884a7-9cfb-4e02-802b-82444ee95bb5"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "id": "46e18367-1820-4fe8-8c7c-d8badbc76aa3",
                    "parameters": {
                      "buf_free_run": 50,
                      "disk_write_inc": 1048576,
                      "max_mem_buf_size": 3145728,
                      "queue_upper_size": 0,
                      "records": 10
                    },
                    "schema_ref": "07fed318-4370-4c95-bbbc-16d4a91421bb"
                  }
                ],
                "parameters": {
                  "input_count": 0,
                  "output_count": 1
                },
                "type": "binding"
              },
              {
                "app_data": {
                  "datastage": {
                    "inputs_order": "9e842525-7bbf-4a42-ae95-49ae325e0c87"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/informix.svg",
                    "label": "informixTgt",
                    "x_pos": 690,
                    "y_pos": 229
                  }
                },
                "connection": {
                  "project_ref": "{project_id}",
                  "properties": {
                    "create_statement": "CREATE TABLE custid(customer_num int)",
                    "table_action": "append",
                    "table_name": "custid",
                    "write_mode": "insert"
                  },
                  "ref": "85193161-aa63-4cc5-80e7-7bfcdd59c438"
                },
                "id": "8b4933d9-32c0-4c40-9c47-d8791ab12baf",
                "inputs": [
                  {
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "id": "9e842525-7bbf-4a42-ae95-49ae325e0c87",
                    "links": [
                      {
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "class_name": "",
                                "hotspot": false,
                                "id": "Link_3",
                                "label": "Link_3",
                                "outline": true,
                                "path": "",
                                "position": "middle"
                              }
                            ]
                          }
                        },
                        "id": "55b884a7-9cfb-4e02-802b-82444ee95bb5",
                        "link_name": "Link_3",
                        "node_id_ref": "77e6d535-8312-4692-8850-c129dcf921ed",
                        "port_id_ref": "46e18367-1820-4fe8-8c7c-d8badbc76aa3",
                        "type_attr": "PRIMARY"
                      }
                    ],
                    "parameters": {
                      "part_coll": "part_type"
                    },
                    "schema_ref": "07fed318-4370-4c95-bbbc-16d4a91421bb"
                  }
                ],
                "op": "informix",
                "parameters": {
                  "input_count": 1,
                  "output_count": 0
                },
                "type": "binding"
              }
            ],
            "runtime_ref": "pxOsh"
          }
        ],
        "primary_pipeline": "287b2b30-95ff-4cc8-b18f-92e23c464134",
        "schemas": [
          {
            "fields": [
              {
                "app_data": {
                  "column_reference": "customer_num",
                  "is_unicode_string": false,
                  "odbc_type": "INTEGER",
                  "table_def": "Saved\\\\Link_3\\\\ifx_customer",
                  "type_code": "INT32"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": true,
                  "item_index": 0,
                  "max_length": 0,
                  "min_length": 0
                },
                "name": "customer_num",
                "nullable": false,
                "type": "integer"
              }
            ],
            "id": "07fed318-4370-4c95-bbbc-16d4a91421bb"
          }
        ],
        "version": "3.0"
      },
      "entity": {
        "data_intg_flow": {
          "mime_type": "application/json",
          "dataset": false
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "name": "{job_name}",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:14:10.193000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx",
          "last_modification_time": "2021-04-08 17:14:10.193000+00:00",
          "last_modifier_id": "IBMid-xxxxxxxxxx"
        }
      }
    }
  • {
      "attachments": {
        "app_data": {
          "datastage": {
            "external_parameters": []
          }
        },
        "doc_type": "pipeline",
        "id": "98cc1fa0-0fd8-4d55-9b27-d477096b4b37",
        "json_schema": "{url}/schemas/common-pipeline/pipeline-flow/pipeline-flow-v3-schema.json",
        "pipelines": [
          {
            "app_data": {
              "datastage": {
                "runtime_column_propagation": "false"
              },
              "ui_data": {
                "comments": []
              }
            },
            "id": "287b2b30-95ff-4cc8-b18f-92e23c464134",
            "nodes": [
              {
                "app_data": {
                  "datastage": {
                    "outputs_order": "46e18367-1820-4fe8-8c7c-d8badbc76aa3"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/PxRowGenerator.svg",
                    "label": "RowGen_1",
                    "x_pos": 239,
                    "y_pos": 236
                  }
                },
                "id": "77e6d535-8312-4692-8850-c129dcf921ed",
                "op": "PxRowGenerator",
                "outputs": [
                  {
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "55b884a7-9cfb-4e02-802b-82444ee95bb5"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "id": "46e18367-1820-4fe8-8c7c-d8badbc76aa3",
                    "parameters": {
                      "buf_free_run": 50,
                      "disk_write_inc": 1048576,
                      "max_mem_buf_size": 3145728,
                      "queue_upper_size": 0,
                      "records": 10
                    },
                    "schema_ref": "07fed318-4370-4c95-bbbc-16d4a91421bb"
                  }
                ],
                "parameters": {
                  "input_count": 0,
                  "output_count": 1
                },
                "type": "binding"
              },
              {
                "app_data": {
                  "datastage": {
                    "inputs_order": "9e842525-7bbf-4a42-ae95-49ae325e0c87"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/informix.svg",
                    "label": "informixTgt",
                    "x_pos": 690,
                    "y_pos": 229
                  }
                },
                "connection": {
                  "project_ref": "{project_id}",
                  "properties": {
                    "create_statement": "CREATE TABLE custid(customer_num int)",
                    "table_action": "append",
                    "table_name": "custid",
                    "write_mode": "insert"
                  },
                  "ref": "85193161-aa63-4cc5-80e7-7bfcdd59c438"
                },
                "id": "8b4933d9-32c0-4c40-9c47-d8791ab12baf",
                "inputs": [
                  {
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "id": "9e842525-7bbf-4a42-ae95-49ae325e0c87",
                    "links": [
                      {
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "class_name": "",
                                "hotspot": false,
                                "id": "Link_3",
                                "label": "Link_3",
                                "outline": true,
                                "path": "",
                                "position": "middle"
                              }
                            ]
                          }
                        },
                        "id": "55b884a7-9cfb-4e02-802b-82444ee95bb5",
                        "link_name": "Link_3",
                        "node_id_ref": "77e6d535-8312-4692-8850-c129dcf921ed",
                        "port_id_ref": "46e18367-1820-4fe8-8c7c-d8badbc76aa3",
                        "type_attr": "PRIMARY"
                      }
                    ],
                    "parameters": {
                      "part_coll": "part_type"
                    },
                    "schema_ref": "07fed318-4370-4c95-bbbc-16d4a91421bb"
                  }
                ],
                "op": "informix",
                "parameters": {
                  "input_count": 1,
                  "output_count": 0
                },
                "type": "binding"
              }
            ],
            "runtime_ref": "pxOsh"
          }
        ],
        "primary_pipeline": "287b2b30-95ff-4cc8-b18f-92e23c464134",
        "schemas": [
          {
            "fields": [
              {
                "app_data": {
                  "column_reference": "customer_num",
                  "is_unicode_string": false,
                  "odbc_type": "INTEGER",
                  "table_def": "Saved\\\\Link_3\\\\ifx_customer",
                  "type_code": "INT32"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": true,
                  "item_index": 0,
                  "max_length": 0,
                  "min_length": 0
                },
                "name": "customer_num",
                "nullable": false,
                "type": "integer"
              }
            ],
            "id": "07fed318-4370-4c95-bbbc-16d4a91421bb"
          }
        ],
        "version": "3.0"
      },
      "entity": {
        "data_intg_flow": {
          "mime_type": "application/json",
          "dataset": false
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "name": "{job_name}",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:14:10.193000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx",
          "last_modification_time": "2021-04-08 17:14:10.193000+00:00",
          "last_modifier_id": "IBMid-xxxxxxxxxx"
        }
      }
    }

Update DataStage flow

Modifies a data flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

PUT /v3/data_intg_flows/{data_intg_flow_id}
ServiceCall<DataIntgFlow> updateDatastageFlows(UpdateDatastageFlowsOptions updateDatastageFlowsOptions)
updateDatastageFlows(params)
update_datastage_flows(self,
        data_intg_flow_id: str,
        data_intg_flow_name: str,
        *,
        pipeline_flows: 'PipelineJson' = None,
        catalog_id: str = None,
        project_id: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the UpdateDatastageFlowsOptions.Builder to create a UpdateDatastageFlowsOptions object that contains the parameter values for the updateDatastageFlows method.

Path Parameters

  • The DataStage flow ID to use.

Query Parameters

  • The data flow name.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

Pipeline json to be attached.

The updateDatastageFlows options.

parameters

  • The DataStage flow ID to use.

  • The data flow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • The DataStage flow ID to use.

  • The data flow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X PUT --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   --header "Content-Type: application/json;charset=utf-8"   --data '{}'   "{base_url}/v3/data_intg_flows/{data_intg_flow_id}?data_intg_flow_name={data_intg_flow_name}&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • PipelineJson exampleFlowUpdated = PipelineFlowHelper.buildPipelineFlow(updatedFlowJson);
    UpdateDatastageFlowsOptions updateDatastageFlowsOptions = new UpdateDatastageFlowsOptions.Builder()
      .dataIntgFlowId(flowID)
      .dataIntgFlowName(flowName)
      .pipelineFlows(exampleFlowUpdated)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.updateDatastageFlows(updateDatastageFlowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgFlowId: assetID,
           dataIntgFlowName,
           pipelineFlows: pipelineJsonFromFile,
           projectId: projectID,
           assetCategory: 'system',
         };
         const res = await datastageService.updateDatastageFlows(params);
  • data_intg_flow = datastage_service.update_datastage_flows(
      data_intg_flow_id=createdFlowId,
      data_intg_flow_name='testFlowJob1Updated',
      pipeline_flows=UtilHelper.readJsonFileToDict('inputFiles/exampleFlowUpdated.json'),
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "attachments": [
        {
          "asset_type": "data_intg_flow",
          "attachment_id": "9081dd6b-0ab7-47b5-8233-c10c6e64509d",
          "href": "{url}/v2/assets/{asset_id}/attachments/9081dd6b-0ab7-47b5-8233-c10c6e64509d?project_id={project_id}",
          "mime": "application/json",
          "name": "data_intg_flows",
          "object_key": "data_intg_flow/{project_id}{asset_id}",
          "object_key_is_read_only": false,
          "private_url": false
        }
      ],
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "catalog_id": "{catalog_id}",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?catalog_id={catalog_id}",
        "name": "{job_name}",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:21:33.936000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx"
        }
      }
    }
  • {
      "attachments": [
        {
          "asset_type": "data_intg_flow",
          "attachment_id": "9081dd6b-0ab7-47b5-8233-c10c6e64509d",
          "href": "{url}/v2/assets/{asset_id}/attachments/9081dd6b-0ab7-47b5-8233-c10c6e64509d?project_id={project_id}",
          "mime": "application/json",
          "name": "data_intg_flows",
          "object_key": "data_intg_flow/{project_id}{asset_id}",
          "object_key_is_read_only": false,
          "private_url": false
        }
      ],
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "catalog_id": "{catalog_id}",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?catalog_id={catalog_id}",
        "name": "{job_name}",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:21:33.936000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx"
        }
      }
    }

Clone DataStage flow

Create a DataStage flow in the specified project or catalog based on an existing DataStage flow in the same project or catalog.

Create a DataStage flow in the specified project or catalog based on an existing DataStage flow in the same project or catalog.

Create a DataStage flow in the specified project or catalog based on an existing DataStage flow in the same project or catalog.

Create a DataStage flow in the specified project or catalog based on an existing DataStage flow in the same project or catalog.

POST /v3/data_intg_flows/{data_intg_flow_id}/clone
ServiceCall<DataIntgFlow> cloneDatastageFlows(CloneDatastageFlowsOptions cloneDatastageFlowsOptions)
cloneDatastageFlows(params)
clone_datastage_flows(self,
        data_intg_flow_id: str,
        *,
        catalog_id: str = None,
        project_id: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the CloneDatastageFlowsOptions.Builder to create a CloneDatastageFlowsOptions object that contains the parameter values for the cloneDatastageFlows method.

Path Parameters

  • The DataStage flow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

The cloneDatastageFlows options.

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/{data_intg_flow_id}/clone?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • CloneDatastageFlowsOptions cloneDatastageFlowsOptions = new CloneDatastageFlowsOptions.Builder()
      .dataIntgFlowId(flowID)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.cloneDatastageFlows(cloneDatastageFlowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgFlowId: assetID,
           projectId: projectID,
         };
         const res = await datastageService.cloneDatastageFlows(params);
  • data_intg_flow = datastage_service.clone_datastage_flows(
      data_intg_flow_id=createdFlowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}",
        "name": "{job_name_copy}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_flow/{job_name_copy}"
      }
    }
  • {
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}",
        "name": "{job_name_copy}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_flow/{job_name_copy}"
      }
    }

Compile DataStage flow to generate runtime assets

Generate the runtime assets for a DataStage flow in the specified project or catalog for a specified runtime type. Either project_id or catalog_id must be specified.

Generate the runtime assets for a DataStage flow in the specified project or catalog for a specified runtime type. Either project_id or catalog_id must be specified.

Generate the runtime assets for a DataStage flow in the specified project or catalog for a specified runtime type. Either project_id or catalog_id must be specified.

Generate the runtime assets for a DataStage flow in the specified project or catalog for a specified runtime type. Either project_id or catalog_id must be specified.

POST /v3/ds_codegen/compile/{data_intg_flow_id}
ServiceCall<FlowCompileResponse> compileDatastageFlows(CompileDatastageFlowsOptions compileDatastageFlowsOptions)
compileDatastageFlows(params)
compile_datastage_flows(self,
        data_intg_flow_id: str,
        *,
        catalog_id: str = None,
        project_id: str = None,
        runtime_type: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the CompileDatastageFlowsOptions.Builder to create a CompileDatastageFlowsOptions object that contains the parameter values for the compileDatastageFlows method.

Path Parameters

  • The DataStage flow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The type of the runtime to use. e.g. dspxosh or Spark etc. If not provided queried from within pipeline flow if available otherwise default of dspxosh is used.

The compileDatastageFlows options.

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The type of the runtime to use. e.g. dspxosh or Spark etc. If not provided queried from within pipeline flow if available otherwise default of dspxosh is used.

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The type of the runtime to use. e.g. dspxosh or Spark etc. If not provided queried from within pipeline flow if available otherwise default of dspxosh is used.

  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/ds_codegen/compile/{data_intg_flow_id}?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • CompileDatastageFlowsOptions compileDatastageFlowsOptions = new CompileDatastageFlowsOptions.Builder()
      .dataIntgFlowId(flowID)
      .projectId(projectID)
      .build();
    
    Response<FlowCompileResponse> response = datastageService.compileDatastageFlows(compileDatastageFlowsOptions).execute();
    FlowCompileResponse flowCompileResponse = response.getResult();
    
    System.out.println(flowCompileResponse);
  •      const params = {
           dataIntgFlowId: assetID,
           projectId: projectID,
         };
         const res = await datastageService.compileDatastageFlows(params);
  • flow_compile_response = datastage_service.compile_datastage_flows(
      data_intg_flow_id=createdFlowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(flow_compile_response, indent=2))

Response

Describes the compile response model.

Describes the compile response model.

Describes the compile response model.

Describes the compile response model.

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • Request object contains invalid information. Server is not able to process the request object.

  • Unexpected error.

Example responses
  • {
      "message": {
        "flowName": "{job_name}",
        "flow_name": "{job_name}",
        "result": "success",
        "runtime_code": "{compiled_OSH}",
        "runtime_type": "dspxosh"
      },
      "type": "ok"
    }
  • {
      "message": {
        "flowName": "{job_name}",
        "flow_name": "{job_name}",
        "result": "success",
        "runtime_code": "{compiled_OSH}",
        "runtime_type": "dspxosh"
      },
      "type": "ok"
    }

Delete DataStage subflows

Deletes the specified data subflows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data subflows will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

Deletes the specified data subflows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data subflows will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

Deletes the specified data subflows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data subflows will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

Deletes the specified data subflows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data subflows will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

DELETE /v3/data_intg_flows/subflows
ServiceCall<Void> deleteDatastageSubflows(DeleteDatastageSubflowsOptions deleteDatastageSubflowsOptions)
deleteDatastageSubflows(params)
delete_datastage_subflows(self,
        id: List[str],
        *,
        catalog_id: str = None,
        project_id: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the DeleteDatastageSubflowsOptions.Builder to create a DeleteDatastageSubflowsOptions object that contains the parameter values for the deleteDatastageSubflows method.

Query Parameters

  • The list of DataStage subflow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

The deleteDatastageSubflows options.

parameters

  • The list of DataStage subflow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • The list of DataStage subflow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X DELETE --location --header "Authorization: Bearer {iam_token}"   "{base_url}/v3/data_intg_flows/subflows?id=[]&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • String[] ids = new String[] {subflowID, cloneSubflowID};
    DeleteDatastageSubflowsOptions deleteDatastageSubflowsOptions = new DeleteDatastageSubflowsOptions.Builder()
      .id(Arrays.asList(ids))
      .projectId(projectID)
      .build();
    
    datastageService.deleteDatastageSubflows(deleteDatastageSubflowsOptions).execute();
  •      const params = {
           id: [assetID, cloneID],
           projectId: projectID,
           force: true,
         };
         const res = await datastageService.deleteDatastageFlows(params);
  • response = datastage_service.delete_datastage_subflows(
      id=createdSubflowId,
      project_id=config['PROJECT_ID']
    )

Response

Status Code

  • The requested operation is in progress.

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Get metadata and lock information for DataStage subflows

Lists the metadata, entity and lock information for DataStage subflows that are contained in the specified project.

Lists the metadata, entity and lock information for DataStage subflows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageSubFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

Lists the metadata, entity and lock information for DataStage subflows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageSubFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

Lists the metadata, entity and lock information for DataStage subflows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageSubFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

GET /v3/data_intg_flows/subflows
ServiceCall<DataFlowPagedCollection> listDatastageSubflows(ListDatastageSubflowsOptions listDatastageSubflowsOptions)
listDatastageSubflows(params)
list_datastage_subflows(self,
        *,
        catalog_id: str = None,
        project_id: str = None,
        sort: str = None,
        start: str = None,
        limit: int = None,
        entity_name: str = None,
        entity_description: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the ListDatastageSubflowsOptions.Builder to create a ListDatastageSubflowsOptions object that contains the parameter values for the listDatastageSubflows method.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

  • The limit of the number of items to return, for example limit=50. If not specified a default of 100 will be used.

    Possible values: value ≥ 1

    Example: 100

  • Filter results based on the specified name.

    Example: MyDataStageSubFlow

  • Filter results based on the specified description.

The listDatastageSubflows options.

parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

  • The limit of the number of items to return, for example limit=50. If not specified a default of 100 will be used.

    Possible values: value ≥ 1

    Examples:
    value
    _source
    _lines
    _html
  • Filter results based on the specified name.

  • Filter results based on the specified description.

parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

  • The limit of the number of items to return, for example limit=50. If not specified a default of 100 will be used.

    Possible values: value ≥ 1

    Examples:
    value
    _source
    _lines
    _html
  • Filter results based on the specified name.

  • Filter results based on the specified description.

  • curl -X GET --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/subflows?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23&limit=100"
  • ListDatastageSubflowsOptions listDatastageSubflowsOptions = new ListDatastageSubflowsOptions.Builder()
      .projectId(projectID)
      .limit(Long.valueOf("100"))
      .build();
    
    Response<DataFlowPagedCollection> response = datastageService.listDatastageSubflows(listDatastageSubflowsOptions).execute();
    DataFlowPagedCollection dataFlowPagedCollection = response.getResult();
    
    System.out.println(dataFlowPagedCollection);
  •      const params = {
           projectId: projectID,
           sort: 'name',
           limit: 100,
         };
         const res = await datastageService.listDatastageSubflows(params);
  • data_flow_paged_collection = datastage_service.list_datastage_subflows(
      project_id=config['PROJECT_ID'],
      limit=100
    ).get_result()
    
    print(json.dumps(data_flow_paged_collection, indent=2))

Response

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "data_flows": [
        {
          "entity": {
            "data_intg_subflow": {
              "mime_type": "application/json",
              "dataset": false
            }
          },
          "metadata": {
            "asset_id": "{asset_id}",
            "asset_type": "data_intg_subflow",
            "create_time": "2021-04-03 15:32:55+00:00",
            "creator_id": "IBMid-xxxxxxxxx",
            "description": " ",
            "href": "{url}/data_intg/v3/data_intg_flows/subflows/{asset_id}?project_id={project_id}",
            "name": "{job_name}",
            "project_id": "{project_id}",
            "resource_key": "{project_id}/data_intg_subflow/{job_name}",
            "size": 5780,
            "usage": {
              "access_count": 0,
              "last_access_time": "2021-04-03 15:33:01.320000+00:00",
              "last_accessor_id": "IBMid-xxxxxxxxx",
              "last_modification_time": "2021-04-03 15:33:01.320000+00:00",
              "last_modifier_id": "IBMid-xxxxxxxxx"
            }
          }
        }
      ],
      "first": {
        "href": "{url}/data_intg/v3/data_intg_flows/subflows?project_id={project_id}&limit=2"
      },
      "next": {
        "href": "{url}/data_intg/v3/data_intg_flows/subflows?project_id={project_id}&limit=2&start=g1AAAADOeJzLYWBgYMpgTmHQSklKzi9KdUhJMjTUS8rVTU7WLS3WLc4vLcnQNbLQS87JL01JzCvRy0styQHpyWMBkgwNQOr____9WWCxXCAhYmRgZKhrYKJrYBxiaGplbGRlahqVaJCFZocB8XYcgNhxHrcdhlamhlGJ-llZAD4lOMI"
      },
      "total_count": 1
    }
  • {
      "data_flows": [
        {
          "entity": {
            "data_intg_subflow": {
              "mime_type": "application/json",
              "dataset": false
            }
          },
          "metadata": {
            "asset_id": "{asset_id}",
            "asset_type": "data_intg_subflow",
            "create_time": "2021-04-03 15:32:55+00:00",
            "creator_id": "IBMid-xxxxxxxxx",
            "description": " ",
            "href": "{url}/data_intg/v3/data_intg_flows/subflows/{asset_id}?project_id={project_id}",
            "name": "{job_name}",
            "project_id": "{project_id}",
            "resource_key": "{project_id}/data_intg_subflow/{job_name}",
            "size": 5780,
            "usage": {
              "access_count": 0,
              "last_access_time": "2021-04-03 15:33:01.320000+00:00",
              "last_accessor_id": "IBMid-xxxxxxxxx",
              "last_modification_time": "2021-04-03 15:33:01.320000+00:00",
              "last_modifier_id": "IBMid-xxxxxxxxx"
            }
          }
        }
      ],
      "first": {
        "href": "{url}/data_intg/v3/data_intg_flows/subflows?project_id={project_id}&limit=2"
      },
      "next": {
        "href": "{url}/data_intg/v3/data_intg_flows/subflows?project_id={project_id}&limit=2&start=g1AAAADOeJzLYWBgYMpgTmHQSklKzi9KdUhJMjTUS8rVTU7WLS3WLc4vLcnQNbLQS87JL01JzCvRy0styQHpyWMBkgwNQOr____9WWCxXCAhYmRgZKhrYKJrYBxiaGplbGRlahqVaJCFZocB8XYcgNhxHrcdhlamhlGJ-llZAD4lOMI"
      },
      "total_count": 1
    }

Create DataStage subflow

Creates a DataStage subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

POST /v3/data_intg_flows/subflows
ServiceCall<DataIntgFlow> createDatastageSubflows(CreateDatastageSubflowsOptions createDatastageSubflowsOptions)
createDatastageSubflows(params)
create_datastage_subflows(self,
        data_intg_subflow_name: str,
        *,
        pipeline_flows: 'PipelineJson' = None,
        catalog_id: str = None,
        project_id: str = None,
        asset_category: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the CreateDatastageSubflowsOptions.Builder to create a CreateDatastageSubflowsOptions object that contains the parameter values for the createDatastageSubflows method.

Query Parameters

  • The DataStage subflow name.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The category of the asset. Must be either SYSTEM or USER. Only a registered service can use this parameter.

    Allowable values: [system,user]

Pipeline json to be attached.

The createDatastageSubflows options.

parameters

  • The DataStage subflow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The category of the asset. Must be either SYSTEM or USER. Only a registered service can use this parameter.

    Allowable values: [system,user]

parameters

  • The DataStage subflow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • The category of the asset. Must be either SYSTEM or USER. Only a registered service can use this parameter.

    Allowable values: [system,user]

  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   --header "Content-Type: application/json;charset=utf-8"   --data '{}'   "{base_url}/v3/data_intg_flows/subflows?data_intg_subflow_name={data_intg_subflow_name}&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • PipelineJson exampleSubFlow = PipelineFlowHelper.buildPipelineFlow(subFlowJson);
    CreateDatastageSubflowsOptions createDatastageSubflowsOptions = new CreateDatastageSubflowsOptions.Builder()
      .dataIntgSubflowName(subflowName)
      .pipelineFlows(exampleSubFlow)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.createDatastageSubflows(createDatastageSubflowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgSubflowName: dataIntgSubFlowName,
           pipelineFlows: pipelineJsonFromFile,
           projectId: projectID,
           assetCategory: 'system',
         };
         const res = await datastageService.createDatastageSubflows(params);
  • data_intg_flow = datastage_service.create_datastage_subflows(
      data_intg_subflow_name='testSubflow1',
      pipeline_flows=UtilHelper.readJsonFileToDict('inputFiles/exampleSubflow.json'),
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "entity": {
        "data_intg_subflow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_subflow",
        "href": "{url}/data_intg/v3/data_intg_flows/subflows/{asset_id}",
        "name": "{subflow_name}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_subflow/{job_name}"
      }
    }
  • {
      "entity": {
        "data_intg_subflow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_subflow",
        "href": "{url}/data_intg/v3/data_intg_flows/subflows/{asset_id}",
        "name": "{subflow_name}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_subflow/{job_name}"
      }
    }

Get DataStage subflow

Lists the DataStage subflow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage subflow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage subflow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage subflow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

GET /v3/data_intg_flows/subflows/{data_intg_subflow_id}
ServiceCall<DataIntgFlowJson> getDatastageSubflows(GetDatastageSubflowsOptions getDatastageSubflowsOptions)
getDatastageSubflows(params)
get_datastage_subflows(self,
        data_intg_subflow_id: str,
        *,
        catalog_id: str = None,
        project_id: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the GetDatastageSubflowsOptions.Builder to create a GetDatastageSubflowsOptions object that contains the parameter values for the getDatastageSubflows method.

Path Parameters

  • The DataStage subflow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

The getDatastageSubflows options.

parameters

  • The DataStage subflow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • The DataStage subflow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X GET --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/subflows/{data_intg_subflow_id}?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • GetDatastageSubflowsOptions getDatastageSubflowsOptions = new GetDatastageSubflowsOptions.Builder()
      .dataIntgSubflowId(subflowID)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlowJson> response = datastageService.getDatastageSubflows(getDatastageSubflowsOptions).execute();
    DataIntgFlowJson dataIntgFlowJson = response.getResult();
    
    System.out.println(dataIntgFlowJson);
  •      const params = {
           dataIntgSubflowId: subflow_assetID,
           projectId: projectID,
         };
         const res = await datastageService.getDatastageSubflows(params);
  • data_intg_flow_json = datastage_service.get_datastage_subflows(
      data_intg_subflow_id=createdSubflowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow_json, indent=2))

Response

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • Unexpected error.

Example responses
  • {
      "metadata": {
        "asset_id": "7ad1e03c-5380-4bfa-8317-3604b95954c1",
        "asset_type": "data_intg_subflow",
        "catalog_id": "e35806c5-5314-4677-bb8a-416d3c628d41",
        "create_time": "2021-05-10T19:11:04.000Z",
        "creator_id": "IBMid-310000E15B",
        "name": "NSC2_Subflow",
        "origin_country": "us",
        "size": 5117,
        "project_id": "{project_id}",
        "resource_key": "baa8b445-9bea-4c7b-9930-233f57f8c629/data_intg_subflow/NSC2_Subflow",
        "description": "",
        "tags": [],
        "usage": {
          "last_access_time": "2021-05-10T19:11:05.474Z",
          "last_accessor_id": "IBMid-310000E15B",
          "access_count": 0
        }
      },
      "entity": {
        "data_intg_subflow": {
          "mime_type": "application/json",
          "dataset": false
        }
      },
      "attachments": {
        "doc_type": "pipeline",
        "version": "3.0",
        "json_schema": "https://api.dataplatform.ibm.com/schemas/common-pipeline/pipeline-flow/pipeline-flow-v3-schema.json",
        "id": "913abf38-fac2-4c56-815b-f6f21e140fa3",
        "primary_pipeline": "abd53940-0ab2-4559-978e-864800ee875a",
        "pipelines": [
          {
            "id": "abd53940-0ab2-4559-978e-864800ee875a",
            "runtime_ref": "pxOsh",
            "nodes": [
              {
                "outputs": [
                  {
                    "id": "5e514391-fc64-4ad9-b7ef-d164783d1484",
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "aaac7610-cf58-4b7c-9431-643afe952621"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "schema_ref": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
                  }
                ],
                "id": "602a1843-4cb2-4a28-93f3-f6d08e9910b6",
                "type": "binding",
                "app_data": {
                  "datastage": {
                    "outputs_order": "5e514391-fc64-4ad9-b7ef-d164783d1484"
                  },
                  "ui_data": {
                    "image": "",
                    "x_pos": 48,
                    "label": "Entry node 1",
                    "y_pos": 48
                  }
                }
              },
              {
                "outputs": [
                  {
                    "id": "c539d891-84a8-481e-82fa-a6c90e588e1d",
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "78c4cdcd-2f6b-474a-805f-f15d00b7cac2"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "schema_ref": "d4ba6846-debd-47c5-90ec-dda663728a36"
                  }
                ],
                "inputs": [
                  {
                    "links": [
                      {
                        "node_id_ref": "602a1843-4cb2-4a28-93f3-f6d08e9910b6",
                        "type_attr": "PRIMARY",
                        "id": "aaac7610-cf58-4b7c-9431-643afe952621",
                        "link_name": "DSLink1E",
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "path": "",
                                "outline": true,
                                "hotspot": false,
                                "id": "DSLink1E",
                                "label": "DSLink1E",
                                "position": "middle",
                                "class_name": ""
                              }
                            ]
                          }
                        },
                        "port_id_ref": "5e514391-fc64-4ad9-b7ef-d164783d1484"
                      }
                    ],
                    "id": "fb38f373-7b2c-4e70-8629-c4e5e05a7cff",
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "parameters": {
                      "runtime_column_propagation": 0
                    },
                    "schema_ref": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
                  }
                ],
                "id": "a2fb41ad-5088-4849-a3cc-453a6416492c",
                "type": "super_node",
                "app_data": {
                  "datastage": {
                    "inputs_order": "fb38f373-7b2c-4e70-8629-c4e5e05a7cff",
                    "outputs_order": "c539d891-84a8-481e-82fa-a6c90e588e1d"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/Standardize.svg",
                    "expanded_height": 200,
                    "is_expanded": false,
                    "expanded_width": 300,
                    "x_pos": 192,
                    "label": "ContainerC3",
                    "y_pos": 48
                  }
                },
                "parameters": {
                  "output_count": 1,
                  "input_count": 1
                },
                "subflow_ref": {
                  "url": "app_defined",
                  "pipeline_id_ref": "default_pipeline_id"
                }
              },
              {
                "inputs": [
                  {
                    "links": [
                      {
                        "node_id_ref": "a2fb41ad-5088-4849-a3cc-453a6416492c",
                        "type_attr": "PRIMARY",
                        "id": "78c4cdcd-2f6b-474a-805f-f15d00b7cac2",
                        "link_name": "DSLink2E",
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "path": "",
                                "outline": true,
                                "hotspot": false,
                                "id": "DSLink2E",
                                "label": "DSLink2E",
                                "position": "middle",
                                "class_name": ""
                              }
                            ]
                          }
                        },
                        "port_id_ref": "c539d891-84a8-481e-82fa-a6c90e588e1d"
                      }
                    ],
                    "id": "2a52a02d-113c-4d9b-8f36-609414be8bf5",
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "schema_ref": "d4ba6846-debd-47c5-90ec-dda663728a36"
                  }
                ],
                "id": "547dcda4-a052-432d-ae4b-06df14e8e5b3",
                "type": "binding",
                "app_data": {
                  "datastage": {
                    "inputs_order": "2a52a02d-113c-4d9b-8f36-609414be8bf5"
                  },
                  "ui_data": {
                    "image": "",
                    "x_pos": 384,
                    "label": "Exit node 1",
                    "y_pos": 48
                  }
                }
              }
            ],
            "app_data": {
              "datastage": {
                "runtimecolumnpropagation": "true"
              },
              "ui_data": {
                "comments": []
              }
            }
          }
        ],
        "schemas": [
          {
            "id": "a479344e-7835-42b8-a5f5-7d88bc490dfe",
            "fields": [
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": true,
                  "min_length": 0,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 0,
                  "is_signed": true
                },
                "nullable": false,
                "name": "col1",
                "type": "integer",
                "app_data": {
                  "column_reference": "col1",
                  "odbc_type": "INTEGER",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "INT32"
                }
              },
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": false,
                  "min_length": 5,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 5,
                  "is_signed": false
                },
                "nullable": false,
                "name": "col2",
                "type": "string",
                "app_data": {
                  "column_reference": "col2",
                  "odbc_type": "CHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "STRING"
                }
              },
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": false,
                  "min_length": 0,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 10,
                  "is_signed": false
                },
                "nullable": false,
                "name": "col3",
                "type": "string",
                "app_data": {
                  "column_reference": "col3",
                  "odbc_type": "VARCHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "STRING"
                }
              }
            ]
          },
          {
            "id": "d4ba6846-debd-47c5-90ec-dda663728a36",
            "fields": [
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": true,
                  "min_length": 0,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 0,
                  "is_signed": true
                },
                "nullable": false,
                "name": "col1",
                "type": "integer",
                "app_data": {
                  "column_reference": "col1",
                  "odbc_type": "INTEGER",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "INT32"
                }
              },
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": false,
                  "min_length": 5,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 5,
                  "is_signed": false
                },
                "nullable": false,
                "name": "col2",
                "type": "string",
                "app_data": {
                  "column_reference": "col2",
                  "odbc_type": "CHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "STRING"
                }
              },
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": false,
                  "min_length": 0,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 10,
                  "is_signed": false
                },
                "nullable": false,
                "name": "col3",
                "type": "string",
                "app_data": {
                  "column_reference": "col3",
                  "odbc_type": "VARCHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "STRING"
                }
              }
            ]
          }
        ],
        "runtimes": [
          {
            "name": "pxOsh",
            "id": "pxOsh"
          }
        ],
        "app_data": {
          "datastage": {
            "version": "3.0.2"
          }
        }
      }
    }
  • {
      "metadata": {
        "asset_id": "7ad1e03c-5380-4bfa-8317-3604b95954c1",
        "asset_type": "data_intg_subflow",
        "catalog_id": "e35806c5-5314-4677-bb8a-416d3c628d41",
        "create_time": "2021-05-10T19:11:04.000Z",
        "creator_id": "IBMid-310000E15B",
        "name": "NSC2_Subflow",
        "origin_country": "us",
        "size": 5117,
        "project_id": "{project_id}",
        "resource_key": "baa8b445-9bea-4c7b-9930-233f57f8c629/data_intg_subflow/NSC2_Subflow",
        "description": "",
        "tags": [],
        "usage": {
          "last_access_time": "2021-05-10T19:11:05.474Z",
          "last_accessor_id": "IBMid-310000E15B",
          "access_count": 0
        }
      },
      "entity": {
        "data_intg_subflow": {
          "mime_type": "application/json",
          "dataset": false
        }
      },
      "attachments": {
        "doc_type": "pipeline",
        "version": "3.0",
        "json_schema": "https://api.dataplatform.ibm.com/schemas/common-pipeline/pipeline-flow/pipeline-flow-v3-schema.json",
        "id": "913abf38-fac2-4c56-815b-f6f21e140fa3",
        "primary_pipeline": "abd53940-0ab2-4559-978e-864800ee875a",
        "pipelines": [
          {
            "id": "abd53940-0ab2-4559-978e-864800ee875a",
            "runtime_ref": "pxOsh",
            "nodes": [
              {
                "outputs": [
                  {
                    "id": "5e514391-fc64-4ad9-b7ef-d164783d1484",
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "aaac7610-cf58-4b7c-9431-643afe952621"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "schema_ref": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
                  }
                ],
                "id": "602a1843-4cb2-4a28-93f3-f6d08e9910b6",
                "type": "binding",
                "app_data": {
                  "datastage": {
                    "outputs_order": "5e514391-fc64-4ad9-b7ef-d164783d1484"
                  },
                  "ui_data": {
                    "image": "",
                    "x_pos": 48,
                    "label": "Entry node 1",
                    "y_pos": 48
                  }
                }
              },
              {
                "outputs": [
                  {
                    "id": "c539d891-84a8-481e-82fa-a6c90e588e1d",
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "78c4cdcd-2f6b-474a-805f-f15d00b7cac2"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "schema_ref": "d4ba6846-debd-47c5-90ec-dda663728a36"
                  }
                ],
                "inputs": [
                  {
                    "links": [
                      {
                        "node_id_ref": "602a1843-4cb2-4a28-93f3-f6d08e9910b6",
                        "type_attr": "PRIMARY",
                        "id": "aaac7610-cf58-4b7c-9431-643afe952621",
                        "link_name": "DSLink1E",
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "path": "",
                                "outline": true,
                                "hotspot": false,
                                "id": "DSLink1E",
                                "label": "DSLink1E",
                                "position": "middle",
                                "class_name": ""
                              }
                            ]
                          }
                        },
                        "port_id_ref": "5e514391-fc64-4ad9-b7ef-d164783d1484"
                      }
                    ],
                    "id": "fb38f373-7b2c-4e70-8629-c4e5e05a7cff",
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "parameters": {
                      "runtime_column_propagation": 0
                    },
                    "schema_ref": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
                  }
                ],
                "id": "a2fb41ad-5088-4849-a3cc-453a6416492c",
                "type": "super_node",
                "app_data": {
                  "datastage": {
                    "inputs_order": "fb38f373-7b2c-4e70-8629-c4e5e05a7cff",
                    "outputs_order": "c539d891-84a8-481e-82fa-a6c90e588e1d"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/Standardize.svg",
                    "expanded_height": 200,
                    "is_expanded": false,
                    "expanded_width": 300,
                    "x_pos": 192,
                    "label": "ContainerC3",
                    "y_pos": 48
                  }
                },
                "parameters": {
                  "output_count": 1,
                  "input_count": 1
                },
                "subflow_ref": {
                  "url": "app_defined",
                  "pipeline_id_ref": "default_pipeline_id"
                }
              },
              {
                "inputs": [
                  {
                    "links": [
                      {
                        "node_id_ref": "a2fb41ad-5088-4849-a3cc-453a6416492c",
                        "type_attr": "PRIMARY",
                        "id": "78c4cdcd-2f6b-474a-805f-f15d00b7cac2",
                        "link_name": "DSLink2E",
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "path": "",
                                "outline": true,
                                "hotspot": false,
                                "id": "DSLink2E",
                                "label": "DSLink2E",
                                "position": "middle",
                                "class_name": ""
                              }
                            ]
                          }
                        },
                        "port_id_ref": "c539d891-84a8-481e-82fa-a6c90e588e1d"
                      }
                    ],
                    "id": "2a52a02d-113c-4d9b-8f36-609414be8bf5",
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "schema_ref": "d4ba6846-debd-47c5-90ec-dda663728a36"
                  }
                ],
                "id": "547dcda4-a052-432d-ae4b-06df14e8e5b3",
                "type": "binding",
                "app_data": {
                  "datastage": {
                    "inputs_order": "2a52a02d-113c-4d9b-8f36-609414be8bf5"
                  },
                  "ui_data": {
                    "image": "",
                    "x_pos": 384,
                    "label": "Exit node 1",
                    "y_pos": 48
                  }
                }
              }
            ],
            "app_data": {
              "datastage": {
                "runtimecolumnpropagation": "true"
              },
              "ui_data": {
                "comments": []
              }
            }
          }
        ],
        "schemas": [
          {
            "id": "a479344e-7835-42b8-a5f5-7d88bc490dfe",
            "fields": [
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": true,
                  "min_length": 0,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 0,
                  "is_signed": true
                },
                "nullable": false,
                "name": "col1",
                "type": "integer",
                "app_data": {
                  "column_reference": "col1",
                  "odbc_type": "INTEGER",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "INT32"
                }
              },
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": false,
                  "min_length": 5,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 5,
                  "is_signed": false
                },
                "nullable": false,
                "name": "col2",
                "type": "string",
                "app_data": {
                  "column_reference": "col2",
                  "odbc_type": "CHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "STRING"
                }
              },
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": false,
                  "min_length": 0,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 10,
                  "is_signed": false
                },
                "nullable": false,
                "name": "col3",
                "type": "string",
                "app_data": {
                  "column_reference": "col3",
                  "odbc_type": "VARCHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "STRING"
                }
              }
            ]
          },
          {
            "id": "d4ba6846-debd-47c5-90ec-dda663728a36",
            "fields": [
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": true,
                  "min_length": 0,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 0,
                  "is_signed": true
                },
                "nullable": false,
                "name": "col1",
                "type": "integer",
                "app_data": {
                  "column_reference": "col1",
                  "odbc_type": "INTEGER",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "INT32"
                }
              },
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": false,
                  "min_length": 5,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 5,
                  "is_signed": false
                },
                "nullable": false,
                "name": "col2",
                "type": "string",
                "app_data": {
                  "column_reference": "col2",
                  "odbc_type": "CHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "STRING"
                }
              },
              {
                "metadata": {
                  "item_index": 0,
                  "is_key": false,
                  "min_length": 0,
                  "decimal_scale": 0,
                  "decimal_precision": 0,
                  "max_length": 10,
                  "is_signed": false
                },
                "nullable": false,
                "name": "col3",
                "type": "string",
                "app_data": {
                  "column_reference": "col3",
                  "odbc_type": "VARCHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "is_unicode_string": false,
                  "type_code": "STRING"
                }
              }
            ]
          }
        ],
        "runtimes": [
          {
            "name": "pxOsh",
            "id": "pxOsh"
          }
        ],
        "app_data": {
          "datastage": {
            "version": "3.0.2"
          }
        }
      }
    }

Update DataStage subflow

Modifies a data subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

PUT /v3/data_intg_flows/subflows/{data_intg_subflow_id}
ServiceCall<DataIntgFlow> updateDatastageSubflows(UpdateDatastageSubflowsOptions updateDatastageSubflowsOptions)
updateDatastageSubflows(params)
update_datastage_subflows(self,
        data_intg_subflow_id: str,
        data_intg_subflow_name: str,
        *,
        pipeline_flows: 'PipelineJson' = None,
        catalog_id: str = None,
        project_id: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the UpdateDatastageSubflowsOptions.Builder to create a UpdateDatastageSubflowsOptions object that contains the parameter values for the updateDatastageSubflows method.

Path Parameters

  • The DataStage subflow ID to use.

Query Parameters

  • The DataStage subflow name.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

Pipeline json to be attached.

The updateDatastageSubflows options.

parameters

  • The DataStage subflow ID to use.

  • The DataStage subflow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • The DataStage subflow ID to use.

  • The DataStage subflow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X PUT --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   --header "Content-Type: application/json;charset=utf-8"   --data '{}'   "{base_url}/v3/data_intg_flows/subflows/{data_intg_subflow_id}?data_intg_subflow_name={data_intg_subflow_name}&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • PipelineJson exampleSubFlowUpdated = PipelineFlowHelper.buildPipelineFlow(updatedSubFlowJson);
    UpdateDatastageSubflowsOptions updateDatastageSubflowsOptions = new UpdateDatastageSubflowsOptions.Builder()
      .dataIntgSubflowId(subflowID)
      .dataIntgSubflowName(subflowName)
      .pipelineFlows(exampleSubFlowUpdated)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.updateDatastageSubflows(updateDatastageSubflowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgSubflowId: subflow_assetID,
           dataIntgSubflowName: dataIntgSubFlowName,
           pipelineFlows: pipelineJsonFromFile,
           projectId: projectID,
           assetCategory: 'system',
         };
         const res = await datastageService.updateDatastageSubflows(params);
  • data_intg_flow = datastage_service.update_datastage_subflows(
      data_intg_subflow_id=createdSubflowId,
      data_intg_subflow_name='testSubflow1Updated',
      pipeline_flows=UtilHelper.readJsonFileToDict('inputFiles/exampleSubflowUpdated.json'),
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "attachments": [
        {
          "asset_type": "data_intg_subflow",
          "attachment_id": "9081dd6b-0ab7-47b5-8233-c10c6e64509d",
          "href": "{url}/v2/assets/{asset_id}/attachments/9081dd6b-0ab7-47b5-8233-c10c6e64509d?project_id={project_id}",
          "mime": "application/json",
          "name": "data_intg_flows",
          "object_key": "data_intg_subflow/{project_id}{asset_id}",
          "object_key_is_read_only": false,
          "private_url": false
        }
      ],
      "entity": {
        "data_intg_subflow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_subflow",
        "catalog_id": "{catalog_id}",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "href": "{url}/data_intg/v3/data_intg_flows/subflows/{asset_id}?catalog_id={catalog_id}",
        "name": "{subflow_name}",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_subflow/{subflow_name}",
        "size": 2712,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:21:33.936000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx"
        }
      }
    }
  • {
      "attachments": [
        {
          "asset_type": "data_intg_subflow",
          "attachment_id": "9081dd6b-0ab7-47b5-8233-c10c6e64509d",
          "href": "{url}/v2/assets/{asset_id}/attachments/9081dd6b-0ab7-47b5-8233-c10c6e64509d?project_id={project_id}",
          "mime": "application/json",
          "name": "data_intg_flows",
          "object_key": "data_intg_subflow/{project_id}{asset_id}",
          "object_key_is_read_only": false,
          "private_url": false
        }
      ],
      "entity": {
        "data_intg_subflow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_subflow",
        "catalog_id": "{catalog_id}",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "href": "{url}/data_intg/v3/data_intg_flows/subflows/{asset_id}?catalog_id={catalog_id}",
        "name": "{subflow_name}",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_subflow/{subflow_name}",
        "size": 2712,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:21:33.936000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx"
        }
      }
    }

Clone DataStage subflow

Create a DataStage subflow in the specified project or catalog based on an existing DataStage subflow in the same project or catalog.

Create a DataStage subflow in the specified project or catalog based on an existing DataStage subflow in the same project or catalog.

Create a DataStage subflow in the specified project or catalog based on an existing DataStage subflow in the same project or catalog.

Create a DataStage subflow in the specified project or catalog based on an existing DataStage subflow in the same project or catalog.

POST /v3/data_intg_flows/subflows/{data_intg_subflow_id}/clone
ServiceCall<DataIntgFlow> cloneDatastageSubflows(CloneDatastageSubflowsOptions cloneDatastageSubflowsOptions)
cloneDatastageSubflows(params)
clone_datastage_subflows(self,
        data_intg_subflow_id: str,
        *,
        catalog_id: str = None,
        project_id: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the CloneDatastageSubflowsOptions.Builder to create a CloneDatastageSubflowsOptions object that contains the parameter values for the cloneDatastageSubflows method.

Path Parameters

  • The DataStage subflow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

The cloneDatastageSubflows options.

parameters

  • The DataStage subflow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • The DataStage subflow ID to use.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/subflows/{data_intg_subflow_id}/clone?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • CloneDatastageSubflowsOptions cloneDatastageSubflowsOptions = new CloneDatastageSubflowsOptions.Builder()
      .dataIntgSubflowId(subflowID)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.cloneDatastageSubflows(cloneDatastageSubflowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgSubflowId: subflow_assetID,
           projectId: projectID,
         };
         const res = await datastageService.cloneDatastageSubflows(params);
  • data_intg_flow = datastage_service.clone_datastage_subflows(
      data_intg_subflow_id=createdSubflowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Create V3 data flows from the attached job export file

Creates data flows from the attached job export file. This is an asynchronous call. The API call returns almost immediately which does not necessarily imply the completion of the import request. It only means that the import request has been accepted. The status field of the import request is included in the import response object. The status "completed" ("in_progress", "failed", resp.) indicates the import request is completed (in progress, and failed, resp.) The job export file for an import request may contain one mor more data flows. Unless the on_failure option is set to "stop", a completed import request may contain not only successfully imported data flows but also data flows that cannot be imported.

Creates data flows from the attached job export file. This is an asynchronous call. The API call returns almost immediately which does not necessarily imply the completion of the import request. It only means that the import request has been accepted. The status field of the import request is included in the import response object. The status "completed" ("in_progress", "failed", resp.) indicates the import request is completed (in progress, and failed, resp.) The job export file for an import request may contain one mor more data flows. Unless the on_failure option is set to "stop", a completed import request may contain not only successfully imported data flows but also data flows that cannot be imported.

Creates data flows from the attached job export file. This is an asynchronous call. The API call returns almost immediately which does not necessarily imply the completion of the import request. It only means that the import request has been accepted. The status field of the import request is included in the import response object. The status "completed" ("in_progress", "failed", resp.) indicates the import request is completed (in progress, and failed, resp.) The job export file for an import request may contain one mor more data flows. Unless the on_failure option is set to "stop", a completed import request may contain not only successfully imported data flows but also data flows that cannot be imported.

Creates data flows from the attached job export file. This is an asynchronous call. The API call returns almost immediately which does not necessarily imply the completion of the import request. It only means that the import request has been accepted. The status field of the import request is included in the import response object. The status "completed" ("in_progress", "failed", resp.) indicates the import request is completed (in progress, and failed, resp.) The job export file for an import request may contain one mor more data flows. Unless the on_failure option is set to "stop", a completed import request may contain not only successfully imported data flows but also data flows that cannot be imported.

POST /v3/migration/isx_imports
ServiceCall<ImportResponse> createMigration(CreateMigrationOptions createMigrationOptions)
createMigration(params)
create_migration(self,
        body: BinaryIO,
        *,
        catalog_id: str = None,
        project_id: str = None,
        on_failure: str = None,
        conflict_resolution: str = None,
        attachment_type: str = None,
        file_name: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the CreateMigrationOptions.Builder to create a CreateMigrationOptions object that contains the parameter values for the createMigration method.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • Action when the first import failure occurs. The default action is "continue" which will continue importing the remaining data flows. The "stop" action will stop the import operation upon the first error.

    Allowable values: [continue,stop]

    Example: continue

  • Resolution when data flow to be imported has a name conflict with an existing data flow in the project or catalog. The default conflict resolution is "skip" will skip the data flow so that it will not be imported. The "rename" resolution will append "_Import_NNNN" suffix to the original name and use the new name for the imported data flow, while the "replace" resolution will first remove the existing data flow with the same name and import the new data flow. For the "rename_replace" option, when the flow name is already used, a new flow name with the suffix "_DATASTAGE_ISX_IMPORT" will be used. If the name is not currently used, the imported flow will be created with this name. In case the new name is already used, the existing flow will be removed first before the imported flow is created. With the rename_replace option, job creation will be determined as follows. If the job name is already used, a new job name with the suffix ".DataStage job" will be used. If the new job name is not currently used, the job will be created with this name. In case the new job name is already used, the job creation will not happen and an error will be raised.

    Allowable values: [skip,rename,replace,rename_replace]

    Example: rename

  • Type of attachment. The default attachment type is "isx".

    Allowable values: [isx]

    Example: isx

  • Name of the input file, if it exists.

    Example: myFlows.isx

The createMigration options.

parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • Action when the first import failure occurs. The default action is "continue" which will continue importing the remaining data flows. The "stop" action will stop the import operation upon the first error.

    Allowable values: [continue,stop]

    Examples:
    value
    _source
    _lines
    _html
  • Resolution when data flow to be imported has a name conflict with an existing data flow in the project or catalog. The default conflict resolution is "skip" will skip the data flow so that it will not be imported. The "rename" resolution will append "_Import_NNNN" suffix to the original name and use the new name for the imported data flow, while the "replace" resolution will first remove the existing data flow with the same name and import the new data flow. For the "rename_replace" option, when the flow name is already used, a new flow name with the suffix "_DATASTAGE_ISX_IMPORT" will be used. If the name is not currently used, the imported flow will be created with this name. In case the new name is already used, the existing flow will be removed first before the imported flow is created. With the rename_replace option, job creation will be determined as follows. If the job name is already used, a new job name with the suffix ".DataStage job" will be used. If the new job name is not currently used, the job will be created with this name. In case the new job name is already used, the job creation will not happen and an error will be raised.

    Allowable values: [skip,rename,replace,rename_replace]

    Examples:
    value
    _source
    _lines
    _html
  • Type of attachment. The default attachment type is "isx".

    Allowable values: [isx]

    Examples:
    value
    _source
    _lines
    _html
  • Name of the input file, if it exists.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • Action when the first import failure occurs. The default action is "continue" which will continue importing the remaining data flows. The "stop" action will stop the import operation upon the first error.

    Allowable values: [continue,stop]

    Examples:
    value
    _source
    _lines
    _html
  • Resolution when data flow to be imported has a name conflict with an existing data flow in the project or catalog. The default conflict resolution is "skip" will skip the data flow so that it will not be imported. The "rename" resolution will append "_Import_NNNN" suffix to the original name and use the new name for the imported data flow, while the "replace" resolution will first remove the existing data flow with the same name and import the new data flow. For the "rename_replace" option, when the flow name is already used, a new flow name with the suffix "_DATASTAGE_ISX_IMPORT" will be used. If the name is not currently used, the imported flow will be created with this name. In case the new name is already used, the existing flow will be removed first before the imported flow is created. With the rename_replace option, job creation will be determined as follows. If the job name is already used, a new job name with the suffix ".DataStage job" will be used. If the new job name is not currently used, the job will be created with this name. In case the new job name is already used, the job creation will not happen and an error will be raised.

    Allowable values: [skip,rename,replace,rename_replace]

    Examples:
    value
    _source
    _lines
    _html
  • Type of attachment. The default attachment type is "isx".

    Allowable values: [isx]

    Examples:
    value
    _source
    _lines
    _html
  • Name of the input file, if it exists.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   --header "Content-Type: application/octet-stream"   --data 'createMockStream(This is a mock file.)'   "{base_url}/v3/migration/isx_imports?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23&on_failure=continue&conflict_resolution=rename&attachment_type=isx&file_name=myFlows.isx"
  • CreateMigrationOptions createMigrationOptions = new CreateMigrationOptions.Builder()
      .body(rowGenIsx)
      .projectId(projectID)
      .onFailure("continue")
      .conflictResolution("rename")
      .attachmentType("isx")
      .fileName("rowgen_peek.isx")
      .build();
    
    Response<ImportResponse> response = datastageService.createMigration(createMigrationOptions).execute();
    ImportResponse importResponse = response.getResult();
    
    System.out.println(importResponse);
  • const params = {
      body: Buffer.from(fs.readFileSync('testInput/rowgen_peek.isx')),
      projectId: projectID,
      onFailure: 'continue',
      conflictResolution: 'rename',
      attachmentType: 'isx',
      fileName: 'rowgen_peek.isx',
    };
    const res = await datastageService.createMigration(params);
  • import_response = datastage_service.create_migration(
      body=open(Path(__file__).parent / 'inputFiles/rowgen_peek.isx', "rb").read(),
      project_id=config['PROJECT_ID'],
      on_failure='continue',
      conflict_resolution='rename',
      attachment_type='isx',
      file_name='rowgen_peek.isx'
    ).get_result()
    
    print(json.dumps(import_response, indent=2))

Response

Response object of an import request.

Response object of an import request.

Response object of an import request.

Response object of an import request.

Status Code

  • The requested import operation has been accepted. However, the import operation may or may not be completed. The status field in the import response object describes the current status of the import. The response "Location" header provides a convenient url for retrieving the status with a GET request.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "entity": {
        "conflict_resolution": "rename",
        "end_time": "2021-04-08 17:28:46.819000+00:00",
        "import_data_flows": [
          {
            "conflict_resolution_status": "import_flow_renamed",
            "end_time": "2021-04-08 17:28:46.811000+00:00",
            "id": "3fe0af3b-20a8-4bbe-86a8-6675c0b0d300",
            "job_id": "a7c0110f-920c-4a3a-aa3b-2b55ee4bdad7",
            "job_name": "rowgen_peek_Import_1617902925763.DataStage job",
            "job_type": "px_job",
            "name": "rowgen_peek_Import_1617902921560",
            "original_name": "rowgen_peek",
            "status": "completed",
            "type": "px_job"
          }
        ],
        "name": "Import_1617902920158",
        "notifications": [
          {
            "created_at": "2021-04-08 17:28:46.819000+00:00",
            "id": "752eaf78-8689-41e4-8f7d-9b2286682f6f",
            "status": "completed"
          },
          {
            "created_at": "2021-04-08 17:28:40.160000+00:00",
            "id": "38b13187-e18f-4905-a357-33d9981b06aa",
            "status": "queued"
          }
        ],
        "on_failure": "continue",
        "remaining_time": 0,
        "start_time": "2021-04-08 17:28:41.082000+00:00",
        "status": "completed",
        "tally": {
          "connections_total": 0,
          "deprecated": 0,
          "failed": 0,
          "imported": 1,
          "parameter_sets_total": 0,
          "pending": 0,
          "px_containers_total": 0,
          "renamed": 1,
          "replaced": 0,
          "sequence_jobs_total": 0,
          "skipped": 0,
          "table_definitions_total": 0,
          "total": 1,
          "unsupported": 0
        }
      },
      "metadata": {
        "created_at": "2021-04-08 17:28:40.158000+00:00",
        "created_by": "{ibm_id}",
        "id": "395d1b77-60eb-4f8f-81bd-643c20f99bfb",
        "modified_at": "2021-04-08 17:28:46.819000+00:00",
        "name": "rowgen_peek.isx",
        "project_id": "{project_id}",
        "project_name": "dstage",
        "url": "{url}/data_intg/v3/migration/isx_imports/395d1b77-60eb-4f8f-81bd-643c20f99bfb?project_id={project_id}"
      }
    }
  • {
      "entity": {
        "conflict_resolution": "rename",
        "end_time": "2021-04-08 17:28:46.819000+00:00",
        "import_data_flows": [
          {
            "conflict_resolution_status": "import_flow_renamed",
            "end_time": "2021-04-08 17:28:46.811000+00:00",
            "id": "3fe0af3b-20a8-4bbe-86a8-6675c0b0d300",
            "job_id": "a7c0110f-920c-4a3a-aa3b-2b55ee4bdad7",
            "job_name": "rowgen_peek_Import_1617902925763.DataStage job",
            "job_type": "px_job",
            "name": "rowgen_peek_Import_1617902921560",
            "original_name": "rowgen_peek",
            "status": "completed",
            "type": "px_job"
          }
        ],
        "name": "Import_1617902920158",
        "notifications": [
          {
            "created_at": "2021-04-08 17:28:46.819000+00:00",
            "id": "752eaf78-8689-41e4-8f7d-9b2286682f6f",
            "status": "completed"
          },
          {
            "created_at": "2021-04-08 17:28:40.160000+00:00",
            "id": "38b13187-e18f-4905-a357-33d9981b06aa",
            "status": "queued"
          }
        ],
        "on_failure": "continue",
        "remaining_time": 0,
        "start_time": "2021-04-08 17:28:41.082000+00:00",
        "status": "completed",
        "tally": {
          "connections_total": 0,
          "deprecated": 0,
          "failed": 0,
          "imported": 1,
          "parameter_sets_total": 0,
          "pending": 0,
          "px_containers_total": 0,
          "renamed": 1,
          "replaced": 0,
          "sequence_jobs_total": 0,
          "skipped": 0,
          "table_definitions_total": 0,
          "total": 1,
          "unsupported": 0
        }
      },
      "metadata": {
        "created_at": "2021-04-08 17:28:40.158000+00:00",
        "created_by": "{ibm_id}",
        "id": "395d1b77-60eb-4f8f-81bd-643c20f99bfb",
        "modified_at": "2021-04-08 17:28:46.819000+00:00",
        "name": "rowgen_peek.isx",
        "project_id": "{project_id}",
        "project_name": "dstage",
        "url": "{url}/data_intg/v3/migration/isx_imports/395d1b77-60eb-4f8f-81bd-643c20f99bfb?project_id={project_id}"
      }
    }

Cancel a previous import request

Cancel a previous import request. Use GET /v3/migration/imports/{import_id} to obtain the current status of the import, including whether it has been cancelled.

Cancel a previous import request. Use GET /v3/migration/imports/{import_id} to obtain the current status of the import, including whether it has been cancelled.

Cancel a previous import request. Use GET /v3/migration/imports/{import_id} to obtain the current status of the import, including whether it has been cancelled.

Cancel a previous import request. Use GET /v3/migration/imports/{import_id} to obtain the current status of the import, including whether it has been cancelled.

DELETE /v3/migration/isx_imports/{import_id}
ServiceCall<Void> deleteMigration(DeleteMigrationOptions deleteMigrationOptions)
deleteMigration(params)
delete_migration(self,
        import_id: str,
        *,
        catalog_id: str = None,
        project_id: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the DeleteMigrationOptions.Builder to create a DeleteMigrationOptions object that contains the parameter values for the deleteMigration method.

Path Parameters

  • Unique ID of the import request.

    Example: cc6dbbfd-810d-4f0e-b0a9-228c328aff29

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

The deleteMigration options.

parameters

  • Unique ID of the import request.

    Examples:
    value
    _source
    _lines
    _html
  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • Unique ID of the import request.

    Examples:
    value
    _source
    _lines
    _html
  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X DELETE --location --header "Authorization: Bearer {iam_token}"   "{base_url}/v3/migration/isx_imports/{import_id}?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • DeleteMigrationOptions deleteMigrationOptions = new DeleteMigrationOptions.Builder()
      .importId(importID)
      .projectId(projectID)
      .build();
    
    datastageService.deleteMigration(deleteMigrationOptions).execute();
  •      const params = {
           importId: importID,
           projectId: projectID,
         };
         const res = await datastageService.deleteMigration(params);
  • response = datastage_service.delete_migration(
      import_id=importId,
      project_id=config['PROJECT_ID']
    )

Response

Status Code

  • The import cancellation request was accepted.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • Status of the import request cannot be found. This can be due to the given import_id is not valid, or the import has been completed long ago and its status information is no longer available.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Get the status of a previous import request

Gets the status of an import request. The status field in the response object indicates if the given import is completed, in progress, or failed. Detailed status information about each imported data flow is also contained in the response object.

Gets the status of an import request. The status field in the response object indicates if the given import is completed, in progress, or failed. Detailed status information about each imported data flow is also contained in the response object.

Gets the status of an import request. The status field in the response object indicates if the given import is completed, in progress, or failed. Detailed status information about each imported data flow is also contained in the response object.

Gets the status of an import request. The status field in the response object indicates if the given import is completed, in progress, or failed. Detailed status information about each imported data flow is also contained in the response object.

GET /v3/migration/isx_imports/{import_id}
ServiceCall<ImportResponse> getMigration(GetMigrationOptions getMigrationOptions)
getMigration(params)
get_migration(self,
        import_id: str,
        *,
        catalog_id: str = None,
        project_id: str = None,
        **kwargs
    ) -> DetailedResponse

Request

Use the GetMigrationOptions.Builder to create a GetMigrationOptions object that contains the parameter values for the getMigration method.

Path Parameters

  • Unique ID of the import request.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

The getMigration options.

parameters

  • Unique ID of the import request.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html

parameters

  • Unique ID of the import request.

  • The ID of the catalog to use. catalog_id or project_id is required.

  • The ID of the project to use. catalog_id or project_id is required.

    Examples:
    value
    _source
    _lines
    _html
  • curl -X GET --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/migration/isx_imports/{import_id}?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • GetMigrationOptions getMigrationOptions = new GetMigrationOptions.Builder()
      .importId(importID)
      .projectId(projectID)
      .build();
    
    Response<ImportResponse> response = datastageService.getMigration(getMigrationOptions).execute();
    ImportResponse importResponse = response.getResult();
    
    System.out.println(importResponse);
  •      const params = {
           importId: importID,
           projectId: projectID,
         };
         const res = await datastageService.getMigration(params);
  • import_response = datastage_service.get_migration(
      import_id=importId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(import_response, indent=2))

Response

Response object of an import request.

Response object of an import request.

Response object of an import request.

Response object of an import request.

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • Status of the import request cannot be found. This can be due to the given import_id is not valid, or the import has been completed long ago and its status information is no longer available.

  • An error occurred. See response for more information.

Example responses
  • {
      "entity": {
        "conflict_resolution": "rename",
        "end_time": "2021-04-08 17:28:46.819000+00:00",
        "import_data_flows": [
          {
            "conflict_resolution_status": "import_flow_renamed",
            "end_time": "2021-04-08 17:28:46.811000+00:00",
            "id": "3fe0af3b-20a8-4bbe-86a8-6675c0b0d300",
            "job_id": "a7c0110f-920c-4a3a-aa3b-2b55ee4bdad7",
            "job_name": "rowgen_peek_Import_1617902925763.DataStage job",
            "job_type": "px_job",
            "name": "rowgen_peek_Import_1617902921560",
            "original_name": "rowgen_peek",
            "status": "completed",
            "type": "px_job"
          }
        ],
        "name": "Import_1617902920158",
        "notifications": [
          {
            "created_at": "2021-04-08 17:28:46.819000+00:00",
            "id": "752eaf78-8689-41e4-8f7d-9b2286682f6f",
            "status": "completed"
          },
          {
            "created_at": "2021-04-08 17:28:40.160000+00:00",
            "id": "38b13187-e18f-4905-a357-33d9981b06aa",
            "status": "queued"
          }
        ],
        "on_failure": "continue",
        "remaining_time": 0,
        "start_time": "2021-04-08 17:28:41.082000+00:00",
        "status": "completed",
        "tally": {
          "connections_total": 0,
          "deprecated": 0,
          "failed": 0,
          "imported": 1,
          "parameter_sets_total": 0,
          "pending": 0,
          "px_containers_total": 0,
          "renamed": 1,
          "replaced": 0,
          "sequence_jobs_total": 0,
          "skipped": 0,
          "table_definitions_total": 0,
          "total": 1,
          "unsupported": 0
        }
      },
      "metadata": {
        "created_at": "2021-04-08 17:28:40.158000+00:00",
        "created_by": "{ibm_id}",
        "id": "395d1b77-60eb-4f8f-81bd-643c20f99bfb",
        "modified_at": "2021-04-08 17:28:46.819000+00:00",
        "name": "rowgen_peek.isx",
        "project_id": "{project_id}",
        "project_name": "dstage",
        "url": "{url}/data_intg/v3/migration/isx_imports/395d1b77-60eb-4f8f-81bd-643c20f99bfb?project_id={project_id}"
      }
    }
  • {
      "entity": {
        "conflict_resolution": "rename",
        "end_time": "2021-04-08 17:28:46.819000+00:00",
        "import_data_flows": [
          {
            "conflict_resolution_status": "import_flow_renamed",
            "end_time": "2021-04-08 17:28:46.811000+00:00",
            "id": "3fe0af3b-20a8-4bbe-86a8-6675c0b0d300",
            "job_id": "a7c0110f-920c-4a3a-aa3b-2b55ee4bdad7",
            "job_name": "rowgen_peek_Import_1617902925763.DataStage job",
            "job_type": "px_job",
            "name": "rowgen_peek_Import_1617902921560",
            "original_name": "rowgen_peek",
            "status": "completed",
            "type": "px_job"
          }
        ],
        "name": "Import_1617902920158",
        "notifications": [
          {
            "created_at": "2021-04-08 17:28:46.819000+00:00",
            "id": "752eaf78-8689-41e4-8f7d-9b2286682f6f",
            "status": "completed"
          },
          {
            "created_at": "2021-04-08 17:28:40.160000+00:00",
            "id": "38b13187-e18f-4905-a357-33d9981b06aa",
            "status": "queued"
          }
        ],
        "on_failure": "continue",
        "remaining_time": 0,
        "start_time": "2021-04-08 17:28:41.082000+00:00",
        "status": "completed",
        "tally": {
          "connections_total": 0,
          "deprecated": 0,
          "failed": 0,
          "imported": 1,
          "parameter_sets_total": 0,
          "pending": 0,
          "px_containers_total": 0,
          "renamed": 1,
          "replaced": 0,
          "sequence_jobs_total": 0,
          "skipped": 0,
          "table_definitions_total": 0,
          "total": 1,
          "unsupported": 0
        }
      },
      "metadata": {
        "created_at": "2021-04-08 17:28:40.158000+00:00",
        "created_by": "{ibm_id}",
        "id": "395d1b77-60eb-4f8f-81bd-643c20f99bfb",
        "modified_at": "2021-04-08 17:28:46.819000+00:00",
        "name": "rowgen_peek.isx",
        "project_id": "{project_id}",
        "project_name": "dstage",
        "url": "{url}/data_intg/v3/migration/isx_imports/395d1b77-60eb-4f8f-81bd-643c20f99bfb?project_id={project_id}"
      }
    }