IBM Cloud API Docs

Introduction

You can use a collection of IBM DataStage REST APIs to process, compile, and run flows. DataStage flows are design-time assets that contain data integration logic in JSON-based schemas.

Process flows Use the processing API to manipulate data that you have read from a data source before writing it to a data target.

Compile flows Use the compile API to compile flows. All flows must be compiled before you run them. .

Run flows Use the run API to run flows. When you run a flow, the extraction, loading, and transforming tasks that were built into the flow designs are actually implemented.

You can use the DataStage REST APIs for both DataStage in Cloud Pak for Data as a service and DataStage in Cloud Pak for Data.

For more information on the DataStage service, see the following links:

The code examples on this tab use the client library that is provided for Java.

<dependency>
<groupId>com.ibm.cloud</groupId>
<artifactId>datastage</artifactId>
<version>0.0.1</version>
</dependency>

Gradle

compile 'com.ibm.cloud:datastage:0.0.1'

GitHub

The code examples on this tab use the client library that is provided for Node.js.

Installation

npm install datastage

GitHub

The code examples on this tab use the client library that is provided for Python.

Installation

pip install --upgrade "datastage>=0.0.1"

GitHub

Authentication

Before you can call an IBM DataStage API, you must first create an IAM bearer token. Tokens support authenticated requests without embedding service credentials in every call. Each token is valid for one hour. After a token expires, you must create a new one if you want to continue using the API. The recommended method to retrieve a token programmatically is to create an API key for your IBM Cloud identity and then use the IAM token API to exchange that key for a token. For more information on authentication, see the following links:

Replace {apikey} and {url} with your service credentials.

curl -X {request_method} -u "apikey:{apikey}" "{url}/v4/{method}"

Setting client options through external configuration

Example environment variables, where <SERVICE_URL> is the endpoint URL, <API_KEY> is your IAM API key and <IAM_URL> is your IAM URL endpoint

DATASTAGE_AUTH_TYPE=iam
DATASTAGE_URL=https://api.dataplatform.cloud.ibm.com/data_intg
DATASTAGE_APIKEY=<API_KEY>
DATASTAGE_AUTH_URL=https://iam.cloud.ibm.com/identity/token

Example of constructing the service client

import com.ibm.cloud.datastage.v3.Datastage;
Datastage service = Datastage.newInstance();

Setting client options through external configuration

Example environment variables, where <SERVICE_URL> is the endpoint URL, <API_KEY> is your IAM API key and <IAM_URL> is your IAM URL endpoint

DATASTAGE_AUTH_TYPE=iam
DATASTAGE_URL=https://api.dataplatform.cloud.ibm.com/data_intg
DATASTAGE_APIKEY=<API_KEY>
DATASTAGE_AUTH_URL=https://iam.cloud.ibm.com/identity/token

Example of constructing the service client

const DatastageV3 = require('datastage/datastage/v3');
const datastageService = DatastageV3.newInstance({});

Setting client options through external configuration

To authenticate when using this sdk, an external credentials file is necessary (i.e. credentials.env). In this credentials file you will define and set 4 required fields for authenticating your sdk use against IAM.

Example environment variables, where <API_KEY> is your IAM API key

DATASTAGE_AUTH_TYPE=iam
DATASTAGE_URL=https://api.dataplatform.cloud.ibm.com/data_intg
DATASTAGE_APIKEY=<API_KEY>
DATASTAGE_AUTH_URL=https://iam.cloud.ibm.com/identity/token

Example of constructing the service client

import os
from datastage.datastage_v3 import DatastageV3

# define path to external credentials file
config_file = 'credentials.env'
# define a chosen service name
custom_service_name = 'DATASTAGE'

datastage_service = None

if os.path.exists(config_file):
    # set environment variable to point towards credentials file path
	os.environ['IBM_CREDENTIALS_FILE'] = config_file

    # create datastage instance using custom service name
	datastage_service = DatastageV3.new_instance(custom_service_name)

Endpoint URLs

Identify the base URL for your service instance.

IBM Cloud URLs

The base URLs come from the service instance. To find the URL, view the service credentials by clicking the name of the service in the Resource list. Use the value of the URL. Add the method to form the complete API endpoint for your request.

https://api.dataplatform.cloud.ibm.com/data_intg

Example API request

curl --request GET --header "Content-Type: application/json" --header "Accept: application/json" --header "Authorization: Bearer <IAM token>" --url "https://api.dataplatform.cloud.ibm.com/data_intg/v3/data_intg_flows?project_id=<Project ID>&limit=10"

Replace <IAM token> and <Project ID> in this example with the values for your particular API call.

Auditing

You can monitor API activity within your account by using the DataStage service. Whenever an API method is called, an event is generated that you can then track and audit from within Acitivty Tracker. The specific event type is listed for each individual method.

Error handling

DataStage uses standard HTTP response codes to indicate whether a method completed successfully. HTTP response codes in the 2xx range indicate success. A response in the 4xx range is some sort of failure, and a response in the 5xx range usually indicates an internal system error that cannot be resolved by the user. Response codes are listed with the method.

ErrorResponse

Name Description
error
string
Description of the problem.
code
integer
HTTP response code.
code_description
string
Response message.
warnings
string
Warnings associated with the error.

Methods

Delete DataStage flows

Deletes the specified data flows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data flows and their runs will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

All the data flow runs associated with the data flows will also be deleted. If a data flow is still running, it will not be deleted unless the force parameter is set to true. If a data flow is still running and the force parameter is set to true, the call returns immediately with a 202 response. The related data flows are deleted after the data flow runs are stopped.

Deletes the specified data flows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data flows and their runs will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously. All the data flow runs associated with the data flows will also be deleted. If a data flow is still running, it will not be deleted unless the force parameter is set to true. If a data flow is still running and the force parameter is set to true, the call returns immediately with a 202 response. The related data flows are deleted after the data flow runs are stopped.

Deletes the specified data flows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data flows and their runs will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously. All the data flow runs associated with the data flows will also be deleted. If a data flow is still running, it will not be deleted unless the force parameter is set to true. If a data flow is still running and the force parameter is set to true, the call returns immediately with a 202 response. The related data flows are deleted after the data flow runs are stopped.

Deletes the specified data flows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data flows and their runs will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously. All the data flow runs associated with the data flows will also be deleted. If a data flow is still running, it will not be deleted unless the force parameter is set to true. If a data flow is still running and the force parameter is set to true, the call returns immediately with a 202 response. The related data flows are deleted after the data flow runs are stopped.

DELETE /v3/data_intg_flows
ServiceCall<Void> deleteDatastageFlows(DeleteDatastageFlowsOptions deleteDatastageFlowsOptions)
deleteDatastageFlows(params)
delete_datastage_flows(
        self,
        id: List[str],
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        force: Optional[bool] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the DeleteDatastageFlowsOptions.Builder to create a DeleteDatastageFlowsOptions object that contains the parameter values for the deleteDatastageFlows method.

Query Parameters

  • The list of DataStage flow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • Whether to stop all running data flows. Running DataStage flows must be stopped before the DataStage flows can be deleted.

The deleteDatastageFlows options.

parameters

  • The list of DataStage flow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • Whether to stop all running data flows. Running DataStage flows must be stopped before the DataStage flows can be deleted.

parameters

  • The list of DataStage flow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • Whether to stop all running data flows. Running DataStage flows must be stopped before the DataStage flows can be deleted.

  • curl -X DELETE --location --header "Authorization: Bearer {iam_token}"   "{base_url}/v3/data_intg_flows?id=[]&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • String[] ids = new String[] {flowID, cloneFlowID};
    DeleteDatastageFlowsOptions deleteDatastageFlowsOptions = new DeleteDatastageFlowsOptions.Builder()
      .id(Arrays.asList(ids))
      .projectId(projectID)
      .build();
    
    datastageService.deleteDatastageFlows(deleteDatastageFlowsOptions).execute();
  •      const params = {
           id: [subflow_assetID, subflowCloneID],
           projectId: projectID,
         };
         const res = await datastageService.deleteDatastageSubflows(params);
  • response = datastage_service.delete_datastage_flows(
      id=createdFlowId,
      project_id=config['PROJECT_ID']
    )

Response

Status Code

  • The requested operation is in progress.

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Get metadata for DataStage flows

Lists the metadata and entity for DataStage flows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name

Lists the metadata and entity for DataStage flows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

Lists the metadata and entity for DataStage flows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

Lists the metadata and entity for DataStage flows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

GET /v3/data_intg_flows
ServiceCall<DataFlowPagedCollection> listDatastageFlows(ListDatastageFlowsOptions listDatastageFlowsOptions)
listDatastageFlows(params)
list_datastage_flows(
        self,
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        sort: Optional[str] = None,
        start: Optional[str] = None,
        limit: Optional[int] = None,
        entity_name: Optional[str] = None,
        entity_description: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the ListDatastageFlowsOptions.Builder to create a ListDatastageFlowsOptions object that contains the parameter values for the listDatastageFlows method.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e2

  • The limit of the number of items to return for each page, for example limit=50. If not specified a default of 100 will be used. The maximum value of limit is 200.

    Example: 100

  • Filter results based on the specified name.

  • Filter results based on the specified description.

The listDatastageFlows options.

parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

    Examples:
  • The limit of the number of items to return for each page, for example limit=50. If not specified a default of 100 will be used. The maximum value of limit is 200.

    Examples:
  • Filter results based on the specified name.

  • Filter results based on the specified description.

parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

    Examples:
  • The limit of the number of items to return for each page, for example limit=50. If not specified a default of 100 will be used. The maximum value of limit is 200.

    Examples:
  • Filter results based on the specified name.

  • Filter results based on the specified description.

  • curl -X GET --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23&limit=100"
  • ListDatastageFlowsOptions listDatastageFlowsOptions = new ListDatastageFlowsOptions.Builder()
      .projectId(projectID)
      .limit(Long.valueOf("100"))
      .build();
    
    Response<DataFlowPagedCollection> response = datastageService.listDatastageFlows(listDatastageFlowsOptions).execute();
    DataFlowPagedCollection dataFlowPagedCollection = response.getResult();
    
    System.out.println(dataFlowPagedCollection);
  •      const params = {
           projectId: projectID,
           sort: 'name',
           limit: 100,
         };
         const res = await datastageService.listDatastageFlows(params);
  • data_flow_paged_collection = datastage_service.list_datastage_flows(
      project_id=config['PROJECT_ID'],
      limit=100
    ).get_result()
    
    print(json.dumps(data_flow_paged_collection, indent=2))

Response

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "data_flows": [
        {
          "entity": {
            "description": " ",
            "name": "{job_name}",
            "rov": {
              "mode": 0
            }
          },
          "metadata": {
            "asset_id": "{asset_id}",
            "asset_type": "data_intg_flow",
            "create_time": "2021-04-03 15:32:55+00:00",
            "creator_id": "IBMid-xxxxxxxxx",
            "description": " ",
            "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?project_id={project_id}",
            "name": "{job_name}",
            "project_id": "{project_id}",
            "resource_key": "{project_id}/data_intg_flow/{job_name}",
            "size": 5780,
            "usage": {
              "access_count": 0,
              "last_access_time": "2021-04-03 15:33:01.320000+00:00",
              "last_accessor_id": "IBMid-xxxxxxxxx",
              "last_modification_time": "2021-04-03 15:33:01.320000+00:00",
              "last_modifier_id": "IBMid-xxxxxxxxx"
            }
          }
        }
      ],
      "first": {
        "href": "{url}/data_intg/v3/data_intg_flows?project_id={project_id}&limit=2"
      },
      "next": {
        "href": "{url}/data_intg/v3/data_intg_flows?project_id={project_id}&limit=2&start=g1AAAADOeJzLYWBgYMpgTmHQSklKzi9KdUhJMjTUS8rVTU7WLS3WLc4vLcnQNbLQS87JL01JzCvRy0styQHpyWMBkgwNQOr____9WWCxXCAhYmRgZKhrYKJrYBxiaGplbGRlahqVaJCFZocB8XYcgNhxHrcdhlamhlGJ-llZAD4lOMI"
      },
      "total_count": 135
    }
  • {
      "data_flows": [
        {
          "entity": {
            "description": " ",
            "name": "{job_name}",
            "rov": {
              "mode": 0
            }
          },
          "metadata": {
            "asset_id": "{asset_id}",
            "asset_type": "data_intg_flow",
            "create_time": "2021-04-03 15:32:55+00:00",
            "creator_id": "IBMid-xxxxxxxxx",
            "description": " ",
            "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?project_id={project_id}",
            "name": "{job_name}",
            "project_id": "{project_id}",
            "resource_key": "{project_id}/data_intg_flow/{job_name}",
            "size": 5780,
            "usage": {
              "access_count": 0,
              "last_access_time": "2021-04-03 15:33:01.320000+00:00",
              "last_accessor_id": "IBMid-xxxxxxxxx",
              "last_modification_time": "2021-04-03 15:33:01.320000+00:00",
              "last_modifier_id": "IBMid-xxxxxxxxx"
            }
          }
        }
      ],
      "first": {
        "href": "{url}/data_intg/v3/data_intg_flows?project_id={project_id}&limit=2"
      },
      "next": {
        "href": "{url}/data_intg/v3/data_intg_flows?project_id={project_id}&limit=2&start=g1AAAADOeJzLYWBgYMpgTmHQSklKzi9KdUhJMjTUS8rVTU7WLS3WLc4vLcnQNbLQS87JL01JzCvRy0styQHpyWMBkgwNQOr____9WWCxXCAhYmRgZKhrYKJrYBxiaGplbGRlahqVaJCFZocB8XYcgNhxHrcdhlamhlGJ-llZAD4lOMI"
      },
      "total_count": 135
    }

Create DataStage flow

Creates a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

POST /v3/data_intg_flows
ServiceCall<DataIntgFlow> createDatastageFlows(CreateDatastageFlowsOptions createDatastageFlowsOptions)
createDatastageFlows(params)
create_datastage_flows(
        self,
        data_intg_flow_name: str,
        *,
        pipeline_flows: Optional['PipelineJson'] = None,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the CreateDatastageFlowsOptions.Builder to create a CreateDatastageFlowsOptions object that contains the parameter values for the createDatastageFlows method.

Query Parameters

  • The data flow name.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • The directory asset ID.

Pipeline json to be attached.

The createDatastageFlows options.

parameters

  • The data flow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

parameters

  • The data flow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   --header "Content-Type: application/json;charset=utf-8"   --data '{}'   "{base_url}/v3/data_intg_flows?data_intg_flow_name={data_intg_flow_name}&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • PipelineJson exampleFlow = PipelineFlowHelper.buildPipelineFlow(flowJson);
    CreateDatastageFlowsOptions createDatastageFlowsOptions = new CreateDatastageFlowsOptions.Builder()
      .dataIntgFlowName(flowName)
      .pipelineFlows(exampleFlow)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.createDatastageFlows(createDatastageFlowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    System.out.println(dataIntgFlow);
  •      const pipelineJsonFromFile = JSON.parse(fs.readFileSync('testInput/rowgen_peek.json', 'utf-8'));
         const params = {
           dataIntgFlowName,
           pipelineFlows: pipelineJsonFromFile,
           projectId: projectID,
           assetCategory: 'system',
         };
         const res = await datastageService.createDatastageFlows(params);
  • data_intg_flow = datastage_service.create_datastage_flows(
      data_intg_flow_name='testFlowJob1',
      pipeline_flows=UtilHelper.readJsonFileToDict('inputFiles/exampleFlow.json'),
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}",
        "name": "{job_name}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_flow/{job_name}"
      }
    }
  • {
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}",
        "name": "{job_name}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_flow/{job_name}"
      }
    }

Get DataStage flow

Lists the DataStage flow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage flow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage flow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage flow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

GET /v3/data_intg_flows/{data_intg_flow_id}
ServiceCall<DataIntgFlowJson> getDatastageFlows(GetDatastageFlowsOptions getDatastageFlowsOptions)
getDatastageFlows(params)
get_datastage_flows(
        self,
        data_intg_flow_id: str,
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the GetDatastageFlowsOptions.Builder to create a GetDatastageFlowsOptions object that contains the parameter values for the getDatastageFlows method.

Path Parameters

  • The DataStage flow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

The getDatastageFlows options.

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • curl -X GET --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/{data_intg_flow_id}?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • GetDatastageFlowsOptions getDatastageFlowsOptions = new GetDatastageFlowsOptions.Builder()
      .dataIntgFlowId(flowID)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlowJson> response = datastageService.getDatastageFlows(getDatastageFlowsOptions).execute();
    DataIntgFlowJson dataIntgFlowJson = response.getResult();
    
    System.out.println(dataIntgFlowJson);
  •      const params = {
           dataIntgFlowId: assetID,
           projectId: projectID,
         };
         const res = await datastageService.getDatastageFlows(params);
  • data_intg_flow_json = datastage_service.get_datastage_flows(
      data_intg_flow_id=createdFlowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow_json, indent=2))

Response

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • Unexpected error.

Example responses
  • {
      "attachments": {
        "app_data": {
          "datastage": {
            "external_parameters": []
          }
        },
        "doc_type": "pipeline",
        "id": "98cc1fa0-0fd8-4d55-9b27-d477096b4b37",
        "json_schema": "{url}/schemas/common-pipeline/pipeline-flow/pipeline-flow-v3-schema.json",
        "pipelines": [
          {
            "app_data": {
              "datastage": {
                "runtime_column_propagation": "false"
              },
              "ui_data": {
                "comments": []
              }
            },
            "id": "287b2b30-95ff-4cc8-b18f-92e23c464134",
            "nodes": [
              {
                "app_data": {
                  "datastage": {
                    "outputs_order": "46e18367-1820-4fe8-8c7c-d8badbc76aa3"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/PxRowGenerator.svg",
                    "label": "RowGen_1",
                    "x_pos": 239,
                    "y_pos": 236
                  }
                },
                "id": "77e6d535-8312-4692-8850-c129dcf921ed",
                "op": "PxRowGenerator",
                "outputs": [
                  {
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "55b884a7-9cfb-4e02-802b-82444ee95bb5"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "id": "46e18367-1820-4fe8-8c7c-d8badbc76aa3",
                    "parameters": {
                      "buf_free_run": 50,
                      "disk_write_inc": 1048576,
                      "max_mem_buf_size": 3145728,
                      "queue_upper_size": 0,
                      "records": 10
                    },
                    "schema_ref": "07fed318-4370-4c95-bbbc-16d4a91421bb"
                  }
                ],
                "parameters": {
                  "input_count": 0,
                  "output_count": 1
                },
                "type": "binding"
              },
              {
                "app_data": {
                  "datastage": {
                    "inputs_order": "9e842525-7bbf-4a42-ae95-49ae325e0c87"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/informix.svg",
                    "label": "informixTgt",
                    "x_pos": 690,
                    "y_pos": 229
                  }
                },
                "connection": {
                  "project_ref": "{project_id}",
                  "properties": {
                    "create_statement": "CREATE TABLE custid(customer_num int)",
                    "table_action": "append",
                    "table_name": "custid",
                    "write_mode": "insert"
                  },
                  "ref": "85193161-aa63-4cc5-80e7-7bfcdd59c438"
                },
                "id": "8b4933d9-32c0-4c40-9c47-d8791ab12baf",
                "inputs": [
                  {
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "id": "9e842525-7bbf-4a42-ae95-49ae325e0c87",
                    "links": [
                      {
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "class_name": "",
                                "hotspot": false,
                                "id": "Link_3",
                                "label": "Link_3",
                                "outline": true,
                                "path": "",
                                "position": "middle"
                              }
                            ]
                          }
                        },
                        "id": "55b884a7-9cfb-4e02-802b-82444ee95bb5",
                        "link_name": "Link_3",
                        "node_id_ref": "77e6d535-8312-4692-8850-c129dcf921ed",
                        "port_id_ref": "46e18367-1820-4fe8-8c7c-d8badbc76aa3",
                        "type_attr": "PRIMARY"
                      }
                    ],
                    "parameters": {
                      "part_coll": "part_type"
                    },
                    "schema_ref": "07fed318-4370-4c95-bbbc-16d4a91421bb"
                  }
                ],
                "op": "informix",
                "parameters": {
                  "input_count": 1,
                  "output_count": 0
                },
                "type": "binding"
              }
            ],
            "runtime_ref": "pxOsh"
          }
        ],
        "primary_pipeline": "287b2b30-95ff-4cc8-b18f-92e23c464134",
        "schemas": [
          {
            "fields": [
              {
                "app_data": {
                  "column_reference": "customer_num",
                  "is_unicode_string": false,
                  "odbc_type": "INTEGER",
                  "table_def": "Saved\\\\Link_3\\\\ifx_customer",
                  "type_code": "INT32"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": true,
                  "item_index": 0,
                  "max_length": 0,
                  "min_length": 0
                },
                "name": "customer_num",
                "nullable": false,
                "type": "integer"
              }
            ],
            "id": "07fed318-4370-4c95-bbbc-16d4a91421bb"
          }
        ],
        "version": "3.0"
      },
      "entity": {
        "description": "",
        "name": "{job_name}",
        "rov": {
          "mode": 0
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "name": "{job_name}",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:14:10.193000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx",
          "last_modification_time": "2021-04-08 17:14:10.193000+00:00",
          "last_modifier_id": "IBMid-xxxxxxxxxx"
        }
      }
    }
  • {
      "attachments": {
        "app_data": {
          "datastage": {
            "external_parameters": []
          }
        },
        "doc_type": "pipeline",
        "id": "98cc1fa0-0fd8-4d55-9b27-d477096b4b37",
        "json_schema": "{url}/schemas/common-pipeline/pipeline-flow/pipeline-flow-v3-schema.json",
        "pipelines": [
          {
            "app_data": {
              "datastage": {
                "runtime_column_propagation": "false"
              },
              "ui_data": {
                "comments": []
              }
            },
            "id": "287b2b30-95ff-4cc8-b18f-92e23c464134",
            "nodes": [
              {
                "app_data": {
                  "datastage": {
                    "outputs_order": "46e18367-1820-4fe8-8c7c-d8badbc76aa3"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/PxRowGenerator.svg",
                    "label": "RowGen_1",
                    "x_pos": 239,
                    "y_pos": 236
                  }
                },
                "id": "77e6d535-8312-4692-8850-c129dcf921ed",
                "op": "PxRowGenerator",
                "outputs": [
                  {
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "55b884a7-9cfb-4e02-802b-82444ee95bb5"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "id": "46e18367-1820-4fe8-8c7c-d8badbc76aa3",
                    "parameters": {
                      "buf_free_run": 50,
                      "disk_write_inc": 1048576,
                      "max_mem_buf_size": 3145728,
                      "queue_upper_size": 0,
                      "records": 10
                    },
                    "schema_ref": "07fed318-4370-4c95-bbbc-16d4a91421bb"
                  }
                ],
                "parameters": {
                  "input_count": 0,
                  "output_count": 1
                },
                "type": "binding"
              },
              {
                "app_data": {
                  "datastage": {
                    "inputs_order": "9e842525-7bbf-4a42-ae95-49ae325e0c87"
                  },
                  "ui_data": {
                    "image": "../graphics/palette/informix.svg",
                    "label": "informixTgt",
                    "x_pos": 690,
                    "y_pos": 229
                  }
                },
                "connection": {
                  "project_ref": "{project_id}",
                  "properties": {
                    "create_statement": "CREATE TABLE custid(customer_num int)",
                    "table_action": "append",
                    "table_name": "custid",
                    "write_mode": "insert"
                  },
                  "ref": "85193161-aa63-4cc5-80e7-7bfcdd59c438"
                },
                "id": "8b4933d9-32c0-4c40-9c47-d8791ab12baf",
                "inputs": [
                  {
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "id": "9e842525-7bbf-4a42-ae95-49ae325e0c87",
                    "links": [
                      {
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "class_name": "",
                                "hotspot": false,
                                "id": "Link_3",
                                "label": "Link_3",
                                "outline": true,
                                "path": "",
                                "position": "middle"
                              }
                            ]
                          }
                        },
                        "id": "55b884a7-9cfb-4e02-802b-82444ee95bb5",
                        "link_name": "Link_3",
                        "node_id_ref": "77e6d535-8312-4692-8850-c129dcf921ed",
                        "port_id_ref": "46e18367-1820-4fe8-8c7c-d8badbc76aa3",
                        "type_attr": "PRIMARY"
                      }
                    ],
                    "parameters": {
                      "part_coll": "part_type"
                    },
                    "schema_ref": "07fed318-4370-4c95-bbbc-16d4a91421bb"
                  }
                ],
                "op": "informix",
                "parameters": {
                  "input_count": 1,
                  "output_count": 0
                },
                "type": "binding"
              }
            ],
            "runtime_ref": "pxOsh"
          }
        ],
        "primary_pipeline": "287b2b30-95ff-4cc8-b18f-92e23c464134",
        "schemas": [
          {
            "fields": [
              {
                "app_data": {
                  "column_reference": "customer_num",
                  "is_unicode_string": false,
                  "odbc_type": "INTEGER",
                  "table_def": "Saved\\\\Link_3\\\\ifx_customer",
                  "type_code": "INT32"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": true,
                  "item_index": 0,
                  "max_length": 0,
                  "min_length": 0
                },
                "name": "customer_num",
                "nullable": false,
                "type": "integer"
              }
            ],
            "id": "07fed318-4370-4c95-bbbc-16d4a91421bb"
          }
        ],
        "version": "3.0"
      },
      "entity": {
        "description": "",
        "name": "{job_name}",
        "rov": {
          "mode": 0
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "name": "{job_name}",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:14:10.193000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx",
          "last_modification_time": "2021-04-08 17:14:10.193000+00:00",
          "last_modifier_id": "IBMid-xxxxxxxxxx"
        }
      }
    }

Update DataStage flow

Modifies a data flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data flow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

PUT /v3/data_intg_flows/{data_intg_flow_id}
ServiceCall<DataIntgFlow> updateDatastageFlows(UpdateDatastageFlowsOptions updateDatastageFlowsOptions)
updateDatastageFlows(params)
update_datastage_flows(
        self,
        data_intg_flow_id: str,
        data_intg_flow_name: str,
        *,
        pipeline_flows: Optional['PipelineJson'] = None,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the UpdateDatastageFlowsOptions.Builder to create a UpdateDatastageFlowsOptions object that contains the parameter values for the updateDatastageFlows method.

Path Parameters

  • The DataStage flow ID to use.

Query Parameters

  • The data flow name.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • The directory asset ID.

Pipeline json to be attached.

The updateDatastageFlows options.

parameters

  • The DataStage flow ID to use.

  • The data flow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

parameters

  • The DataStage flow ID to use.

  • The data flow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

  • curl -X PUT --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   --header "Content-Type: application/json;charset=utf-8"   --data '{}'   "{base_url}/v3/data_intg_flows/{data_intg_flow_id}?data_intg_flow_name={data_intg_flow_name}&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • PipelineJson exampleFlowUpdated = PipelineFlowHelper.buildPipelineFlow(updatedFlowJson);
    UpdateDatastageFlowsOptions updateDatastageFlowsOptions = new UpdateDatastageFlowsOptions.Builder()
      .dataIntgFlowId(flowID)
      .dataIntgFlowName(flowName)
      .pipelineFlows(exampleFlowUpdated)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.updateDatastageFlows(updateDatastageFlowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgFlowId: assetID,
           dataIntgFlowName,
           pipelineFlows: pipelineJsonFromFile,
           projectId: projectID,
           assetCategory: 'system',
         };
         const res = await datastageService.updateDatastageFlows(params);
  • data_intg_flow = datastage_service.update_datastage_flows(
      data_intg_flow_id=createdFlowId,
      data_intg_flow_name='testFlowJob1Updated',
      pipeline_flows=UtilHelper.readJsonFileToDict('inputFiles/exampleFlowUpdated.json'),
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "attachments": [
        {
          "asset_type": "data_intg_flow",
          "attachment_id": "9081dd6b-0ab7-47b5-8233-c10c6e64509d",
          "href": "{url}/v2/assets/{asset_id}/attachments/9081dd6b-0ab7-47b5-8233-c10c6e64509d?project_id={project_id}",
          "mime": "application/json",
          "name": "data_intg_flows",
          "object_key": "data_intg_flow/{project_id}{asset_id}",
          "object_key_is_read_only": false,
          "private_url": false
        }
      ],
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "catalog_id": "{catalog_id}",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?catalog_id={catalog_id}",
        "name": "{job_name}",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:21:33.936000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx"
        }
      }
    }
  • {
      "attachments": [
        {
          "asset_type": "data_intg_flow",
          "attachment_id": "9081dd6b-0ab7-47b5-8233-c10c6e64509d",
          "href": "{url}/v2/assets/{asset_id}/attachments/9081dd6b-0ab7-47b5-8233-c10c6e64509d?project_id={project_id}",
          "mime": "application/json",
          "name": "data_intg_flows",
          "object_key": "data_intg_flow/{project_id}{asset_id}",
          "object_key_is_read_only": false,
          "private_url": false
        }
      ],
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "catalog_id": "{catalog_id}",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?catalog_id={catalog_id}",
        "name": "{job_name}",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:21:33.936000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx"
        }
      }
    }

Modifies attributes of a DataStage flow

Modifies attributes of a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set).

Modifies attributes of a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set).

Modifies attributes of a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set).

Modifies attributes of a DataStage flow in the specified project or catalog (either project_id or catalog_id must be set).

PUT /v3/data_intg_flows/{data_intg_flow_id}/attributes
ServiceCall<DataIntgFlow> patchAttributesDatastageFlow(PatchAttributesDatastageFlowOptions patchAttributesDatastageFlowOptions)
patchAttributesDatastageFlow(params)
patch_attributes_datastage_flow(
        self,
        data_intg_flow_id: str,
        *,
        description: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        name: Optional[str] = None,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the PatchAttributesDatastageFlowOptions.Builder to create a PatchAttributesDatastageFlowOptions object that contains the parameter values for the patchAttributesDatastageFlow method.

Path Parameters

  • The DataStage flow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

attributes of flow to modify.

The patchAttributesDatastageFlow options.

parameters

  • The DataStage flow ID to use.

  • description of the asset.

  • The directory asset ID of the asset.

  • name of the asset.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:

parameters

  • The DataStage flow ID to use.

  • description of the asset.

  • The directory asset ID of the asset.

  • name of the asset.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "attachments": [
        {
          "asset_type": "data_intg_flow",
          "attachment_id": "9081dd6b-0ab7-47b5-8233-c10c6e64509d",
          "href": "{url}/v2/assets/{asset_id}/attachments/9081dd6b-0ab7-47b5-8233-c10c6e64509d?project_id={project_id}",
          "mime": "application/json",
          "name": "data_intg_flows",
          "object_key": "data_intg_flow/{project_id}{asset_id}",
          "object_key_is_read_only": false,
          "private_url": false
        }
      ],
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "catalog_id": "{catalog_id}",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?catalog_id={catalog_id}",
        "name": "{job_name}",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:21:33.936000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx"
        }
      }
    }
  • {
      "attachments": [
        {
          "asset_type": "data_intg_flow",
          "attachment_id": "9081dd6b-0ab7-47b5-8233-c10c6e64509d",
          "href": "{url}/v2/assets/{asset_id}/attachments/9081dd6b-0ab7-47b5-8233-c10c6e64509d?project_id={project_id}",
          "mime": "application/json",
          "name": "data_intg_flows",
          "object_key": "data_intg_flow/{project_id}{asset_id}",
          "object_key_is_read_only": false,
          "private_url": false
        }
      ],
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "catalog_id": "{catalog_id}",
        "create_time": "2021-04-08 17:14:08+00:00",
        "creator_id": "IBMid-xxxxxxxxxx",
        "description": "",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}?catalog_id={catalog_id}",
        "name": "{job_name}",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "{project_id}/data_intg_flow/{job_name}",
        "size": 2712,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-04-08 17:21:33.936000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxxx"
        }
      }
    }

Clone DataStage flow

Create a DataStage flow in the specified project or catalog or space based on an existing DataStage flow in the same project or catalog or space.

Create a DataStage flow in the specified project or catalog or space based on an existing DataStage flow in the same project or catalog or space.

Create a DataStage flow in the specified project or catalog or space based on an existing DataStage flow in the same project or catalog or space.

Create a DataStage flow in the specified project or catalog or space based on an existing DataStage flow in the same project or catalog or space.

POST /v3/data_intg_flows/{data_intg_flow_id}/clone
ServiceCall<DataIntgFlow> cloneDatastageFlows(CloneDatastageFlowsOptions cloneDatastageFlowsOptions)
cloneDatastageFlows(params)
clone_datastage_flows(
        self,
        data_intg_flow_id: str,
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        data_intg_flow_name: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the CloneDatastageFlowsOptions.Builder to create a CloneDatastageFlowsOptions object that contains the parameter values for the cloneDatastageFlows method.

Path Parameters

  • The DataStage flow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • The directory asset ID.

  • The data flow name.

The cloneDatastageFlows options.

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

  • The data flow name.

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

  • The data flow name.

  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/{data_intg_flow_id}/clone?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • CloneDatastageFlowsOptions cloneDatastageFlowsOptions = new CloneDatastageFlowsOptions.Builder()
      .dataIntgFlowId(flowID)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.cloneDatastageFlows(cloneDatastageFlowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgFlowId: assetID,
           projectId: projectID,
         };
         const res = await datastageService.cloneDatastageFlows(params);
  • data_intg_flow = datastage_service.clone_datastage_flows(
      data_intg_flow_id=createdFlowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}",
        "name": "{job_name_copy}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_flow/{job_name_copy}"
      }
    }
  • {
      "entity": {
        "data_intg_flow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "{asset_id}",
        "asset_type": "data_intg_flow",
        "href": "{url}/data_intg/v3/data_intg_flows/{asset_id}",
        "name": "{job_name_copy}",
        "origin_country": "US",
        "resource_key": "{project_id}/data_intg_flow/{job_name_copy}"
      }
    }

Compile DataStage flow to generate runtime assets

Generate the runtime assets for a DataStage flow in the specified project or catalog for a specified runtime type. Either project_id or catalog_id must be specified.

Generate the runtime assets for a DataStage flow in the specified project or catalog for a specified runtime type. Either project_id or catalog_id must be specified.

Generate the runtime assets for a DataStage flow in the specified project or catalog for a specified runtime type. Either project_id or catalog_id must be specified.

Generate the runtime assets for a DataStage flow in the specified project or catalog for a specified runtime type. Either project_id or catalog_id must be specified.

POST /v3/ds_codegen/compile/{data_intg_flow_id}
ServiceCall<FlowCompileResponse> compileDatastageFlows(CompileDatastageFlowsOptions compileDatastageFlowsOptions)
compileDatastageFlows(params)
compile_datastage_flows(
        self,
        data_intg_flow_id: str,
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        runtime_type: Optional[str] = None,
        enable_sql_pushdown: Optional[bool] = None,
        enable_async_compile: Optional[bool] = None,
        enable_pushdown_source: Optional[bool] = None,
        enable_push_processing_to_source: Optional[bool] = None,
        enable_push_join_to_source: Optional[bool] = None,
        enable_pushdown_target: Optional[bool] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the CompileDatastageFlowsOptions.Builder to create a CompileDatastageFlowsOptions object that contains the parameter values for the compileDatastageFlows method.

Path Parameters

  • The DataStage flow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • The type of the runtime to use. e.g. dspxosh or Spark etc. If not provided queried from within pipeline flow if available otherwise default of dspxosh is used.

  • Whether to enable the SQL pushdown code generation or not. When this flag is set to true and enable_pushdown_source is not specified, enable_pushdown_source will be set to true. When this flag is set to true and enable_pushdown_target is not specified, enable_pushdown_target will be set to true.

  • Whether to compile the flow asynchronously or not. When set to true, the compile request will be queued and then compiled. Response will be returned immediately as "Compiling". For compile status, call get compile status api.

  • Whether to enable the push sql to source connectors. Setting this flag to true will automatically set enable_sql_pushdown to true if the latter is not specified or is explicitly set to false. When this flag is set to true and enable_push_processing_to_source is not specified, enable_push_processing_to_source will be automatically set to true as well. When this flag is set to true and enable_push_join_to_source is not speicified, enable_push_join_to_source will be automatically set to true as well.

  • Whether to enable pushing processing stages to source connectors or not. Setting this flag to true will automatically set enable_pushdown_source to true if the latter is not specified or is explicitly set to false.

  • Whether to enable pushing join/lookup stages to source connectors or not. Setting this flag to true will automatically set enable_pushdown_source to true if the latter is not specified or is explicitly set to false.

  • Whether to enable the push sql to target connectors. Setting this flag to true will automatically set enable_sql_pushdown to true if the latter is not specified or is explicitly set to false.

The compileDatastageFlows options.

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The type of the runtime to use. e.g. dspxosh or Spark etc. If not provided queried from within pipeline flow if available otherwise default of dspxosh is used.

  • Whether to enable the SQL pushdown code generation or not. When this flag is set to true and enable_pushdown_source is not specified, enable_pushdown_source will be set to true. When this flag is set to true and enable_pushdown_target is not specified, enable_pushdown_target will be set to true.

  • Whether to compile the flow asynchronously or not. When set to true, the compile request will be queued and then compiled. Response will be returned immediately as "Compiling". For compile status, call get compile status api.

  • Whether to enable the push sql to source connectors. Setting this flag to true will automatically set enable_sql_pushdown to true if the latter is not specified or is explicitly set to false. When this flag is set to true and enable_push_processing_to_source is not specified, enable_push_processing_to_source will be automatically set to true as well. When this flag is set to true and enable_push_join_to_source is not speicified, enable_push_join_to_source will be automatically set to true as well.

  • Whether to enable pushing processing stages to source connectors or not. Setting this flag to true will automatically set enable_pushdown_source to true if the latter is not specified or is explicitly set to false.

  • Whether to enable pushing join/lookup stages to source connectors or not. Setting this flag to true will automatically set enable_pushdown_source to true if the latter is not specified or is explicitly set to false.

  • Whether to enable the push sql to target connectors. Setting this flag to true will automatically set enable_sql_pushdown to true if the latter is not specified or is explicitly set to false.

parameters

  • The DataStage flow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The type of the runtime to use. e.g. dspxosh or Spark etc. If not provided queried from within pipeline flow if available otherwise default of dspxosh is used.

  • Whether to enable the SQL pushdown code generation or not. When this flag is set to true and enable_pushdown_source is not specified, enable_pushdown_source will be set to true. When this flag is set to true and enable_pushdown_target is not specified, enable_pushdown_target will be set to true.

  • Whether to compile the flow asynchronously or not. When set to true, the compile request will be queued and then compiled. Response will be returned immediately as "Compiling". For compile status, call get compile status api.

  • Whether to enable the push sql to source connectors. Setting this flag to true will automatically set enable_sql_pushdown to true if the latter is not specified or is explicitly set to false. When this flag is set to true and enable_push_processing_to_source is not specified, enable_push_processing_to_source will be automatically set to true as well. When this flag is set to true and enable_push_join_to_source is not speicified, enable_push_join_to_source will be automatically set to true as well.

  • Whether to enable pushing processing stages to source connectors or not. Setting this flag to true will automatically set enable_pushdown_source to true if the latter is not specified or is explicitly set to false.

  • Whether to enable pushing join/lookup stages to source connectors or not. Setting this flag to true will automatically set enable_pushdown_source to true if the latter is not specified or is explicitly set to false.

  • Whether to enable the push sql to target connectors. Setting this flag to true will automatically set enable_sql_pushdown to true if the latter is not specified or is explicitly set to false.

  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/ds_codegen/compile/{data_intg_flow_id}?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • CompileDatastageFlowsOptions compileDatastageFlowsOptions = new CompileDatastageFlowsOptions.Builder()
      .dataIntgFlowId(flowID)
      .projectId(projectID)
      .build();
    
    Response<FlowCompileResponse> response = datastageService.compileDatastageFlows(compileDatastageFlowsOptions).execute();
    FlowCompileResponse flowCompileResponse = response.getResult();
    
    System.out.println(flowCompileResponse);
  •      const params = {
           dataIntgFlowId: assetID,
           projectId: projectID,
         };
         const res = await datastageService.compileDatastageFlows(params);
  • flow_compile_response = datastage_service.compile_datastage_flows(
      data_intg_flow_id=createdFlowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(flow_compile_response, indent=2))

Response

Describes the compile response model.

Describes the compile response model.

Describes the compile response model.

Describes the compile response model.

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • Request object contains invalid information. Server is not able to process the request object.

  • Unexpected error.

Example responses

Delete DataStage subflows

Deletes the specified data subflows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data subflows will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

Deletes the specified data subflows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data subflows will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

Deletes the specified data subflows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data subflows will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

Deletes the specified data subflows in a project or catalog (either project_id or catalog_id must be set).

If the deletion of the data subflows will take some time to finish, then a 202 response will be returned and the deletion will continue asynchronously.

DELETE /v3/data_intg_flows/subflows
ServiceCall<Void> deleteDatastageSubflows(DeleteDatastageSubflowsOptions deleteDatastageSubflowsOptions)
deleteDatastageSubflows(params)
delete_datastage_subflows(
        self,
        id: List[str],
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the DeleteDatastageSubflowsOptions.Builder to create a DeleteDatastageSubflowsOptions object that contains the parameter values for the deleteDatastageSubflows method.

Query Parameters

  • The list of DataStage subflow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

The deleteDatastageSubflows options.

parameters

  • The list of DataStage subflow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:

parameters

  • The list of DataStage subflow IDs to delete.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • curl -X DELETE --location --header "Authorization: Bearer {iam_token}"   "{base_url}/v3/data_intg_flows/subflows?id=[]&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • String[] ids = new String[] {subflowID, cloneSubflowID};
    DeleteDatastageSubflowsOptions deleteDatastageSubflowsOptions = new DeleteDatastageSubflowsOptions.Builder()
      .id(Arrays.asList(ids))
      .projectId(projectID)
      .build();
    
    datastageService.deleteDatastageSubflows(deleteDatastageSubflowsOptions).execute();
  •      const params = {
           id: [assetID, cloneID],
           projectId: projectID,
           force: true,
         };
         const res = await datastageService.deleteDatastageFlows(params);
  • response = datastage_service.delete_datastage_subflows(
      id=createdSubflowId,
      project_id=config['PROJECT_ID']
    )

Response

Status Code

  • The requested operation is in progress.

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Get metadata for DataStage subflows

Lists the metadata and entity for DataStage subflows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageSubFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name

Lists the metadata and entity for DataStage subflows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageSubFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

Lists the metadata and entity for DataStage subflows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageSubFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

Lists the metadata and entity for DataStage subflows that are contained in the specified project.

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | entity.name | Equals | entity.name=MyDataStageSubFlow | | entity.name | Starts with | entity.name=starts:MyData | | entity.description | Equals | entity.description=movement | | entity.description | Starts with | entity.description=starts:data |

To sort the results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time (i.e. returning the most recently created data flows first).

| Field | Example | | ------------------------- | ----------------------------------- | | sort | sort=+entity.name (sort by ascending name) | | sort | sort=-metadata.create_time (sort by descending creation time) |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: sort=-metadata.create_time,+entity.name.

GET /v3/data_intg_flows/subflows
ServiceCall<DataFlowPagedCollection> listDatastageSubflows(ListDatastageSubflowsOptions listDatastageSubflowsOptions)
listDatastageSubflows(params)
list_datastage_subflows(
        self,
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        sort: Optional[str] = None,
        start: Optional[str] = None,
        limit: Optional[int] = None,
        entity_name: Optional[str] = None,
        entity_description: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the ListDatastageSubflowsOptions.Builder to create a ListDatastageSubflowsOptions object that contains the parameter values for the listDatastageSubflows method.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e2

  • The limit of the number of items to return for each page, for example limit=50. If not specified a default of 100 will be used. The maximum value of limit is 200.

    Example: 100

  • Filter results based on the specified name.

  • Filter results based on the specified description.

The listDatastageSubflows options.

parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

    Examples:
  • The limit of the number of items to return for each page, for example limit=50. If not specified a default of 100 will be used. The maximum value of limit is 200.

    Examples:
  • Filter results based on the specified name.

  • Filter results based on the specified description.

parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

    Examples:
  • The limit of the number of items to return for each page, for example limit=50. If not specified a default of 100 will be used. The maximum value of limit is 200.

    Examples:
  • Filter results based on the specified name.

  • Filter results based on the specified description.

  • curl -X GET --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/subflows?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23&limit=100"
  • ListDatastageSubflowsOptions listDatastageSubflowsOptions = new ListDatastageSubflowsOptions.Builder()
      .projectId(projectID)
      .limit(Long.valueOf("100"))
      .build();
    
    Response<DataFlowPagedCollection> response = datastageService.listDatastageSubflows(listDatastageSubflowsOptions).execute();
    DataFlowPagedCollection dataFlowPagedCollection = response.getResult();
    
    System.out.println(dataFlowPagedCollection);
  •      const params = {
           projectId: projectID,
           sort: 'name',
           limit: 100,
         };
         const res = await datastageService.listDatastageSubflows(params);
  • data_flow_paged_collection = datastage_service.list_datastage_subflows(
      project_id=config['PROJECT_ID'],
      limit=100
    ).get_result()
    
    print(json.dumps(data_flow_paged_collection, indent=2))

Response

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

A page from a collection of DataStage flows.

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

Example responses
  • {
      "data_flows": [
        {
          "entity": {
            "data_intg_subflow": {
              "dataset": false,
              "mime_type": "application/json"
            }
          },
          "metadata": {
            "asset_id": "{asset_id}",
            "asset_type": "data_intg_subflow",
            "create_time": "2021-04-03 15:32:55+00:00",
            "creator_id": "IBMid-xxxxxxxxx",
            "description": " ",
            "href": "{url}/data_intg/v3/data_intg_flows/subflows/{asset_id}?project_id={project_id}",
            "name": "{job_name}",
            "project_id": "{project_id}",
            "resource_key": "{project_id}/data_intg_subflow/{job_name}",
            "size": 5780,
            "usage": {
              "access_count": 0,
              "last_access_time": "2021-04-03 15:33:01.320000+00:00",
              "last_accessor_id": "IBMid-xxxxxxxxx",
              "last_modification_time": "2021-04-03 15:33:01.320000+00:00",
              "last_modifier_id": "IBMid-xxxxxxxxx"
            }
          }
        }
      ],
      "first": {
        "href": "{url}/data_intg/v3/data_intg_flows/subflows?project_id={project_id}&limit=2"
      },
      "next": {
        "href": "{url}/data_intg/v3/data_intg_flows/subflows?project_id={project_id}&limit=2&start=g1AAAADOeJzLYWBgYMpgTmHQSklKzi9KdUhJMjTUS8rVTU7WLS3WLc4vLcnQNbLQS87JL01JzCvRy0styQHpyWMBkgwNQOr____9WWCxXCAhYmRgZKhrYKJrYBxiaGplbGRlahqVaJCFZocB8XYcgNhxHrcdhlamhlGJ-llZAD4lOMI"
      },
      "total_count": 1
    }
  • {
      "data_flows": [
        {
          "entity": {
            "data_intg_subflow": {
              "dataset": false,
              "mime_type": "application/json"
            }
          },
          "metadata": {
            "asset_id": "{asset_id}",
            "asset_type": "data_intg_subflow",
            "create_time": "2021-04-03 15:32:55+00:00",
            "creator_id": "IBMid-xxxxxxxxx",
            "description": " ",
            "href": "{url}/data_intg/v3/data_intg_flows/subflows/{asset_id}?project_id={project_id}",
            "name": "{job_name}",
            "project_id": "{project_id}",
            "resource_key": "{project_id}/data_intg_subflow/{job_name}",
            "size": 5780,
            "usage": {
              "access_count": 0,
              "last_access_time": "2021-04-03 15:33:01.320000+00:00",
              "last_accessor_id": "IBMid-xxxxxxxxx",
              "last_modification_time": "2021-04-03 15:33:01.320000+00:00",
              "last_modifier_id": "IBMid-xxxxxxxxx"
            }
          }
        }
      ],
      "first": {
        "href": "{url}/data_intg/v3/data_intg_flows/subflows?project_id={project_id}&limit=2"
      },
      "next": {
        "href": "{url}/data_intg/v3/data_intg_flows/subflows?project_id={project_id}&limit=2&start=g1AAAADOeJzLYWBgYMpgTmHQSklKzi9KdUhJMjTUS8rVTU7WLS3WLc4vLcnQNbLQS87JL01JzCvRy0styQHpyWMBkgwNQOr____9WWCxXCAhYmRgZKhrYKJrYBxiaGplbGRlahqVaJCFZocB8XYcgNhxHrcdhlamhlGJ-llZAD4lOMI"
      },
      "total_count": 1
    }

Create DataStage subflow

Creates a DataStage subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Creates a DataStage subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

POST /v3/data_intg_flows/subflows
ServiceCall<DataIntgFlow> createDatastageSubflows(CreateDatastageSubflowsOptions createDatastageSubflowsOptions)
createDatastageSubflows(params)
create_datastage_subflows(
        self,
        data_intg_subflow_name: str,
        *,
        entity: Optional['SubFlowEntityJson'] = None,
        pipeline_flows: Optional['PipelineJson'] = None,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the CreateDatastageSubflowsOptions.Builder to create a CreateDatastageSubflowsOptions object that contains the parameter values for the createDatastageSubflows method.

Query Parameters

  • The DataStage subflow name.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • The directory asset ID.

Pipeline json to be attached.

The createDatastageSubflows options.

parameters

  • The DataStage subflow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

parameters

  • The DataStage subflow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   --header "Content-Type: application/json;charset=utf-8"   --data '{}'   "{base_url}/v3/data_intg_flows/subflows?data_intg_subflow_name={data_intg_subflow_name}&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • PipelineJson exampleSubFlow = PipelineFlowHelper.buildPipelineFlow(subFlowJson);
    CreateDatastageSubflowsOptions createDatastageSubflowsOptions = new CreateDatastageSubflowsOptions.Builder()
      .dataIntgSubflowName(subflowName)
      .pipelineFlows(exampleSubFlow)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.createDatastageSubflows(createDatastageSubflowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgSubflowName: dataIntgSubFlowName,
           pipelineFlows: pipelineJsonFromFile,
           projectId: projectID,
           assetCategory: 'system',
         };
         const res = await datastageService.createDatastageSubflows(params);
  • data_intg_flow = datastage_service.create_datastage_subflows(
      data_intg_subflow_name='testSubflow1',
      pipeline_flows=UtilHelper.readJsonFileToDict('inputFiles/exampleSubflow.json'),
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Get DataStage subflow

Lists the DataStage subflow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage subflow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage subflow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

Lists the DataStage subflow that is contained in the specified project. Attachments, metadata and a limited number of attributes from the entity of each DataStage flow is returned.

GET /v3/data_intg_flows/subflows/{data_intg_subflow_id}
ServiceCall<DataIntgFlowJson> getDatastageSubflows(GetDatastageSubflowsOptions getDatastageSubflowsOptions)
getDatastageSubflows(params)
get_datastage_subflows(
        self,
        data_intg_subflow_id: str,
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the GetDatastageSubflowsOptions.Builder to create a GetDatastageSubflowsOptions object that contains the parameter values for the getDatastageSubflows method.

Path Parameters

  • The DataStage subflow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

The getDatastageSubflows options.

parameters

  • The DataStage subflow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:

parameters

  • The DataStage subflow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • curl -X GET --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/subflows/{data_intg_subflow_id}?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • GetDatastageSubflowsOptions getDatastageSubflowsOptions = new GetDatastageSubflowsOptions.Builder()
      .dataIntgSubflowId(subflowID)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlowJson> response = datastageService.getDatastageSubflows(getDatastageSubflowsOptions).execute();
    DataIntgFlowJson dataIntgFlowJson = response.getResult();
    
    System.out.println(dataIntgFlowJson);
  •      const params = {
           dataIntgSubflowId: subflow_assetID,
           projectId: projectID,
         };
         const res = await datastageService.getDatastageSubflows(params);
  • data_intg_flow_json = datastage_service.get_datastage_subflows(
      data_intg_subflow_id=createdSubflowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow_json, indent=2))

Response

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

A pipeline JSON containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • Unexpected error.

Example responses
  • {
      "attachments": {
        "app_data": {
          "datastage": {
            "version": "3.0.2"
          }
        },
        "doc_type": "pipeline",
        "id": "913abf38-fac2-4c56-815b-f6f21e140fa3",
        "json_schema": "https://api.dataplatform.ibm.com/schemas/common-pipeline/pipeline-flow/pipeline-flow-v3-schema.json",
        "pipelines": [
          {
            "app_data": {
              "datastage": {
                "runtimecolumnpropagation": "true"
              },
              "ui_data": {
                "comments": []
              }
            },
            "id": "abd53940-0ab2-4559-978e-864800ee875a",
            "nodes": [
              {
                "app_data": {
                  "datastage": {
                    "outputs_order": "5e514391-fc64-4ad9-b7ef-d164783d1484"
                  },
                  "ui_data": {
                    "image": "",
                    "label": "Entry node 1",
                    "x_pos": 48,
                    "y_pos": 48
                  }
                },
                "id": "602a1843-4cb2-4a28-93f3-f6d08e9910b6",
                "outputs": [
                  {
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "aaac7610-cf58-4b7c-9431-643afe952621"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "id": "5e514391-fc64-4ad9-b7ef-d164783d1484",
                    "schema_ref": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
                  }
                ],
                "type": "binding"
              },
              {
                "app_data": {
                  "datastage": {
                    "inputs_order": "fb38f373-7b2c-4e70-8629-c4e5e05a7cff",
                    "outputs_order": "c539d891-84a8-481e-82fa-a6c90e588e1d"
                  },
                  "ui_data": {
                    "expanded_height": 200,
                    "expanded_width": 300,
                    "image": "../graphics/palette/Standardize.svg",
                    "is_expanded": false,
                    "label": "ContainerC3",
                    "x_pos": 192,
                    "y_pos": 48
                  }
                },
                "id": "a2fb41ad-5088-4849-a3cc-453a6416492c",
                "inputs": [
                  {
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "id": "fb38f373-7b2c-4e70-8629-c4e5e05a7cff",
                    "links": [
                      {
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "class_name": "",
                                "hotspot": false,
                                "id": "DSLink1E",
                                "label": "DSLink1E",
                                "outline": true,
                                "path": "",
                                "position": "middle"
                              }
                            ]
                          }
                        },
                        "id": "aaac7610-cf58-4b7c-9431-643afe952621",
                        "link_name": "DSLink1E",
                        "node_id_ref": "602a1843-4cb2-4a28-93f3-f6d08e9910b6",
                        "port_id_ref": "5e514391-fc64-4ad9-b7ef-d164783d1484",
                        "type_attr": "PRIMARY"
                      }
                    ],
                    "parameters": {
                      "runtime_column_propagation": 0
                    },
                    "schema_ref": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
                  }
                ],
                "outputs": [
                  {
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "78c4cdcd-2f6b-474a-805f-f15d00b7cac2"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "id": "c539d891-84a8-481e-82fa-a6c90e588e1d",
                    "schema_ref": "d4ba6846-debd-47c5-90ec-dda663728a36"
                  }
                ],
                "parameters": {
                  "input_count": 1,
                  "output_count": 1
                },
                "subflow_ref": {
                  "pipeline_id_ref": "default_pipeline_id",
                  "url": "app_defined"
                },
                "type": "super_node"
              },
              {
                "app_data": {
                  "datastage": {
                    "inputs_order": "2a52a02d-113c-4d9b-8f36-609414be8bf5"
                  },
                  "ui_data": {
                    "image": "",
                    "label": "Exit node 1",
                    "x_pos": 384,
                    "y_pos": 48
                  }
                },
                "id": "547dcda4-a052-432d-ae4b-06df14e8e5b3",
                "inputs": [
                  {
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "id": "2a52a02d-113c-4d9b-8f36-609414be8bf5",
                    "links": [
                      {
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "class_name": "",
                                "hotspot": false,
                                "id": "DSLink2E",
                                "label": "DSLink2E",
                                "outline": true,
                                "path": "",
                                "position": "middle"
                              }
                            ]
                          }
                        },
                        "id": "78c4cdcd-2f6b-474a-805f-f15d00b7cac2",
                        "link_name": "DSLink2E",
                        "node_id_ref": "a2fb41ad-5088-4849-a3cc-453a6416492c",
                        "port_id_ref": "c539d891-84a8-481e-82fa-a6c90e588e1d",
                        "type_attr": "PRIMARY"
                      }
                    ],
                    "schema_ref": "d4ba6846-debd-47c5-90ec-dda663728a36"
                  }
                ],
                "type": "binding"
              }
            ],
            "runtime_ref": "pxOsh"
          }
        ],
        "primary_pipeline": "abd53940-0ab2-4559-978e-864800ee875a",
        "runtimes": [
          {
            "id": "pxOsh",
            "name": "pxOsh"
          }
        ],
        "schemas": [
          {
            "fields": [
              {
                "app_data": {
                  "column_reference": "col1",
                  "is_unicode_string": false,
                  "odbc_type": "INTEGER",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "INT32"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": true,
                  "is_signed": true,
                  "item_index": 0,
                  "max_length": 0,
                  "min_length": 0
                },
                "name": "col1",
                "nullable": false,
                "type": "integer"
              },
              {
                "app_data": {
                  "column_reference": "col2",
                  "is_unicode_string": false,
                  "odbc_type": "CHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "STRING"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": false,
                  "item_index": 0,
                  "max_length": 5,
                  "min_length": 5
                },
                "name": "col2",
                "nullable": false,
                "type": "string"
              },
              {
                "app_data": {
                  "column_reference": "col3",
                  "is_unicode_string": false,
                  "odbc_type": "VARCHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "STRING"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": false,
                  "item_index": 0,
                  "max_length": 10,
                  "min_length": 0
                },
                "name": "col3",
                "nullable": false,
                "type": "string"
              }
            ],
            "id": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
          },
          {
            "fields": [
              {
                "app_data": {
                  "column_reference": "col1",
                  "is_unicode_string": false,
                  "odbc_type": "INTEGER",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "INT32"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": true,
                  "is_signed": true,
                  "item_index": 0,
                  "max_length": 0,
                  "min_length": 0
                },
                "name": "col1",
                "nullable": false,
                "type": "integer"
              },
              {
                "app_data": {
                  "column_reference": "col2",
                  "is_unicode_string": false,
                  "odbc_type": "CHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "STRING"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": false,
                  "item_index": 0,
                  "max_length": 5,
                  "min_length": 5
                },
                "name": "col2",
                "nullable": false,
                "type": "string"
              },
              {
                "app_data": {
                  "column_reference": "col3",
                  "is_unicode_string": false,
                  "odbc_type": "VARCHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "STRING"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": false,
                  "item_index": 0,
                  "max_length": 10,
                  "min_length": 0
                },
                "name": "col3",
                "nullable": false,
                "type": "string"
              }
            ],
            "id": "d4ba6846-debd-47c5-90ec-dda663728a36"
          }
        ],
        "version": "3.0"
      },
      "entity": {
        "data_intg_subflow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "7ad1e03c-5380-4bfa-8317-3604b95954c1",
        "asset_type": "data_intg_subflow",
        "catalog_id": "e35806c5-5314-4677-bb8a-416d3c628d41",
        "create_time": "2021-05-10 19:11:04+00:00",
        "creator_id": "IBMid-xxxxxxxxx",
        "description": "",
        "name": "NSC2_Subflow",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "baa8b445-9bea-4c7b-9930-233f57f8c629/data_intg_subflow/NSC2_Subflow",
        "size": 5117,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-05-10 19:11:05.474000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxx"
        }
      }
    }
  • {
      "attachments": {
        "app_data": {
          "datastage": {
            "version": "3.0.2"
          }
        },
        "doc_type": "pipeline",
        "id": "913abf38-fac2-4c56-815b-f6f21e140fa3",
        "json_schema": "https://api.dataplatform.ibm.com/schemas/common-pipeline/pipeline-flow/pipeline-flow-v3-schema.json",
        "pipelines": [
          {
            "app_data": {
              "datastage": {
                "runtimecolumnpropagation": "true"
              },
              "ui_data": {
                "comments": []
              }
            },
            "id": "abd53940-0ab2-4559-978e-864800ee875a",
            "nodes": [
              {
                "app_data": {
                  "datastage": {
                    "outputs_order": "5e514391-fc64-4ad9-b7ef-d164783d1484"
                  },
                  "ui_data": {
                    "image": "",
                    "label": "Entry node 1",
                    "x_pos": 48,
                    "y_pos": 48
                  }
                },
                "id": "602a1843-4cb2-4a28-93f3-f6d08e9910b6",
                "outputs": [
                  {
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "aaac7610-cf58-4b7c-9431-643afe952621"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "id": "5e514391-fc64-4ad9-b7ef-d164783d1484",
                    "schema_ref": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
                  }
                ],
                "type": "binding"
              },
              {
                "app_data": {
                  "datastage": {
                    "inputs_order": "fb38f373-7b2c-4e70-8629-c4e5e05a7cff",
                    "outputs_order": "c539d891-84a8-481e-82fa-a6c90e588e1d"
                  },
                  "ui_data": {
                    "expanded_height": 200,
                    "expanded_width": 300,
                    "image": "../graphics/palette/Standardize.svg",
                    "is_expanded": false,
                    "label": "ContainerC3",
                    "x_pos": 192,
                    "y_pos": 48
                  }
                },
                "id": "a2fb41ad-5088-4849-a3cc-453a6416492c",
                "inputs": [
                  {
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "id": "fb38f373-7b2c-4e70-8629-c4e5e05a7cff",
                    "links": [
                      {
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "class_name": "",
                                "hotspot": false,
                                "id": "DSLink1E",
                                "label": "DSLink1E",
                                "outline": true,
                                "path": "",
                                "position": "middle"
                              }
                            ]
                          }
                        },
                        "id": "aaac7610-cf58-4b7c-9431-643afe952621",
                        "link_name": "DSLink1E",
                        "node_id_ref": "602a1843-4cb2-4a28-93f3-f6d08e9910b6",
                        "port_id_ref": "5e514391-fc64-4ad9-b7ef-d164783d1484",
                        "type_attr": "PRIMARY"
                      }
                    ],
                    "parameters": {
                      "runtime_column_propagation": 0
                    },
                    "schema_ref": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
                  }
                ],
                "outputs": [
                  {
                    "app_data": {
                      "datastage": {
                        "is_source_of_link": "78c4cdcd-2f6b-474a-805f-f15d00b7cac2"
                      },
                      "ui_data": {
                        "label": "outPort"
                      }
                    },
                    "id": "c539d891-84a8-481e-82fa-a6c90e588e1d",
                    "schema_ref": "d4ba6846-debd-47c5-90ec-dda663728a36"
                  }
                ],
                "parameters": {
                  "input_count": 1,
                  "output_count": 1
                },
                "subflow_ref": {
                  "pipeline_id_ref": "default_pipeline_id",
                  "url": "app_defined"
                },
                "type": "super_node"
              },
              {
                "app_data": {
                  "datastage": {
                    "inputs_order": "2a52a02d-113c-4d9b-8f36-609414be8bf5"
                  },
                  "ui_data": {
                    "image": "",
                    "label": "Exit node 1",
                    "x_pos": 384,
                    "y_pos": 48
                  }
                },
                "id": "547dcda4-a052-432d-ae4b-06df14e8e5b3",
                "inputs": [
                  {
                    "app_data": {
                      "datastage": {},
                      "ui_data": {
                        "label": "inPort"
                      }
                    },
                    "id": "2a52a02d-113c-4d9b-8f36-609414be8bf5",
                    "links": [
                      {
                        "app_data": {
                          "datastage": {},
                          "ui_data": {
                            "decorations": [
                              {
                                "class_name": "",
                                "hotspot": false,
                                "id": "DSLink2E",
                                "label": "DSLink2E",
                                "outline": true,
                                "path": "",
                                "position": "middle"
                              }
                            ]
                          }
                        },
                        "id": "78c4cdcd-2f6b-474a-805f-f15d00b7cac2",
                        "link_name": "DSLink2E",
                        "node_id_ref": "a2fb41ad-5088-4849-a3cc-453a6416492c",
                        "port_id_ref": "c539d891-84a8-481e-82fa-a6c90e588e1d",
                        "type_attr": "PRIMARY"
                      }
                    ],
                    "schema_ref": "d4ba6846-debd-47c5-90ec-dda663728a36"
                  }
                ],
                "type": "binding"
              }
            ],
            "runtime_ref": "pxOsh"
          }
        ],
        "primary_pipeline": "abd53940-0ab2-4559-978e-864800ee875a",
        "runtimes": [
          {
            "id": "pxOsh",
            "name": "pxOsh"
          }
        ],
        "schemas": [
          {
            "fields": [
              {
                "app_data": {
                  "column_reference": "col1",
                  "is_unicode_string": false,
                  "odbc_type": "INTEGER",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "INT32"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": true,
                  "is_signed": true,
                  "item_index": 0,
                  "max_length": 0,
                  "min_length": 0
                },
                "name": "col1",
                "nullable": false,
                "type": "integer"
              },
              {
                "app_data": {
                  "column_reference": "col2",
                  "is_unicode_string": false,
                  "odbc_type": "CHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "STRING"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": false,
                  "item_index": 0,
                  "max_length": 5,
                  "min_length": 5
                },
                "name": "col2",
                "nullable": false,
                "type": "string"
              },
              {
                "app_data": {
                  "column_reference": "col3",
                  "is_unicode_string": false,
                  "odbc_type": "VARCHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "STRING"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": false,
                  "item_index": 0,
                  "max_length": 10,
                  "min_length": 0
                },
                "name": "col3",
                "nullable": false,
                "type": "string"
              }
            ],
            "id": "a479344e-7835-42b8-a5f5-7d88bc490dfe"
          },
          {
            "fields": [
              {
                "app_data": {
                  "column_reference": "col1",
                  "is_unicode_string": false,
                  "odbc_type": "INTEGER",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "INT32"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": true,
                  "is_signed": true,
                  "item_index": 0,
                  "max_length": 0,
                  "min_length": 0
                },
                "name": "col1",
                "nullable": false,
                "type": "integer"
              },
              {
                "app_data": {
                  "column_reference": "col2",
                  "is_unicode_string": false,
                  "odbc_type": "CHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "STRING"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": false,
                  "item_index": 0,
                  "max_length": 5,
                  "min_length": 5
                },
                "name": "col2",
                "nullable": false,
                "type": "string"
              },
              {
                "app_data": {
                  "column_reference": "col3",
                  "is_unicode_string": false,
                  "odbc_type": "VARCHAR",
                  "table_def": "Basic3\\\\Basic3\\\\Basic3",
                  "type_code": "STRING"
                },
                "metadata": {
                  "decimal_precision": 0,
                  "decimal_scale": 0,
                  "is_key": false,
                  "is_signed": false,
                  "item_index": 0,
                  "max_length": 10,
                  "min_length": 0
                },
                "name": "col3",
                "nullable": false,
                "type": "string"
              }
            ],
            "id": "d4ba6846-debd-47c5-90ec-dda663728a36"
          }
        ],
        "version": "3.0"
      },
      "entity": {
        "data_intg_subflow": {
          "dataset": false,
          "mime_type": "application/json"
        }
      },
      "metadata": {
        "asset_id": "7ad1e03c-5380-4bfa-8317-3604b95954c1",
        "asset_type": "data_intg_subflow",
        "catalog_id": "e35806c5-5314-4677-bb8a-416d3c628d41",
        "create_time": "2021-05-10 19:11:04+00:00",
        "creator_id": "IBMid-xxxxxxxxx",
        "description": "",
        "name": "NSC2_Subflow",
        "origin_country": "us",
        "project_id": "{project_id}",
        "resource_key": "baa8b445-9bea-4c7b-9930-233f57f8c629/data_intg_subflow/NSC2_Subflow",
        "size": 5117,
        "tags": [],
        "usage": {
          "access_count": 0,
          "last_access_time": "2021-05-10 19:11:05.474000+00:00",
          "last_accessor_id": "IBMid-xxxxxxxxx"
        }
      }
    }

Update DataStage subflow

Modifies a data subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

Modifies a data subflow in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the data flow must specify the project or catalog ID the data flow was created in.

PUT /v3/data_intg_flows/subflows/{data_intg_subflow_id}
ServiceCall<DataIntgFlow> updateDatastageSubflows(UpdateDatastageSubflowsOptions updateDatastageSubflowsOptions)
updateDatastageSubflows(params)
update_datastage_subflows(
        self,
        data_intg_subflow_id: str,
        data_intg_subflow_name: str,
        *,
        entity: Optional['SubFlowEntityJson'] = None,
        pipeline_flows: Optional['PipelineJson'] = None,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the UpdateDatastageSubflowsOptions.Builder to create a UpdateDatastageSubflowsOptions object that contains the parameter values for the updateDatastageSubflows method.

Path Parameters

  • The DataStage subflow ID to use.

Query Parameters

  • The DataStage subflow name.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • The directory asset ID.

Pipeline json to be attached.

The updateDatastageSubflows options.

parameters

  • The DataStage subflow ID to use.

  • The DataStage subflow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

parameters

  • The DataStage subflow ID to use.

  • The DataStage subflow name.

  • Pipeline flow to be stored.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

  • curl -X PUT --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   --header "Content-Type: application/json;charset=utf-8"   --data '{}'   "{base_url}/v3/data_intg_flows/subflows/{data_intg_subflow_id}?data_intg_subflow_name={data_intg_subflow_name}&project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • PipelineJson exampleSubFlowUpdated = PipelineFlowHelper.buildPipelineFlow(updatedSubFlowJson);
    UpdateDatastageSubflowsOptions updateDatastageSubflowsOptions = new UpdateDatastageSubflowsOptions.Builder()
      .dataIntgSubflowId(subflowID)
      .dataIntgSubflowName(subflowName)
      .pipelineFlows(exampleSubFlowUpdated)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.updateDatastageSubflows(updateDatastageSubflowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgSubflowId: subflow_assetID,
           dataIntgSubflowName: dataIntgSubFlowName,
           pipelineFlows: pipelineJsonFromFile,
           projectId: projectID,
           assetCategory: 'system',
         };
         const res = await datastageService.updateDatastageSubflows(params);
  • data_intg_flow = datastage_service.update_datastage_subflows(
      data_intg_subflow_id=createdSubflowId,
      data_intg_subflow_name='testSubflow1Updated',
      pipeline_flows=UtilHelper.readJsonFileToDict('inputFiles/exampleSubflowUpdated.json'),
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Modifies attributes of DataStage subflow

Modifies attributes of a data subflow in the specified project or catalog (either project_id or catalog_id must be set).

Modifies attributes of a data subflow in the specified project or catalog (either project_id or catalog_id must be set).

Modifies attributes of a data subflow in the specified project or catalog (either project_id or catalog_id must be set).

Modifies attributes of a data subflow in the specified project or catalog (either project_id or catalog_id must be set).

PUT /v3/data_intg_flows/subflows/{data_intg_subflow_id}/attributes
ServiceCall<DataIntgFlow> patchAttributesDatastageSubflow(PatchAttributesDatastageSubflowOptions patchAttributesDatastageSubflowOptions)
patchAttributesDatastageSubflow(params)
patch_attributes_datastage_subflow(
        self,
        data_intg_subflow_id: str,
        *,
        description: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        name: Optional[str] = None,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the PatchAttributesDatastageSubflowOptions.Builder to create a PatchAttributesDatastageSubflowOptions object that contains the parameter values for the patchAttributesDatastageSubflow method.

Path Parameters

  • The DataStage subflow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

attributes of subflows to modify.

The patchAttributesDatastageSubflow options.

parameters

  • The DataStage subflow ID to use.

  • description of the asset.

  • The directory asset ID of the asset.

  • name of the asset.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:

parameters

  • The DataStage subflow ID to use.

  • description of the asset.

  • The directory asset ID of the asset.

  • name of the asset.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Clone DataStage subflow

Create a DataStage subflow in the specified project or catalog based on an existing DataStage subflow in the same project or catalog.

Create a DataStage subflow in the specified project or catalog based on an existing DataStage subflow in the same project or catalog.

Create a DataStage subflow in the specified project or catalog based on an existing DataStage subflow in the same project or catalog.

Create a DataStage subflow in the specified project or catalog based on an existing DataStage subflow in the same project or catalog.

POST /v3/data_intg_flows/subflows/{data_intg_subflow_id}/clone
ServiceCall<DataIntgFlow> cloneDatastageSubflows(CloneDatastageSubflowsOptions cloneDatastageSubflowsOptions)
cloneDatastageSubflows(params)
clone_datastage_subflows(
        self,
        data_intg_subflow_id: str,
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        data_intg_subflow_name: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the CloneDatastageSubflowsOptions.Builder to create a CloneDatastageSubflowsOptions object that contains the parameter values for the cloneDatastageSubflows method.

Path Parameters

  • The DataStage subflow ID to use.

Query Parameters

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Example: 4c9adbb4-28ef-4a7d-b273-1cee0c38021f

  • The directory asset ID.

  • The data subflow name.

The cloneDatastageSubflows options.

parameters

  • The DataStage subflow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

  • The data subflow name.

parameters

  • The DataStage subflow ID to use.

  • The ID of the catalog to use. catalog_id or project_id or space_id is required.

  • The ID of the project to use. catalog_id or project_id or space_id is required.

    Examples:
  • The ID of the space to use. catalog_id or project_id or space_id is required.

    Examples:
  • The directory asset ID.

  • The data subflow name.

  • curl -X POST --location --header "Authorization: Bearer {iam_token}"   --header "Accept: application/json;charset=utf-8"   "{base_url}/v3/data_intg_flows/subflows/{data_intg_subflow_id}/clone?project_id=bd0dbbfd-810d-4f0e-b0a9-228c328a8e23"
  • CloneDatastageSubflowsOptions cloneDatastageSubflowsOptions = new CloneDatastageSubflowsOptions.Builder()
      .dataIntgSubflowId(subflowID)
      .projectId(projectID)
      .build();
    
    Response<DataIntgFlow> response = datastageService.cloneDatastageSubflows(cloneDatastageSubflowsOptions).execute();
    DataIntgFlow dataIntgFlow = response.getResult();
    
    System.out.println(dataIntgFlow);
  •      const params = {
           dataIntgSubflowId: subflow_assetID,
           projectId: projectID,
         };
         const res = await datastageService.cloneDatastageSubflows(params);
  • data_intg_flow = datastage_service.clone_datastage_subflows(
      data_intg_subflow_id=createdSubflowId,
      project_id=config['PROJECT_ID']
    ).get_result()
    
    print(json.dumps(data_intg_flow, indent=2))

Response

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

A DataStage flow model that defines physical source(s), physical target(s) and an optional pipeline containing operations to apply to source(s).

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Delete table definitions

Delete the specified table definitions from a project or catalog (either project_id or catalog_id must be set).

Delete the specified table definitions from a project or catalog (either project_id or catalog_id must be set).

Delete the specified table definitions from a project or catalog (either project_id or catalog_id must be set).

Delete the specified table definitions from a project or catalog (either project_id or catalog_id must be set).

DELETE /v3/table_definitions
ServiceCall<Void> deleteTableDefinitions(DeleteTableDefinitionsOptions deleteTableDefinitionsOptions)
deleteTableDefinitions(params)
delete_table_definitions(
        self,
        id: List[str],
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the DeleteTableDefinitionsOptions.Builder to create a DeleteTableDefinitionsOptions object that contains the parameter values for the deleteTableDefinitions method.

Query Parameters

  • The list of table definitions IDs to delete.

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id, space_id, or project_id is required.

The deleteTableDefinitions options.

parameters

  • The list of table definitions IDs to delete.

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

parameters

  • The list of table definitions IDs to delete.

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

Response

Status Code

  • The requested operation completed successfully.

  • The requested operation is in progress.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

List table definitions

Lists the table definitions that are contained in the specified project or catalog (either project_id or catalog_id must be set).

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | asset.name | Starts with | ?asset.name=starts:MyTable | | asset.description | Equals | ?asset.description=starts:profiling |

To sort the returned results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time, returning the most recently created data flows first.

| Field | Example | | ------------------------- | ----------------------------------- | | asset.name | ?sort=+asset.name | | metadata.create_time | ?sort=-metadata.create_time |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: `?sort=-metadata.create_time,+asset.name

Lists the table definitions that are contained in the specified project or catalog (either project_id or catalog_id must be set).

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | asset.name | Starts with | ?asset.name=starts:MyTable | | asset.description | Equals | ?asset.description=starts:profiling |

To sort the returned results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time, returning the most recently created data flows first.

| Field | Example | | ------------------------- | ----------------------------------- | | asset.name | ?sort=+asset.name | | metadata.create_time | ?sort=-metadata.create_time |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: `?sort=-metadata.create_time,+asset.name.

Lists the table definitions that are contained in the specified project or catalog (either project_id or catalog_id must be set).

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | asset.name | Starts with | ?asset.name=starts:MyTable | | asset.description | Equals | ?asset.description=starts:profiling |

To sort the returned results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time, returning the most recently created data flows first.

| Field | Example | | ------------------------- | ----------------------------------- | | asset.name | ?sort=+asset.name | | metadata.create_time | ?sort=-metadata.create_time |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: `?sort=-metadata.create_time,+asset.name.

Lists the table definitions that are contained in the specified project or catalog (either project_id or catalog_id must be set).

Use the following parameters to filter the results:

| Field | Match type | Example | | ------------------------ | ------------ | --------------------------------------- | | asset.name | Starts with | ?asset.name=starts:MyTable | | asset.description | Equals | ?asset.description=starts:profiling |

To sort the returned results, use one or more of the parameters described in the following section. If no sort key is specified, the results are sorted in descending order on metadata.create_time, returning the most recently created data flows first.

| Field | Example | | ------------------------- | ----------------------------------- | | asset.name | ?sort=+asset.name | | metadata.create_time | ?sort=-metadata.create_time |

Multiple sort keys can be specified by delimiting them with a comma. For example, to sort in descending order on create_time and then in ascending order on name use: `?sort=-metadata.create_time,+asset.name.

GET /v3/table_definitions
ServiceCall<TableDefinitionPagedCollection> getTableDefinitions(GetTableDefinitionsOptions getTableDefinitionsOptions)
getTableDefinitions(params)
get_table_definitions(
        self,
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        sort: Optional[str] = None,
        start: Optional[str] = None,
        limit: Optional[int] = None,
        asset_name: Optional[str] = None,
        asset_description: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the GetTableDefinitionsOptions.Builder to create a GetTableDefinitionsOptions object that contains the parameter values for the getTableDefinitions method.

Query Parameters

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id, space_id, or project_id is required.

  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time

  • The page token indicating where to start paging from.

  • The limit of the number of items to return, for example limit=50. If not specified a default of 100 will be used.

    Possible values: value ≥ 1

    Example: 100

  • Filter results based on the specified name.

  • Filter results based on the specified description.

The getTableDefinitions options.

parameters

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

  • The limit of the number of items to return, for example limit=50. If not specified a default of 100 will be used.

    Possible values: value ≥ 1

    Examples:
  • Filter results based on the specified name.

  • Filter results based on the specified description.

parameters

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

  • The field to sort the results on, including whether to sort ascending (+) or descending (-), for example, sort=-metadata.create_time.

  • The page token indicating where to start paging from.

  • The limit of the number of items to return, for example limit=50. If not specified a default of 100 will be used.

    Possible values: value ≥ 1

    Examples:
  • Filter results based on the specified name.

  • Filter results based on the specified description.

Response

A page from a collection of table definitions.

A page from a collection of table definitions.

A page from a collection of table definitions.

A page from a collection of table definitions.

Status Code

  • The requested operation completed successfully.

  • You are not permitted to perform this action. See response for more information.

  • Not authorized.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Create table definition

Creates a table definition in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the parameter set must specify the project or catalog ID the table definition was created in.

Creates a table definition in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the parameter set must specify the project or catalog ID the table definition was created in.

Creates a table definition in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the parameter set must specify the project or catalog ID the table definition was created in.

Creates a table definition in the specified project or catalog (either project_id or catalog_id must be set). All subsequent calls to use the parameter set must specify the project or catalog ID the table definition was created in.

POST /v3/table_definitions
ServiceCall<TableDefinition> createTableDefinition(CreateTableDefinitionOptions createTableDefinitionOptions)
createTableDefinition(params)
create_table_definition(
        self,
        entity: 'TableDefinitionEntity',
        metadata: 'TableDefinitionMetadata',
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        asset_category: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the CreateTableDefinitionOptions.Builder to create a CreateTableDefinitionOptions object that contains the parameter values for the createTableDefinition method.

Query Parameters

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id, space_id, or project_id is required.

  • The directory asset id to create the asset in or move to

  • The category of the asset. Must be either SYSTEM or USER. Only a registered service can use this parameter.

    Allowable values: [SYSTEM,USER]

The table definition to be created.

The createTableDefinition options.

parameters

  • The underlying table definition.

  • System metadata about a table definition.

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

  • The directory asset id to create the asset in or move to.

  • The category of the asset. Must be either SYSTEM or USER. Only a registered service can use this parameter.

    Allowable values: [SYSTEM,USER]

parameters

  • The underlying table definition.

  • System metadata about a table definition.

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

  • The directory asset id to create the asset in or move to.

  • The category of the asset. Must be either SYSTEM or USER. Only a registered service can use this parameter.

    Allowable values: [SYSTEM,USER]

Response

A table definition model that defines a set of parameters that can be referenced at runtime.

A table definition model that defines a set of parameters that can be referenced at runtime.

A table definition model that defines a set of parameters that can be referenced at runtime.

A table definition model that defines a set of parameters that can be referenced at runtime.

Status Code

  • The requested operation completed successfully.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • The data source type details cannot be found.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Delete a table definition

Delete the specified table definition from a project or catalog (either project_id or catalog_id must be set).

DELETE /v3/table_definitions/{table_id}

Request

Path Parameters

  • Table definition ID

Query Parameters

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id, space_id, or project_id is required.

Response

Status Code

  • The requested operation completed successfully.

  • Bad request. See response for more information.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • The data source type details cannot be found.

  • Gone. See response for more information.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Get table definition

Get table definition

Get table definition.

Get table definition.

Get table definition.

GET /v3/table_definitions/{table_id}
ServiceCall<TableDefinition> getTableDefinition(GetTableDefinitionOptions getTableDefinitionOptions)
getTableDefinition(params)
get_table_definition(
        self,
        table_id: str,
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the GetTableDefinitionOptions.Builder to create a GetTableDefinitionOptions object that contains the parameter values for the getTableDefinition method.

Path Parameters

  • Table definition ID

Query Parameters

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id, space_id, or project_id is required.

The getTableDefinition options.

parameters

  • Table definition ID.

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

parameters

  • Table definition ID.

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

Response

A table definition model that defines a set of parameters that can be referenced at runtime.

A table definition model that defines a set of parameters that can be referenced at runtime.

A table definition model that defines a set of parameters that can be referenced at runtime.

A table definition model that defines a set of parameters that can be referenced at runtime.

Status Code

  • Table definition found.

  • Bad request. See response for more information.

  • You are not authorized to retrieve the table definition.

  • You are not permitted to perform this action.

  • The data source type details cannot be found.

  • The service is currently receiving more requests than it can process in a timely fashion. Please retry submitting your request later.

  • An error occurred. No table definitions were retrieved.

  • A timeout occurred when processing your request. Please retry later.

No Sample Response

This method does not specify any sample responses.

Patch a table definition

Patch a table definition in the specified project or catalog (either project_id or catalog_id must be set).

Patch a table definition in the specified project or catalog (either project_id or catalog_id must be set).

Patch a table definition in the specified project or catalog (either project_id or catalog_id must be set).

Patch a table definition in the specified project or catalog (either project_id or catalog_id must be set).

PATCH /v3/table_definitions/{table_id}
ServiceCall<TableDefinition> patchTableDefinition(PatchTableDefinitionOptions patchTableDefinitionOptions)
patchTableDefinition(params)
patch_table_definition(
        self,
        table_id: str,
        json_patch: List['PatchDocument'],
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the PatchTableDefinitionOptions.Builder to create a PatchTableDefinitionOptions object that contains the parameter values for the patchTableDefinition method.

Path Parameters

  • Table definition ID

Query Parameters

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id, space_id, or project_id is required.

The patch operations to apply.

The patchTableDefinition options.

parameters

  • Table definition ID.

  • The patch operations to apply.

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

parameters

  • Table definition ID.

  • The patch operations to apply.

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Examples:
  • The ID of the space to use. catalog_id, space_id, or project_id is required.

Response

A table definition model that defines a set of parameters that can be referenced at runtime.

A table definition model that defines a set of parameters that can be referenced at runtime.

A table definition model that defines a set of parameters that can be referenced at runtime.

A table definition model that defines a set of parameters that can be referenced at runtime.

Status Code

  • The requested operation completed successfully.

  • Bad request. See response for more information.

  • You are not authorized to access the service. See response for more information.

  • You are not permitted to perform this action. See response for more information.

  • The data source type details cannot be found.

  • An error occurred. See response for more information.

No Sample Response

This method does not specify any sample responses.

Update a table definition with a replacement

Replace the contents of a table definition in the specified project or catalog (either project_id or catalog_id must be set).

Replace the contents of a table definition in the specified project or catalog (either project_id or catalog_id must be set).

Replace the contents of a table definition in the specified project or catalog (either project_id or catalog_id must be set).

Replace the contents of a table definition in the specified project or catalog (either project_id or catalog_id must be set).

PUT /v3/table_definitions/{table_id}
ServiceCall<TableDefinition> updateTableDefinition(UpdateTableDefinitionOptions updateTableDefinitionOptions)
updateTableDefinition(params)
update_table_definition(
        self,
        table_id: str,
        entity: 'TableDefinitionEntity',
        metadata: 'TableDefinitionMetadata',
        *,
        catalog_id: Optional[str] = None,
        project_id: Optional[str] = None,
        space_id: Optional[str] = None,
        directory_asset_id: Optional[str] = None,
        **kwargs,
    ) -> DetailedResponse

Request

Use the UpdateTableDefinitionOptions.Builder to create a UpdateTableDefinitionOptions object that contains the parameter values for the updateTableDefinition method.

Path Parameters

  • Table definition ID

Query Parameters

  • The ID of the catalog to use. catalog_id, space_id, or project_id is required.

  • The ID of the project to use. catalog_id, space_id, or project_id is required.

    Example: bd0dbbfd-810d-4f0e-b0a9-228c328a8e23

  • The ID of the space to use. catalog_id, space_id, or project_id is required.

  • The directory asset id to create the asset in or move to

The table definition to be updated.

The updateTableDefinition options.