Transform Services User Guide

Service Parameters

To run the service, specify the source object storage and identify the input data set.

REQUIRED: "source"

Note
Identify the transform source object storage, where the input resides. The source object storage details appear in the Model9 agent configuration file.

Required Keywords for "source"

1
{
2
"source": {
3
"url":"<URL>",
4
"api":"<API>",
5
"bucket":"<USER_BUCKET>",
6
"user":"<USERID>",
7
"password":"<PASSWORD>",
8
}
9
}
Copied!

Optional Keywords for "source"

1
{
2
"source": {
3
"useS3V4Signatures":"false"|"true"
4
}
5
}
Copied!
Keyword
Description
Required
Default
url
The object storage / proxy url
YES
-
api
The api-protocol used by this object storage / proxy
YES
-
bucket
The bucket defined within the object storage / proxy
YES
-
user
The userid provided by the object storage / proxy
YES
-
password
The password provided by the object storage / proxy
YES
-
useS3V4Signatures
Whether to use the V4 protocol of S3. Required for certain object storage providers, such as HCP Cloud Scale and Cohesity. Relevant for api “S3” only.
NO
false

OPTIONAL: “target”

Note
Identify the transform target object storage. Values not specified will be taken from the “source” parameter.
1
{
2
"target": {
3
"url":"<URL>",
4
"api":"<API>",
5
"bucket"<USER-BUCKET>",
6
"user":"<USERID>",
7
"password":"<PASSWORD>",
8
"useS3V4Signatures":"false"|"true"
9
}
10
}
Copied!
Keyword
Description
Required
Default
url
The object storage / proxy url
NO
Taken from “source”
api
The api-protocol used by this object storage / proxy
NO
Taken from “source”
bucket
The bucket defined within the object storage / proxy
NO
Taken from “source”
user
The userid provided by the object storage / proxy
NO
Taken from “source”
password
The password provided by the object storage / proxy
NO
Taken from “source”
useS3V4Signatures
Whether to use the V4 protocol of S3. Required for certain object storage providers, such as HCP Cloud Scale and Cohesity. Relevant for api “S3” only.
NO
false

REQUIRED: “input”

Note
If you specify VSAM keywords for a sequential input data set, the transform will be performed and a warning message will be issued

Required Keywords for "input"

1
{
2
"input": {
3
"name":"<DSN>",
4
"complex":"<group-SYSPLEX>"
5
}
6
}
Copied!

Optional Keywords for "input"

1
{
2
"input": {
3
"type":"backup"|"archive"|"import",
4
"entry":"0|<N>",
5
"prefix":"model9|<USER-PREFIX>",
6
"recordBinary":"false|true",
7
"recordCharset":"<CHARSET>",
8
"vsam":{
9
"keyBinary":"false|true",
10
"keyCharset":"<CHARSET>"
11
}
12
}
13
}
Copied!
Description
Default
name
Name of the original data set
MF data set legal name, case insensitive
complex
The Model9 resource complex name as defined in the agent configuration file.
String representing the complex
type
The type of the input data set, according to the Model9 Cloud Data Manager policy that created it:
    “backup” - A backup copy (default)
    “archive” - An archived data set
    “import” - A data set imported from tape
“backup”
(case insensitive)
entry
When the type is “backup”, “entry” represents the generation. The default is “0”, meaning the latest backup copy. Entry “1” would be the backup copy that was taken prior to the latest copy, and so on.
“0”
prefix
The environment prefix as defined in the agent configuration file.
“model9”
recordBinary
Whether the record input is binary. Qualifies for all “record” input (PS, PDS, VSAM data)
“false”
(case insensitive)
recordCharset
If the record input is not binary, what will be the character set of the input. Qualifies for all “record” input (PS, PDS, VSAM data)
“IBM-1047”
keyBinary
In case the input is VSAM data set, whether the VSAM key is binary. The output is in base64 format
“false”
(case insensitive)
keyCharset
In case the input is VSAM data set and the key is not binary, the character set of the VSAM key
“IBM-1047”

OPTIONAL: "output"

Note
The output is the transformed data of the MF data set, accessible as S3 object
Note
    When transforming a file with the same name as an existing file in the target, the existing file will be replaced by the newly transformed file.
    Note that the service does not delete previously transformed files but rather overwrites files with the same name, so when re-transforming a file using the “split” function, ensure to remove any previously transformed files to avoid having split files of different versions.
    When splitting a file, wait for the successful completion of the transform function before continuing with the processing, to insure all the parts of a the file were created.
    Specifying “text” format for a “binary” input will cause the transform to fail.
1
{
2
"output": {
3
"prefix":"model9|<USER-PREFIX>",
4
"compression":"none|gzip",
5
"format":"JSON|text|CSV|RAW",
6
"charset":"UTF8",
7
"endWithNewLine":"false|true",
8
"splitBySize":"<nnnnb/m/g>",
9
"splitByRecords":"<n>"
10
}
11
}
Copied!
Keyword
Description
Default
prefix
Prefix to be added to the object name:
”Prefix”/”object name”
“transform”
compression
Should the output be compressed: “gzip”|”no
“gzip”
(case insensitive)
format
The format of the output file: “JSON”|”Text”|”CSV”|"RAW"
“JSON”
(case insensitive)
charset
If the key input is not binary, this keyword specifies what will be the character set of the output. Currently only “UTF8” is supported
“UTF8”
endWithNewLine
A newline will be added at the end of the file, before end of file. This is required by some applications.
false
splitBySize
Whether to split the output files to several files by the requested size, for example, “3000b”, "1000m", "1g". The output files will be numbered "<file-name>.1", "<file-name>.2", "<file-name>.3" and so on.
    The keyword is mutually exclusive with splitByRecords
    The minimum value for this parameter is 1024 bytes, it is not possible to specify a smaller size
    When specifying a number without a unit, the service will use bytes, for example: splitBySize":"1024"
    The service will split the data set into files the size of 1024 bytes.
    The function will not split a record in the middle
    The last part can be smaller than the specified size
    Specifying the value “0” indicates no split by size will be performed
0
No split by size will be performed
splitByRecords
Whether to split the output files to several files, according to output records. The output files will be numbered "<file-name>.1", "<file-name>.2", "<file-name>.3" and so on.
    The keyword is mutually exclusive with splitBySize
    The function will not split a record in the middle
    The last part can include less records than specified
    Specifying the value “0” indicates no split by records will be performed
0
No split by records will be performed

Service parameters samples

Transforming a plain text data set

Transform the latest backup of a plain text data set, charset IBM-1047, converted to UTF8 and compressed.
1
{
2
"input" : {
3
"name" : "SAMPLE.TEXT",
4
"complex" : "group-PLEX1"
5
},
6
"output" : {
7
"format" : "text"
8
},
9
"source" : {
10
"url" : "https://s3.amazonaws.com",
11
"api" : "aws-s3",
12
"bucket" : "prod-bucket",
13
"user" : "sdsdDVDCsxadA43TERVGFBSDSSDff",
14
"password" : "ddferdscsdW4REFEBA33DSffss344gbs4efe7"
15
}
16
}
Copied!

Transforming an unloaded DB2 table

Transform the latest backup of an unloaded DB2 table, charset IBM-1047, converted to UTF8 and compressed, located with a specific prefix:
1
{
2
"input" : {
3
"name" : "DB2.UNLOADED.SEQ",
4
"complex" : "group-PLEX1"
5
},
6
"output" : {
7
"format" : "text",
8
},
9
"source" : {
10
"url" : "https://s3.amazonaws.com",
11
"api" : "aws-s3",
12
"bucket" : "prod-bucket",
13
"user" : "sdsdDVDCsxadA43TERVGFBSDSSDff",
14
"password" : "ddferdscsdW4REFEBA33DSffss344gbs4efe7"
15
},
16
"output" :{
17
"prefix" : "DBprodCustomers"
18
}
19
}
Copied!

Transforming a VSAM file using the defaults

When transforming a VSAM file, the defaults are a text key and binary data, transforming to a JSON output file:
1
{
2
"input" : {
3
"name" : "SAMPLE.VSAM",
4
"complex" : "group-PLEX1"
5
},
6
"source" : {
7
"url" : "https://s3.amazonaws.com",
8
"api" : "aws-s3",
9
"bucket" : "prod-bucket",
10
"user" : "sdsdDVDCsxadA43TERVGFBSDSSDff",
11
"password" : "ddferdscsdW4REFEBA33DSffss344gbs4efe7"
12
}
13
}
Copied!

Transforming a VSAM text file to CSV

Specify a text data, transforming to a CSV output file:
1
{
2
"input" : {
3
"name" : "SAMPLE.VSAM",
4
"complex" : "group-PLEX1"
5
},
6
"vsam" :{
7
"keyBinary" : "false|true",
8
"keyCharset": "<CHARSET>"
9
},
10
"output" : {
11
"format" : "CSV",
12
},
13
"source" : {
14
"url" : "https://s3.amazonaws.com",
15
"api" : "aws-s3",
16
"bucket" : "prod-bucket",
17
"user" : "sdsdDVDCsxadA43TERVGFBSDSSDff",
18
"password" : "ddferdscsdW4REFEBA33DSffss344gbs4efe7"
19
}
20
}
Copied!

Transforming on Azure Storage using OAuth2

When transforming data on Azure blob storage with OAuth2 use the azureOauth section to specify Azure OAuth arguments as follows:
1
{
2
"input" : {
3
"name" : "SAMPLE.PS",
4
"complex" : "group-PLEX1"
5
},
6
"vsam" :{
7
"keyBinary" : "false|true",
8
"keyCharset": "<CHARSET>"
9
},
10
"output" : {
11
"format" : "CSV",
12
},
13
"source": {
14
"api": "azureblob",
15
"url": "https://<azure-storage-account>.blob.core.windows.net",
16
"bucket": "<azure-container-name>",
17
"user": "<azure-application-uuid>",
18
"password": "<azure-application-client-secret>",
19
"azureOauth": {
20
"oauthEndpoint": "<azure-oauth-endpoint>",
21
"storageAccount": "<azure-storage-account>",
22
"oauthAudience": "<azure-oauth-audience>",
23
"credentialType": "<azure-credential-type>"
24
}
25
}
26
}
Copied!
Table: Azure OAuth2 Arguments
Field Name
Description
Required
Default Value
oauthEndpoint
The OAuth2 endpoint from which an OAuth2 token will be request. This value will usually take the form of: https://login.microsoftonline.com/<tenent-id>/oauth2/token
true
N/A
storageAccount
The name of the Azure storage account which contains the Azureblob container.
true
N/A
oauthAudience
OAuth2 audience
false
https://storage.azure.com
credentialType
OAuth2 cretential type
false
clientCredentialsSecret

Service response and log

The transform service is invoked as an HTTP request. It returns:

HTTP status

Code
Description
200
OK
400
Bad user input or unsupported data set
500
Unexpected error

HTTP response

1
{
2
"status" : “OK|WARNING|ERROR”,
3
"outputName" : “<OUTPUT-NAME>”,
4
"inputName" : ”<DSN>”,
5
"outputCompression" : ”none|gzip”,
6
"outputSizeInBytes" : ”<SIZE-IN_BYTES>”,
7
"outputFormat" : ”JSON|text|CSV”
8
}
Copied!
Output keyword
Description
status
    OK - all is well, no log records
    WARNING - minor problem e.g. specifying parameters that do not fit the input data set. The log is returned.
    ERROR - major problem such e.g. unable to read the input data or problem in communication. The log is returned.
outputName
The object name as appears in the target object storage
inputName
The input data set name
outputCompression
The compression type as selected in the input parameters / default
outputSizeInBytes
The number of bytes on the output object
outputFormat
The format as selected in the input parameters / default
In case of a WARNING or an ERROR - the HTTP response will also contain log messages.
Note
Informational messages are printed only to service log and not to the HTTP response. The service log can be viewed on the AWS console when executing the service from AWS, or the docker log, when executing the service on-premises.

Log

1
{
2
"log": [
3
"<INFO-MESSAGE>",
4
"<WARNING-MESSAGE>",
5
"<ERROR-MESSAGE>",
6
]
7
}
Copied!

Service response and log samples

Status OK sample

1
{
2
"status" : "OK",
3
"outputName" : "transform/QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS!uuid=a641d670-2d05-41e7-9dd3-7815e1b2d4c4",
4
"inputName" : "QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS",
5
"outputCompression": "NONE",
6
"outputSizeInBytes": 97,
7
"outputFormat" : "JSON"
8
}
Copied!

Status WARNING sample

1
{
2
"log" : [
3
"ZM9K001I Transform service started",
4
"ZM9K108W Specifying input parameter vsam is ignored for input data set with DSORG PS",
5
"ZM9K002I Transform service completed successfully, output is transform/QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS!uuid=d779fbf9-da6b-495b-b6b9-de7583905f19"
6
],
7
"status" : "WARNING",
8
"outputName" : "transform/QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS!uuid=d779fbf9-da6b-495b-b6b9-de7583905f19",
9
"inputName" : "QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS",
10
"outputCompression": "NONE",
11
"outputSizeInBytes": 97,
12
"outputFormat" : "JSON"
13
}
Copied!

Status ERROR sample

1
{
2
"status": "ERROR",
3
"log" : [
4
"ZM9K001I Transform service started",
5
"ZM9K008E The input was not found: name QA.SMS.MCBK.DSERV.TXT.NON, archive false, entry (0)"
6
]
7
}
Copied!

Input format support

Supported formats

    SMS-managed data sets
    Non-SMS managed data sets
    Sequential and extended-sequential data sets with the following RECFM:
      V
      VB
      F
      FB
      FBA
    Non-extended VSAM KSDS data sets

Unsupported formats

    RRDS, VRRDS, LINEAR, ESDS
    Extended format data sets with compression or encryption
    PDS data sets
    RECFM not mentioned above (U)

Output format support

    Text
    JSON
    CSV

DB2 Image Copy Transform Guide

Configuration

1. Make sure that <M9_HOME>/scripts/transform-service.sh has execute permissions. If not, add it by using chmod a+x <M9_HOME>/scripts/transform-service.sh.
2. Copy M9XFDB2 from Model9's SAMPLIB data set to a PDS data set of your choosing.
3. Edit M9XFDB2 and replace the placeholders enclosed with angle brackets with the following:
Table: Placeholders
Placeholder Name
Replace with ...
<M9_SAMP>
Your Model9 SAMPLIB data set.
<M9_HOME>
Your Model9 installation directory.
<DB2_SDSNLOAD>
Your DB2's SDSNLOAD data set.
<TABLE_NAME>
The name of the table to be transformed.
<SCHEMA_NAME>
The schema of the table (can be seen under column CREATOR in SYSIBM.SYSTABLES.)
<DB2_SUBSYS>
The name of the DB2 subsystem.
<XFORM_SVC_URL>
The endpoint URL of the installed transform service.
4. Replace the remaining placeholders in the JCL as described in this manual.

Execute and verify results

When done, submit the job and make sure it ends with MAXCC of 0.
Via SDSF, verify that the transform service was in fact called and completed successfully. Successful output would look something like this:
1
{
2
"status": "OK",
3
"outputNames": [
4
"transform-output/M9.SHY.DB2.IMGCPY.M9DB.M9SEG4"
5
],
6
"inputName": "M9.SHY.DB2.IMGCPY.M9DB.M9SEG4",
7
"outputCompression": "NONE",
8
"outputSizeInBytes": 1064,
9
"outputFormat": "CSV"
10
}
Copied!

Supported DB2 Column Types

Table 5. Supported DB2 Column Types for Transformation
DB2 SQL Code
Name
392-3
TIMESTAMP
2448-9
TIMESTAMP WITH TIME ZONE
384-5
DATE
388-9
TIME
452-3
CHAR
448-9, 456-7
VARCHAR
480-1
REAL/FLOAT/DOUBLE
484-5
DECIMAL/DEC/NUMERIC
492-3
BIGINT
496-7
INTEGER/INT
500-1
SMALLINT
Last modified 1mo ago