tokenizationJob

Get all tokenization jobs

Securityapi_key
Request
query Parameters
page_number
integer <int64>
Default: 1

The page number for which to get tokenization jobs. This will default to the first page if excluded

page_size
integer <int64>

The maximum number of objects to return. This will default to the DefaultApiPageSize setting if not provided

environment_id
integer <int32>

The ID of the environment to get all tokenization jobs from

Responses
200

Success

400

Bad request

403

Forbidden access

404

Not found

get/tokenization-jobs
Response samples
application/json
{
  • "_pageInfo": {
    },
  • "responseList": [
    ]
}

Create tokenization job

Securityapi_key
Request
Request Body schema: application/json
required

The tokenization job to create

jobName
required
string <= 255 characters

The name of the tokenization job. Once the tokenization job is created, this field cannot be changed.

rulesetId
required
integer <int32>

The ID of the ruleset that this tokenization job is based on. Once the tokenization job is created, the underlying environment that is inferred from the ruleset will be unchangeable. That is, the job can only be updated to reference a ruleset that is in the same environment as the environment of the original ruleset.

email
string

The email address to send job status notifications to; note that the SMTP settings must be configured first to receive notifications.

feedbackSize
integer <int32> >= 1

The granularity with which the Masking Engine provides updates on the progress of the tokenization job. For instance, a feedbackSize of 50000 results in log updates whenever 50000 rows are processed during the masking phase.

jobDescription
string <= 255 characters

A description of the job.

maxMemory
integer <int32>

The maximum amount of memory, in MB, that the tokenization job can consume during execution.

minMemory
integer <int32>

The minimum amount of memory, in MB, that the tokenization job can consume during execution.

multiTenant
boolean
Default: false

This field determines whether the tokenization job, after creation, can be executed using a connector that is different from the underlying connector associated with the ruleset that this tokenization job is based on.

numInputStreams
integer <int32> >= 1
Default: 1

This field controls the amount of parallelism that the tokenization job uses to extract out the data to be masked. For instance, when masking a database, specifying 5 input streams results in the tokenization job reading up to 5 database tables in parallel and then masking those 5 streams of data in parallel. The higher the value of this field, the more potential parallelism there will be in the job, but the tokenization job will consume more memory. If the number of input streams exceeds the number of units being masked (e.g. tables or files), then the excess streams will do nothing.

onTheFlyMasking
boolean
Default: false

This field determines whether the tokenization job will be performed InPlace or OnTheFly. The process for InPlace masking is to read out the data to be masked, mask the data, and then load the masked data back into the original data source. The process for OnTheFly masking is to read out the data to be masked, mask the data, and then load the masked data back into a different data source. When masking OnTheFly, the field 'onTheFlyMaskingSource' must be provided.

object (DatabaseMaskingOptions)
object (OnTheFlyMaskingSource)
failImmediately
boolean
Default: false

This field determines whether the masking job will fail immediately or delay failure until job completion when a masking algorithm fails to mask its data. Setting this value to 'false' provides a means for a user to see all cumulative masking errors before the job is marked as failed.

Array of objects (JobTask)

This field determines what tasks to perform before/after a job from a set of available driver support tasks as indicated by the chosen target ruleset/connector.

streamRowLimit
integer >= -1

This value constrains the total number of rows that may enter the job for each masking stream. A setting of 0 means unlimited. A value of -1 selects the default value. The default value for this setting varies by job type. The minimum explicit value allowed is 20

Responses
201

Success

400

Bad request

403

Forbidden access

404

Not found

409

Conflict

post/tokenization-jobs
Request samples
application/json
{
  • "jobName": "some_tokenization_job",
  • "rulesetId": 7,
  • "jobDescription": "This example illustrates a TokenizationJob with just a handful of the possible fields set. It is meant to exemplify a simple JSON body that can be passed to the endpoint to create a TokenizationJob.",
  • "feedbackSize": 100000,
  • "onTheFlyMasking": false,
  • "databaseMaskingOptions": {
    }
}
Response samples
application/json
{
  • "jobName": "some_tokenization_job",
  • "rulesetId": 7,
  • "jobDescription": "This example illustrates a TokenizationJob with just a handful of the possible fields set. It is meant to exemplify a simple JSON body that can be passed to the endpoint to create a TokenizationJob.",
  • "feedbackSize": 100000,
  • "onTheFlyMasking": false,
  • "databaseMaskingOptions": {
    }
}

Get tokenization job by ID

Securityapi_key
Request
path Parameters
tokenizationJobId
required
integer <int32>

The ID of the tokenization job to get

Responses
200

Success

400

Bad request

403

Forbidden access

404

Not found

get/tokenization-jobs/{tokenizationJobId}
Response samples
application/json
{
  • "jobName": "some_tokenization_job",
  • "rulesetId": 7,
  • "jobDescription": "This example illustrates a TokenizationJob with just a handful of the possible fields set. It is meant to exemplify a simple JSON body that can be passed to the endpoint to create a TokenizationJob.",
  • "feedbackSize": 100000,
  • "onTheFlyMasking": false,
  • "databaseMaskingOptions": {
    }
}

Update tokenization job by ID

Securityapi_key
Request
path Parameters
tokenizationJobId
required
integer <int32>

The ID of the tokenization job to update

Request Body schema: application/json
required

The updated tokenization job

jobName
required
string <= 255 characters

The name of the tokenization job. Once the tokenization job is created, this field cannot be changed.

rulesetId
required
integer <int32>

The ID of the ruleset that this tokenization job is based on. Once the tokenization job is created, the underlying environment that is inferred from the ruleset will be unchangeable. That is, the job can only be updated to reference a ruleset that is in the same environment as the environment of the original ruleset.

email
string

The email address to send job status notifications to; note that the SMTP settings must be configured first to receive notifications.

feedbackSize
integer <int32> >= 1

The granularity with which the Masking Engine provides updates on the progress of the tokenization job. For instance, a feedbackSize of 50000 results in log updates whenever 50000 rows are processed during the masking phase.

jobDescription
string <= 255 characters

A description of the job.

maxMemory
integer <int32>

The maximum amount of memory, in MB, that the tokenization job can consume during execution.

minMemory
integer <int32>

The minimum amount of memory, in MB, that the tokenization job can consume during execution.

multiTenant
boolean
Default: false

This field determines whether the tokenization job, after creation, can be executed using a connector that is different from the underlying connector associated with the ruleset that this tokenization job is based on.

numInputStreams
integer <int32> >= 1
Default: 1

This field controls the amount of parallelism that the tokenization job uses to extract out the data to be masked. For instance, when masking a database, specifying 5 input streams results in the tokenization job reading up to 5 database tables in parallel and then masking those 5 streams of data in parallel. The higher the value of this field, the more potential parallelism there will be in the job, but the tokenization job will consume more memory. If the number of input streams exceeds the number of units being masked (e.g. tables or files), then the excess streams will do nothing.

onTheFlyMasking
boolean
Default: false

This field determines whether the tokenization job will be performed InPlace or OnTheFly. The process for InPlace masking is to read out the data to be masked, mask the data, and then load the masked data back into the original data source. The process for OnTheFly masking is to read out the data to be masked, mask the data, and then load the masked data back into a different data source. When masking OnTheFly, the field 'onTheFlyMaskingSource' must be provided.

object (DatabaseMaskingOptions)
object (OnTheFlyMaskingSource)
failImmediately
boolean
Default: false

This field determines whether the masking job will fail immediately or delay failure until job completion when a masking algorithm fails to mask its data. Setting this value to 'false' provides a means for a user to see all cumulative masking errors before the job is marked as failed.

Array of objects (JobTask)

This field determines what tasks to perform before/after a job from a set of available driver support tasks as indicated by the chosen target ruleset/connector.

streamRowLimit
integer >= -1

This value constrains the total number of rows that may enter the job for each masking stream. A setting of 0 means unlimited. A value of -1 selects the default value. The default value for this setting varies by job type. The minimum explicit value allowed is 20

Responses
200

Success

400

Bad request

403

Forbidden access

404

Not found

put/tokenization-jobs/{tokenizationJobId}
Request samples
application/json
{
  • "jobName": "some_tokenization_job",
  • "rulesetId": 7,
  • "jobDescription": "This example illustrates a TokenizationJob with just a handful of the possible fields set. It is meant to exemplify a simple JSON body that can be passed to the endpoint to create a TokenizationJob.",
  • "feedbackSize": 100000,
  • "onTheFlyMasking": false,
  • "databaseMaskingOptions": {
    }
}
Response samples
application/json
{
  • "jobName": "some_tokenization_job",
  • "rulesetId": 7,
  • "jobDescription": "This example illustrates a TokenizationJob with just a handful of the possible fields set. It is meant to exemplify a simple JSON body that can be passed to the endpoint to create a TokenizationJob.",
  • "feedbackSize": 100000,
  • "onTheFlyMasking": false,
  • "databaseMaskingOptions": {
    }
}

Delete tokenization job by ID

Securityapi_key
Request
path Parameters
tokenizationJobId
required
integer <int32>

The ID of the tokenization job to delete

Responses
200

Success

400

Bad request

403

Forbidden access

404

Not found

delete/tokenization-jobs/{tokenizationJobId}