- TLC Home Home
- Discussions & Ideas Discussions & Ideas
- Product Guides Product Guides
- Knowledge Base Knowledge Base
- Developer Docs Developer Docs
- Education Education
- Blog Blog
- Support Desk Support Desk
The Omnichannel File Status API provides detailed status information for your Omnichannel files. The Omnichannel File Status API returns a File Status that details when the files imported, how they were processed, and any processing errors.
In this article:
To use the Omnichannel File Status API, you will need the following:
The bearer token is used to authenticate all API calls and not the API key. The API key is only used in the authentication call.
See the Getting Started guide to learn about generating a bearer token from the API key.
A File Status is represented as a JSON object containing the following keys:
Object Name | Type | Description |
---|---|---|
account |
string | The name of the AudienceStream account |
profile |
string | The name of the profile in the account where your files are uploaded and processed |
created at |
UTC timestamp | The date and time when the file status was created, expressed in UTC format with the designator 'Z' |
file.name |
string | The name of the file in question |
file.size |
integer | The file size, in bytes |
file.source-type |
string | The service used for uploading the file: FTP, SFTP, Tealium S3, or S3 |
file.source-host |
string | The name of the service
|
line_count |
integer | The number of rows in the file |
status.state |
string |
Indicates the processing action on the file. Possible values are:
|
status.timestamp |
UTC timestamp | The date and time when the [status-state] was recorded, expressed in UTC format with the designator 'Z' |
lines_successfully_processed |
integer | The number of rows successfully parsed |
lines_skipped |
integer | The number of rows that could not be parsed due to missing data |
lines_failed_processing |
integer | The number of rows that could not be parsed due to bad data or incorrect formatting |
last_failure |
string |
Error due to a row or file that could not be processed |
The following example provides a file status sample:
{ "account": "company_xyz", "created_at": { "$date": "2017-04-04T22:27:56.993Z" }, "file": { "name": "sales-transaction_2016feb4v50.csv", "size": 2744, "checksum": "44b12d35ea9fffdeeb69f98b03004f22", "source": { "type": "s3", "host": "companyxyz:" }, "line_count": 35 }, "node_id": "bulk_downloader_i-cd504b49", "profile": "omnichannelv2", "status": [ { "state": "DOWNLOADING", "timestamp": { "$date": "2017-04-04T22:27:57.285Z" } }, { "state": "DOWNLOADED", "timestamp": { "$date": "2017-04-04T22:27:57.722Z" } }, { "state": "PROCESSING", "timestamp": { "$date": "2017-04-04T22:27:57.780Z" }, "lines_skipped": 34 }, { "state": "PROCESSED", "timestamp": { "$date": "2017-04-04T22:27:58.797Z" } } ] }
File Count returns a count of uploaded files and only applies to files with the same file prefix. File Count uses the following query parameters:
Parameter | Description |
---|---|
|
The file prefix matching the Omnichannel file definition, such as |
|
The start date of a range of files to query, for example |
|
The end date of a range of files to query, for example |
Use the following GET command to obtain a file count:
GET /v2/omnichannel/accounts/{account}/profiles/{profile}/files/count
Use the following cURL command for File Count:
curl -H 'Authorization: Bearer {token}' \ https://api.tealiumiq.com/v2/omnichannel/accounts/{account}/profiles/{profile}/files/count?filePrefix={file prefix}&startDate={startDate}&endDate={endDate}
If the file count exceeds 50, narrow the date range parameters to reduce the number of returns.
The following example shows a typical response generated from the cURL command:
{ "count" : 42 }
Potential error messages for this task are:
Error Message | Description |
---|---|
400 Bad request |
|
Use the following GET command to retrieve the status of a single file:
GET /v2/omnichannel/accounts/{account}/profiles/{profile}/files/{filename}
Use the following cURL command to retrieve the status of a single file:
curl -H 'Authorization: Bearer {token}' \ 'https://api.tealiumiq.com/v2/omnichannel/accounts/{account}/profiles/{profile}/files/{filename}.csv' -o {filename}.txt
Designated .csv filenames should not be similar and must be distinctly different.
The following example shows a typical response generated from the cURL command:
{ "_id": { "$oid": "5702ea6db993f8f8cbea334c" }, "account": "acme", "created_at": { "$date": "2016-04-04T22:27:56.993Z" }, "file": { "name": "sales-transaction_2016feb4v50.csv", "size": 2744, "checksum": "44b12d35ea9fffdeeb69f98b03004f22", "source": { "type": "s3", "host": "johndoe:" }, "line_count": 35 }, "node_id": "bulk_downloader_i-cd504b49", "profile": "main", "status": [ { "state": "DOWNLOADING", "timestamp": { "$date": "2016-04-04T22:27:57.285Z" } }, { "state": "DOWNLOADED", "timestamp": { "$date": "2016-04-04T22:27:57.722Z" } }, { "state": "PROCESSING", "timestamp": { "$date": "2016-04-04T22:27:57.780Z" }, "lines_skipped": 34 }, { "state": "PROCESSED", "timestamp": { "$date": "2016-04-04T22:27:58.797Z" } } ] }
Potential error messages for this task are:
Error Message | Description |
---|---|
404 Not Found | Supplied file name not found |
400 Bad request | File name exceeds 150 characters |
Returns an array of file statuses within the specified date range. Multiple File Statuses only applies to files with the same file prefix.
Use the following GET command to retrieve the status of multiple files for a specific date range. For startDate and endDate, specify the date and time in this format - 2016-07-31T13:45-0700
GET /v2/omnichannel/accounts/{account}/profiles/{profile}/files/search?filePrefix={filePrefix}&startDate={startDate}&endDate={endDate}
Use the following cURL command to retrieve the status of multiple files for a specific date range:
curl -H 'Authorization: Bearer {token}' \ 'https://api.tealiumiq.com/v2/omnichannel/accounts/{account}/profiles/{profile}/files/search?filePrefix={filePrefix}&startDate={startDate}&endDate={endDate}' \ -o {filename}.txt
The following sample file status shows three (3) files with the following criteria:
sales-transaction
"_"
, and a unique date identifier[ { "_id": { "$oid": "5702ea6db993f8f8cbea334c" }, "account": "acme", "created_at": { "$date": "2017-04-04T22:27:56.993Z" }, "file": { "name": "sales-transaction_2016feb4v50.csv", "size": 2744, "checksum": "44b12d35ea9fffdeeb69f98b03004f22", "source": { "type": "s3", "host": "johndoe:" }, "line_count": 35 }, "node_id": "bulk_downloader_i-cd504b49", "profile": "main", "status": [ { "state": "DOWNLOADING", "timestamp": { "$date": "2017-04-04T22:27:57.285Z" } }, { "state": "DOWNLOADED", "timestamp": { "$date": "2017-04-04T22:27:57.722Z" } }, { "state": "PROCESSING", "timestamp": { "$date": "2017-04-04T22:27:57.780Z" }, "lines_skipped": 34 }, { "state": "PROCESSED", "timestamp": { "$date": "2016-04-04T22:27:58.797Z" } } ] }, { "_id": { "$oid": "5702ea6bb993f8f8cbea334b" }, "account": "acme", "created_at": { "$date": "2017-04-04T22:27:55.562Z" }, "file": { "name": "sales-transaction_2016feb4v49.csv", "size": 2744, "checksum": "7b8f92474e220c275bf9931c0337abf3", "source": { "type": "s3", "host": "johndoe:" }, "line_count": 35 }, "node_id": "bulk_downloader_i-cd504b49", "profile": "main", "status": [ { "state": "DOWNLOADING", "timestamp": { "$date": "2017-04-04T22:27:55.601Z" } }, { "state": "DOWNLOADED", "timestamp": { "$date": "2017-04-04T22:27:55.887Z" } }, { "state": "PROCESSING", "timestamp": { "$date": "2017-04-04T22:27:56.045Z" }, "lines_skipped": 34 }, { "state": "PROCESSED", "timestamp": { "$date": "2017-04-04T22:27:56.669Z" } } ] }, { "_id": { "$oid": "5702ea69b993f8f8cbea3349" }, "account": "acme", "created_at": { "$date": "2017-04-04T22:27:53.276Z" }, "file": { "name": "sales-transaction_2016feb4v53.csv", "size": 2744, "checksum": "44b12d35ea9fffdeeb69f98b03004f22", "source": { "type": "s3", "host": "johndoe:" }, "line_count": 35 }, "node_id": "bulk_downloader_i-cd504b49", "profile": "profile", "status": [ { "state": "DOWNLOADING", "timestamp": { "$date": "2017-04-04T22:27:53.307Z" } }, { "state": "DOWNLOADED", "timestamp": { "$date": "2017-04-04T22:27:54.249Z" } }, { "state": "PROCESSING", "timestamp": { "$date": "2017-04-04T22:27:54.289Z" }, "lines_skipped": 34 }, { "state": "PROCESSED", "timestamp": { "$date": "2017-04-04T22:27:55.480Z" } } ] },
Potential error messages for this task are:
Error Message | Description |
---|---|
400 Bad request |
|
File processing can fail for many reasons, such as invalid service credentials, missing row data, or an internal server error. In the File Status, specific errors are logged in the value of the last_failure
key.
The following truncated sample shows the last failure for a file status:
... { "state": "PROCESSING", "timestamp": { "$date": "2017-12-05T18:09:54.014Z" }, "lines_failed_processing": 1, "last_failure": "Failed to find attribute for column with id" }, ...
The errors listed in the following sections below are specific to Omnichannel Files. These errors are different from API endpoint errors.
The following table describes errors specific to processing and downloading and provides an explanation of the meaning and a resolution.
Error message | Meaning | Resolution |
---|---|---|
Unknown file prefix, unable to parse file |
File for the supplied prefix ( name ) could not be found. Either the file does not exist or there is a typographical error in the prefix. |
Re-upload the file and double check the file prefix in the Omnichannel Definitions tab. If the error persists, contact your Tealium Account Manager. |
DBObject of size {###} is over Max BSON size {###} |
File is too large to be processed | Split your file data into multiple file definitions |
Failed to download (FTP, SFTP, S3) file |
File could not be downloaded due to errors in your service credentials |
For FTP/SFTP, double check the host name, user name, and password. For S3/Tealium S3, double check the Access/Secret keys and Bucket/prefix |
Invalid connection type – OR – Could not find required (FTP, SFTP, S3) configuration parameters for definition |
Your service credentials under the Omnichannel Configuration tab could not be authenticated |
For FTP/SFTP, double check the host name, user name, and password. For S3/Tealium S3, double check the Access/Secret keys and Bucket/prefix |
The following table describes errors are caused by incorrect column mappings in the definitions tab and provides an explanation of the meaning and a resolution.
Error Message | Meaning | Resolution |
---|---|---|
Failed to find attribute for column with id |
AudienceStream could not generate an Omnichannel attribute for the mapped column name | Replace the offending column header with a fresh instance and re-upload the file |
Exception thrown while processing file |
Unable to parse the value of your mapped column header into the Omnichannel attribute | Replace the offending column value and re-upload the file |
Secondary visitor ID column= {column_name_here} does not exist in the file, unable to continue |
The column header mapped to the Visitor ID field does not exist in your file. This could be due to a typographical error or the header is not defined. | Ensure that the column header supplied matches the column header in the file |
{date_format_here} could not be parsed as a Date |
The date format specified for the mapped column name does not exist in your file | Ensure that the date format you supplied matches the date format in the file |
The number of columns to be processed {##} must match the number of CellProcessors {##}: check that the number of CellProcessors you have defined matches the expected number of columns being read/written |
The column names mapped under Omnichannel attributes do not exist in your file. This could be due to a typographical error or the header is not defined. | Ensure that the column header supplied matches the column header in the file |
The following errors occur when the different servers processing your files are unable to communicate with each other. To resolve these errors, re-upload your files.
Connection is already closed due to connection error; cause: com.rabbitmq.client.MissedHeartbeatException
Can't connect to new replica set master [##.#.#.###:###], err: couldn't connect to server [##.#.#.###:###], error querying server
DBCClientBase::findN: transport error
socket exception [SEND_ERROR]
Copyright All Rights Reserved © 2008-2022