Omnichannel File Status (v1 API)

Omnichannel File Status (v1 API)

by Community Manager ‎02-20-2018 02:24 PM - edited ‎02-20-2018 02:36 PM (342 Views)

The Omnichannel File Status API provides detailed status information for your Omnichannel Files. The API is available to AudienceStream-enabled accounts only. This article assumes you have already uploaded the Omnichannel Files in your AudienceStream profile.

Table of Contents Placeholder

File Status Fields

The API returns a File Status that details when the files imported and how they were processed, including any processing errors. A File Status is represented as a JSON object containing the following keys:

Name Type Description
account string Name of the AudienceStream account
profile string Name of the profile (in the account) where your files are uploaded and processed 
created at UTC timestamp date and time when the file status was created (expressed in UTC format with the designator 'Z')
file.name string Name of the file in question
file.size integer File size in bytes
file.source-type string Service used for uploading the file: FTP, SFTP, Tealium S3, and S3
file.source-host string Name of the service
  • Hosting account for FTP/SFTP
  • Bucket name for S3/Tealium S3
line_count integer Number of rows in the file
status.state string Indicates the processing action on the file. Possible values are: DOWNLOADING, DOWNLOADED, PROCESSING, or PROCESSED.
status.timestamp UTC timestamp date and time when the [status-state] was recorded (expressed in UTC format with the designator 'Z')
lines_successfully_processed integer Number of rows parsed successfully
lines_skipped integer Number of rows that could not be parsed due to missing data
lines_failed_processing integer Number of rows that could not be parsed due to bad data or incorrect formatting
last_failure string Error due to which the row/file could not be processed (Refer to Error Glossary and Troubleshooting link_here)

Sample File Status

{
    "account": "company_xyz",
    "created_at": {
      "$date": "2016-04-04T22:27:56.993Z"
    },
    "file": {
      "name": "sales-transaction_2016feb4v50.csv",
      "size": 2744,
      "checksum": "44b12d35ea9fffdeeb69f98b03004f22",
      "source": {
        "type": "s3",
        "host": "companyxyz:"
      },
      "line_count": 35
    },
    "node_id": "bulk_downloader_i-cd504b49",
    "profile": "omnichannelv2",
    "status": [
      {
        "state": "DOWNLOADING",
        "timestamp": {
          "$date": "2016-04-04T22:27:57.285Z"
        }
      },
      {
        "state": "DOWNLOADED",
        "timestamp": {
          "$date": "2016-04-04T22:27:57.722Z"
        }
      },
      {
        "state": "PROCESSING",
        "timestamp": {
          "$date": "2016-04-04T22:27:57.780Z"
        },
        "lines_skipped": 34

      },
      {
        "state": "PROCESSED",
        "timestamp": {
          "$date": "2016-04-04T22:27:58.797Z"
        }
      }
    ]
}

File Count

Returns the count of uploaded files. Only applies to files with the same file prefix.

The following query parameters are used:

  • filePrefix - the file prefix matching the Omnichannel file definition eg. "sales_transactions"
  • startDate - the start date of a range of files to query eg. "2017-01-01T12:34Z"
  • endDate - the end date of a range of files to query eg. "2017-01-08T12:34Z"

GET /v1/omnichannel/accounts/{account}/profiles/{profile}/files/count?utk={utk token}&filePrefix={file prefix}&startDate={start date}&endDate={end date}

Curl Request

curl -i -b JSESSIONID={session_id} \
  https://api.tealiumiq.com/v1/omnichannel/accounts/{account}/profiles/{profile}/files/count?utk={token}&filePrefix={file prefix}&startDate={start date}&endDate={end date}

If the file count exceeds 50, try narrowing the date range parameters.

Example Response

{ 
    "count" : 42 
}

Error Messages

Error Type Description
400 Bad request
  • File Definition prefix exceeds 150 chars
  • File Definition prefix, start date, or end date not supplied
  • Start/End date format is invalid
  • End date is before Start date


Single File Status

GET /v1/omnichannel/accounts/{account}/profiles/{profile}/files/{file name}?utk={utk token}

Curl Request

curl -i -b JSESSIONID={session_id} \
  https://api.tealiumiq.com/v1/omnichannel/accounts/{account}/profiles/{profile}/files/{filename}.csv?utk={utk token} \ 
  -o {filename}.txt 

Example Response

{
    "_id": {
      "$oid": "5702ea6db993f8f8cbea334c"
    },
    "account": "acme",
    "created_at": {
      "$date": "2016-04-04T22:27:56.993Z"
    },
    "file": {
      "name": "sales-transaction_2016feb4v50.csv",
      "size": 2744,
      "checksum": "44b12d35ea9fffdeeb69f98b03004f22",
      "source": {
        "type": "s3",
        "host": "johndoe:"
      },
      "line_count": 35
    },
    "node_id": "bulk_downloader_i-cd504b49",
    "profile": "main",
    "status": [
      {
        "state": "DOWNLOADING",
        "timestamp": {
          "$date": "2016-04-04T22:27:57.285Z"
        }
      },
      {
        "state": "DOWNLOADED",
        "timestamp": {
          "$date": "2016-04-04T22:27:57.722Z"
        }
      },
      {
        "state": "PROCESSING",
        "timestamp": {
          "$date": "2016-04-04T22:27:57.780Z"
        },
        "lines_skipped": 34
      },
      {
        "state": "PROCESSED",
        "timestamp": {
          "$date": "2016-04-04T22:27:58.797Z"
        }
      }
    ]
  }

Error Messages

Error Type Description
404 Not Found Supplied file name not found
400 Bad request File name exceeds 150 chars

Multiple File Statuses

Returns an array of file statuses within the specified date range. Only applies to files with the same file prefix.

GET /v1/omnichannel/accounts/{account}/profiles/{profile}/files/search?utk={utk token}&filePrefix={file prefix}&startDate={start date}&endDate={end date}

Curl Request

curl -i -b JSESSIONID={session_id} \ 
  https://api.tealiumiq.com/v1/omnichannel/accounts/{account}/profiles/{profile}/files/search?utk={token}&filePrefix={file prefix}&startDate={start date}&endDate={end date} \ 
  -o {filename}.txt  

Example Response

Below is a sample file status for three files that are all prefixed by "sales-transaction". Each file prefix is followed by "_" and a unique date identifier. Each file entry contains information about the file size, line count (i.e. number of rows) and number of lines processed (including any processing errors). 

[
  {
    "_id": {
      "$oid": "5702ea6db993f8f8cbea334c"
    },
    "account": "acme",
    "created_at": {
      "$date": "2016-04-04T22:27:56.993Z"
    },
    "file": {
      "name": "sales-transaction_2016feb4v50.csv",
      "size": 2744,
      "checksum": "44b12d35ea9fffdeeb69f98b03004f22",
      "source": {
        "type": "s3",
        "host": "johndoe:"
      },
      "line_count": 35
    },
    "node_id": "bulk_downloader_i-cd504b49",
    "profile": "main",
    "status": [
      {
        "state": "DOWNLOADING",
        "timestamp": {
          "$date": "2016-04-04T22:27:57.285Z"
        }
      },
      {
        "state": "DOWNLOADED",
        "timestamp": {
          "$date": "2016-04-04T22:27:57.722Z"
        }
      },
      {
        "state": "PROCESSING",
        "timestamp": {
          "$date": "2016-04-04T22:27:57.780Z"
        },
        "lines_skipped": 34
      },
      {
        "state": "PROCESSED",
        "timestamp": {
          "$date": "2016-04-04T22:27:58.797Z"
        }
      }
    ]
  },
  {
    "_id": {
      "$oid": "5702ea6bb993f8f8cbea334b"
    },
    "account": "acme",
    "created_at": {
      "$date": "2016-04-04T22:27:55.562Z"
    },
    "file": {
      "name": "sales-transaction_2016feb4v49.csv",
      "size": 2744,
      "checksum": "7b8f92474e220c275bf9931c0337abf3",
      "source": {
        "type": "s3",
        "host": "johndoe:"
      },
      "line_count": 35
    },
    "node_id": "bulk_downloader_i-cd504b49",
    "profile": "main",
    "status": [
      {
        "state": "DOWNLOADING",
        "timestamp": {
          "$date": "2016-04-04T22:27:55.601Z"
        }
      },
      {
        "state": "DOWNLOADED",
        "timestamp": {
          "$date": "2016-04-04T22:27:55.887Z"
        }
      },
      {
        "state": "PROCESSING",
        "timestamp": {
          "$date": "2016-04-04T22:27:56.045Z"
        },
        "lines_skipped": 34
      },
      {
        "state": "PROCESSED",
        "timestamp": {
          "$date": "2016-04-04T22:27:56.669Z"
        }
      }
    ]
  },
  {
    "_id": {
      "$oid": "5702ea69b993f8f8cbea3349"
    },
    "account": "acme",
    "created_at": {
      "$date": "2016-04-04T22:27:53.276Z"
    },
    "file": {
      "name": "sales-transaction_2016feb4v53.csv",
      "size": 2744,
      "checksum": "44b12d35ea9fffdeeb69f98b03004f22",
      "source": {
        "type": "s3",
        "host": "johndoe:"
      },
      "line_count": 35
    },
    "node_id": "bulk_downloader_i-cd504b49",
    "profile": "profile",
    "status": [
      {
        "state": "DOWNLOADING",
        "timestamp": {
          "$date": "2016-04-04T22:27:53.307Z"
        }
      },
      {
        "state": "DOWNLOADED",
        "timestamp": {
          "$date": "2016-04-04T22:27:54.249Z"
        }
      },
      {
        "state": "PROCESSING",
        "timestamp": {
          "$date": "2016-04-04T22:27:54.289Z"
        },
        "lines_skipped": 34
      },
      {
        "state": "PROCESSED",
        "timestamp": {
          "$date": "2016-04-04T22:27:55.480Z"
        }
      }
    ]
  },

Error Messages

Error Type Description
400 Bad request
  • File Definition prefix exceeds 150 chars
  • File Definition prefix, start date, or end date not supplied
  • Start/End date format is invalid
  • End date is before Start date

 

Omnichannel File Errors

File processing can fail for many reasons: invalid service credentials, missing row data, internal server error, etc. In the File Status, these specific errors are logged in the value of the last_failure key.

Here's a truncated sample of a file status:

    ... 
    {
        "state": "PROCESSING",
        "timestamp": {
            "$date": "2016-10-05T18:09:54.014Z"
        },
        "lines_failed_processing": 1,
        "last_failure": "Failed to find attribute for column with id"
    },     
    ...

 

Below listed errors are specific to Omnichannel Files; they are different from the API endpoint errors.

Processing and Download Errors

Error message What does it mean How to resolve
Unknown file prefix, unable to parse file File for the supplied prefix (name) could not be found. Either the file doesn't exist or there is a typo in the prefix. Re-upload the file and double check the file prefix in the Omnichannel Definitions tab. If the error persists, please contact your Tealium Account Manager.
DBObject of size {###} is over Max BSON size {###} File is too large to be processed Split up your file data into multiple file definitions
Failed to download (FTP, SFTP, S3) file File could not be downloaded due to errors in your service credentials (see Omnichannel Configuration tab).
  • For FTP/SFTP, double check the Host name, user name, and password
  • For S3/Tealium S3, double check the Access/Secret keys and Bucket/prefix
Invalid connection type
OR 
Could not find required (FTP, SFTP, S3) configuration parameters for definition
Your service credentials (under Omnichannel Configuration tab) could not be authenticated.
  • For FTP/SFTP, double check the Host name, user name, and password
  • For S3/Tealium S3, double check the Access/Secret keys and Bucket/prefix

Configuration and Definition Errors

These errors are caused by incorrect column mappings in the Definitions tab.

Error Message What does it mean How to resolve
Failed to find attribute for column with id AudienceStream could not generate an Omnichannel Attribute for the column name you mapped. Replace the offending column header with a fresh instance and re-upload the file
Exception thrown while processing file Unable to parse the value of your mapped column header into the Omnichannel Attribute Replace the offending column value and re-upload the file.
Secondary visitor ID column= {column_name_here} does not exist in the file, unable to continue The column header you mapped to the Visitor ID field does not exist in your file. This could be due to a typo or the header is not defined. Make sure the column header you supplied matches what is in the file
{date_format_here} could not be parsed as a Date The date format you specified for the mapped column name does not exist in your file. Make sure the date format you supplied matches what is in the file
The number of columns to be processed {##} must match the number of CellProcessors {##}: check that the number of CellProcessors you have defined matches the expected number of columns being read/written The column names you mapped under Omnichannel Attributes do not exist in your file. This could be due to a typo or the header is not defined. Make sure the column header you supplied matches what is in the file

Internal Server Errors

These errors occur when the different servers processing your files are unable to communicate with each other.

  • Connection is already closed due to connection error; cause: com.rabbitmq.client.MissedHeartbeatException
  • Can't connect to new replica set master [##.#.#.###:###], err: couldn't connect to server [##.#.#.###:###], error querying server
  • DBClientBase::findN: transport error
  • socket exception [SEND_ERROR]

Resolution: Re-upload your file(s).