- TLC Home Home
- Discussions & Ideas Discussions & Ideas
- Product Guides Product Guides
- Knowledge Base Knowledge Base
- Developer Docs Developer Docs
- Education Education
- Blog TLC Blog
- Support Desk Support Desk
This article describes how to import CSV files as a data source, allowing offline data to become enriched and actionable within the Customer Data Hub.
Before you begin:
This guide has the following sections:
In this article:
The File Import feature provides the ability to import a CSV file to supplement online visitor profiles with valuable offline data. Using this feature, you can import a CSV file using a file transfer service, such as an Amazon S3 Bucket. The Tealium server connects to the file service, reads the file and ingests the data. Each imported row is processed as an event. Once ingested, the data can then be enriched, stitched to existing visitor profiles, and sent to other vendors.
There are two components used in setting up a file import: a column mapping configuration and a file transfer service.
The column mapping configuration determines the event attributes that correspond to each column in the CSV file. The column names are often different from the attribute names in the Customer Data Hub, so this mapping ensures that the data is imported properly. For example, a CSV file might have a column named postalCode
, but the matching event attribute is named customer_zip
, so a column mapping is needed to associate the two.
Mappings can be configured based on an existing event specification or as a custom event.
When a file import uses an event specification mapping, the event attributes are preselected and you specify the CSV column name that corresponds to each attribute. Each row is processed as an event of the selected specification, for example tealium_event = "purchase"
When a file import uses a custom event mapping, you specify the event attribute that corresponds to each CSV column. Each row is processed as an event with the following event identifier: tealium_event = "imported"
The file transfer service is a secure location where you upload the files to be imported. The following file transfer services are supported:
Tealium uses VPC endpoints to access S3 buckets directly through the AWS network. Use IAM credentials to allow Tealium to access to your own bucket.
If using your own file transfer service, be sure to have the connection details ready before proceeding.
The order of operations for importing a file as a data source is as follows:
store_purchases
.Once a file transfer service is assigned to a file import data source you will upload files to the service. The system then uses the following order of operations:
Grouping rows with the same visitor ID will increase the speed of the import.
Setting up a file import data source requires that you prepare a CSV file, configure column mappings, and assign a file transfer service.
Before you begin, create a sample CSV file (less than 1,000 rows) to use during the setup process. It will be used to automatically detect the column mappings.
Use the following steps to begin setting up a file import:
Use the following steps to add a file import data source:
The first step in setting up a file import is to upload sample CSV file. This will automatically detect the columns names in the file and pre-configure them on the column mappings screen. This step is optional and can be skipped by clicking Continue to go directly to the column mappings screen.
Use the following steps to upload a sample file:
The Mapping screen is used to indicate the type of data being imported and how to map the CSV columns to event attributes.
Each row of the CSV file is processed as an event. Use the drop-down menu to select which specification matches the data in the file, otherwise select Custom. This selection determines the value of the tealium_event
attribute during the import process. If Custom is selected, tealium_event
is set to "imported", otherwise it is set to the corresponding specification name e.g. "purchase".
Changing the selected specification will reset the column mapping table.
The column mapping table contains the following columns:
Use the following steps to map each column to an event attribute:
yyyy-MM-dd
.The file transfer service is a secure location used you upload your files for Tealium to retrieve them. If you are using your own file transfer service, ensure that you have the connection details ready before proceeding.
Tealium supports the following file transfer services:
Use the following steps to add and configure your file transfer service:
In this final step, you will view the summary, make any needed corrections, and then save and publish.
The final step is to upload your CSV files, which is done outside of the product interface. To upload a file to a file service, a third-party application is required to initiate the upload. Though you may use any client for this purpose, Tealium recommends Cyberduck because it’s free and supports FTP and Amazon S3.
When multiple files are uploaded at the same time using SFTP or S3, the files are processed in the order of the upload timestamp.
Use the following steps to upload a file via FTP or Amazon S3 using Cyberduck:
When using SFTP, the file must be located in the root folder. Files cannot be located in the sub-folders of an SFTP connection.
Once the file import data source is set up and has begun importing files, you can view the import activity from the Status tab. A rolling month report will display, with details about how many rows were processed, with and without errors, in graphical format. A tabular view displays directly underneath the graph.
You can also see events imported from files on the Live Events screen by navigating to Live Events and selecting the File Import data source you want to view from the drop-down list.
If an error occurs, hovering over the red area of a bar in the graphical display displays a breakdown of the errors. Tooltips will show common errors with a link to detailed report of all errors hosted in S3. Hover over any bar in the display to view details about the number of rows processed.
This section addresses frequently asked questions (FAQs) about the feature described in this article.
If AudienceStream fails to copy a file using the file transfer service, it will retry in 10 minutes.
When a file process failure occurs, the file is ignored. The system will not attempt to process the file again. The most common reasons for failure are:
File import does not support PGP or GPG key encryption. Unencrypted or decrypted files prior to transfer in order for them to be uploaded successfully. Files are secured with native SFTP in transit, or native S3 in transit and at rest.
File import does not support compressed (zipped) files. Extract (unzip) files prior to transfer in order for them to be uploaded successfully.
Copyright All Rights Reserved © 2008-2021