09-05-2017 03:51 AM - edited 09-05-2017 03:52 AM
The documentation about importing offline data specifies the limits for the CSV files, as follows:
The following limits are in place for Omnichannel file processing:
In case we would like to import bigger files, there is the recommendation to use the Tealium Collect HTTP API. My question is, do we have to send an HTTP request to the API for each row of the csv file with the offline data? Because for more than 1,000,000 rows, that would be a long time for importing the data...
Many thanks in advance.
Solved! Go to Solution.
Below are a few points that I hope help guide you:
1) Each row of CSV data imported (Omnichannel), and each HTTP API call made, result in an event. It's important to attempt to combine all visitor data into a single CSV row or HTTP API call to help keep the number of events to a minimum.
2) Only the CSV adheres to the Omnichannel import limitiations. The API can handle much more volume - it's the same endpoint as the Tealium Collect Tag and therefore able to handle a very large amount of volume. Any volume under 100 events/second does not require approval, however any volume of that amount would need approval which can be managed through your Account Management.
3) If you are looking for a one time initial import, we can be flexible on the number of rows imported per day for Omnichannel. Our Digital Strategy team can guide you on the best approach for that. Then after the initial import, it's important to only import the change/delta in visitors.
4) For future readers, you may be wondering why to use one over the other. The first is the base ability of the system you're using: some systems can only export data therefore the Omnichannel CSV is required; other systems are able to trigger an HTTP call and this is where the API comes into play. It secondly depends on the immediate need to make use of the data: the API excels in having the data available quickly, whereas the Omnichannel import is a batch process that will take time to munch on the data. It also depends on the capability of your team: small teams with limited resource seem to prefer CSV so that they aren't writing code; though other small teams may have a ninja that can quickly whip up a perl/python script that triggers the API.
This may be a bit more than what you were looking for but it's important for considering what best fits your needs.
Please let me know if you have any further questions.
Our guide on omnichannel file imports has a section on splitting large files into smaller files:
the batch processing of file import should be put in the documentation.
Aside from the 10min cycle time to import it would nice to know how long it takes to process files.
As someone who is configuring by way of batch upload it's a pain to test and configure and iterate with both 10min cycles and an unknown process time