Hi @bjoern_koth, it has been my experience with other personalization tools that the only way to avoid the flicker issue is to implement synchronously.
That said, for the 3 second built-in timeout, have you tried editing it to see if 2 seconds, or even 1 second, works? I worked on a similar issue before and we went down the route of manipulating the timeout to reduce the "hide" as much as possible.
... View more
In my last post I spoke about what to do when your QA test goes wrong. As you go through the process of testing your logic for assigning attributes and audiences, one important tool you need for validating end-to-end scenarios is Trace. That said, along the way, I have collected other tools that I rely on to ensure that everything is working as it should.
1. The Test User Badge
In AudienceStream, every update you make is "live". This means that you need to be sure that the updates that you made will work. But what if you aren't ready for that update to make it to PROD and would like to make sure it truly works? In this case, you can leverage the Test User badge.
Why You Need This
By adding in a Test User Badge as a condition to your audience you are guaranteeing that the only people that can make it to your audience are people who have that badge assigned to them. This means you won't have any live site visitors getting caught up in your test experiments. Once you've passed all of your test scenarios, you can remove the badge condition and save. Once the audience is truly in PROD, you can run through another test to ensure that everything is still working properly.
This Chrome extension allows you to edit, delete, and add cookies to your site.
Why You Need This
In many use cases, your audience conditions may be dependent on cookies and their values. For example, if you have a cookie called "Segment" with the possible values of A and B you might want to quickly edit the cookie values to test that your audience assignment is working. EditThisCookie allows you to change and cookies value so that once the page refreshes your edits will be saved. The tool is also rather helpful in that it's an easy way to keep track of all the cookies that are being set.
3. Tealium Tools
This robust browser extension is actually a suite of tools that help you in the management of your Tealium installation.
Why You Need This
Anyone who works in AudienceStream needs to be familiar with Tealium Tools as there are tools within it that will help with auditing your tags (Web Companion), inspect your data layer (UTAG Debugger), follow your site interactions in real time (Trace), and switch TiQ environments (Environment Switcher).
I find the Environment Switcher particularly useful when I have tags or extensions that have been published to DEV or QA in TiQ. By dropping down into the lower environment using this tool, suddenly these elements will be live on the site until you switch back to PROD.
4. Developer Tools (Console and Local Storage)
Why You Need This
All of your badges, audiences, and UDO variables will be populated. It's important to get comfortable looking "under the hood" of your site this way to know what is going on at a high level.
In the addition to the Console tab, I use the Application tab in order to look at Local Storage as well as Cookies. Under Local Storage for your site, if you look at the tealium_va key you will see list of all the audiences, badges, booleans (flags), and number attributes (metrics) associated with your profile on that given page view. From this view, you will need to know your attribute ID numbers since pretty names are not visible. Knowing where all these elements live in DevTools means that you can "trace" your journey without signing into AudienceStream. In many cases we might want to run through a test scenario without having to signing in to enable Trace. This becomes useful for quick spot checks.
Getting to know your site and data layer from many different lenses is a big help in the long run and the tools outlined above will certainly help you become an expert. Are there any tools that you use that should be added to the list? Comment below and share your picks!
... View more
Yes, absolutely @mark_dearlove , thank you for highlighting the importance of a Test User badge! In my next post, I'll be digging into recommended tools/methods for testing in AS and the Test User badge is definitely one I rely on quite a bit. :)
... View more
As a member of the Data Science and Analytics team for Publicis Sapient, the work that I do is centered on the concept of digital transformation. As strategic partners to our clients, we work to transform and accelerate digital businesses worldwide. There are many tools and platforms that aid in our work, a key one being Tealium. I’ve been a certified Tealium practitioner for the past 3 years, creating end-to-end use cases for a diverse group of clients including telco, pharmaceutical, and retail. As I’ve had the opportunity to ideate on many types of use cases, it stands to reason that I’d be a big proponent for having a solid process for standing up new audiences.
When you first start building attributes and audiences in AudienceStream, you'll find that the process is pretty straightforward. That said, as you continue to build out new audiences and have more team members working in the account, you may quickly find you have some unintended conflicts in audience assignment.
It's common for teams to run into issues with unintended overlap or conflicting logic that will impact the success of a use case. This post will focus on how to create a solid test plan to ensure a succesful launch of your AudienceStream solution.
One method my team uses to maintain an AudienceStream account is to develop QA test plans prior to going live with any new audience. These plans allow us to test our "join audience" logic, "leave audience" logic, and see how a new use case might overlap or compete with another audience that we did not anticipate. In this post, I'll focus on how to create and use a test plan to ensure a smooth launch.
Draw It Out
Even before creating your QA test plan, one step that I've found particularly important to any use case is to whiteboard your audience as a process flow diagram. A diagram breaks down the logic of a use case as a series of decision points and outcomes. Having a visual representation of your logic helps to validate that your use case is fundamentally sound.
Create Your Test Plan
Your QA test plan should be tightly scripted--that is, the instructions should be written in such a way that a user without prior knowledge of the use case would be able to follow them and successfully replicate the intended result.
Ideally, you will want to write a series of tests to validate the following key scenarios:
Joining the Audience You are added to the audience based on the prescribed steps.
Attributes Get Set The attributes that make up your audience are being assigned at the right step (validating your assignment logic).
Not Joining the Audience You are not added to the audience when you don't match the audience criteria (validating your exclusion logic).
Leaving the Audience Once you're in an audience, you're able to be removed from the audience (validating your removal logic).
Triggering Connectors Connector(s) are firing at the appropriate time (end-to-end testing).
Triggering Personalization (if applicable) If the audience is meant to trigger a personalized experience, that you are seeing the right experience (cross-platform testing).
Avoiding Audeince Overlap (if applicable) You are not able to join more than one audience at a time (validating that your audiences are mutually exclusive).
What Your Test Plan Should Include
In my experience, I've found that documenting these test plans in a spreadsheet helps to track every last detail and keep everyone coordinated. And it needs to be accessible to your entire team!
I've also found that watertight QA plans should include the following elements:
Title and Description - What scenario does this test validate? What's the intended result?
Descriptive Instructions - Ideally, the person writing the test and the one doing the test will be different. Be detailed so that the tester knows when they have replicated the intended result.
Pass/Fail Status - Have a column dedicated to tracking whether the test passed or not—I've found with my clients that having a simple color-coded "yes/no" column makes things clear at a glance
Keep Notes - Have a notes section; the tester might want to add in notes to document what they are seeing especially if it's different to what is described as the intended result.
What To Do When It All Goes Wrong
As you make your way through validating each scenario, don't be surprised if there are some unintended effects that causes your test to fail. In the event something goes wrong, figure out which step isn't working and write clear notes about your observations.
With my clients, this is where collaborative troubleshooting kicks into high gear. The first thing we do is confirm that other testers on the team are able to replicate the behavior. Replication is key as it will confirm that there is something wrong with the audience or attribute logic and not an issue isolated to how one person might be conducting the test.
Coordinate your team efforts by running a Trace in AudienceStream. Using trace to troubleshoot is great because you're able to inspect each event received and watch in real-time what is happening (or not happening) with your attribute and audience assignments. For example, perhaps there's a typo in your badge rule which is causing it not to be assigned—sometimes this is easier to see in trace than the badge screen. If you need to make a quick edit of a badge rule, save and publish, then rerun a new trace to confirm that you've been able to resolve the issue.
Doing this work upfront will save you time—you are much better serves to resolve bugs prior to launch instead of having to scramble to fix a faulty audience that is in production. It also helps to make sure that everyone—from stakeholders to the implementation team are all on the same page as far as how the audience creates fulfills the business need for the client.
I'm interested to hear whether or not QA test planning is a tactic that you use—share your comments below!
... View more