How to Create a Solid Test Plan for your AudienceStream Implementation

Jane_PS
Tealium Expert
Tealium Expert

As a member of the Data Science and Analytics team for Publicis Sapient, the work that I do is centered on the concept of digital transformation. As strategic partners to our clients, we work to transform and accelerate digital businesses worldwide. There are many tools and platforms that aid in our work, a key one being Tealium. I’ve been a certified Tealium practitioner for the past 3 years, creating end-to-end use cases for a diverse group of clients including telco, pharmaceutical, and retail. As I’ve had the opportunity to ideate on many types of use cases, it stands to reason that I’d be a big proponent for having a solid process for standing up new audiences.

When you first start building attributes and audiences in AudienceStream, you'll find that the process is pretty straightforward. That said, as you continue to build out new audiences and have more team members working in the account, you may quickly find you have some unintended conflicts in audience assignment. 

It's common for teams to run into issues with unintended overlap or conflicting logic that will impact the success of a use case. This post will focus on how to create a solid test plan to ensure a succesful launch of your AudienceStream solution.

One method my team uses to maintain an AudienceStream account is to develop QA test plans prior to going live with any new audience. These plans allow us to test our "join audience" logic, "leave audience" logic, and see how a new use case might overlap or compete with another audience that we did not anticipate. In this post, I'll focus on how to create and use a test plan to ensure a smooth launch.

Draw It Out

Even before creating your QA test plan, one step that I've found particularly important to any use case is to whiteboard your audience as a process flow diagram. A diagram breaks down the logic of a use case as a series of decision points and outcomes. Having a visual representation of your logic helps to validate that your use case is fundamentally sound.Picture1.png

Create Your Test Plan

Your QA test plan should be tightly scripted--that is, the instructions should be written in such a way that a user without prior knowledge of the use case would be able to follow them and successfully replicate the intended result.

Ideally, you will want to write a series of tests to validate the following key scenarios:

  • Joining the Audience
    You are added to the audience based on the prescribed steps.
    • Attributes Get Set
      The attributes that make up your audience are being assigned at the right step (validating your assignment logic).
  • Not Joining the Audience
    You are not added to the audience when you don't match the audience criteria (validating your exclusion logic).
  • Leaving the Audience
    Once you're in an audience, you're able to be removed from the audience (validating your removal logic).
  • Triggering Connectors
    Connector(s) are firing at the appropriate time (end-to-end testing).
  • Triggering Personalization (if applicable)
    If the audience is meant to trigger a personalized experience, that you are seeing the right experience (cross-platform testing).
  • Avoiding Audeince Overlap (if applicable)
    You are not able to join more than one audience at a time (validating that your audiences are mutually exclusive).

What Your Test Plan Should Include

In my experience, I've found that documenting these test plans in a spreadsheet helps to track every last detail and keep everyone coordinated. And it needs to be accessible to your entire team!

I've also found that watertight QA plans should include the following elements:

  • Title and Description - What scenario does this test validate? What's the intended result?
  • Descriptive Instructions - Ideally, the person writing the test and the one doing the test will be different. Be detailed so that the tester knows when they have replicated the intended result.
  • Pass/Fail Status - Have a column dedicated to tracking whether the test passed or not—I've found with my clients that having a simple color-coded "yes/no" column makes things clear at a glance
  • Keep Notes - Have a notes section; the tester might want to add in notes to document what they are seeing especially if it's different to what is described as the intended result.

Picture2.png

 

What To Do When It All Goes Wrong

As you make your way through validating each scenario, don't be surprised if there are some unintended effects that causes your test to fail. In the event something goes wrong, figure out which step isn't working and write clear notes about your observations.

With my clients, this is where collaborative troubleshooting kicks into high gear. The first thing we do is confirm that other testers on the team are able to replicate the behavior. Replication is key as it will confirm that there is something wrong with the audience or attribute logic and not an issue isolated to how one person might be conducting the test.

Coordinate your team efforts by running a Trace in AudienceStream. Using trace to troubleshoot is great because you're able to inspect each event received and watch in real-time what is happening (or not happening) with your attribute and audience assignments. For example, perhaps there's a typo in your badge rule which is causing it not to be assigned—sometimes this is easier to see in trace than the badge screen. If you need to make a quick edit of a badge rule, save and publish, then rerun a new trace to confirm that you've been able to resolve the issue.

Doing this work upfront will save you time—you are much better serves to resolve bugs prior to launch instead of having to scramble to fix a faulty audience that is in production. It also helps to make sure that everyone—from stakeholders to the implementation team are all on the same page as far as how the audience creates fulfills the business need for the client.

I'm interested to hear whether or not QA test planning is a tactic that you use—share your comments below!

 

23 Kudos
Comments
neilc
Tealium Employee

Nice work Jane!

Jane_PS
Tealium Expert
Tealium Expert

Thank you, @neilc ! :) 

kimmazzucco
Employee Emeritus

Jane, THANK YOU!  You are such a rockstar and your knowledge of our products is invaluable to our Tealium community!!!

BrittanyTracy
Employee Emeritus

Stellar write-up, @Jane_PS! <applause>

Jane_PS
Tealium Expert
Tealium Expert

Cheers @kimmazzucco  and @BrittanyTracy ! :)

LisaMadden
Tealium Employee

This is fantastic! Thank you for taking the time to pull this together. This will be extremely helpful to many. 

Jane_PS
Tealium Expert
Tealium Expert

Thank  you @LisaMadden !

timhutton
Employee Emeritus

This is a great article, thanks for sharing @Jane_PS!

Jane_PS
Tealium Expert
Tealium Expert

Cheers, @timhutton ! Appreciate the feedback :) 

mark_dearlove
Employee Emeritus

This is great @Jane_PS - definitely a valuable insight for the community. One thing in addition that I always find useful is to ensure all my test visitors have the "Test User" badge so you can identify those visitors if you are using the Audience Discovery or other tools.

Public