Skip to Content
A/B Testing

Tracking A/B tests in Usermaven using custom events

1. Introduction

While Usermaven doesn’t have a dedicated A/B testing module, its flexible custom event tracking allows you to effectively set up, run, and analyze A/B tests for your website or application. This guide will walk you through a recommended method using custom events to identify test variants and Usermaven’s analytics features to measure their performance.

This approach empowers you to:

  • Test changes to marketing pages, product features, user flows, and more.
  • Track user assignments to different test variations.
  • Analyze conversion rates and user behavior for each variant using funnels and trend reports.

2. The core method: Implementing A/B test tracking

The fundamental process involves:

  1. Programmatically assigning users to different variations of your test (e.g., “Control” vs. “Variant B”).
  2. Sending a custom event to Usermaven to record which variation a user was assigned to.
  3. Ensuring users consistently experience the same variation on subsequent visits or interactions.
  4. Tracking relevant goal events (conversions, key actions) for users in each variant.

A. Assigning users to test variants

Your website or application code will need to determine which version of a test a user sees. Common methods include:

  • Client-side assignment: Using JavaScript in the user’s browser to randomly assign users to a variant. This is frequently used for website UI/UX tests.
  • Server-side assignment: Using your backend system to assign variants, often based on user ID or other attributes for consistent experiences across sessions and devices.

The examples in this guide will primarily focus on client-side JavaScript for simplicity.

B. Sending the key Usermaven event: ab_test_assigned

When a user is assigned to a test variant, it’s crucial to immediately send a custom event to Usermaven. A standard event name like ab_test_assigned is recommended.

This event should include at least two key properties:

  • test_name: A unique identifier for your A/B test (e.g., homepage_cta_color_test, new_pricing_page_layout). This allows you to filter and analyze results for specific tests, especially if you’re running multiple experiments.
  • variant: The name of the specific variation the user was assigned to (e.g., control, blue_button, simplified_layout).
// Example: Sending the 'ab_test_assigned' event // 'assignedVariant' is the variant name determined by your assignment logic. // 'currentTestName' is the unique name you've given this specific A/B test. Usermaven("track", 'ab_test_assigned', { test_name: currentTestName, // e.g., 'homepage_cta_color_test' variant: assignedVariant // e.g., 'control' or 'blue_button' });

C. Ensuring consistent user experience & tracking

Users should consistently see the same test variant across their session and on return visits to avoid skewing results. Browser localStorage is a common way to achieve this for client-side tests.

Here are example helper functions for managing variant assignment and retrieval:

/** * Assigns a user to a variant for a specific test or retrieves their existing assignment. * Stores the assignment in localStorage to ensure consistency. * Call this when a user first encounters a part of your site/app under an A/B test. * @param {string} testName - A unique name for your A/B test (e.g., 'homepage_cta_test'). * @param {string[]} availableVariants - An array of variant names (e.g., ['control', 'variant_b']). * @returns {string} The assigned variant. */ function assignAndGetVariant(testName, availableVariants) { const storageKey = `usermaven_ab_test_${testName}`; let assignedVariant = localStorage.getItem(storageKey); // If no variant is stored, or if the stored variant isn't one of the currently available ones if (!assignedVariant || !availableVariants.includes(assignedVariant)) { // Simple random assignment. You might have more sophisticated logic. assignedVariant = availableVariants[Math.floor(Math.random() * availableVariants.length)]; localStorage.setItem(storageKey, assignedVariant); } return assignedVariant; } /** * Retrieves the currently assigned variant for a user for a specific test from localStorage. * Call this when you need to know the user's variant for tracking goal events or other logic. * @param {string} testName - The unique name of your A/B test. * @returns {string|null} The assigned variant, or null if not found. */ function getAssignedVariant(testName) { return localStorage.getItem(`usermaven_ab_test_${testName}`); }

Usage: Call assignAndGetVariant() when the user first encounters the test. Then, use getAssignedVariant() to retrieve this variant when tracking subsequent actions or applying variant-specific logic.


3. Tracking conversion events for A/B tests

After a user has been assigned to a test variant, you’ll track their interactions and whether they achieve the desired outcomes (e.g., sign-ups, purchases, feature usage).

For effective A/B test analysis, it’s highly recommended to include the test_name and variant properties in your conversion events as well. This simplifies segmenting results directly within reports for those specific actions.

// Example: Tracking a conversion event (e.g., a button click) const currentTestName = 'homepage_cta_color_test'; // Should match the test name used in ab_test_assigned const userVariant = getAssignedVariant(currentTestName); // Ensure the user has been assigned to this test before tracking a conversion event for it if (userVariant) { Usermaven("track", 'cta_clicked', { button_label: 'Get Started Free', test_name: currentTestName, // Include the test name variant: userVariant // Include the variant they saw // ... other relevant properties for 'cta_clicked' }); }

4. Analyzing A/B test results in Usermaven

Once your tracking is implemented, Usermaven provides the tools to analyze the performance of each test variant.

Trend reports can help monitor the distribution of users across variants or track the performance of specific goal events over time for each variant.

  1. Navigate to Trends (or the Dashboard section to add a new trend widget).
  2. Configure the trend report:
    • Event: Choose the event you want to analyze (e.g., ab_test_assigned to see variant distribution, or a specific goal event like purchase_completed).
    • Filter (Optional): If analyzing a goal event, add a filter where test_name equals the specific test you’re interested in.
    • Group by: Select the variant property.
  3. The report will display counts or trends for the chosen event, segmented by each test variant.

B. Note on statistical significance

Usermaven provides the raw data and conversion rates for your A/B tests. To determine if the observed differences between variants are statistically significant (i.e., not due to random chance), you will generally need to:

  1. Extract the key numbers from your Usermaven reports (e.g., total users assigned to Variant A, number of conversions for Variant A; same for Variant B, etc.).
  2. Use an external A/B test statistical significance calculator. Many free tools are available online for this purpose.

5. Implementation examples

Here are practical examples for different teams:

A. For marketing teams: Testing a landing page call-to-action (CTA)

Objective: Determine if changing the CTA button text (control vs. benefit_driven_text) improves click-through rates. Test Name: landingpage_cta_text_test

// --- On your landing page --- const LP_CTA_TEST_NAME = 'landingpage_cta_text_test'; const LP_CTA_VARIANTS = ['control_cta', 'benefit_driven_cta']; document.addEventListener('DOMContentLoaded', function() { // 1. Assign user to a variant & send tracking event const assignedVariant = assignAndGetVariant(LP_CTA_TEST_NAME, LP_CTA_VARIANTS); Usermaven("track", 'ab_test_assigned', { test_name: LP_CTA_TEST_NAME, variant: assignedVariant }); // 2. Display the correct CTA button text const ctaButton = document.getElementById('main-cta-button'); if (assignedVariant === 'benefit_driven_cta') { ctaButton.innerText = "Start Your Free Trial & Boost Productivity!"; } else { ctaButton.innerText = "Sign Up Now"; // Control text } }); // --- When the CTA button is clicked --- document.getElementById('main-cta-button').addEventListener('click', function() { const userVariant = getAssignedVariant(LP_CTA_TEST_NAME); if (userVariant) { // Ensure user was part of this test Usermaven("track", 'cta_clicked', { button_id: 'main-cta-button', test_name: LP_CTA_TEST_NAME, variant: userVariant }); } });

B. For product teams: Testing a new feature’s visibility

Objective: See if making a new feature more prominent (new_prominent_link) increases its adoption compared to the standard placement (standard_link). Test name: new_feature_visibility_test

// --- When the relevant page/dashboard loads --- const FEATURE_VISIBILITY_TEST_NAME = 'new_feature_visibility_test'; const FEATURE_VISIBILITY_VARIANTS = ['standard_link', 'new_prominent_link']; function initializePageWithFeatureTest(userId) { // 1. Assign user to variant & send tracking event const assignedVariant = assignAndGetVariant(FEATURE_VISIBILITY_TEST_NAME, FEATURE_VISIBILITY_VARIANTS); usermaven.track('ab_test_assigned', { test_name: FEATURE_VISIBILITY_TEST_NAME, variant: assignedVariant, user_id: userId // Useful to include if available }); // 2. Logic to display the feature link based on the variant if (assignedVariant === 'new_prominent_link') { // renderFeatureLinkProminently(); } else { // renderFeatureLinkStandard(); } } // --- When the user clicks on the new feature link --- function handleFeatureLinkClicked(userId) { const userVariant = getAssignedVariant(FEATURE_VISIBILITY_TEST_NAME); if (userVariant) { usermaven.track('new_feature_clicked', { feature_name: 'SuperAnalyzer', // Example feature name test_name: FEATURE_VISIBILITY_TEST_NAME, variant: userVariant, user_id: userId }); } }

6. Key best practices for A/B testing

  • Clear hypothesis: Define what you’re testing, why, and what outcome you expect before starting.
  • One change at a time: Test isolated changes to clearly attribute performance differences.
  • Sufficient sample size & duration: Run tests long enough to gather enough data for statistical significance and to account for typical user behavior cycles (e.g., weekly patterns). Use online calculators to estimate sample size.
  • Consistent user experience: Ensure users consistently see the variant they were assigned.
  • Thorough QA: Test your A/B setup across different browsers and devices. Verify that events are firing correctly for all variants and that the test doesn’t negatively impact other functionalities.
  • Segment Wisely: Consider if your test should target all users or specific segments (e.g., new vs. returning, different user plans). If segmenting, ensure your assignment logic and analysis account for this.

7. Troubleshooting common issues

  • Events not appearing in Usermaven:
    • Verify that the Usermaven SDK is correctly initialized before any usermaven.track() calls.
    • Check event names and property spellings for typos or case sensitivity issues.
    • Use your browser’s developer tools (Network tab) to confirm events are being sent.
  • Inconsistent variant assignment:
    • Ensure your localStorage keys (or other storage mechanisms) are unique per test and correctly implemented.
    • Verify that the test_name is identical in all related tracking calls for a single test.
  • Low user counts or skewed distribution:
    • Check the code that assigns users to variants and fires the ab_test_assigned event. Ensure it’s being executed as expected for the target audience.
    • If randomization is purely client-side, small sample sizes might show uneven distribution initially; this typically evens out with more traffic.

8. Conclusion

Leveraging Usermaven’s custom event tracking provides a robust and adaptable way to conduct A/B testing. By consistently sending an ab_test_assigned event with clear test_name and variant properties, and then including these in your conversion events, you can effectively use Usermaven’s funnels and trend reports to analyze performance and make data-driven decisions.

This method gives you control over your testing setup and integrates directly with your existing Usermaven analytics workflow.

Last updated on