- Email Campaigns are an important KPI for a limited 14%.
- Retention, subsequent to usage measurements is considered an important indicator by 33% while Expansion by 14%.
Churn, Revenue, Integrations Enablement, and Team Activation are considered important by 10% of the participants.
Webinar Attendance, NPS, Time to initial value, Usage, Educational Content, LTV. CS feedback & In-app engagement have 5% preference.
Activation (61%) is top of mind for most participants but its constituents, Time to Initial Value and Team Activation, fall behind with 5% and 10% preference respectively. Increased preference in Activation and neglection of Retention and Expansion indicates that onboarding prevalence ends after the activation stage.
Conversion Optimization Techniques
CRO techniques are among the top three KPIs, but LTV (5%) is neglected.
16% doesn’t set any KPIs to monitor onboarding.
NPS is among the least important KPIs (5%), implying that customer advocacy is not subsequent when serving low trajectory customers.
Product Management has increased involvement in the onboarding process (81%), but critical indicators like Product usage (5%), In-app engagement measurement (5%) and Educational Content Monitoring (5%), are neglected. The only exception to the rule is Feature Adoption (28%) which given the performance of product measurements, is not evaluated efficiently.
Activation, Trial Length, User Type & Proficiency, Special Promos, On Site Chat Flows are iterated on a consistent cadence, by 5% of the participants.
Only 10% experiments frequently on various Onboarding Strategies.
Content is iterated by 14% of the participants.
Email practices (out of the app onboarding activations) take the lead with 47% preference, followed by In-App Chat Flows (33%), testing of the Sign-up experience (20%) and UI/UX practices (20%).
Conversion Optimization Techniques
CRO Techniques (57%)* are vital for onboarding evaluations, but their constituent, Sign Up Experience, is optimized by 20% of the participants.
Low levels of experimentation on Activation (5%) lead to the conclusion that acquisition practices (eg.Sign up experience) prevail.
Speed to implement (Trial Length), the number one factor associated with Self Serve activation, has 5% preference.
20% of the participants do not test any aspect of their onboarding process.
Increased preference on UI/UX experimentation (20%) in combination with decreased investment on User Type & Proficiency and Activation (5%) make the product experience subject to design practices.
In App Demos
In App Surveys
Live Chat Flows
Experimentation vs. Scalability
Corresponding onboarding tactics for the top five A/B constituents
Sign Up Experience
From the top five experimentation constituents, we exclude Sign-up Experience since it falls under Marketing practices and is not reliant to in-app activations.
Content affects many scalable practices but is optimized by a limited 14%.
Despite Email Campaigns prevalence (84%) only 47%of the participants iterate them constantly.
In-App Tutorials (73%), Live Chat Flows (68%), Welcome Messages (68%) & In-App Sales Demos (57%) have high preference but are iterated by 33%.
User Type & Proficiency
Limited investment on User Type & Proficiency (5%) experimentation implies that in-app. messages are thrown randomly in front of users and neglect context of usage.
Displaying messages to users indicates a predisposed route and that the learning curve, role, or proficiency levels are overlooked. Only product data analysis and historic usage indicate efficiently where and why product engagement practices should exist.
This investment does not imply displacing tooltips or product guides for the sake of experimentation.
Any product engagement iteration, should follow usage and executed when the onboarding team can validate the why behind that change.
Random displacements lead to confusion and friction while keeping conversions and retention rates low.