A/B test: Expanded Website Visitor Registration
Created by: a-bergevin
Motivation
Currently our "trial" experience for individual website visitors is fragmented.
- Users who choose to get started with cloud register & auth early (and are entered into an email campaign) where they have a more guided and nurtured experience.
- Users who choose to get started with self-hosted are sent to the docs, get no nurture emails and have a very unguided experience.
We want to provide a similar experience to users regardless of deployment type. We hypothesize that expanded registration for all trial users at the top of the funnel will help us improve on several metrics. More details can be found in RFC 691.
Hypothesis
Presenting a straight-forward registration before any complexity around deployment will reduce the bounce rate from the Get Started landing page.
Test
The test will split traffic 50/50 from all users who click Get Started CTAs on both sourcegraph.com and about.sourcegraph.com.
Control: The control will be the current flow [INSERT FIGMA]. The deployment choice is presented up front.
- Self-hosted users click through to the docs page
- Cloud users hit a registration page next
Test: The test will be that users who click Get Started immediately hit a registration page (flow diagram). To the user it appears as if they are registering with "Sourcegraph" the company and not for a particular instance. On the back end they are registrants for Cloud. After registering they land on a deployment choice page, very similar to the control group. We will modify any language that may be confusing in the new flow/context, but the goal is to leave it largely unchanged.
- Self-hosted users click through to the docs page
- Cloud users go to the current post-registration flow (need to check but I think it's a step for "syncing repos")
What is not being tested: We are simply testing the effect of registration first for all users followed by the deployment choice. We are not testing new layouts for that deployment choice, added information gathering (Company, Role, etc) or any other post-registration flows as this time.
Metric and experimental design
Metric: We are tracking several key metrics:
- Reduce bounce rate at the landing page for
Get Started(Primary) - Improve registration rates (Secondary)
- Qualified Opportunities / MQLs / PQLs - need to sort this out with marketing (Secondary)
- Activation rate (Secondary)
Smaller significant change: Significance threshold: 5% Duration/size: How long (on how many users) do we need to run this A/B test for it to be significant? Include a link to a singificance calculator.
Descriptive analytics
Flag
Checklist
-
Share Figma Flow with team and get review link -
Get review on Figma Flow with team lock it -
Get legal sign off on the T&C/Privacy Policy being expanded to both deployment types -
Create deployment options page in sourcegraph repo PR #36347 -
Make returnTo work with social logins on Sign Up page -
Assure lazy feature flagging PR merge -
Create /get-startedab test redirection controller path in sourcegraph repo -
Integrate and test event logs for the whole flow -
Update Get Started redirects everywhere -
Confirm measurement plan with analytics, QA events, make sure dashboard is set up (Amplitude, Looker, or GA, etc)