This Python script is designed to conduct a simulated A/B test, a commonly used method in statistics and data science for comparing two versions of a single variable. The objective is to determine which version performs better. This script serves as a practical case study to showcase A/B testing.
Google Collab Link with Python Code - Click here
To simulate click data for an experimental group (exp) and a control group (con).
- Utilizes
numpyandpandaslibraries to generate random binary click data (1= click,0= no click). - Creates two datasets:
df_expfor the experimental group anddf_confor the control group. - Each group has
1000samples with different click probabilities (0.5forexpand0.2forcon). - Merges the data into a single DataFrame
df_ab_testfor analysis.
To determine if the difference in click rates between the experimental and control groups is statistically significant.
- Calculates total number of clicks (
X_con,X_exp) and click probabilities (p_con_hat,p_exp_hat) for each group. - Computes a pooled click probability (
p_pooled_hat) and pooled variance. - Calculates standard error (
SE) to measure the precision of click probability estimates. - Performs a two-sample Z-test (calculates test statistic (
Test_stat), critical value (Z_crit), and p-value). - Determines a confidence interval (
CI) to estimate the true difference in click probabilities.
A website is testing two different webpage designs to see which one results in higher user engagement, measured by clicks.
- The
expgroup is shown the new webpage design, while thecongroup sees the original design. - The script simulates user interactions and calculates the click-through rate for each group.
- Statistically analyzes the results to assess if the new design significantly improves user engagement.
The script provides statistical evidence on whether the new design (experimental group) leads to a higher click rate compared to the control group.
Decisions are made based on the p-value and confidence interval, determining the implementation of the new design across the website.
This case study illustrates the effective use of A/B testing in digital marketing, website optimization, and user experience research for data-driven decision-making.

