Leveraging Drona to improve the attendance-to-payment ratio of the masterclass.

Context
Drona is an in-house video conferencing tool built by Scaler. A masterclass is one of the primary services that helps acquire leads for Scaler. Currently, a masterclass is a central place where a potential learner gets in touch with Scaler’s prominent asset i.e. a credible instructor. Scaler as a service, is building trust by offering 1:1 teaching as a service that is not usually available via other sources.
Within this offering, a learner can gain:
Primary:
- Learn the concept they were seeking to learn. Get the career questions from the experienced industry persons.
- Get learning materials for future reference.Secondary:
- Understand how Scaler functions.
- Learn what goes behind ups killing in certain domain.
- Mini-course certificate.
Phases of the user journey
- Before registering to MC
- After registering to MC
- Before attending MC
- While the MC is going on
- After attending
Situation
We have gotten a brief where we need to improve the Attendance to Payment Ratio of these Masterclasses. Current A2P stands at 1.6%.
There’s no target metric but just a hypothesis to improve this number through design.

What factors influence learners’ decisions to enroll in paid programs after attending a free Masterclass on the Drona Platform?
Various touch-points of a Masterclass: — At this stage, before looking at the Drona platform I have spent time mapping all the communication done regarding an MC.
To answer questions like what makes the post of an MC exciting to join?
What are the takeaways that one gets after attending the class?
Categories of MC that is a Software development and data science.

Data Analysis
Looking at the numbers/data of MC activity from the last 3 months (Nov-Jan):
- Looking at MC recordings vs. attending the live classes would be essential to make clear observations.
- But I need to justify why I chose to watch a specific class. To answer that, I have spent time recording specific data points from the MC that took place in the last 3 months.
- As R2A is a primary metric, looked at that to understand which kinds of classes tend to have high or low R2A.
- The below table represents the data points of top and low-performing MC. Note that apart from the usual MC, a paid workshop type of MC that runs for 6 hours across weekends is excluded in this study. The reason is that they are paid and naturally have high attendance.


Drop-off Trends




Behavioural Insights



Device-level Engagement





Are some MCs supposed to do better?
There are two factors that go into deciding the quality of a class.
- Instructor — This really boosts attendance in case the instructor is a public figure or comes from a reputed background.
- Content — This impacts both the attendance and the drop-off rates.


What difference is happening in the classes that perform exceptionally well?




Analysis & Hypothesis

Does this mean we can expect a similar pattern in the classes with 180 min length as well?

Going one step further — Drop-off trends and hotspots






Indicators of good engagement
Based on the analysis, good engagement in a class seems to be influenced by several factors:
- Device Type — Desktop users show higher engagement in terms of messages, questions, and reactions. Mobile users, despite having a high drop-off rate, also contribute significantly to engagement metrics. This suggests that ensuring masterclasses are optimized for desktop and mobile experiences can enhance engagement.
- Class Completion Rate — The average completion rate among those who drop off (around 48.67%) highlights the importance of keeping attendees engaged beyond the halfway point of the class.
Observations
- Mostly the MC which covers any hands-on topic tends to have high retention rates compared to theory classes. This could be due to the ROI learner expecting to spend 3 hours.
- MC with class content on interviews or how-tos, tends to have the lowest R2A and mod retention. This could be due to learners' need for quick info and impatience.
- Among DS MC’s theory classes tend to have more R2A than Hands-on.
- Need to understand why DS hands-on classes have moderate R2A?
- Irrespective of R2A rates, Retention percentages of the classes in these months were high. Can this be due to the presence of high-interest learners who are probably repeat visitors?
Actions
MC content analysis and their feedback to discover patterns:
- We started to observe the specific MC as noted in the above table. The reasons for choosing them are:
- MC with high R2A and high retention. Vise versa with DS.
- MC with low R2A
- To make this observation more structured, I made a small framework with color coding specific types of observations.
This step is crucial in forming strong personas and class insights.


Summary on Observations
INSTRUCTOR
- An instructor holds all the load of class engagement. Example:
The script used by the instructor that has well-structured execution has shown to have highly engaged learners. - The regular questions and nudges used by the instructor as responded to by spontaneous thumbs up and downs.
- Through class comments, could observe that certain learners attend in the interest of listening to specific instructors.
HOST
- As a general perception, some classes showed how learners' flow was interrupted by the host. In reality, host interruptions were used as breaks by the instructor.
- Hosts tend to have a consistent delivery style in all the MCs.
Class materials - Learners are very keen on getting hold of materials like notes, certificates, and recordings.
- Nothing pressing was seen in the class responses. They have 90% ratings within 4–5 points. And additional feedback as either positive or generic “good”/”great” short answers.
- There is a potential to capture better feedback through a survey.
tl;dr
Stakeholder Interviews — Instructor, Hosts, Learners, Program, Product, & Leadership
Mostly the MC which covers any hands-on topic tends to have high retention rates compared to theory classes. This could be due to the ROI learner expecting to spend 3 hours of time.
MC with class content on interviews or how-tos, tend to have the lowest R2A and mod retention. This could be due to learners' need for quick info and impatience.
Among DS MC’s theory classes tend to have more R2A than Hands-on. Need to understand why DS hands-on classes have moderate R2A?
Irrespective of R2A rates, Retention percentages of the classes in these months were high. Can this be due to the presence of high-interest learners who are probably repeat visitors?
Next Steps: Engagement Score
Take a step back and look at various aspects of Masterclass engagement.
We built a metric called engagement score to map it up with learners’ actions during the class including messages, quiz participation, polls, Q/A, etc.
Case 1: When polls and quiz launched in class
Quiz:2 , Polls:2, Reaction:2, Chat:2 , Question:2Case 2: When polls are not launched, quiz launched
Quiz: 2.5 , Reaction:2.5 , Chat:2.5 , Question:2.5Case 3: When the quiz is not launched, polls launched
Polls: 2.5 , Reaction:2.5, Chat:2.5, Question 2.5Case 4: When both quizzes and polls are not launched
Question: 4, Reaction: 3, Chat:3

User Personas



Competitive Analysis




UX Design
In order to improve the A2P of MCs, We figured a few things are important.
- Reducing the drop-off and personalizing the experience for expectations matching.
- Context setting, before, during, and after the class.
- Attendees are not aware of what Scaler has to offer.
- Instructors are the reason why someone converts into a paid customer.
1. Reducing the Drop-on in the first 30 minutes of the class.
It was pretty clear that if we can reduce the drop-off, the value creation will happen through the class. We added an exit flow that never existed before to reduce dropoffs and nudges to improve class participation.



Impact Created
- 43%(323) of total joined learners(750) tried to drop-off during the 0–75% duration of the MC.
- Of the 43% learners, Drona Drop off form was able to hold back 21.98%(71) of learners back into the class.
- A total number of learners who received the Register for Upcoming MasterClass Nudge — 6%(20) of them registered for an upcoming MC.
- 80% of the learners we held back stayed back for ~5 minutes on average.
Rest 20% stayed for a longer duration than 5 minutes and dropped off in the last 30% duration of the class
2. Improving Participation in the Class to engage for longer.
Apart from a few changes in design, we suggested several actionable on the operational level basis of the research we conducted.
- Instructor to utilize the first 15 minutes of the class to engage learners with the outcome of the session that would make them want to stay back.
- Host to make sure that the questions are answered as most of the people who dropped off were the ones who didn’t get an apt response.




Impact Created
- Overall engagement level with Quiz, Polls, QnA improved by 17%.
- The first quiz take rate improved by 28%.
3. Instructor-led Retention
One clear learning we had from looking at various MCs where few sessions outperformed not just in terms of registration but in attendance. Turns out the payments follow the same trend where learners had a really high confidence in the instructor majorly coming from their public figure.
Oftentimes, Learners were not aware of the instructor and the hypothesis was if we could leverage the first few minutes of the session to hype up the instructor to improve retention.

Impact Created
- Though still in the MVP stage, we have seen a mixed response to this one. Live in a few classes, an average of 7% of the learners checked the instructors' LinkedIn profiles.
- User Calls suggest that a lot of learners started respecting the instructor more with this screen and wanted to hear them out.
- Drop-off in the first 30 minutes of the call reduced by 3% while the overall class remained the same.
4. Context Setting: Cue Cards
We solved a few problems through this;
- Showing what’s the agenda of the session helped learners stay through the class.
- It also helped Instructors adhere to the agenda and initiate engagement nudges like Quizzes and Polls.


Results
- The overall A2P of the session with these changes shipped improved the percentage from 1.6% to 2.9%. In absolute numbers, this brought a (16*800*0.013*12*350000) INR 70 Crores annually.
- The engagement score turned out to be flawed. Engagement during the class has nothing to do with a user’s intention to buy into the program as the score efficiency turned out to be 4%.
- Utilizing the first 5 minutes of the class helped build retention. We kept a slideshow going in the first few
- Not getting answered in the call was the major reason for the dropoff. We are scoping out AI responses to the curriculum and program-based queries.
- Few learners were not aware of the Scaler offerings before they joined the masterclass. Cue cards helped us set the expectations right from day 0.
- We recommended learners join through the desktop for on-call practice sessions, lower drop-off rates, and better interactions.
- Working on classroom PPT, Content, and instructor manual for engagement were the key outcomes of this research.