Growth teams are commonplace among new-age startups and organizations that run cross-functional initiatives, as these teams can help unlock growth potential for the company. Growth teams are responsible for driving initiatives to improve a specific business metric. Typically, their role is to accelerate the accessibility of those features to the target audience that achieves specific business outcomes.
Product growth has increasingly depended upon more sophisticated techniques involving data modeling and machine learning. This can be crucial to building features that move the needle on key business metrics. However, these come with a unique set of challenges. Data science initiatives can be time-consuming and sometimes unnecessary. Such projects require an alternative framework to test and validate your ideas quickly, lest your team waste precious resources on the wrong initiative.
This article outlines a simple framework I've developed for leveraging data science for growth initiatives and circumventing common problems—without compromising execution speed.
The primary goal of any growth initiative is to drive specific key performance indicators for the product and organization. Hence, evaluating the proposed project's impact on the KPI is advisable. However, this is easier said than done.
Realistically, it is impossible to know the impact before executing the initiative. At this stage, the goal is not to quantify the impact accurately to the T but to do your best by making reasonable assumptions. Over time, as you learn more about your project, you can revise the assumptions to get to a more accurate impact statement.
Evaluating the return on investment can help you prioritize problems to solve and even help evaluate multiple solutions. Furthermore, if your teams have limited technical resources, the estimated impact of your project will help you easily maneuver prioritization and resourcing discussions with senior executives or partner teams
Additionally, if your team or organization has a north star metric, it is desirable to go a step further and evaluate the project's impact on that metric. This will ease stakeholder conversations and make it easier to get buy-in for your project across the organization.
Data is the backbone of any data science initiative, the output of which is only as good as the data fed into it. Getting yourself familiar with the integrity of the data you are working with will help you optimize your solution and understand its limitations. For example, if the data is updated at a monthly cadence, it might not make sense to refresh the data science model every week.
In addition to basic checks, assess data quality for completeness, correctness, consistency, validity and timeliness. Before beginning your initiative, a thorough familiarization of data streams can help you get ahead of problems that are simple to solve now but could become severe in the future.
It is essential to fully understand the problems and pain points to build a successful solution. This is analogous to understanding the road conditions to inform whether to build a luxury car, sports car or off-roading ATV. Understanding the current process will identify the potential blockers you will encounter as you drive this initiative.
Working closely with the business teams provides crucial guidance for feature engineering. As you delve into the data, you may come across strong positive correlations, only to realize they’re not independent and cannot be used for modeling. This is a common occurrence when teams use data that has been massaged. The actual performance would be very different from real-world data in such cases. Furthermore, changes to data processes may have downstream impacts. Recognizing these impacts upfront can help loop in the right teams at the right time.
Now that you've done all the pre-work, it's time to evaluate solutions. Several solutions work well in theory, but some may not be practical. In addition to that, data science initiatives are often considered black-box algorithms that non-technical teams do not understand well.
To ensure your project sees the light of the day, an end-to-end walkthrough with all stakeholders helps build trust. It is easy to skip past this step and dive directly into execution. Don't. This can be the single most significant point of failure of your project. Socializing your project plan can help stakeholders get comfortable with the approach, avoid apparent gaps and add new perspectives to solve the problem.
It is common for data science growth initiatives to extend over a long time. However, business teams often do not have enough time or resources to invest without promising results. Like other product features, data science initiatives also require several iterations and take time to perfect. Breaking your project into smaller milestones helps build trust and confidence among stakeholders. But there's another way to speed this up.
Before the first line of code is written, think through a simple analytical approach for quick implementation. This requires working through all the expected changes but replacing time-consuming data modeling with a simple analytical approach. This is your project's minimum viable product. Use this MVP to get early feedback and resolve any remaining operational hassles. If you are looking to get buy-in from your leadership to invest in a complex attribution modeling technique, it would be useful to present expected gains from a basic, spreadsheet-based attribution model to reinforce the business need. Quick results further strengthen stakeholder buy-in and ease the pressure off data science teams to deliver impact in a limited time.
As data science becomes the backbone of most growth initiatives within organizations, it is essential for product growth teams to adopt best practices. The framework outlined here can help teams think through their data science growth initiative and ensure the project is a success.
This article was originally published on Forbes.com