5 challenges Data Teams face supporting PLG revenue teams and how to address them
5 challenges Data Teams face supporting PLG revenue teams and how to address them
Diana Hsieh
Diana Hsieh
Insider Tips

5 challenges Data Teams face supporting PLG revenue teams and how to address them

Table of Contents:

Text LinkText Link

Let’s not mince words. Product led growth (PLG) isn’t something that happens overnight. It has to infuse company culture and involves commitment from every team - not just the go-to-market teams on the front lines. 

PLG introduces a lot of new data about how customers are using the product that GTM teams can leverage, but this data presents new challenges to data teams who are tasked with figuring out how to model data in a scalable and reliable way. It doesn’t help that data leaders are being constantly bombarded with one-off analytics requests. 

Although the situation sounds dire, enabling PLG data will pay dividends. Revenue teams at product led companies need to leverage product data in order to run the most effective, scalable go-to-market (GTM) motions. 

  • Sales teams can use product data to identify the best customers to connect with so that they are spending time on the customers most likely to buy. 
  • Marketing teams can use product data to refine nurture campaigns to provide a more personalized, relevant brand interaction with users. 
  • Customer Success teams can use product data to identify superstar customers or proactively reach out to customers who are at risk to churn. 

We’ve spoken to hundreds of companies implementing PLG or looking to transition to PLG, and throughout all these conversations, we’ve identified 5 common obstacles data teams need to address when it comes to enabling GTM teams with product data. In this blog post, we’ll elaborate on these 5 challenges and provide suggestions on how to address them.

The 5 common challenges for PLG data teams

  • Data is generated by multiple teams across your company
  • Data isn't owned by the stakeholders who use it (i.e. your revenue team)
  • Data is surfaced in platforms that teams don’t use on a daily basis
  • Data that is surfaced in downstream platforms is overwhelming in scope
  • Data that is converted into PQLs or Scores become black boxes that are rarely actionable

Data is generated by multiple teams across your company

Every team across your company is generating data about customers on a daily basis. 

  • Sales teams are manually inputting data about opportunities in their CRMs, tracking activity, or automatically enriching accounts and contacts with tools like Clearbit and ZoomInfo. 
  • Marketing teams are automatically adding leads into CRMs and tracking marketing activity. 
  • Product teams are tracking product usage data in data warehouses. 
  • Customer Success teams are working with customers through tools like Intercom. 

The data inputs are endless and are not uniformly collected, which means that everything lives in different data silos. It’s difficult to stitch all this data together into a single view for all teams to consume.

Further, when different teams are generating data, they are generating it for their own use cases rather than thinking about how they can scale their data sets across the organization. We often observe this when sales teams get access to Segment data, which is often generated by Marketing or Product. Sales teams don’t understand what various button clicks mean, making the barrier to adopting Segment data in sales processes that much higher. This is a challenge we navigated over at Correlated as well! For example, we track the various actions that users automate in our product and build alerts around “exceptions”. However, “exceptions” are labeled differently for different actions, so we have to catch all the labels correctly in order to get a complete alert for all “exceptions”. Obviously, if we threw this raw data at our sales team, they would have absolutely no idea how to use it.

Suggestion: Leverage a data warehouse as your source of truth and invest in enabling technologies in your data stack

Ultimately, the customers who are most successful in implementing a PLG GTM motion with us are the ones who have a data warehouse. This is because a data warehouse can act as your source of truth for all your customer and user data. It is certainly a challenge to sync all of your data sources with your data warehouse, but it is not an insurmountable one. ETL tools like Fivetran, Stitch, and Airbyte make it possible to do this using a single platform. 

Once all the data is loaded into your data warehouse, you can leverage all the features and functionalities of data warehouses to shape and model your data. We highly recommend dbt as a great option for those looking to scale their data modeling across their organization.

Leveraging a data warehouse is not without its limitations. It requires deep institutional knowledge and expertise from data teams in order to unblock key business challenges, and of course this becomes a cost center. Investing in tools that allow teams to self-serve their way into pre-built models is a great option, and there are many tools in the market today that can help. For example, Reverse ETL solutions like Census and Hightouch connect into the models your data teams have built and make it easy for downstream applications to consume that data. This makes it possible for your data teams to scale their efforts to support multiple use cases. BI solutions like Tableau, Looker, and more recently, Hex, provide business users with access to pre-modeled data. Tools like Correlated sit on top of your data warehouse to enable business users to build automated playbooks. 

Data isn't owned by the stakeholders who use it

One of the biggest challenges with implementing a data warehouse as described above is that you have to figure out how to enable the various personas who need that data. If all you do is dump data into a data warehouse, none of the teams who rely on that data to do their jobs will be able to access it. If left to fester, data teams and GTM teams can ultimately reach a stalemate where data teams have implemented processes that prevent GTM teams from getting what they want. Alternatively, GTM teams could be requesting data team resources all day every day, completely inundating data teams with requests. We’ve seen both situations at companies, and it is ugly.

Suggestion: Focus on driving alignment between data teams and business goals. Make cross-team collaboration part of every day processes.

Companies should view their data teams as a pooled resource that supports the entire company. As a result, data team resources should be allocated and scheduled similarly to how engineering teams schedule and allocate time to work on product roadmap. It’s important, however, to not treat the data team as a separate silo where requests are sent (and often added to the backlog). Instead, scheduling and allocation of resources should happen hand in hand with the business units who will benefit from data team resources. This is the model that we’ve seen work best in companies with successful PLG GTM motions that leverage data.

One of the reasons why it’s important to intertwine business use cases with data team projects is that we’ve rarely seen companies successfully implement a product-led data strategy without doing so. If you send the data team off to “solve our usage tracking problems,” they’ll end up coming up with a solution that might not work with the downstream tools your business teams use, or they’ll try to find a solution that works for everything, essentially boiling the ocean on all possibilities. This ends up taking upwards of a year to implement, leaving GTM teams flying blind in the meantime. 

Instead, focus on use cases that you as a business want to accomplish and build out the infrastructure, models, and views that can support those use cases. Leveraging data is a core strategic competency for SaaS businesses, so it’s not realistic to expect to never have to touch your existing data models again. This is why the tools you choose to use are so important - although you will have to adjust your data models from time to time, it doesn’t have to be painful every single time.

Data is surfaced in platforms that revenue teams don’t use on a daily basis

Typically when companies start scratching the surface on leveraging product data to run a product led growth motion, they’ll take an initial approach of driving “visibility”. This is essentially Looker, Mode, Tableau, or some other business intelligence (BI) dashboard that shows things like how an account is using the product or a list of “product qualified leads”. This is a worthwhile exercise because it allows your data team to build out the data models that your GTM teams can benefit from. 

However, time and time again, we’ve heard feedback that GTM teams simply do not look at dashboards. In the rare cases that they do, they find that dashboards are rarely actionable. For example, let’s say that you have a dashboard that lists out PQLs. An AE can’t just take an email off that dashboard and email them. That’s because (1) they have to figure out if they’ve already emailed the user before (check Salesforce/Hubspot), (2) they have to figure out what to say to the user (check another dashboard on product usage), (3) they aren’t sure why the user is even a PQL in the first place (Slack data team to ask how a PQL is triggered).

Solution: Pipe data into downstream applications.

Rather than expecting GTM teams to look at dashboards and analyze the data, send data directly to the downstream applications that they live in on a daily basis. We’ll talk about some of the downsides to this approach in the next “problem,” but some approaches we’ve seen companies take is to build very specific views in Salesforce that contain all the information an AE (account executive) or CSM (customer support manager) needs to reach out in an intelligent way. For example, on the Account view, they might include things like: seats sold, seats utilized, number of active users. This allows the revenue team to stay in the tools they use, but still see the data they need to reach out to customers appropriately.

Data that is surfaced in revenue team platforms is overwhelming in scope

This problem is linked in many ways to the prior problem - when companies discover that their teams are not using dashboards, their initial response is to pipe data into downstream applications. Now, I highlight data here because data is messy, there is a lot of it, and it requires analysis. We’ve seen companies pipe so much data into Salesforce that they literally run out of custom fields. As an AE or CSM looking at Salesforce, those views simply become information overload. You end up giving so much information to your GTM teams that they literally cannot do anything with it.

Solution: Strive to pipe actionable insights over data.

A better approach is to think about the end goal you want to achieve by sending data to downstream applications. Are you trying to surface a lead? Are you trying to suggest a next-best-action? That should determine what data you send and how you send it. 

For example, let’s dig into the “seats sold” and “seats utilized” example. You could pipe that information into Salesforce and surface that as a custom field on the Account object. However, wouldn’t it be better if you created a lead in Salesforce, assigned it to an AE, and added a note that said - this customer is hitting 80% license utilization, add them to the “Upsell” Outreach sequence? You are sending insights that are immediately actionable. 

Some patterns we’ve seen that we really like are: auto-creating Salesforce Leads, auto-creating Salesforce Opportunities, adding users to Outreach or Salesloft Sequences / Cadences, adding users to lists in Hubspot. The opportunities are endless. By thinking about insights and expected action over just data, you can ensure that all your efforts to build out a product-led data strategy are actually adopted by the GTM teams who have to put them into practice.

Data that is converted into PQL Scores becomes a black box

Final topic: PQL Scoring. People really love PQL Scores recently, but it’s something that you should definitely approach with care. In some ways, PQL Scores are the ultimate effort to convert data into insights. E.g. I crunched all the data for you, and this customer is great to reach out to. The problem is that GTM teams need to know how to reach out, which means they have to understand the “why” behind PQL Scores. If they don’t understand the why, they are not going to adopt the score. You’ll spend 6-12 months building out a score that ultimately no one uses. 

This is not to say scores are not useful - they certainly are, particularly when you’re trying to get an overall directional feel of if someone is worth reaching out to. However, they should only be used as one piece of the puzzle and should not be viewed as a solution to PLG.

Solution: Think about the playbooks you want to enable after receiving an insight

There are a lot of vendors out there claiming to surface leads “magically” and with “machine learning”. We ourselves run machine learning models to help our customers identify which Signals might be interesting to go after, so it’s a viable strategy and can surface some interesting insights. However, again, insights are more valuable if they drive a downstream action. 

If you’re thinking of using a score, we’d suggest tying it to a playbook you want to execute. For example, create an “Upsell” score, and align it with an “Upsell” playbook. Create an “Onboarding Self-Serve” score, and align it with an appropriate playbook. If you simply use one score, your results won’t be as good as focusing on personalized, actionable playbooks that drive real results. 

Time to dive in!

We’ve explored a lot of surface area in this blog post, from data warehouses to delivering that data in actionable ways to GTM teams. This may sound like a lot, but we’ve seen a lot of companies get to success by taking things one step at a time. What we’ve seen work best is actually a more vertical approach. Rather than trying to solve problem 1, then moving onto problem 2, etc etc, pick one important business challenge that aligns with a top-line goal and solve all the problems vertically.

For example, let’s say that your business wants to drive self-serve conversion. Next, identify the “actions” you want your GTM teams to take, the “data” you need to make that possible, and the “tools” that you’ll need to tie everything together. Finally, go forth and do it! By aligning data teams with actionable business outcomes, you’ll be able to get the backing and support you need across teams to get things done. 

If you'd like to learn how Correlated can assist your product-led journey, get started for free or schedule a call with our team.

Interested in learning about how Correlated can help your PLG company uncover expansion and upsell opportunities?

Sales and revenue leaders at PLG companies, like yourself, are faced with unique challenges. Using tools like Correlated can help sales and marketing teams identify new accounts that are ready to convert, or can help to notify your team for expansion and upsell opportunities.

Schedule a Demo