What's wrong with data teams
Why most data teams are expensive consulting shops — and how to fix it
"That sucks, but you're right; this is the best for the company. We actually will get by. We have workarounds; the salespeople don't."
I still remember the department head saying those words in our quarterly planning meeting at Unite. I'd just told him his team would get zero data support for the next twelve months while we focused on sales analytics instead. He could have fought me. He could have escalated to leadership.
Instead, he chose the company over his personal requests.
That moment changed everything I believed about data teams.
For a year, I'd been running our business intelligence team like Apple customer service—fast, responsive, focused on stakeholder satisfaction. We processed tickets, delivered dashboards, and measured success by delivery speed. With 500 employees and ~100 weekly BI users, we were the model of a modern data team. Our stakeholder satisfaction hit 95%+. Average ticket resolution: 3.2 days.
We were also completely wrong about what data teams should do.
The problem isn't that most data teams lack technical skills. This Reddit thread makes it painfully clear that even data professionals recognize their teams are fundamentally dysfunctional. Teams build beautiful, unused models. Executives ignore recommendations with practiced indifference. Industry analysis confirms that 73% of data science projects never reach production.
If you only have 5 minutes: here are the key points
Most data teams run a service-desk model that optimises for ticket velocity and stakeholder happiness—not for business impact.
Three hidden dysfunctions fuel the problem: solution-first thinking, technical purity over pragmatism, and insulation from consequences.
When the business pivots, the service model crumbles because requests become fuzzy, context changes daily, and “fast delivery” no longer equals value.
Treat the data function as a product team whose customers are company decisions, not individual requestors. Success = decisions made, not dashboards shipped.
Measure your work by decision implementation and business outcomes; be willing to disappoint some stakeholders to serve the company’s strategy.
The real tragedy isn't that data teams fail—it's that we keep building them like internal consulting shops when they should be strategic product teams.
The service model trap
Unite's data team in 2018 was everything the industry preached. We operated like a repair service: clear problems came in through tickets, we had step-by-step solutions, and we delivered quickly. Sales manager needed conversion metrics by product line? Done in two days. Marketing wanted cohort analysis? Ready by Friday.
The model worked because it met three conditions that felt permanent until they weren't. Problems were well-defined. Solutions followed predictable patterns. Tasks needed quick implementation, not strategic thinking.
Then Unite underwent a triple transformation that shattered our comfortable assumptions: complete business model pivot, migration to microservices, and move to cloud. Suddenly, those three conditions vanished like foundational support beams pulled from under our feet.
Requests started coming in for products we'd never heard of—whole business lines that didn't exist the previous quarter. We analyzed data from unknown sources using terminology that changed weekly, sometimes daily. Our "step-by-step" solutions disappeared because every dashboard now required understanding context that evolved faster than we could learn it.
Our quarterly planning meetings became the perfect metaphor for dysfunction. Instead of daily ad-hoc requests, we'd gather everyone's wishlists at the start of each quarter. Twenty department heads with printed lists of desired reports and dashboards—dozens of disconnected requests masquerading as strategy. We'd spend four hours prioritizing these requests, creating elaborate spreadsheets with effort estimates and business value scores.
I convinced myself this was progress—moving from daily chaos to quarterly batches. But we hadn't solved anything. We'd just become a quarterly ticket processor instead of a daily one, like switching from individual pain pills to monthly injections of the same drug.
The real problem: we were optimizing for stakeholder satisfaction instead of business impact. Every request felt equally valid because we had no framework for distinguishing between what people wanted and what the company needed.
The three dysfunctions that kill data teams
The Unite transformation revealed three dysfunctions that plague most data teams—dysfunctions so embedded in how we think about data work that we mistake them for best practices. I started recognizing these patterns everywhere: in sleepless nights wondering why our beautiful work gathered dust, in the industry reports documenting systematic failure.
Dysfunction 1: Solution-first thinking
Data teams ask "What can we build?" instead of "What decision needs to be made?" This distinction seems subtle until you watch it destroy project after project. During our service model days, we'd get a request for "churn prediction modeling" and immediately think algorithms, feature engineering, model validation. My team would light up—finally, some real machine learning!
What we should have asked: "What specific decision will this enable, and who will make it?"
Often, the answer revealed the real need was basic cohort analysis—something solvable with SQL in hours, not machine learning over weeks. But solution-first thinking feels like expertise. It feels sophisticated to reach for advanced techniques. Meanwhile, that customer success manager is making retention decisions based on gut feelings because our sophisticated model takes three weeks to run and produces outputs they can't use.
Solution-first thinking creates sophisticated solutions to problems that don't exist while ignoring simple solutions to problems that matter.
Dysfunction 2: Technical purity over pragmatic impact
Data teams optimize for code quality, model sophistication, and architectural elegance instead of decision velocity. This isn't impractical thinking—it's misdirected excellence. I watched brilliant data scientists at Unite spend six months building ensemble models to predict customer lifetime value with 87% accuracy. The work was genuinely impressive—clean code, rigorous validation, the kind of project that would shine in an academic portfolio.
It influenced exactly zero business decisions.
Meanwhile, the sales team made territory assignments using Excel formulas and gut feelings. Account managers prioritized outreach with manual scoring systems they'd built in Google Sheets. The sophisticated model sat unused because it couldn't integrate with how sales managers actually worked—like building a Formula 1 car for people who need to drive to the grocery store.
Decision-makers need answers Tuesday, not perfect models eventually. They need insights they can act on with their existing tools and processes, not elegant solutions that require learning new interfaces.
Dysfunction 3: Insulation from consequences
Most data teams report to CTOs or Chief Data Officers, not business leaders whose decisions they're supposed to influence. This organizational structure creates a fundamental misalignment that quietly kills impact. At Unite, our success metrics were all delivery-focused: dashboards shipped, requests completed, stakeholder satisfaction. We were never measured on whether sales managers changed territory strategies or executives made different decisions based on our analysis.
This turns data teams into internal consulting firms—impressive presentations get delivered, sophisticated analyses get shared, recommendations get documented, and everyone moves to the next project. There's no skin in the game for whether insights actually change business operations.
The organizational chart reinforces this dysfunction at every level. When data teams report to technology leaders, they're evaluated using technology metrics. Code quality matters more than business impact. Model accuracy matters more than decision implementation. Architectural elegance matters more than stakeholder behavior change.
What actually works: The product transformation
The solution isn't better tools or smarter data scientists. It's treating data teams like product teams instead of service teams. But this transformation requires confronting the comfortable myth that being responsive to stakeholder requests equals being valuable to the business.
When I decided to flip our quarterly planning, I was risking the credibility I'd spent a year building as the PM who delivered on stakeholder requests. Instead of asking stakeholders what they wanted, I was going to propose what the company needed. Instead of collecting wishlists, I spent weeks understanding the critical decisions facing Unite's leadership during our transformation.
The result was radically different. Instead of thirty disconnected dashboards, we proposed two major initiatives: sales segmentation optimization and customer health scoring. Both aligned with Unite's strategic priorities. Both would be measured by decision implementation, not technical sophistication.
The magic was forcing hard choices and making the strategic case clearly. When stakeholders had to choose between personal requests and company-critical initiatives, they consistently chose the company. But this only worked because I'd done the homework to understand what the company actually needed.
The complete transformation process took three years and required fundamental changes in how we operated. We developed a dual communication strategy that acknowledged the reality of internal customers: informal updates with all users who needed tactical information, but strategic planning only with actual decision-makers who could implement our recommendations.
We learned to distinguish between people who support decisions and people who make them—a distinction that seems obvious until you're trying to keep everyone happy.
The vision that guided everything was deceptively simple: "To help employees make better and faster decisions thanks to data." This clarity revealed why the service model couldn't work—you can't optimize decision-making by optimizing ticket processing. You can't improve judgment by improving responsiveness.
Most importantly, we measured success differently. Instead of stakeholder satisfaction and delivery completion, we tracked decision implementation and business impact. Did sales managers actually use territory recommendations? Did customer success teams change outreach based on health scores?
The results validated everything we'd suspected but had been too afraid to test. Stakeholders preferred strategic focus over responsive service, even when it meant getting less immediate attention. Decision-makers valued implementable insights over sophisticated analysis, even when it meant sacrificing technical elegance. The company made better decisions faster when data teams acted like strategic partners rather than technical consultants.
Why this matters now
Companies are spending $2-5M annually on data teams while most projects never reach production—burning cash on elaborate solutions to problems that don't exist. Post-2023 efficiency focus means executives are questioning every expensive team that doesn't directly drive results.
The companies that fix this will compound competitive advantages in ways that won't be immediately visible to competitors. While their rivals burn millions on unused ML models and ignored dashboards, they'll have embedded analysts making critical decisions faster and better.
The transformation isn't about changing tools or hiring different people. It's about changing how data teams define success and who they're accountable to. It's about having the courage to disappoint individual stakeholders to serve the broader organization—the same courage that department head showed when he chose company needs over personal convenience.
The stakeholder who told me "That sucks, but you're right" understood what most data leaders miss: sometimes the most strategic thing a data team can do is say no to the urgent so they can say yes to the important. Sometimes being truly data-driven means building fewer dashboards, not more.
The dysfunction isn't that data teams lack sophistication—it's that we've confused sophistication with impact. Until we fix that fundamental misalignment, we'll keep building beautiful systems that no one uses while wondering why organizations still make decisions based on intuition and Excel.