3 Lessons I Learned as a Data Leader (and How to Apply Them)

Discover 3 essential lessons every data leader should know to strengthen data leadership and drive business value.
Bruna Baungarten
Data Manager
Published in
September 29, 2025
Last update in
October 9, 2025

Data is everywhere, but extracting real value from it requires technique, focus, and intention. Throughout my career, I have built a solid foundation that combines theoretical knowledge, including a PhD, with practical experience in data. I worked in different roles, from Data Scientist to Data Manager, with experience in Analytics and Engineering along the way. 

Over time, I developed a genuine passion for turning data into meaningful impact and for fostering a strong self-service culture. Along my journey, I have learned practical lessons that have shaped not only how I work but also how I lead teams and drive data strategy.

In this article, I share 3 key lessons that I give to my team to get more value from your data and achieve more reliable insights across the business.

Lesson 1: Create Business Value

It doesn’t matter if we developed a sophisticated machine learning model or implemented a data migration pipeline using the most advanced tools on the market. What truly counts is the impact the solution generates for the business and the value it delivers.

Balancing a highly technical field, focused on optimized algorithms, precise calculations, and statistically valid tests, with the company’s strategic concerns is a major challenge. 

At the end of the day, what executives really want are answers to questions like:

  • How can we increase sales in the next quarter?
  • Why aren’t customers returning after their first purchase?
  • How can we optimize results without raising costs?

These questions can (and should) be answered with data. But identifying the right insights among dozens of analyses and visualizations requires sensitivity, pragmatism, systems thinking, and a strong focus on business objectives.

More than just finding answers, it is essential to translate technical insights into a language that business teams can easily understand and to tell stories that highlight the relevance of those discoveries. Often, it is storytelling that determines whether your solution moves forward or ends up shelved.

For me, this ability to transform data techniques into real business value, paired with strong storytelling, is one of the key markers of seniority in the field.

Lesson 2: Master the Data Journey

As you probably know, the Data Journey is the path data takes until it generates value for the business and there are several frameworks that describe this journey (e.g. IBM’s CRISP-DM).

A basic model of the Data Journey might include:

  1. Understanding the business context
  2. Understanding the available data sources
  3. Data collection and storage
  4. Data processing
  5. Data analysis or modeling
  6. Value generation

However, it is essential to understand that the Data Journey cannot be isolated from the context in which it is being applied. For each project, even within the same company, the path can (and usually will) be different.

In projects involving a company’s core data, for example, many of these steps are already resolved, allowing you to jump straight into analysis and modeling. 

On the other hand, there are cases where simply making the data available represents value generation for the business. In this scenario, the initial understanding steps are necessary, but the analysis step may be replaced by the very act of making the data accessible.

Mastering the Data Journey is vital for any professional in the field. Contrary to what many believe, it’s not enough to know just “your part” of the journey. 

Data Engineers, for example, need to understand the purpose of the data to act effectively in collection, storage, and processing. Scientists cannot perform good feature engineering without understanding the data sources and the processing applied by Engineers.

If I could give one piece of advice to start mastering the Data Journey, it would be this: focus on the initial stages! That’s where most of the success (or failure) of a project is defined. Let’s look at this in a practical way.

  • Understanding the business context: Without a clear understanding of the company’s goals and needs, it is impossible to properly direct the efforts of a data project. Always keep the organization’s strategic priorities in mind and analyze the potential impact that each deliverable can generate. Seek to deeply understand stakeholders’ pain points and objectives so your work can precisely address them. 

    At this stage, it’s worth scheduling conversations with key business areas. Conduct structured interviews with targeted  questions that help uncover the root  problem. Don’t expect the problem to come fully defined, it’s the data team’s role to translate and structure this.
  • Understanding the available data sources: A strong data professional must know their sources in order to analyze and experiment accurately. Understanding how data is generated and which product interactions produce each data point is essential to knowing what’s useful and how to apply it effectively in your project.

    At this point, keep close alignment with Product and Technology teams. Follow new feature launches and demos, and join training sessions that clarify how these interactions translate into data generation.

Understanding the Data Journey is not difficult, but it is labor-intensive and requires attention to detail, potentially involving a series of technical evaluations, stakeholder interviews, cross-team alignment, and technical validations. 

Surely, mastering this journey makes all the difference in the development and quality of data projects. And if a professional truly wants to generate value with their work, they must venture into this journey.

Lesson 3: Keep it Simple

One of the most valuable lessons I’ve learned and shared in my career is that the famous design principle Keep it Simple, Stupid!” (or KISS, for short) is just as essential in data projects. 

This principle emphasizes simplicity in development and delivery, to ensure that solutions are successful, easy to understand, and effectively used.

In data, it’s no different. We should always strive for the simplest solution that meets the business objective. Excessive complexity tends to create barriers, delays, and maintenance challenges. Moreover, the more complex the delivery, the harder it is for stakeholders to understand and consequently, the lower the adoption rate.

Data concepts are already complex enough, even for those of us who work in the field. Whether it’s indicator analysis, performance monitoring, predictive algorithms, or data pipelines, complexity is part of our daily reality. When a project becomes unnecessarily complicated, it’s easy to lose sight of the main goal, over-test, and delay delivery. Simplicity, on the other hand, drives efficiency, productivity, keeping development focused and on track.

If we, as specialists, already face significant challenges working with data, imagine business users who don’t deal with databases, sources, or algorithms on a daily basis. They need to understand what was built so they can apply insights and transform the business. Stakeholder comprehension is vital to the success of data solutions. And here lies a crucial point: you must be able to explain your solution simply. If you can’t explain it as if to a child, perhaps you don’t fully understand it yourself (as the saying goes).

So how do you apply KISS in data projects? For me, it always comes down to finding the most direct route between a) the inputs and available data and b) the delivery objective. Often, this route may seem too simple, and we’re tempted to build something more “grandiose” to impress stakeholders. That’s a common pitfall: trying to prove yourself instead of focusing on delivering efficiently.

It’s also important to remember that the shortest solution may not cover every aspect of the problem. But here’s the question: couldn’t that still be a good first delivery? An initial value delivery until something more robust is developed and the scenario is better understood? In most cases, yes. A simple approach can serve as a valuable starting point, enabling continuous evolution and adaptation over time.

In short, the KISS principle shows us that in data, simplicity does not mean a lack of sophistication. It means clarity and efficiency. Simple solutions make it easier for business users to understand and adopt them, while also ensuring faster, more effective deliveries. So, when planning your next data project, remember: less is almost always more.

-

Ready to unlock the full potential of your company’s data and accelerate results?

👉 Connect with Shearwater today to discover how our on-demand, specialized data experts can help you scale analytics and drive greater value and impact from your data.

This post was written by

Bruna brings a unique blend of theoretical knowledge and practical experience in data. She’s passionate about turning data into impact and fostering a strong self-service culture.

Subscribe to our newsletter to stay in touch with the latest

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.