Lasting Success With Data Science: Fostering Agility Across Data Science Teams

By Kris Schroeder, Business Architect & Agilist
9/13/2022

In 2020, Forbes reported that nearly 15% of leading firms have deployed Artificial Intelligence (AI) capabilities into production. Companies that can respond quickly to change, grow their people and leadership capabilities, and emphasize effective delivery agility will position themselves with significant advantages in their marketplace.

Responding to your customers’ needs requires innovation — and agility is the key to driving that innovation.

In this third and final part of our series of blogs on data science, we’ll explore why your data science teams should be striving for agility. In part 1, we looked at common reasons why data science projects fall short. In part 2, we discussed how to improve your data science project maturity. Now, let’s dig into how to bolster your team's agility.

What is agility?

Agility is being responsive and adaptive. Innovation requires the ability to respond and adapt to the changing marketplace and changing business needs. Data science teams require agility.

When you hear the word agility you may think of Agile or common frameworks like Scrum and Kanban. Common Agile frameworks do not fit nicely with data science (due to the unpredictable nature of the work, and the iterative, circular nature of the data science lifecycle). But a focus on agility can still help data science teams.

When thinking of agility, start with the Agile mindset. The word mindset means the direction of one’s thinking. In other words, it's our beliefs that orient our thoughts and actions. One common mindset that people exhibit is glass-half-full or glass-half-empty thinking. The optimistic beliefs inherent in a glass-half-full mindset will impact how someone responds to situations and the decisions they make. Similarly, your Agile mindset influences how you interact with your team and how you make decisions about your work.

The Agile mindset is defined by the values in the Agile Manifesto and described by the 12 Agile Principles. As many data scientists are not familiar with the Agile Manifesto or principles, let’s simplify it. Alistair Cockburn, one of the signers of the Agile Manifesto, summarized it in his Heart of Agile approach, which is based on the manifesto and highlights four focus areas: collaborate, deliver, reflect, and improve.

Heart of agile
Source: Heart of Agile, Alistair Cockburn

Data science agility

Just like software development, data science teams need to find an approach for prioritizing work, decomposing the work into small chunks, identifying success criteria, and collaborating to demonstrate business value. You may need to add talent to help data scientists adopt the Agile mindset that will allow them to improve any stagnant processes that may be holding the team back from delivering value.

Adapting to feedback and responding to change needs to happen at every step of the data science lifecycle.

Data science lifecycle

Business understanding & data collection: By increasing focus on business outcomes, you gain clarity on the question you are trying to answer. After creating a hypothesis to achieve those outcomes and determining if you have the data to test your hypothesis, present your concept to the business and get feedback on your understanding.

  • Collaborate to gain clarity on business outcomes.
  • Deliver a hypothesis and present your concept.
  • Reflect on feedback from the business.
  • Improve and refine your hypothesis based on feedback.

Data preparation: Ensuring the team has the right data to answer the hypothesis is only half the battle. Depending on the accuracy of the data or pipeline setup, iterations could be spent just cleaning and delivering the data. Once delivered, additional preparation and featurization are needed to get the data ready for model consumption. By involving the right stakeholders in your data governance process, you ensure shorter feedback on data accuracy and build their confidence in the models you create.

  • Collaborate with stakeholders in your data governance process.
  • Deliver and share code.
  • Reflect on feedback from the business and reflect on your delivery capabilities.
  • Improve and refine your data preparation process.

Model training & evaluation: As you test your hypothesis, continue to engage the stakeholders frequently. These feedback sessions ensure the data scientists aren't going off course or building something too complex for a simple problem. Intentionally seek feedback on delivering value. There needs to be a decision to deploy the model, continue to investigate the model further, or stop investment in this particular hypothesis and focus elsewhere.

  • Collaborate with your team to develop your model.
  • Deliver and present findings frequently in feedback sessions.
  • Reflect on feedback from the business and your processes.
  • Improve by adapting your model; try experimenting with new ways of working.

Deployment & business integration: Model deployment requires special skills that many data science teams struggle to fill. Iterative mindsets are needed to ensure the models are being delivered to the right location and in the correct manner. After deployment, agility is adjusting to the inevitable data and market change, determining the delivery of value through adoption, and validating the ROI.

  • Collaborate to validate the adoption of deployed models.
  • Deliver model adjustments based on learnings.
  • Reflect on market changes and user feedback.
  • Improve by incorporating learnings into future work.

This process iterates on itself once a new use case is identified or as any step of the process needs to be refactored.

Experimentation

In Agile, we inspect and adapt. We value reflecting on what we’ve done to see if it’s working — and if it’s not, how we can try to do it better. In Scrum, we also value empiricism, which requires learning through our experiences. In Lean and UX, defining experiments helps combat uncertainty. Applying the idea of running experiments to data science teams can help us discover which practices are helpful and which don’t work.

In part 2: How to Improve Your Data Science Project Maturity, we introduced the experiment format.

  • Hypothesis: indicates what you will explore and the benefits you hope to achieve
  • Experiments: clarifies how you will test your hypothesis
  • Timebox: sets a reflection point to intentionally determine whether further investment should be made
  • People: identifies who will be accountable for the tests and any dependencies

This format helps clarify work and shorten feedback loops. It can also be used to improve delivery approaches, practices, and techniques that enhance your team’s effectiveness. On a regular cadence, the team reflects on their processes and how they interact with each other, identifying potential improvements. They can then choose the most important improvement to focus on and document the improvement in the form of an experiment. At the end of the timebox, they can determine whether to continue with the improvement option or try a new one.

Wrapping up

Data science has become an integral part of staying competitive and relevant across industries. And though many data science teams face common challenges — having visibility into bottlenecks, defining success, understanding the iterative nature of data science, and adopting consistent feedback loops can improve effectiveness. By focusing on continuous improvement and agility, organizations can improve their approach — for the long term.


Organizations everywhere trust Insight to support data science projects with a proven-practices approach and a deep bench of experts who strategically deliver on your unique business outcomes.

Contact our team to see how we can work together.