top of page

Understanding the DORA Metrics Exceptions

DORA (DevOps Research and Assessment) metrics have become widely adopted for measuring and improving software development and delivery performance. Although these metrics—deployment frequency, lead time for changes, change failure rate, and mean time to recovery (MTTR)—offer valuable insights for a large set of development teams, there are specific scenarios where tracking DORA metrics may not be effective or fully applicable for an organization.



In this blog, we’ll explore these exceptions and discuss alternative approaches to improve your ability to measure in these various cases.


Mobile Development


Mobile development contains additional mechanisms and flows that would decrease the effectiveness of tracking the DORA metrics. 


Mobile developments and release updates are subject to reviews run by the application stores. Therefore, the deployment actions and process are not completely in the hands of the development team. The delays caused by these reviews would create an unstable flow for lead time for changes and deployment frequency metrics and heavily influence these metrics.



Another aspect of mobile development is that the deployment can be run partially for the users and most of the time it depends on the users and their choice to update the application. This is another reason that the same two metrics deployment frequency and lead time would be affected.


SUGGESTION


Focus on crash-free users, user retention, and app store ratings to gauge app performance and user satisfaction while tracking the DORA metrics.



Hardware-Dependent Software


When software is dependent on the hardware the releases and updates will also rely on these specific hardware options such as embedded systems or IoT devices. 


In such cases, the software updates will depend on the updates in the hardware or the coordination between the teams and the hardware vendor which again will take the actions out of your team and have an impact on the deployment frequency. 


The need to cover all possible hardware configurations creates another requirement for new releases because it requires teams to run extensive testing on various configuration settings. These comprehensive testing process would inevitably affect the Lead time for changes and decrease effectiveness of tracking the metric. 


SUGGESTION


Measure hardware compatibility success rates and the time taken to validate updates across different hardware setups while tracking the related DORA metrics.



Regulated Industries


Industries such as healthcare and finance that are regularly under strict regulatory requirements can also face challenges with DORA metrics.


The regulations, mandatory compliance checks, audit reviews, or documentation processes would delay the process outside of the development team's reach. Therefore the back-and-forth process and strict policies would have a negative impact on the Lead time for changes.


Additionally, these controls and reviews will create approval processes and the need for detailed testing and validation steps in each release update which will also have a direct impact on Deployment frequency. 


SUGGESTION


Track compliance audit success rates, regulatory approval times, and the number of successful deployments without compliance issues while tracking the related DORA metrics.



Legacy Systems


Legacy systems can create unique challenges for the teams managing the systems because of dependencies or being more vulnerable in terms of failures.


The Legacy systems often have a big complexity of dependencies which creates an additional risk of breaking existing functionality. This risk leads to additional checks and actions to eliminate the impact and creates a challenge to track Lead time for changes and Deployment frequency.


The fact that legacy systems are more prone to failures because of outdated technology stacks and technical dept would lead to a high Change failure rate for the software development teams. 


SUGGESTION


Focus on measuring technical debt reduction, system uptime, and the number of successfully refactored components while tracking the related DORA metrics.



Small Teams or Startups


Small teams or startups may find DORA metrics less relevant due to their scale and focus but an early adoption in software development and deployment visibility would have a great impact on the team's ability to make an impact in the market.


The limited resources in the small teams or startups may urge teams to skip monitoring the software development and deployment metrics. However, once a cost-effective and highly capable solution is found to track these metrics these teams can easily maximize their ability to ship new features. 


The main focus for such teams should be prioritizing rapid iterations and eliminating manual tracking of the metrics. Since tracking metrics like DORA can lead to constant effort, most of the time teams will drop the tracking of these valuable metrics. 


SUGGESTION


Find a solution that will constantly provide monitoring and actionable insights to maximize development and deployment capabilities and measure product-market fit indicators, user feedback, and the speed of delivering key features or fixes.



Experimental or R&D Projects


The experimental or R&D projects can have different dynamics and priorities because by their nature they contain innovation and discovery so the priorities may not be tracking the related metrics. 


While working to innovate and experiment, teams take their time with brainstorming and evaluations therefore the focus is not on the fast recovery from failures or regular deployments. In this scenario, the nature of experimenting and prototyping creates unpredictable timelines which result in unpredictable lead times and deployment frequency.


SUGGESTION


Track innovation metrics such as the number of experiments conducted, prototype iterations, and successful proof-of-concept validations while tracking the related DORA metrics.



Conclusion


While DORA metrics provide valuable insights for many development teams, they are not universally applicable. Understanding the specific context and challenges of your development environment is crucial in determining the most effective metrics to track. By recognizing the limitations of DORA metrics and adopting alternative measurement approaches, teams can ensure they are focusing on the right metrics to drive improvement and success.


In all scenarios, there are still various benefits that can be achieved by tracking the DORA metrics and combining them with the development behaviors. As Valven, we are fully dedicated to improving the performance of the teams and aligning this with the deployment process. Our product Valven Atlas, an AI-powered Engineering Assistant, is tailored to improve various development contexts, helping visionary organizations to measure and improve the organization's performance effectively.



Explore Valven Atlas to find out how we can support your organization to start your journey toward excellence.


Reach out to us for more information regarding DORA metrics and how your organization can utilize these valuable insights accurately.


11 views0 comments

Commentaires


bottom of page