Data Governance: Tackling Complexity with Baby Steps


Steve Gens- Founder, Gens & Associates; Preeya Beczek-Director, Beczek.COM

Everything in life sciences – from the latest measures around safety and regulatory rigor to renewed focus on agility, efficiency, and streamlined paths to market – relies heavily on companies’ effective organization and handling of data. Unless that data can be fundamentally trusted for its accuracy and consistency right across the organization, there will always be a sense that its authenticity must be checked before further actions can be taken.

It’s in this context that the concept and discipline of data quality governance comes to the fore. The more critical data becomes to regulatory procedures, to safety processes, to clinical research, and to R&D/manufacturing, and ultimately to connecting all of those parts of the life sciences end-to-end lifecycle value chain more seamlessly, the greater the need for formal strategies and provisions around the governance of that data’s quality (aka integrity, reliability, trusted status) across all internal and external touchpoints.

Dependency on Data

In Gens & Associates’ most recent World-Class RIM survey of 76 companies, in 2022, top-performing life sciences companies expected to have most of their systems connected and sharing data within the next two to three years - with electronic trial master file (eTMF) systems, quality management systems (QMSs), master data management (MDM), and enterprise resource planning (ERP) being the highest priority for investment.

Without the assumption of high trust in the data, the risk to marketing authorization/licensing, to patient safety, to the company’s reputation and to its financial performance, can become intolerably high as companies’ dependency on the flow of good data broadens. If this forces teams back into manual data entry, quality checks and numerous verification steps, it will undermine the ROI of digital process transformation (Gens & Associates refer to this as ‘technical debt’).

As companies across the pharma/biotech/medical devices spectrum strive to understand what all of this means for them practically, there is a temptation to create a major initiative supported by a large consulting budget, due to a lack of confidence in getting it right. But it’s a misconception that a mammoth engagement is required before companies can get started on the right track. On the contrary, it’s far more important that work starts now, to move things in the right direction.

Myth 1: This will inevitably be an overwhelming program.

The first barrier companies come up against is knowing where to start, when data quality governance by its very nature needs to be an enterprise-wide endeavor. There’s more than one way to skin a cat, though, and it’s fine for good habits to accrue in one corner of the business (e.g. regulatory, clinical, or manufacturing) before being extrapolated more broadly as teams and leaders learn what works.

All positive change has to start somewhere, so decide whether a top-down or a function-by-function (with consistent practices) approach will produce the quickest wins, and the greatest overall progress.

Myth 2: Complexity and high cost are unavoidable.

The ‘data driven’ agenda might feel fresh and new in life sciences, but digital process transformation is well advanced in other industries, while matters of managing quality are firmly established in all sectors and businesses. This means that solid frameworks already exist and have been adapted appropriately for data quality governance in a life sciences ‘Regulatory+’ context.

Much of what’s needed has to do with nurturing the right culture, assembling the right teams or assigning key roles, communicating successes, and being on the same page as a company about the goals of this whole undertaking.

Myth 3: You’re doing this largely because you have to.

This links back to the last point. Compliance with IDMP SPOR and other regulatory mandates might seem to be the most obvious driver for getting the company’s product and manufacturing process-related data in order. Yet there are many higher purposes for making data-related investments. These range from more tightly-run business operations to a safer and more convenient experience for patients as consumers of existing and new products. The tighter the controls around data quality, the more companies can do with their trusted data – use cases which could extend right out into the real world (such as prompter access to real-time updates to patient advice).

Another way to look at the importance of data quality governance is risk to the business where this is compromised. Our findings suggest that these risks include:

  • The inability to realize automation/integration investments;
  • Limited reporting and dashboard effectiveness and efficiency;
  • Continued manual compliance that increases the company’s ‘technical debt’ (reliance on short-term, manual workarounds);
  • A lack of confidence in realizing global system/process investments; and
  • A major missed cultural opportunity to build a mentality of good data citizenship – a sense that everyone is in this together, so that from this point on data is captured correctly and well maintained from day one.
Myth 4: This is an IT/data management concern first and foremost.

Time and again, the evidence confirms that the key success factors for a data quality program have little to do with technology, and everything to do with culture, organization, and mindset. Specific contributors to progress, based on active programs today, most notably include:

  • A shared data quality vision – so that good data-related practice becomes second nature;
  • Establish ‘actionable governance’ in the form of a data quality office and assigned data quality officer, whose remit is to oversee efforts to clean up and maintain good data;
  • End-to-end orientation: in other words, a perspective on data quality that extends beyond a single function to the full spectrum of data use cases and key functions;
  • Ensuring that senior leaders advocate for a culture of data quality built into rewards systems; that executives drive a ‘right first time’ mindset around data as it is captured and first entered into a system; and
  • Formal continuous improvement – that is, continued rigor in raising the quality of data and making this consistent across the company over time.
Myth 5: You’re already vigilant about data quality so don’t need a formal program.

Not so. Here’s a quick checklist that will soon highlight when a company has challenges with its data quality (which will deepen as data becomes increasingly fundamental to critical everyday processes):

  • Data quality is not viewed as an organizational competency or linked to the organizational culture;
  • There is no clear data quality vision, policy, or strategy;
  • There is no overarching data quality operating model;
  • There’s a lack of transparency of the current/evolving data quality status and any trending/ reporting;
  • Data quality is not built into rewards or recognition systems; and
  • Data connectivity is being prioritized ahead of the organizational support required to properly manage and leverage the value of connected data.

Defining Data Quality

Clearly, it’s important to understand what the company understands to be ‘good data quality’ and to communicate this so that everyone understands and is aiming towards the same benchmark. Building data quality checks and reporting on these is the best way to mold and bed in agreed expectations.

Based on observed leading practice to date, these are likely to include: accuracy – e.g. the expected renewal date for each country is correct; adherence to standards; timeliness; accessibility – e.g. access is allowed and enabled to any authorized person, process or connected system; ownership – e.g. there’s a clear data owner for the authoritative source of truth; and transparency of results – the ability to recognize the “good” and what’s “not there yet”.

There is no time to delay and it is entirely possible to start today by taking a bite-sized approach to data quality governance.

Author Biographies

Steve Gens is the managing partner of Gens & Associates, a global life sciences advisory and benchmarking firm specializing in strategic planning, RIM program development, industry benchmarking, and organizational performance. [email protected] http:// gens-associates.com/

Preeya Beczek, Director of Beczek.COM, is an independent regulatory affairs expert, providing teams with valuable insights, advice and strategies for operational excellence, by optimizing process, systems, roles and operating models. https://www.linkedin.com/in/preeyabeczek/

Subscribe to our e-Newsletters
Stay up to date with the latest news, articles, and events. Plus, get special
offers from American Pharmaceutical Review delivered to your inbox!
Sign up now!

  • <<
  • >>

Join the Discussion