You’ve probably heard it said data is the ‘New Oil’ of our economy, a form of currency every bit as important to the modern enterprise as money itself. Why? Because properly managed, data:
- Improves operational efficiencies (cuts costs)
- Generates better results (increases revenues)
- Improves morale (fewer headaches equal happier, more productive employees)
All good, right? But the aforementioned rests upon one important premise: that your data is of the highest quality. Otherwise, your data assets become corporate liabilities. Indeed, nowhere does the old maxim – garbage in, garbage out – ring truer than when it comes to data.
Simple example: Over the past year John and Jack Doe have donated to – and advocated on behalf of – a nonprofit. But because these two names are treated by two internal departments as two distinct individuals (rather than one man’s habit of using two approaches to the same first name), the organization:
- Doubles the costs of engaging him
- Guarantees itself a chance to annoy him
- Risks altogether losing his support
- Makes it likely he will criticize the brand across his social media networks
That’s the problem with bad data. It can escalate rather quickly. Now picture this mess in a Big Data world where such errors are easily compounded by orders of magnitude (more people, more channels, more devices, more social outlets, etc.). Gasoline, meet fire.
The 3 Faces of the Nonprofit Data Enterprise
According to Ingmar Slomic, a senior technical consultant specializing in enterprise data management, when it comes to data quality most nonprofit organizations fall into one of three categories: Reactive, Committed, Acculturated.
- Reactive – Convinced they’ve got their data quality issues under control, these organizations routinely find themselves in firefighting mode, moving from one critical issue to the next and spending 2x, 5x, or even 10x what they should. Without standardized processes these data fixes are often administered in an ad-hoc fashion, with questionable analysis and testing. The lack of documented methodology and process leaves little room for improvement.
- Siloed – In these organizations IT prioritizes data quality and has built out the proper policies and methodologies to main good data quality. But they are fighting an uphill battle against users who don’t share the same values and priorities. “These organizations still see data quality as an IT problem – that IT is going to fix all the data problems,” says Slomic. “That’s a huge mistake that will only get worse over time.”
- Committed – Across these organizations everyone prioritizes data quality because they understand its value. From ingestion and storage to analysis and reporting, every process, methodology, campaign, and strategy is built with data quality in mind. Errors or inconsistencies aren’t merely repaired but traced back to their cause. Every day employees who touch the data make hundreds of subtle decisions that impact – and improve – data quality, giving C-level executives confidence in their own data-driven decisions.
Assuming you haven’t reached the zenith of this hierarchy, what steps can be taken to improve your data quality?
An Executive Decision
The commitment to data integrity must start at the top, says Jonathan Sotsky, director of strategy and assessment at Knight Foundation. “I can say with 100 percent certainty that any effort to meaningfully use data to drive strategy cannot succeed without leadership buy-in.”
Meaning that the C-suite must recognize – not just in words but in actions – that in the same way a dollar is a dollar regardless of where you go within an organization, so too must data be treated as an invaluable asset across the entire enterprise, not just in IT or in the offices of the CMO or CDO.
The challenge arises in today’s environment where would-be donors, volunteers, constituents, etc., can engage organizations across any number of channels and touch points. If data in Development is treated different from Marketing or Services, you’re going to end up with a problem. Especially as data increasingly becomes the building blocks upon which the modern organization depends.
So assuming your C-suite is ready to issue an edict that all enterprise data shall be treated equal, what are some specific steps that can and should be taken to implement this change?
What Do You Need?
Data is a tool only to the extent to which the organization agrees on its intended purposes. Establish your entire organization’s informational needs and data priorities. Overlook nothing, because it can be challenging going back to grandfather in a vital data element overlooked before that massive database was built. If something as simple as a salutation or middle name may potentially be needed downstream, plan for it now or pay for it later.
Once your data priorities are set, ensure that every department, employee, vendor, etc., is apprised and in-synch with those plans.
Start your data quality program at the point of ingestion by identifying every conceivable input channel – online forms, data entry, social media, caging vendors, fulfillment partners, etc. Then ensure that these channels are in tune with your data priorities. Be sure to establish internal policies for acceptable data integrity and the rules and business practices necessary to meet those policies. Again, when data becomes a centerpiece to every internal policy-making conversation, integrity will no longer be an issue.
Keep it Synched and Clean
IT remains central to any data integrity efforts, of course. Ensuring that databases are properly configured, maintained and updated; that exception reporting and notification are properly managed; and that any third-party data assets are appropriately audited before ingestion.
But too often, says Gary Carr, CEO of Third Sector Labs, the nonprofit industry and even some of its IT owners, “pictures data management as a daisy-chain of updating datasets from one repository to another. What they should do is build data repositories where all data is synched, cleaned, managed and from there fed into the various systems that depend on it.”
One of the modern era’s great gifts is the ability to use analytical tools to create an iterative feedback loop that allows organizations to gradually work toward the business intelligence they most need.
So as you run reports, determine whether the business rules baked into your early data requirements and priorities are actually delivering the intelligence you need. If not, continue to update and refine them until you are getting what you need. Start with pilot programs and small datasets, test, repeat as needed until you’re satisfied, then expand to your larger data universe.
Junk the Excuses
An integral part of any data integrity program is tossing out aged, bad, or irrelevant data. While you’re at it, be sure to junk the excuses your organization may have been using to avoid the whole data integrity conversation. In this day and age the idea that an organization hasn’t the time, resources, or talents to do data right simply cannot be accepted. Or as Jamey Heinze, CMO at CDS Global recently wrote in ThirdSectorToday, it’s certainly not easy capturing and cataloging all of the right data in the right way, let alone mining it for the business intelligence you need. “But you need to try.”
In future issues we’ll cover other aspects of data as it pertains to our nonprofit community. In the meantime, if you’re concerned about your own data quality issues, please contact us.
Leave a Reply