27
Sat, Apr
1 New Articles

Data Quality Management in the Cloud Too Important to Ignore

Cloud
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Managing corporate data in a way that lets knowledge drive business success has always been a challenge. Managing it in a cloud environment has compounded that overall task enormously. Heightened awareness is critical.

Even before we had computers, we had the need to track business data in ways that enabled enterprises to understand such basic information as inventory, cash outlays, and income; to make better decisions on what products and services to add in the future; and to support predictions of market evolution more effectively than "seat-of-the-pants educated guesswork." Big Data in a cloud environment has created knowledge possibilities that were simply on wish lists a decade ago and promises to continue to provide substantial rewards to organizations that can make fullest use of analyzing everything there is to know about their customers and markets.

But therein lies the rub. The sheer amount of data potentially available for analysis has grown exponentially as cloud storage technology and advances in gathering and parsing available data have improved. It's reaching the point that if your organization is a megacorporation with lots of financial and staff resources, you can access what you need to stay competitive. If not, well, your business venture may not survive competition against those who do have such access. This is the rather stark reality that's starting to come into focus as businesses compete for a pool of such talent as data analysts, data architects, and data scientists—which isn't growing fast enough to meet demand.

Traditional vs. Cloud Data Management

To illustrate, let's start with the monumental changes inherent in the evolution from using traditional on-premises data management to managing data that is stored in the cloud. Traditional data management practices developed in environments that were exclusively local or on-premises. Databases stored on in-house equipment used software stored on local area networks attached to central servers to give organization analysts and managers answers to basic questions about who the customers were and how the organization was serving them. Analysis tools, network connections, data integration algorithms, programming assistance, and hardware maintenance workers were under the full control of each company's own executives, who could make fixing problems and formulating new ways of looking at data a top priority for their own employees.

These aspects of data have always been important, but now that the cloud has become such a dominant medium in business data analysis processes, the ability to draw better information from larger amounts of data faster and more accurately than the competition has caused a turning point. The opportunities of scale are evolving in favor of those with access to a larger pool of better data. This crisis makes data quality more important than ever, even as data quantity itself has become a kind of quality. In short, cloud computing has morphed into an "information habitat" in which data has become analogous to food. Whoever gets the most access to the best of it is most likely to survive. Those left behind are increasingly likely to remain behind until they are themselves consumed by larger and more efficient competitors. It may even be reaching the point where public-cloud users are gaining an advantage over private cloud users because the private-cloud users are living in a data universe that's inherently limited by the enterprise's own ability to gather and store relevant data. Add to that the advantages of using a large service provider that can offer better APIs, data-integration tools, reporting apps, and data-checking algorithms than a smaller enterprise may be able to deploy, and the playing field starts to look even more sloped.

This situation puts an unprecedented new emphasis on Data Quality Management (DQM), particularly in cloud environments. DQM can be defined as the art of bringing together a combination of the best technology, processes, and people skills to ensure the accuracy and utility of data to meet an organization's goals. Making the best use of the best-quality data is looking like the best way smaller enterprises can overcome the scale advantages of competitors with access to more data, more places to store it, and more ways to slice and dice it.

Data Quality Pitfalls

What's the most important factor in maintaining DQM? Sadly, this is a great critical question to which there is no clear answer despite scads of recommendations that vary significantly depending on the authors’ point of view (e.g., marketing or technical) or the types of career-saving tools for data managers that their companies happen to be offering. Too many suggestions are horribly vague: "promote good data governance," "collaborate with data providers," "validate and cleanse your data," and "make sure your data is secure."

Well…duh! Maybe what's needed is to take a step back and take a broader view, even if that simply highlights many common pitfalls to achieving real data quality.

Data accuracy seems like a good place to start. Back in the ’70s, when all programming was being done with chisels on stone tablets, "garbage in, garbage out" was a favored explanation for how coding mistakes led to useless results. The joke was old 40 years ago, but the adage still holds true for data in the age of cloud. Today, data can't just be accurate; it must be in a consistent format, it must be refreshed frequently, and it must be "complete," whatever that may mean in your business context.

And quality issues don't stop there. If any of the data happens to have been fat-fingered in by a human using a keyboard, you'll need some kind of algorithm to watch for typos, mismatches between fields that expect to get a five- or nine-digit ZIP code into which six digits are actually entered, fields whose entries are skipped altogether, and duplicate records whose contents are off by a single character and therefore don't show up as actual duplicates. (Just to name a few common instances.)

Then there's the matter of how, in some contexts, the data coming in (even from an automated source) isn't as intrinsically important as raising a flag if there's some kind of change to the data coming in. There's also the problem of hidden data, which occurs because the tried-and-true data tool an accounts manager has always used ignores a field that today might show an opportunity, but that data point remains locked in a silo with no one the wiser.

Data sources can present challenges. We'll assume your enterprise is correctly tracking its own customer transactions and has some workaround for data inconsistencies from those sources, but what other useful information is out there that might be germane? Market conditions? Data on competitors? News on supply chain problems for your providers? Changing consumer trends? Is there even someone in charge of looking for this additional information at your enterprise, or is it just vaguely left up to some executive committee meeting quarterly?

How does your enterprise (or cloud provider) handle your data? Could there be any data volume or distribution issues that could hamper accurate data analysis? Have there been any schema changes that might affect your company's results? Do some data sources report later than others so that some data isn't included in a particular analysis? Has a problem at some server farm dictated a failover? Do your enterprise's needs still match what's specifically in your service contract?

Are Your Data Tools Up to Date?

Another potential weak spot is how well your data analysis algorithms are working. Unless you've upgraded them in the past year, they're probably not as up to date as they could be. If you're reliant on in-house or provider-sourced tools, you could be missing out on something useful. There are many companies (too numerous to mention here) that are constantly coming out with new tools for slicing and dicing data, making consistency checks of existing data, and looking for fields left blank (null values) or other errors.

Faulty data models can be a source of problems. Two of them are referential integrity and relationship cardinality. The former refers to information that's stored or displayed in different places but is missing correlation. For example, a hospital patient might have been recorded as having received a procedure with a numeric code, but that code doesn't match the name of a corresponding procedure. The latter happens sometimes when two data objects have more than one relationship in a certain context. For example, a patient may live at a certain address, but the same address may be valid for another patient who, let's say, lives in the same apartment building. Other data-model problems can include missing validation constraints, which check that values stored in certain database fields are standardized and properly formatted or are using an incorrect formula for producing values calculated from existing other fields in a database.

Remedies for Data Nutrition

Hopefully it's clear that fixing DQM problems involves designating a person or department to check for these problems and asking some of the above questions constantly. It's important that this be a primary responsibility for those assigned to keep an eye on it, not let it be delegated to some IT programmer with a dozen other priorities and projects on her plate. It's also important to create a culture in which data accuracy and quality are identified as of paramount importance. Are employees being sufficiently trained to watch out for errors? Do they understand how data is structured and how to properly query information that is important to the enterprise mission?

Organizations should have a person or department that's specifically empowered to address all the issues raised so far, as well as many others there hasn't been room here to discuss, such as data governance rules, staying up to date on specific government laws and regulations that affect data, and riding herd on a cloud service provider, should you have one. Ideally, it will be a data architecture team with enough imagination to think about all these contingencies and enough clout to identify data challenges and bring them to the attention of those who can do something about them without simply having to accept the explanation that "we can't afford it right now."

Take DQM Seriously

DQM has become too important to let its stewardship slide. If your organization doesn't have someone—or better, a group of people—paying attention to DQM concerns, your enterprise may no longer be running fast enough to avoid becoming prey on the data savannah. Waiting to worry about it after the buyout will be too late.

John Ghrist

John Ghrist has been a journalist, programmer, and systems manager in the computer industry since 1982. He has covered the market for IBM i servers and their predecessor platforms for more than a quarter century and has attended more than 25 COMMON conferences. A former editor-in-chief with Defense Computing and a senior editor with SystemiNEWS, John has written and edited hundreds of articles and blogs for more than a dozen print and electronic publications. You can reach him at This email address is being protected from spambots. You need JavaScript enabled to view it..

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$0.00 Raised:
$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: