20
Sat, Apr
5 New Articles

Trends in Data Warehouse, Business Intelligence, and Analytics

Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Are we being led into an analytics arms race by software vendors?

 

The following statistics were derived from a survey of 600 business analysts, technologists, data analytics professionals, managers, and C-level professionals conducted by Lavastorm last March:

  • 75% of analysis in today's organizations is still being conducted using MS Excel or other spreadsheet applications.
  • 81.8 % of the surveyed group was conducting data analysis as a normal part of the job.
  • 41.8 % were combining information from a data warehouse with data derived from sources outside the data warehouse.
  • 26.9 % predicted that they would need more evolved tools to handle unstructured data.
  • 25.2 % felt that a dearth of analytics professionals was a significant problem.
  • 60 % were anticipating an increase in spending for analytics, but they were anticipating splitting resources between buying tools and hiring personnel.
  • 30.4 % predicted that they would increase spending of new data sources.

This survey raises some intriguing questions about what we are doing with analytics versus what we thought we were doing when our organizations purchased its suite of tools.

 

Gartner and others have been touting the growth of data warehouses, business intelligence (BI), and analytics for years, telling us that the future resided in implementing advanced tools, funding expensive resources, and thoroughly indoctrinating management into the value of analytics. And yet today, if the Lavastorm survey is truly representative of what's going on in the industryand it certainly seems to be in most medium-sized organizationsmost everybody (75%) is finding spreadsheet software "quite enough, thank you!" and only a quarter (26.9%) is thinking they might need more advanced tools to handle the current analytic boogeyman called "unstructured" data. Still, more than half (60%) said their organizations were going to pony up more money for analytics tools and personnel.

 

So the question that comes to mind is this: "Are we being led into an analytics arms race by large software applications vendors when, in fact, the need for decision support information is actually readily handled by the corporate spreadsheet wizards with inexpensive spreadsheet and database tools?"

"Ev'rybody's Talkin' 'Bout"

Certainly, according to Lavastorm, nearly everybody (81.8%) is using analytical practices as a normal part of their job responsibilities. And, with apologies to John Lennon, "Ev'rybody's talkin' 'bout" analytics tools, but not everybody's buying them. Indeed, according to the Lavastorm survey, fewer than 10% is using the highfalutin self-service analytic tools that are pushed so hard by software companies and that get so much coverage in the press.

 

So let's look at what the marketing messages and the software industries predictions are saying, the promises that are being proffered, and the potential that these predicted trends will actually deliver. Then, let's open up the topic to the forums to discuss if there is any reality behind the hype. Your experiences, in your company, are, after all, the most important measure of the real trends in data warehousing, business intelligence, and analytics.

The Promise and the Dream

What might your management team learn if they could analytically investigate every avenue where there's data about your customers, your inventory, your sales force, your production process, your company's social profile, or other resources?

 

For instance, how could you improve inventory management by looking across retail sales, marketing efforts, Web traffic, and supply chain data? If only you could tap all the resources and bring them into an analytical framework, in real time. Then couldn't your management team make better strategic decisions about the organization, instead of responding to events after the fact?

 

That has been the dream of analysts since the earliest days of data warehouses, through the promising years offered by business intelligence, and on through today in the world of business analytics. "Bring it all together and analyze it in real-time!" But have we actually achieved any of those lofty dreams of comprehensive real-time analytical understanding?

 

The quantity of information created by and about our organizations continues to multiply faster than our technological ability to store it, access it, or analyze it. Analytics software application providers promise they have the answer, and their latest predictions point at three highly publicized trends:

  • Cooperative-processing architectures
  • Converged analytics
  • Big Data

Cooperative-Processing Architectures

We all know that to build a traditional data warehouse you need a system that extracts, transforms, and loads (ETL) data from individual source datasets using rules that an analyst or data warehouse specialist has established. The ETL populates the metadata names with the content of transactions for BI tools to manipulate. This straight-line ETL methodology works well for individual datasets, but there's a problem when the sources of data are distributed or varied.

 

For instance, in a supply chain, data that comes down the pike must be translated into the metadata formats defined by the target data warehouse. Meanwhile, sales data and inventory data are arriving from other applications that may be in-house or may be from external sources.

 

Finally, as metadata is defined, the meaning of that data may be redefined for the purposes of the downstream analytics tool. This makes the pathway from the data to the metadata analytics point increasingly complex and difficult to comprehend. For instance, if the ETL processes are not exact or the metadata definitions are not clear, the modification of a single data element derived from one source may have dire consequences to the analysis performed by other BI tools.

 

These issues have become increasingly apparent to analysts, especially in organizations that are using the built-in data warehousing tools that packaged solution providers are selling.

 

To compensate and make use of these tools, data warehousing professionals are now focusing on cooperative-processing architectures, instead of single-stroke, batch-input ETL architectures.

 

The advantage of utilizing a cooperative-processing data warehouse architecture is that individual ETL processes that are closest to the source data can asynchronously build the appropriate metadata in intermediary steps. As processing power has improved and machine virtualization has advanced, analytics software providers are touting cooperative-processing data warehouse architectures to permit an organization to build a virtual data warehouse, closer to real-time, from a broader variety of data sources.

Converged Analytics

Gartner calls this advance a "logical data warehouse," and it sees this as the future of data warehousing, BI, and analytics. By using this virtualized data warehouse concept, companies are moving beyond simple data warehouses and are beginning to focus on something called "converged analytics."

 

But is the architecture of a converged analytics data warehouse really sustainable? If you have two or three or more virtual sources assembled for a data warehouse, all interacting with data derived from their own sources, are you not merely increasing the complexity of a system? Can it be asynchronously managed, upgraded, and maintained? Does this make the results better for management decisions, or does this architecture really just lead to a more fragile, rigid, and error-prone dataset?

 

The fact that Gartner and others see this as a positive trend in analytics is, in itself, a commentary about a perspective that sees "more as better" in making decision-support systems. Wouldn't it be more useful to have a system that delivers "better" instead of "more" information to our management?

 

Yet, as BI tools have proliferated in different parts of an organization, the logical outcomes are silos of information that are specific to the area where the data warehouse was designed.

 

This is why there is a desire for converged analytics: When one organization is a part of a larger supply chain or network of organizations, or when data is arriving from sources outside the organization, such as SaaS data stores, cloud services, or Big Data, the ability to pull all the threads together into a converged data warehouse can potentially generate rewarding insights into how the entire organization is achieving its goals.

 

But convergence has its own set of challenges. It relies upon a cooperative processing architecture that is extremely time-sensitive in order to keep data points accurate. Keeping those datasets in sync requires tremendous control over the sources of information and pushes up against the limitations of IT's ability to keep systems on track. The more management strives for real-time computational analytics, the harder it becomes for IT to keep real-time systems functioning for the benefit of production itself.

 

Batch-processing in a converged analytics architecture becomes essential in order to control those time slices. Yet coordinating them into a synchronous schedule that can pull together a converged dataset can lead to a byzantine operations workload. Moreover, the very nature of batch processing runs counter to the idea of a virtual, cooperative-processing environment. And architecturally, isn't it a win-lose/lose-win design, in which IT is grinding massive "data boulders" into "information dust" for the benefit of sprinkling a bit of "meaning" into a single spreadsheet element on a manager's dashboard? Software application vendors and analyst groups want us to believe this is the real trend. Why?

 

Because once these elements have been established in the infrastructure's architecture, the next milestone for the analytics organization is the inclusion of Big Data.

Incorporating Big Data

Big Data is the term for a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications. But it's more than just "big." Big Data has been likened to "standing in front of a fire hose" as it often includes "live" data streams from sources arriving from sensors, social media, instrumentation, or other media.

 

Organizations in the past have been tempted to open their analytics architecture to Big Data because it can potentially provide broader, more robust, more instantaneous measurements of the processes that may be pertinent to an analytic investigation. But until recently, Big Data implementations have been highly selective regarding its resources and individually built by in-the-trenches programmers and systems administrators who were tasked with managing both the resource and its volatilities. That's because Big Data resources can be so inclusive that they often include data that is too detailed, too widely defined, or just plain too complex. Sometimes it's like opening the spigot to get a drip of information when the Big Data tool configurations are providing a flood of information. The practical results can be overwhelming. So, in a word, the maturity of Big Data initiatives could only be termed "experimental."

 

What's changed today are the investments of billions of dollars that IBM, Oracle, SAP, Amazon, Microsoft, and others have made in the technologies of Big Data, and the packaging of utilities that can tie together large datasets and stream them into analytics packages.

 

How organizations bring Big Data into their analytics architecture is still a very customized process, but the elements that are starting to come together can significantly help the larger organizations manipulate enormous datasets, bringing them closer to real-time analytics analysis.

 

Big Data is definitely a trend for some organizations that have massive computing capabilities, but the practical benefits for a medium-sized organization are still very questionable. Yet IBM, Oracle, Amazon, Microsoft, and many others see this as a trend that has specific resonance for organizations that are trying to get to the next level of competitive dominance in their respective industries.

Analytics Trends for the Mid-Sized Organization

For mid-sized organizationswhat is traditionally thought of as the midrangethe high-value, high-cost analytic trends popularized by Gartner and others bear little resemblance to what is actually occurring in the analyst cubicles back home. And this gulf between what's being marketed by the big-name analytics vendors and what's being purchased by the medium-sized shops is driving software analysts berserk.

 

One brief example is the rumors that have been spreading for some time that IBM has scrapped its marketing campaigns for Cognos Express. Is it true? And if so, why?

 

Well, first of all, IBM is merely withdrawing some of the individual product elements of Cognos Express, but is replacing certain parts of the product line with something also called Cognos Express. (Following the historic IBM logic of "This Page Left Intentionally Blank" lore).

 

On the other hand, is Cognos Express really making money for IBM? That's hard to quantify from the outside. And what does this say about the usefulness of analytics for the midrange?

 

Yet little companies like NGS, Rapid Decision, and many others are doing well by focusing their software and marketing efforts on the real business of creating and maintaining reasonable data warehouses that feed to home-built management dashboards. Perhaps it's because the costs of these BI tools are much more realistic than the systems proposed by the big players in the analytics marketplaceespecially considering that 75% of Lavastorm's surveyed analysts are still using MS Excel and MS Access as analytics tools.

 

So what are the real trends in data warehouses, business intelligence, and analytics? The answer seems to be that the big companies will continue to push their highly complex, high-value, visionary products and services onto their large corporate clientele. But the small and medium-sized organizations will continue to manage their analytics expenditures with acumen and moderation.

Thomas Stockwell

Thomas M. Stockwell is an independent IT analyst and writer. He is the former Editor in Chief of MC Press Online and Midrange Computing magazine and has over 20 years of experience as a programmer, systems engineer, IT director, industry analyst, author, speaker, consultant, and editor.  

 

Tom works from his home in the Napa Valley in California. He can be reached at ITincendiary.com.

 

 

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$0.00 Raised:
$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: