Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

IDG Contributor Network: In data management, the history is the future

Two-thirds of today’s businesses have digitized, and the rest are moving in that direction. While right now this is happening in lockstep with rapid, global growth in spaces like mobile e-commerce and the internet of things (IoT), in the near future it will go beyond internet-based and -connected businesses to mean augmented reality. No longer is digital a cost of business; it is business.

Yet obstacles stand in the way. Fear prevents some companies from embracing digital transformation, though a more quantifiable barrier is lack of skills. Only 15 percent of executives think their companies have the skills to complete that transformation—a percentage that hasn’t budged in two years.

How will companies compete if they can’t close the skills gap? The answer has less to do with skills than with mindset.

[ Learn how your enterprise can excel in agile development. | Take your agile career to the next level: How to improve your scrum master skills. | Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ]

Data everywhere

As reams of raw letters and numbers stream through systems literally every second, managing the onslaught of information becomes the determining factor in whether a business grows or stagnates.

Bu the mindset surrounding data management hasn’t kept pace. Digital technology has created the need for it to be sleek and agile, something that can only happen when it’s iterative. Yet too often Software is developed definitively, rendering the product clunky and obtuse. This happens because of a lack of creativity and conceptualization. A solution for agile data management begins with its history.

The industrial blueprint and the origins of information technology

A century before American companies were pioneering computing technology, they were revolutionizing industrial technology. Henry Ford and his production boss, Charlie Sorensen, realized that by creating a line of employees, each charged with a specific assembly task, whose parts arrived on a conveyor belt, they could reduce assembly time, increase output, and control the workflow.

They began implementing their plan in April 1913. The results were immediately apparent. Engine assembly time was cut from ten hours to less than four; chassis from 12 to six and, by the end of the year, to 2.3. The number of cars produced, which had steadily risen to nearly 69,000 by 1912, skyrocketed to more than 170,000. Company profits rose, worker wages doubled, and market prices halved.

This ingenious feat of engineering, otherwise known as “waterfall design” for its largely irreversible downward flow, kept the American economy humming through two world wars and the ensuing decades.

In the meantime, the new industry of information technology, built on advances in computing and telecommunication systems, was speeding up business practices and daily life. Mechanical tasks that had forever been completed by people—like filing and accounting—were starting to be done by machines.

The machines, “hardware,” ran on “software,” lines of code that packed tons of transmissible information onto tiny chips. In the 1970s and ’80s, the major software used by businesses was enterprise resource management (ERP), a monster that could integrate supply chain management (SCM), customer relationship management (CRM), and business intelligence (BI).

ERP was borne out of, and created to facilitate the waterfall design. The design’s five sequential stages—requirements, design, implementation, verification, maintenance—were considered the logical way to build a modern technology system. Why wouldn’t they be? If they had worked for industry, they would work for information. Even the lexicon of the nascent IT department reflected the industrial mindset: the data the systems tracked would be stored on a “base” until they grew large enough to be relocated to a “warehouse.”

The mindset stuck. IT teams, particularly at large organizations, have thrived on interfacing with the business teams to create a master blueprint for huge projects. And that’s been a problem.

As Moore’s Law prophesied more than 50 years ago, processing power has increased at warp speed. For instance, the first Nintendo Entertainment System, released in 1985, had more power than the computer that took the Apollo space shuttle to the moon. Of all manmade things, software has grown like nothing else, and information has gone from scarce to superabundant.

For the software developers that power our modern information economy, this acceleration has been profound. No longer do they have time to waterfall design huge computing projects. The miracle system that Ford created and drove our entire society no longer applies.

Agile software development and the minimum viable product

Around 2005—the early days of social media and smart phones—software firms started adopting Agile Software Development principles. A main tenet was to work in growth “sprints,” where developers would begin a project by asking themselves: “What do we think we want to accomplish, what’s a smart end result, and what can we hack together to get a minimum viable product (MVP)?” Once they had an MVP, they would bring it to a client (or consumer), receive feedback, and begin growing and crafting the MVP on spec.

A CRM provides a good example of why most Software Development teams today, use agile methods. Historically, CRMs fell under IT’s domain. With input from Sales, Customer Service, Business, and Communications, IT gamed out a master blueprint, then returned a year later with a solution—only to discover that everything had changed. Sales and Customer Service had new problems, while Business, which hadn’t articulated its needs well in the first place, lost interest. The CRM was subpar but workable; it met 60 percent of requirements from one year ago, but its cost was high, so it got adopted.

An agile software development approach instead views a CRM as a dumb interface to the customer “master” record. It splits the CRM to address two distinct problems: usage and data mastery. The former asks how users will interact with it, while the latter addresses data issues, like duplication, identifiers, and matching and merging.

This decoupling cuts straight to the core of master data management (MDM). By starting with MDM questions—what data are we looking for and how will we manage it?—companies can get an MVP in weeks, not a year, and meet their business needs of the moment and the future.

Most companies have the skills to practice agile software development for MDM; they just need to abandon the waterfall mentality. Technology moves at lightning speed, because software is increasingly complex, competition in the global ecosystem of products and services means quick and nimble solutions. It’s the only way to move ahead.

This article is published as part of the IDG Contributor Network. Want to Join?

https://www.infoworld.com/article/3237769/agile-development/in-data-management-the-history-is-the-future.html#tk.rss_applicationdevelopment



This post first appeared on Getskills, please read the originial post: here

Share the post

IDG Contributor Network: In data management, the history is the future

×

Subscribe to Getskills

Get updates delivered right to your inbox!

Thank you for your subscription

×