There is rarely a moment in time when the forces of anarchy rise up and commit to a full-frontal assault on the concept of integrated, high-quality data.
No, the far more likely path to ruin is paved with short-sighted expediencies, misguided priorities, poorly planned performance metrics and weak-willed governance.
Most real data quality advocates are outnumbered a hundred or more to one in any given organization and the slightest loss of conviction by executive leadership when tempted with “quicker” screen delivery will excite the pitchforks-and-torches crowd to a frenzy that will soon overwhelm the voice of reason and lead to decisions that are more quickly gratifying while strategically devastating.
As the integrated environment is incrementally polluted with data feeds of ever-diminishing quality or bypassed entirely in search of a “quick win”, the prophets of doom who never really wanted to go to all that extra effort swoop in to capitalize on each and every tidbit of negative feedback like vultures picking at a slowly decaying carcass.
Eventually, when the data can no longer be considered comprehensive, accurate or timely the verdict is rendered that “integration” is a quixotic idea and faster, easier and [immediately] cheaper is the way to go. The potential real benefits of organizational agility, informational stability and architectural flexibility are dismissed with a snort.