Big Data and Organizational Fluidity

This article also appeared in Business Insider

The term “Big Data” and the unfortunate hype surrounding it obscures a crucial development in the management of organizations, regardless of size. We have definitively moved into an era of copious data, and the challenge for all stakeholders in an organization is to find ways to analyze that data, discern actionable insights from the analysis, implement changes based on those insights and analyze new data in order to measure actual versus forecast results. The days of analysis being anything other than a core feature of the day-to-day operations of an organization are over; we have entered a period of continuous, iterative change.

This new world calls for experimentation as a central operating tenet. The organizations that thrive in coming years will be those capable of embracing this new fluidity.

Gold in the Terabytes

For capital providers, there are outsized returns to be generated by seizing the opportunity that our new, data immersed age offers.  The wealth of data offers tantalizing prospects, as savvy management teams and their advisors push those companies willing to make the effort into a virtuous cycle of continuous improvement, identifying compelling growth opportunities, driving margin improvement, eliminating unnecessary expenditures, and overall driving substantial improvements in enterprise value.

We are moving far beyond simple SKU analysis and the development of optimal pricing models.  By understanding high quality data sources both inside and outside and organizations, areas of inefficiency can be relentlessly targeted, and with the continuous stream of data that most organizations generate, small projects that yield results can quickly be scaled up to organization-wide initiatives.

The Certainty of Casualties

Management by rule of thumb is anachronistic; based on what we are seeing in the market we anticipate that those small and mid-sized organizations willing and able to adjust will find a data savvy business model to be a compelling force multiplier for an organization of any size.  Those organizations that fail to adapt will find themselves at a severe and growing disadvantage as their competitors utilize superior insights to grow market share and identify new pockets of opportunity.

Additionally, blind adherence to data will also produce casualties.  Massive data sets are almost by definition “noisy”, and insights derived from such data must take into account both the data’s strengths and limitations.  Ironically, this increasingly analytic field needs more than ever a solid qualitative framework to ensure good “sanity checks” are not forgotten.

About the Author

David Johnson (@TurnaroundDavid) is Founder and Managing Partner of Abraxas Group, a boutique advisory firm focused on providing transformational leadership to middle market companies in transition.  Over the course of his career David has served as financial advisor and interim executive to dozens of middle market companies.  David is also a recognized thought leader on the topics of business transformation, change management, interim leadership, restructuring, turnaround, and value creation.  He can be contacted at: