Understanding spend analytics and where its headed.
Not so long ago I wrote about “hacking innovation” and presented a basic model which starts with understanding your company’s entire development journey. This provides a useful if not essential framework from which to start mapping new and innovative “symmetrical” (as opposed to risky “asymmetrical”) extensions to an existing portfolio of products and services and an existing customer base. I applied the same process to the industry I have been in for some time and found it particularly helpful when considering how spend analytics is evolving and what I believe it may look like in the future.
The evolutionary phases of spend analytics:
Dark Ages – The focus was on “who” you spend money with.
Spend analytics like many new business technologies began tactically with a few smart people hacking away at solving a problem. The first use cases focused on how organizations set about rationalizing their supplier bases in an effort to drive more spend through a chosen few. Understandably, the focus was in what we call direct spend categories where the largest benefits existed and where the importance of execution and process was the most robust. As we have seen supplier rationalization in the less important indirect categories is far harder to implement and stick.
In the early days, the enterprises implementing this tended to be found in the US manufacturing sectors. They were reacting to new lower cost overseas competitors who were entering their rich domestic market and aggressively taking advantage of their huge wealthy and captive consumer base and eroding their homefield advantage that they had enjoyed for so long.
Pre – Dawn – Focus on “what” you spend money on
In order to remain competitive, spend management had to grow up. From its early tactical beginnings, procurement fast became an activity that had recognizable benefits which in theory could transform the fortunes of a business. Procurement teams started to look at spend more strategically, building teams around categories which required a more granular understanding of the spend. This saw reasonably complex activities such as data classification and enrichment move from supplier level to line item and material level.
It was at this juncture that the whole subject area of spend data quality became a big issue. As soon as the procurement team broadened their data set beyond the vendor masterfile into the transactional data they hit a bottleneck. The vision of becoming more data driven was being seriously frustrated by the incredibly poor quality of spend data. This was compounded by the complexities of extracting data from such sources at SAP. To overcome this, and completely logically IT leveraged large and expensive data warehouse projects but in almost all situations made a bad situation worse. That is a subject for another time and looks at the unique way spend data must be prepared and managed.
What do companies do when they have a problem they struggle to solve? Hire a consultant. Companies turned to consultants to help resolve these data issues and advise them on how best to extract the value out of the data. Consultants built teams to process this poor data manually and to a lesser extent continue to do so.
It was during this second phase that we saw the first big players like Ariba emerge. They began to introduce packaged and integrated procurement technologies that helped enterprises standardize and scale procurement processes and services (ie eAuctions, Contract Management & SRM). These platforms included basic spend reporting which tended to look at the spend that was being processed “on” or “within” these platforms, not elsewhere.
Dawn – Focus on “what” you buy, but “how” you buy started to gain traction
As companies shifted into strategic, direct and indirect category management it required a more sophisticated and holistic approach to spend analytics. To achieve this a new and advanced approached to master spend data management and analytics was needed which saw niche specialist players emerge and rise to the forefront. Competing directly with the manual process and the modules on the P2P platform. Spend analytics as a standalone decision-making platform began to gain credibility standing alongside P2P platforms which were executing the associated business process. In my previous life as an investment banker, we used a Bloomberg or Reuters platform to access vast amounts of necessary information from which to make decisions which were then executed on a separate execution platform.
Two separate but interdependent categories emerged – analytical and transactional/process technologies. The style of computing and the technologies underpinning each is very different. With the general understanding that crystallizing and capturing the benefits of procurement transformation resided in understanding “how” you procure products and services across the organization, spend analytics took another step from “who” and “what” to “how”. The great sourcing work that procurement was executing was being lost in poor compliance and weak processes. P2P analytics emerged providing procurement and business transformation experts insights into every aspect of the enterprises P2P cycle. Uncovering what processes were broken and would benefit from being digitalized and migrated to the P2P platform was essential.
However, there was a fly in ointment. The SI’s advising on P2P deployments offered clients the chance to configure these modules with P2P offerings to accommodate their specific needs which invariably varied between divisions and regions in the same company. France’s processes may be different from the US’s and so the SI would facilitate these differences. The platforms become over customized which resulted in them becoming difficult to manage and the expected savings hard to realize. These systems were sold on a vision of procurement transformation but what actually transpired was procurement complication. SI’s did well out of customizing and configuring P2P platforms whilst the P2P vendors ended up suffering because clients struggled to release the expected value.
Cloud computing created a much needed tectonic shift which materially reduced the risk in this business model. Cloud computing is based on an architecture that simply doesn’t support high levels of customization and is based on delivering an application that is multi tenanted. Companies now investing in new P2P platforms that are cloud delivered don’t have the chance to customize like they did before but only configure it. Thereby having to comply to a single best-in-class process across the enterprise. SI are no longer in the business of customization but focus on some configuration and the hard yards of “procurement transformation” i.e. supporting a company change and align to a single P2P process.
It was at this stage Ariba really began to gain traction and contenders like Coupa emerged.
Today – Focus on “how” you buy
Spend analytics platforms are moving from passive reporting to predictive and “event driven”.
Spend data or transactional data by itself, albeit clean and classified is now not enough and the material challenges of “data convergence” (i.e. pulling data from multiple different sources into one) is something that every procurement team is having to address. To gain an information advantage and retain the strategic importance of procurement in the organization, data from multiple different sources and systems both inside and outside the organization need to come together. Architecturally P2P platforms providing spend analytics are always going to be behind more specialist providers and that is why spend analytics, or procurement analytics, is better suited as a standalone module that is interoperable with the P2P platform.
Now and in the future, I see procurement teams adopting a far more sophisticated approach to analytics which in turn will underpin their growing importance in their businesses. I see these teams becoming more integrated into finance, treasury, risk teams and supply chain, and becoming more akin to active managers of a strategic portfolio of risks and opportunities with a growing number of levers they are able to pull and push.
Summary:
Spend analytics or procurement analytics has yet to have its day but it has been proven time after time to add enormous amounts of value when the conditions are right. Every company on the planet is competing harder for less and healthy margins are being eroded faster than ever before. Procurement play a vital strategic role in the future success of all businesses. We are embarking on a very exciting and thrilling journey.
Some takeaways:
I mentioned “data convergence”. Here are some data sources that should be considered in a modern spend analytics solution:
ALL transactional data – Requisition, PO, SKU, Volume and Price, Invoice, Contract (as much as possible)
ALL P2P data – as above
D&B – firmographics, risk & compliance data (there are +400 dimensions)
MarkIT – Macro and Micro economic leading indicators (i.e. purchasing managers index)
Commodity Prices
Social Media – risk and intelligence purposes
Cyber Risk Scores – risk of supplier being hacked
Market Research – Beroe
Benchmarks
With large diverse data sets and a need for a more active approach to analyzing data here are some thoughts around the technologies and tools you will;
Architecturally you can see why a transactional P2P environment would struggle to support this and as we move forward a standalone capability is likely to be more favorable.
Data integration – massively important
Access to multiple 3rd party data sources
Data cleansing and enrichment
Data governance
Hadoop and SQL
Live streaming
Classification and taxonomy creation and management
Analytical Capabilities
Data Mining
Data Visualization
Data Analytics – Statistics, Predictive
Virtual Development Environment – Procurement need to explore without IT!