This week’s webinar notes are from a May 21st event presented by ISM and Zycus, with main speaker Rob Handfield, a Distinguished Professor of Supply Chain Management at North Carolina State University and Director of the Supply Chain Resource Cooperative. The event is available on demand on the ISM website.
I have heard Rob Handfield speak a number of times, and have always found him both compelling and clear, not matter how complex the topic is. In fact, his role was my primary reason for attending this webinar. Some companies and procurement organizations are leading the charge with regard to big data and what is can be used to accomplish. Other organizations have not yet felt the need (or had the opportunity) but will benefit from being well informed on the subject. As was discussed in the event, not being limited by current practices is a key success factor when it comes to working with data – big or otherwise.
The focus of this event was on how companies are using real time data and improved analytics to mitigate and manage risk while simultaneously capitalizing on other opportunities that reveal themselves along the way. Many analyst and research firms, such as McKinsey, Gartner, and Procurement Leaders have studied the application of big data in procurement. They look at what we are currently doing, what we should be doing, what is interesting, and what we may currently not be able to see. The reality for practitioners, however, is the need to prioritize these areas of consideration.
That priority will dictate the specific objectives that will drive our use of big data. Without such objectives, big data is ineffective and overwhelming. There is a marked difference between open endedly trying to get an organization’s data ‘in order’ and trying to apply it to a targeted purpose. In a procurement specific context, data and analytics have to grow beyond spend analysis. We should always be on the lookout for opportunities to create data assets as much as we expect to harness existing ones.
Because traditional methods for reducing costs are drying up, procurement should increase the time and effort invested in total cost modeling. Working to build accurate cost models not only informs internal decisions, it also provides procurement with the opportunity to impact pricing strategy, thereby having the top line impact we desire.
Building an analytics solution should start with a definition of the problem and boundary conditions. Do not allow yourself to be limited by the data that is easily available. Remember to model the relationships between data sources and data points, and look for directional information (information that indicates a likely trend without direct access to proof data).
We can also not afford to limit our thinking about big data and analytics to solutions. It is a talent need as well. As Zycus’ Richard Waugh stated near the end of the webinar, Analytics can no longer be considered a specialized skill. It must instead become a core competency. As their 2014 ‘Pulse of Procurement’ report found, internal and external data quality issues are a major pain point for procurement and the organization as a whole. Future goals for procurement and their use of big data and analytics should include better internal integration of systems, forward looking or predictive analytics, and stronger ties between internal data and intelligence from external markets.
The ‘Pulse of Procurement’ report can be downloaded from Zycus’ site for free after registration.