A careful spend analysis that ties technology and consulting expertise can help convert senseless raw data into strategic sourcing opportunities to achieve savings.
In most cases there are substantial opportunities 'hidden' in the spend data generated by typical enterprise systems. The data is however jargon if it does not make sense. Spend analysis can help provide line-item visibility into the enterprise spend, by commodity, part number, geography, and business unit, allowing the enterprise to track expenses and clearly visualize the company's spend patterns.
Tracking spend provides the enterprise with invaluable business intelligence and business insight. The process is continuous, as data is constantly recycled back to improve the system for more accurate future assessment. Spend intelligence could help enterprises better manage inventory and keep procurement procedures controlled. Data visibility allows them to identify cost changes, take into account inflation and other factors for a more accurate prediction for budgeting and planning. It allows the enterprise to execute informed supply management, sourcing decisions, and develop sourcing strategies to maximize buying leverage and ultimately achieve savings.
Spend analysis is not simply a project, but an integral part of the procurement process – performed annually, or even quarterly or monthly in more cyclical businesses. The procurement spend analysis process entails analysis of structured data in which one can identify opportunities for savings.
Successful spend analysis is dependent on the ability to extract, organize and analyze the data.
Ideally companies would like to capture 100 percent of the spend data stored in ERP, SAP, Accounts Payable, ACH and Punch Cards. However that is rarely the case. In order to process or analyze any data, it is essential that all the data is centralized. Therefore the first step to spend analysis is to aggregate the data from the multiple sources throughout the company.
Since the data will be used for analysis to forecast savings, and ultimately determine what sourcing projects and strategies should be implemented, the accuracy and validity of the data is vital. The data should first be cleansed in order to eliminate spelling errors, eradicate duplicate records and validate numeric values such as volumes and prices. The next step is to enrich and normalize the data to eliminate discrepancies between incongruent naming conventions. By developing a universal naming convention and structured taxonomy, the data can then be classified accordingly to the appropriate categories. Details such as acronyms for supplier names should be corrected and supplier parent-child relationships should be recognized to maximize clarity and spend visibility.
With organized data at hand, additional business intelligence is necessary for meaningful analysis. Through slicing and dicing the aggregated data across dimensions such as suppliers, categories and departments, in-depth insight on spending patterns is gained. Combined with consulting expertise, this granular level of visibility reveals saving opportunities and sourcing strategies, which provide answers to key cost management questions (Figure 2). Functional expertise is imperative to understand trends in data and identify data points for sourcing opportunities. With this knowledge enterprises can plan and budget for the next year and formulate a sourcing plan.
Knowledge that is not relayed to the correct individuals will not have an impact on the enterprise. In order to obtain stakeholder support, the answers to the questions above have to be presented in a comprehendible manner that catches the stakeholder's attention.
It sounds simple, so why do enterprises not do it themselves? The whole process is rather labor intensive and time consuming, and most enterprises simply do not have the resources.
From a data stand point, the spend data spans various departments. Since the data is from disparate sources, it is difficult to monitor the various departments. Hence data collected is often incomplete or inaccurate. Throughout the enterprise there is also no common classification schema or taxonomy; i.e. the finance department may categorize computers as “PCs” whereas the sales department may refer to them as “Computers”. From a data stand point the two are the same, however they are identified as two different items in a database. The manual process to normalize and cleanse data discrepancies is hectic and time consuming.
Many companies run into the problem where they do not have efficient and repeatable data cleansing and classification capabilities, they turn to technology to automate the process of consolidating and cleansing the data. However, a corporate technology department is not built for that purpose, and does not have the functional expertise or bandwidth to support a comprehensive spend analysis process.
Analytics is another major concern for enterprises. Each enterprise has their own expertise – procurement and consulting is rarely their specialty. However the lack of analytical skill and category expertise make it difficult for individuals to adequately analyze the data. Many enterprises have each department head maintain basic excel sheets to run the analysis. However, without a functional and efficient platform, the analysis conducted is limited to the excel skills and capabilities of the individual, leaving the process fairly basic and non-standardized throughout the company. Furthermore, the category expertise and experience of the individuals across the enterprise is not leveraged or incorporated in the system for future analysis.
Technology is always a handy alternative to streamline and automate processes, offering a repeatable data cleaning and classifying mechanism. There are currently numerous technology suites on the market with varying capabilities, mainly focusing on the earlier stages of the spend analysis cycle: cleansing, validating and classification. Current technology efficiently normalizes and classifies data using the rule based mechanism and the artificial intelligence mechanism to obtain data visibility in a short amount of time. However, it is essential that the media used incorporates functional intelligence that can sort data correctly according to the enterprise's custom needs.
The rule-based mechanism normalizes the data by setting constraints and rules for classification. Specialist technology suites such as GEP Spend incorporate third party online sources or consultant expertise databases to define criteria. Based on the rules, using the nearest neighbor methodology, artificial intelligence assigns each field to the closest suggested classification. This step formulates a custom taxonomy structure by converting similar descriptions and acronyms to a common classification schema and identifying parent-child relationship for a more holistic view of the data.
Data analysis puts meaning to the data and provides insight into buying leverage optimization. A user-intuitive platform makes the process more sophisticated and allows efficient slicing and dicing while standardizing the analytical capabilities of individuals across the enterprise. Automated mapping of spend patterns reflect the dynamics of the supply market. This leads to a frequent assessment and highlights irregularities and spikes in the spend for the company's attention. Constant increasing prices can also be captured to raise awareness for a new sourcing strategy or use of an alternative product.
The flexibility of a technology suite is important for upgrade per refresh cycle and to continuously expand the scope of the program. The ability to incorporate inflation and industry indices allows for more precise predictions. External metrics such as performance index, supplier risks and department compliance scores should also be factored into further enhance analysis capabilities of the tool.
Department heads or CFOs are busy individuals who may not have the time to dig through reports to make sense of your analysis. Analyzing the data is the difficult part, but creating presentable reports is time consuming and necessary to win the stakeholder buy-in. Technology once again comes in handy in communicating information to stakeholders or senior level management. Automated dashboards and standard reports help individuals to efficiently recreate uniform professional reports.
Providers offering both, specific procurement consulting services and technology suites, are rare. Hence, it is understandable that currently there are limited products in the market that offer a hybrid mix of technology capabilities and consulting service expertise. Functional expertise is imperative for exhaustive analysis. Domain expertise and accumulated experience have formulated the GEP insight on sourceable spend per category, the order of sourcing and volume of potential savings that can be achieved. With this knowledge and category expertise, supported by in-depth spend analysis, consultants are able to draw up sourcing plans, and devise strategies, delineating for an enterprise-wide, customized opportunity assessment. This is essentially the missing link in a comprehensive spend analysis process. With that valuable expertise embedded into a technology suite, a unified technology suite can execute the entire spend analysis, from extracting data to providing intelligent assessment and opportunities to save and project savings more accurately.
With the philosophy of marrying Technology and Consulting, GEP is able to take advantage of its accumulated functional expertise and integrate the knowledge into our very own Technology suite. GEP's business model focuses around client delight. By interacting day-to-day with our clients, we know what our clients need and want. With the incorporation of category expertise into our highly automated spend analysis tool, we answer our clients' needs and present them with a mechanism to efficiently conduct their spend analysis.
Theme: Procurement