Finance – the home of data
Over the last 30+ years, the office of finance has become the custodian of corporate data. Initially this arose out of audit and compliance requirements coupled with accelerating computerisation. But as the access to, and ownership of the data developed, so did insight into the value of consistent “true” data.
Throughout the latter part of the 20th century, finance teams oversaw the migration from handwritten ledgers to automated ERP systems and the rise of spreadsheets used for reporting and analysis. This brought process improvement and efficiency to data management and improved the flow of information through organisations.
This evolution of regular management accounts now containing consistent historical data enabled business leaders to gain visibility of actual results, compared to plans, whether that be monitoring projects, customers, products etc. This fuelled the requirement for more data at more granular levels of detail to further drive insight and decision support.
Over time, automated systems were capturing and making available more and more data rather than requiring rekeying from printouts. The availability of this data enabled corporations to add value through analysis and insight using the evolving tools – the natural fit growing out of finance. For example, finance departments began to produce basic variance analysis on a regular basis. This enabled organisations to focus on outliers and areas of interest (adverse or favourable) thereby addressing non-profitable activity or increasing profitable channels. The result: organisations gained the ability to respond rapidly to emerging issues and to anticipate change to gain a competitive edge.
Consistency of reporting and analysis also drives credibility, with finance departments supporting non-finance rather than just producing annual reports of little relevance to operational departments – embedding finance functions as a value-add to the business.
Evolution by stealth
However, this evolved in an ad-hoc fashion via data extracts into spreadsheets and became reliant on the specific skills of individuals rather than being systematically consistent in centrally managed data models. Typically, one or two individuals with an aptitude for technology would have built reporting and analysis systems on an evolutionary basis, using personal productivity tools such as Excel supported by a variety of hand-cranked data sources. These systems would be undocumented and reliant on the builder and when that individual moved on, the system became at risk and the next spreadsheet expert stepped in to build their own (undocumented) version. In large enterprises, significant investments were made in CPM tools specifically to address this. Implementations were high-level, giving consistent quality data at corporate level, but took significant effort to drive down to the sharp end of the business. So the immediate benefit and ability of the data to drive better real-time decision-making was lost.
No longer just for the big guys
This process has been taking place in large enterprises for 20+ years and is a mature market. The mid-market has lagged behind the large enterprises largely due to the sheer cost of implementations (hardware, software and implementation) – all this has now changed. Hardware costs have tumbled thanks to Moore’s law; software can be priced for volume rather than key accounts and the level of functionality bundled with applications enables rapid implementations (over weeks or months rather than a year or more). Currently, data volumes are rarely an issue due to the growth of computer power – removing at least one of the barriers to CPM for the mid-market.
The Mid-Market has the most to gain
However these organisations still have the same budgeting, forecasting, reporting etc. processes as large enterprises, so if anything, they stand to benefit more from process improvement/automation. Why? Because the flatter organisations and shorter line of command make them more agile and aligned to rapid implementations (data may be accessed easily, less users may be required for consensus, leadership and control may be easier to exert). The usability of the application means that implementations are less technical, enabling non-IT staff to participate in and subsequently manage their own solution.
In short, the mid-market can react quicker to the faster availability of accurate data and automate its movement through organisational decision-making that characterises the traits of the Modern Finance Function.
So what steps can you take today to get you on the same path?
Think about where your organisation is on the CPM Maturity model. If the points above resonate, you are probably at stage 1 or maybe edging into stage 2. Put this into the context of the following:
- Do you have the right systems with the capability to automate financial process and analysis, and the data that comes out of it?
- What level of automation does your regular reporting exhibit?
- What is the audience for your reporting – senior management, operations, and/or investors?
- How long does it take to produce?
- Where is the data supporting your reports (ERP system, Excel, data warehouse)?
- How quickly can you respond to requirements for plans to support financing requirements?
- How do you validate and track against these plans? Can you?
- Do you have the right people in the right roles?
Start asking your organisation these questions to see how you measure up. And stay tuned as all of these questions will be examined in more depth in this “Building a Modern Finance Function” series. If you have any immediate questions or want to get started on the route to greater CPM maturity today please get in touch.
See IMA’s Inside talk webcast: A Journey to Best-in-Class CPM[the_ad id=”15201″]