STP in treasury is often looked at in an atomistic way, for instance, avoiding rekeying payments into e-banking systems and rekeying foreign exchange deals into treasury management systems. To get maximum efficiency, STP has to be viewed in a holistic way.
With the digital era over half a century old it is surprising how many treasuries still interface with operating companies and even banks with email and Excel spreadsheets. But the fact that email and Excel enable enterprise-wide data to be collected and processed ipso facto proves that all the needed data is available in some system or other.
The real problem is normally that the data is in disparate systems. Some may be too old to have modern APIs yet they are capable of outputting something that ends up in Excel. And whatever Excel can read (or have pasted into it), other systems can read.
The key point is that the data is out there, and normally in some kind of structure that makes it machine readable. Machine readable data is better handled by data processing, ie software, than by humans. This is especially true of finance data that treasury normally consumes.
In the dreams of ERP vendors, all this should be moot. After all, ERP means enterprise resource planning, and the idea is that everything is integrated in the ERP. Also, the ‘planning’ part of ERP implies business plans, including financial plans – meaning treasury can simply read the cash flow forecast from the ERP.
That’s the theory. In practice many enterprises use their ERP as a glorified general ledger. So before diving into APIs and robotic process automation (RPA) and other work arounds, it makes sense to investigate why ERP is not being used in an integrated manner. Often it is because implementing basic accounting functionality was such a costly nightmare that no one wants to try other functionality. Or else IT is overwhelmed with the two-yearly version upgrade cycle – anyone for SaaS?
Treasury will probably not drive full implementation of ERP across the enterprise but it may be worth it investigating whether the ERP might provide a decent solution for cash flow forecasting.
When, for whatever reasons, there is not one integrated ERP covering treasury’s information needs, some way of connecting systems is required. The bad old way of connecting systems is to manually rekey data from one system into the other.
Arguably more dangerous is to copy-paste from the source system into Excel, mess up the data in Excel with formula errors and more copy-pasting, and then copy-paste the resulting stew into the target system. Compared to the bad old way described previously, this simply allows humans to mess things up at scale.
File transfer is much more reliable and still a very valid way to connect systems. Typically, the source system exports a file to a directory, which the target system is checking regularly, and when the target system detects the file it imports it. This has been used for decades and is reliable and convenient.
A lot of corporate to bank communication is done in this way using SWIFT standards. The bottom line is if the data can be put into Excel (other than by rekeying) then it can be transferred to the target system – thus avoiding all the risks involved in manual handling, not to mention all the brain atrophy that involves.
File transfer typically handles bulk transactions. It is possible to export an individual transaction but most file transfers comprise many transactions. Status messaging and error handling in the file transfer space is typically by return file. In other words, the source system exports transactions, the target system imports transactions and exports the status of import in another file which is in turn read by the source system. Although this may sound a bit lugubrious, in practice it normally happens within seconds.
APIs are designed for atomic transactions – an API sends and reports status on each individual transaction. This is conducive to near real-time processing between systems.
Although APIs are much in the news following PSD2 and other API and open banking initiatives, they are as old as computers. For example, “apps” use API calls for operating system services such as reading the keyboard and saving files. The browser is using API calls to display this text on the screen now.
APIs are used within ERPs to enable different modules to communicate. And they are extensively used on the web, for instance in ‘mash-ups’.
Using APIs between different systems requires that both source and target systems use compatible APIs. To a limited extent, middleware can translate between different API standards. Older systems that do not support APIs cannot easily be connected via API.
To illustrate, consider SWIFT. FileAct is a file transfer method of corporate to bank communication. FIN is an API method of corporate to bank communication – individual MT101 messages are checked and acknowledged by the SWIFT network before being sent on to banks.
APIs – where available – are generally the best way to connect systems, and are likely to be the most future proof.
RPA is an important evolving technology for those situations where APIs and file transfers are not feasible – typically because of old systems and/or lack of budget.
RPA is at its core a way to automate manual operations on computer systems, based on screen scraping. An RPA can broadly repeat any sequence that a human would do. For example, click the “Add” button, key in transaction data, then click the “Save” button.
Anyone who has faced inscrutable error messages and other computer glitches will guess that the above quickly gets more complicated in the real world. So, RPA vendors build in various degrees of flexibility and even intelligence to keep the robots running smoothly.
Advanced RPAs can be programmed to make simple decisions that humans would otherwise make, and are being rolled out for insurance and mortgage application processing, for example.
In the context of connecting systems, a typical use case for RPA would be to scrape transaction data from the screen(s) of source system and key that data into the target system. In practice, commercial RPA software can also process the data that it handles, for instance to validate or translate data.
Example use case
Foreign exchange (FX) hedging is a process which is often not handled in a holistic manner. A typical legacy process might run over a dozen steps beginning with the subsidiary manually collecting data from sales and procurement systems, building an Excel of expected FX cash flows for their entity and emailing that to treasury, which then has to progress it further through many more steps like copy-pasting to consolidated Excel; manually entering new net forwards to be executed into eFX platform; checking confirmation emails and approving payments before finally being able to manually reconcile bank accounts.
This is all fraught with risk of errors and probably requires multiple reconciliations and manual checks to ensure reasonable accuracy. It can be much improved from operational risk and cost perspectives.
The reformed process begins with sales and procurement data being exported to the TMS directly and synchronised with forecasts therein. The TMS uploads new net forwards to the eFX platform; treasury manually prices new net forwards to be executed on the eFX platform; the eFX platform uploads executed forwards to TMS; confirmations are handled by either with only exceptions needing human intervention; TMS exports required payments to banks and auto-reconciles bank accounts, again with only unreconciled items needing human intervention.
This is much safer and cheaper, and also frees employees for more value-added activity. It can be implemented even in the most heterogenous system environments, even sclerotic legacy systems.
Current technologies enable treasurers to stitch together disparate systems to build effective processes that reduce risk and save time and money while making employees happier. It is no longer necessary to wait for new silver bullet solutions – alloys are more durable.