A bad workman always blames his tools…

Sid Jacobson, Managing Partner, Pivotal Risk Advisors responds to our latest blog: Don’t wait until the horse has bolted – Overuse of spreadsheets is creating an inaccurate picture of a firm’s credit, market, liquidity and compliance risk. We can only see the tip of the iceberg…

Read Part I here.

While the prevalent uses of spreadsheets are greatly prone to errors, in any data environment these errors are often not caught due to a lack of appropriate independent oversight and controls. In other words, thoughtful implementation of robust business processes, inclusive of back-testing and data verification, will mitigate these risks in both a spreadsheet or an enterprise software environment.

Data centralization and straight-through processing with software architecture such as CubeLogic’s can greatly mitigate data quality and reporting risks, particularly errors related to bad data input, drag and drop, paste and cut, and algorithm breaks. In fact, with CubeLogic software, it will also provide greater efficiencies through its data processing and data mining features. However, data quality issues will still exist without appropriate commodity, transaction, and risk lifecycle processes, independent checks and balances and thoughtful forensic oversight. Bad data in. Bad data out.

With the emergence of rapid software integration approaches such as Agile and integration cost cutting by squeezing budgets and timeline and resources, organizations are systemically creating operational risks. Recognizing the importance of defining requirements, processes, and data flows up-front will make everyone successful from organization to vendor and reduce costs in the long-run.

×