09.09.2019

The problem with data and the decentralised commodity business

It remains a serious issue and has been such for many years. Although many have tried, very few have successfully found a scalable off-the-shelf solution that can manage global data across a decentralised commodity platform.

The commodity business is not uniform in process. There are countless different products and along with them unique aspects of operations in their supply chain. Setting out to standardise a trading, contract entry, operations, risk management, treasury and finance system for a global commodity group is a huge undertaking. There have been countless millions invested in this endeavour and no notable success stories.

Some very well capitalised technology companies can present a fantastic vision for this infrastructure and they have good sales teams but caveat emptor, once you enter into one of their development agreements, you are actually agreeing to build a platform yourself with the help of their business analysts and developers. Unless you are careful and have excellent project management and internal engagement, what you will obtain, in various guises, is a system completely different from that which was showcased at the sales presentation. Invoices will pile up , the budget requests get enormous, the staff become skeptical and disengaged and the politics get extremely sticky. Eventually somebody may be held responsible for wasting a lot of company money and what is left in place may be less efficient than the legacy process which you were attempting to replace. I am not claiming this will always be the case but it has happened rather a lot. 

So, is there a solution? 

I believe there is and it begins with being practical and keeping it simple.

Let’s first consider why commodity businesses require this sort of data consolidation in the first place. 
There was a time, I think even so little as fifteen years ago, when having operations all around the globe with a physical presence in the countries of product origin and destination produced a distinct market advantage. All it took were daily mails from each location to update on what was happening locally to give the central trading desks enough information to find an edge in their positioning. These were the days of brash traders walking into the office feeling bullish and buying a ’shed load’ on the open and if the market didn’t respond, buying a ’shed load’ more until it did. I was working at Cargill in the mid 90’s and we had a green screen wire system. There were days when I would come into the office to look at the soybean book and open up the wires to read comments from the midwest like, “I am so bullish today, I have already grown horns”. It was a slam dunk, just hit the buy on open and in came the money. 

Today, this general market information is available to everybody, the advantage is gone and with it, the trading edge it provided. Today more analysis is required to better understand what is really happening globally on the buy and sell side of a market and also to fully understand the open risk position which a business is exposed to globally and so find the smartest way to hedge or leverage that. Commodity businesses are running large volumes with thin margins and need to have good control over their staff compliance, operational excellence, quality control, working capital allocation, fx exposure and credit exposure. The profit on a bulk cargo can be easily destroyed by poor operation, quality issues  or logistics. Mistakes in fx hedging can lead to big unexpected mark downs when converted to reporting currency and non-compliant behaviour can lead not only to large losses, nearly all the largest hits are due to rogue trader activity, but also fines from regulatory bodies and long term reputation damage.

You can implement all the brilliant analytical tools, smart data science teams, finance gurus and expert risk analysts you want but these will not make much of a difference unless you have control of your data. Data is where the value is today and managing data is the key to the future leaders of the commodity industry. There will be disruption, there is a gaping hole through which data smart, well capitalised and ambitious businesses can thrive.

So back to the question of data management. If you already have a decentralised infrastructure and have allowed operational companies to grow up as separate business units, manage their own process, policies, data systems, finance and risk and today you find that at group level the transparency is less than desirable and you have a strong feeling that competitive advantage could be much improved with a better view of the global picture. Start here. 

In each location first map out what the data looks like, what systems are being used and how is that data stored and updated. This could be anything from a monthly consolidated spreadsheet to a disciplined, structured transactional database.

Work out what you need to know about each transaction at group level in order to centralise functions and analysis. This can be achieved by speaking with risk teams, treasury teams, accounting, controlling, operations and front office. Follow a few transactions of different kinds through from initiation to financial reporting and see what fields are relevant in order to know everything about a transaction. 
Once you know the fields you require, then look at the hierarchies. What master groups are there and what subgroups do they divide into. For example under ‘commodities’ you may have ‘grains’ which then subdivides into ‘corn’, ‘wheat’, ’sorghum’,'oats’ and then each of those would divide further into type, location or quality. The structure will depend on how the business operates but it must consider not only the commodity hierarchy but also how limits are to be allocated and sub-divided or how working capital allocation is to be measured. 
It is an important consideration and time should be taken to get it right as once populated, this will be the basis on which you store and reference your global data. 

You need to work out how you can obtain what you have going on around the globe every day and fit it into what you have created at a central level. Some companies will claim that you need to have this huge ambition and create a standard global install of an all singing and dancing solution but I would warn against that unless you want an incredibly long and painful death. There are solution providers out there which have excellent components to help in this process and rather than trying to offer a complete solution they offer component parts. The best of these companies have a deep knowledge of the commodity business and have employed analysts and technical people with significant industry experience. I will gladly advise on some of these options. 

The fast and effective way to tackle the problem is to send a data team, ideally people who understand or worked on the initial database structure, to each location in sequence and work out what is there, how they can access it in an automated fashion daily and what transformations need to occur to that data in order that it can be standardised for the new central database format. 

In each location a transformation layer needs to deal with field conversion, this could be as simple as calling something ‘quantity’ in one place and ‘lots’ in another. After transformation everything has to be equivalent in label, in SI unit and in currency denomination…..

To be continued in part 2:

Admin - 13:28 | 13 comments