The hidden revolution in real-world monitoring

Corporate statement: Thomson Reuters

leigh-henson-and-stefan-reichenbach
Leigh Henson (left) and Stefan Reichenbach, Thomson Reuters

Harvesting, consolidating and transforming data into actionable insight has become Thomson Reuters’ mission, as discussed in this sponsored feature by Leigh Henson, global head of energy, and Stefan Reichenbach, global head of commodities research and forecasts.

We have more data and greater transparency about the physical assets that are the foundation of our economies – such as planes, ships, mines, refineries and power plants – than ever before. The trend is irreversible and continuous, and the challenge is to harvest, consolidate and make sense of it all. At Thomson Reuters, the proud winner of the Energy Risk Data Provider of the Year 2014 award, we make it our mission to help commodities market participants overcome that challenge.

When Malaysian Airlines flight MH370 disappeared from radar screens earlier this year, it shone a rare spotlight on the rapidly developing capabilities in remote monitoring of physical assets. It now appears that many of the questions over what exactly happened to flight MH370 will remain unanswered, but it seems reasonable to speculate that no-one ever intended to set the plane on a southern course just to crash into the ocean six hours later. If somebody did intentionally turn off the plane’s radar before changing course, though, clearly they did not know about the plane’s communication with the Inmarsat satellite and the military radar that would continue to track it. The international search and rescue mission that followed flight MH370’s disappearance led to an awkward dance of international governments seeking to make progress on the search without giving away too much about the intelligence they routinely gather about each other’s airspace, and their ability to process and analyse this information. Valuable data will always be guarded, but attempts to lock it away – while possibly being able to delay the inevitable – will ultimately prove futile.

The drivers changing the shape of the information industry are primarily a combination of technology and regulation. We are all beneficiaries of Moore’s law and deeply conscious of the plethora of cheap computers, televisions and gadgets this has enabled. We are maybe less conscious of the revolutions cheap computing power is unleashing across a whole range of industries.

Companies working in information services can now algorithmically determine changes in production patterns based on data from electronic sensing technologies: infrared cameras detect heat and enable temperature to be calculated, for instance, to observe oil refinery operations; sensors that monitor magnetic and electrical fields to quantify electricity generation, transmission and consumption; ultrasonic meters detect transit times or frequency shifts to calculate flows in pipelines; and the use of drones to fly over oil storage facilities or water reservoirs to gauge levels is no longer the stuff of science fiction.

Advances in technology are not only giving rise to new sources and increasing volumes of data, they are also enabling companies to process and analyse data more quickly and in ways that were not possible before. Organisations large or small can connect to an abundance of cloud solutions, where providers are offering dynamically scaled computing resources without a need for costly upfront investments. Larger corporations are building their own private clouds, and have the ability to bridge to public cloud offerings, enabling peak-hour computing without peak-hour investment. The open-source software movement has enabled participants without the research and development powers of Google, Facebook or Microsoft to build on software available to the public, fuelled by enthusiastic amateurs as well as big corporations.

The open-source feedback loop also helps commercial software vendors improve their tools, and vendors are increasingly opening up their software to utilise crowdsourcing. The last decade has seen tools like Hadoop/MongoDB greatly simplify the collection and processing of very big, distributed data sets – be it in batch or real time. Modelling is moving away from focusing on end-of-day batch operations to (near) real-time processing. The move to big data is demanding improved modelling as well as the visualisation tools to provide the end-user analysing the data with easier ways to gain insight.

Regulation is the second major driver increasing information transparency, whether by direct intention or as a by-product. Tracking vessels as they cross the ocean and enable more than 90% of the world’s trade is already commonplace, but the ability to enable this resulted from the International Maritime Organization adopting a requirement within the safety of life-at-sea regulation for ships to carry automatic identification systems (AISs) capable of providing information about the ship, its course and speed to other ships and to coastal authorities, with the primary goal of reducing collisions.

Over the last decade, significant networks of shore-based and satellite AIS receivers have emerged and, by leveraging AIS signals together with sophisticated processing algorithms, these have yielded an unprecedented global view of vessel movements. In a similar manner, US trade legislation was drafted initially to identify trustworthy global supply partners and was extended to address homeland security through advanced electronic registration of cargo manifests. When coupled with the Freedom of Information Act, this results in very granular US cargo data being made available to the public for analysis.

As authorities become increasingly concerned about the potential for those with insight on the physical demand and supply of commodities to control markets, we see examples of more direct consequences of regulation. In Europe, the prime objective of the Regulation on Wholesale Energy Market Integrity and Transparency (REMIT) is to prohibit insider trading and market manipulation. Faced with a monitoring framework for market integrity to detect and deter abuse, many utilities have chosen to release publicly real-time operational data on their power plants and natural gas transfer points in order to comply and continue trading – creating data sets that simply weren’t available before.

For Thomson Reuters, these technological and regulatory trends radically change what we are able to provide to our customers in commodities. For example, predictions about how much crude oil China is importing ahead of the release of official customs figures have traditionally been no more than ‘fingers in the air’. That is no longer the case. The Thomson Reuters Commodities Research and Forecasts team has been predicting Chinese crude oil imports with more than 90% accuracy for six months running, by monitoring the flow of crude oil across the world. In April, when market consensus suggested falling imports due to a weakening Chinese economy, our analysts pinpointed a rise in exports destined for China and linked it to the stockpiling at new strategic commercial storage facilities in Tianjin.

By collecting proprietary data, aggregating data from all the latest sources and cleaning and linking it all together, Thomson Reuters is able to provide insight into the impact of events hitting the supply chain – when six vital Houston oil tanker routes were shut following a collision in late March, Thomson Reuters was able to quickly list the 75 vessels caught up in the closure. Equally, when an 8.2 magnitude earthquake and a series of strong aftershocks hit off of the Chilean coast in early April, Thomson Reuters analysts were able to monitor ports, vessels and mines, and quickly determine that any impact on copper supplies from Chile would be negligible.

The publication of supply-side fundamental data on European power and gas assets under REMIT is enabling Thomson Reuters to build applications like Power Curve, which predicts fair values for future power contracts in real time, based on the marginal cost of production by region and plant against anticipated demand. Bringing this all together with market prices and news in one intuitive user interface enables market participants to drill down and understand which asset is setting the price hour by hour across the full life of a contract.

The combination of AIS vessel movement information, sophisticated analytical tools and Thomson Reuters analysts are giving market participants transparency on the physical supply chain where previously they had to rely on market intelligence from their broker. Global fleet movements by type, region and future directions can be generated in near real time to help customers position their fleets or reduce chartering costs. Likewise, tracking voyage origin, destinations and estimated times of arrival can be aggregated to create forecasts on supply and demand and this information, as well as traffic information at local congestion points, can be used to monitor what else is going on in a particular market, which can have an effect on prices further along the supply chain.

The emergence of new data sources, particularly those that represent real-world production levels or supply disruption events, creates additional challenges for market participants, who often need and spend time assembling several sources to generate an overall picture in an environment suitable for analysis and interpretation. At Thomson Reuters, we aspire to do as much of that heavy lifting as we can, and provide innovative and intuitive ways to access the resulting view. The Eikon Interactive Map object provides a canvas on which representations of real-world physical items and events can be assembled in layers to build an aggregate picture of how they interrelate geospatially. Part of a long-term programme to meet critical needs, more recent enhancements have focused on giving market participants the tools to analyse, drill into or aggregate data sets to support trading decisions, to set alerts on news and events on individual assets, as well as the ability to add and display their own proprietary data – all providing an unrivalled ability to track individual components in the supply chain and the analytics to generate the likely aggregate impact on price trends.

There is more transparency about the physical demand and supply for commodities and their flow across the world than ever before. The data volumes and the diversity of sources are huge. At Thomson Reuters we put it all together and draw out the anomalies that we think you ought to be aware of.

Download/read the article in PDF format

  • LinkedIn  
  • Save this article
  • Print this page