Instant economics & the perils and pitfalls of the real-time revolution

The real-time revolution in economics promises a new era in finance and economic planning. Evidence based policy making paired with instant feedback may change our life. It sounds like the dream of central planning bureaus of the Socialist block comes true.

The Economist devoted a briefing to the “third-wave economics”. According to the newspaper, Raj Chetty is the superstar of the new movement. For me, prof. Chetty’s work is awesome, but it is not really a real-time analysis, but rather it is a super exciting way to use machine learning and big data techniques to study society. While it is interesting on its own, the promise of using real-time data analysis to predict the (near) future and act upon this knowledge is not new. A decade ago, nowcasting was a thing. Think of Google Flu Trends. First, everyone was excited about the new technology. It’d be great if we could use search trends or whatever proxies to predict epidemics, economics crises and etc. As it turned out, Google Flu Trends suffered from serious methodological flaws. Originally, nowcasting is a technical term in meteorology. As Wikipedia defines it:

Nowcasting is weather forecasting on a very short term mesoscale period of up to 2 hours according to the World Meteorological Organization and up to six hours according to other authors in the field

Economics borrowed the term from meteorology.

Nowcasting in economics is the prediction of the present, the very near future, and the very recent past state of an economic indicator. The term is a contraction of “now” and “forecasting” and originates in meteorology. It has recently become popular in economics as typical measures used to assess the state of an economy (e.g., gross domestic product (GDP)), are only determined after a long delay and are subject to revision. Nowcasting models have been applied most notably in Central Banks, who use the estimates to monitor the state of the economy in real-time as a proxy for official measures.

We are collecting data on almost everything. There are thousands of microsatelites above the sky which monitor almost every square centimetre of the Earth. Our roads are getting augmented with sensors and they are being watched by surveillance cameras. The stores are full of items with RFID chips. And everyone holds a mobile phone in his/her pocket. We are living in the age of data streams.

The building of the Hungarian Planning Office (or Országos Tervhivatal). Source: Wikipedia

One might think that we are living the dream of socialist planning bureaus. We can measure everything, we can make fancy visualizations of the data. Also, we can plan, predict, and live our dreams.

Scientific management, aka Taylorism will boost productivity too. Comrade Stakhanov, we follow you!

Surely, nowcasting can be beneficial to our society. However, nowcasting can go wrong, like Flu Trends. If were are to use nowcasting for intervention, we have to be aware of the limitations of machine learning. Machine learning assumes that we can predict the (near) future based on historic data. As practicing machine learners know it, models can drift because of various reasons. The external world can change, or we try to apply our model to a different context, or our training data sample is biased somehow. History warns us, even Nobel laureates and top-notch professors can choose samples badly and go bankrupt.

Basically, there are two types of drift. Data drift is when the properties of the data (aka predictor) change (e.g. due to seasonality). Concept drift is when the features of the target variables change. If you put intervention into the mix, it can change the data and the target variables too.

A graphic by Gerd Arntz Source:

If we were to intervene, we’d have to take the effects of our intervention into consideration in our model, so we have to build a reflexive model. But can we build such models? Historia est magistra vitae, but history, at least according to Karl Popper, is not written in machine readable format and the human history is a chain of unique events which are not akin to events in the physical world.

Popper defines historicism as: “an approach to the social sciences which assumes that historical prediction is their principal aim…”. He also remarks that “[t]he belief … that it is the task of the social sciences to lay bare the law of evolution of society in order to foretell its future… might be described as the central historicist doctrine.”


High-frequency trading and other forms of algorithmic trading were celebrated first. Their effects are questionable at least. These days law makers and researchers try to understand their effect and provide a proper regulatory framework to them. One can argue that state intervention based on real-time data is different from the speculative actions of capital firms. It is better if a civil servant intervene for the greater good and not for the greater profit? When shall a government intervene? Who determines the desired outcomes? How can we implement algorithmic procedures into our daily life without eroding our democratic values? Is the era of libertarian paternalism coming, or the surveillance state occupies yet another terrain of our life?


Make a one-time donation

Make a monthly donation

Make a yearly donation

Choose an amount


Or enter a custom amount

Your contribution is appreciated.

Your contribution is appreciated.

Your contribution is appreciated.

DonateDonate monthlyDonate yearly

Do you like our visualizations? Buy them for yourself!

Visit our shop on Society6 to get a printout of our vizs.