TCSA explores the concept of a ‘National Data Brain’ – a solution that can pioneer data-driven economic governance in today’s digital era, in which data is central
Data – the ‘new oil’
Just as the industrial revolution fundamentally revolutionised human productivity, data is now reshaping our economy in the digital era. Many, including the European Parliament and Forbes Technology Council, have argued that data is the driving force, or the ‘new oil’, of our times. But we have not yet tapped into data’s true potential as the world kicks off the digital revolution.
This article proposes a solution that can unleash the full potential of data: a National Data Brain, which empowers precise economic governance with data-driven policy-making and a hard value anchor for currencies.
Private sector limitations
When considering innovation, the private sector usually comes to mind – specifically tech giants such as Google, Amazon and Alibaba. However, there are some obvious limitations to a primarily private sector-dominated digital world:
- The current plateau of data-driven development is largely a result of the profit-oriented nature of tech giants.
- Competition to capture consumer data has led to data fragmentation, which has resulted in the formation of data silos.
- Tech giants tend to monopolise data, which has led to the crowding out of small and medium-sized internet companies. Publicly owned data infrastructure could fundamentally reinforce data equality.
- Private data has never been truly owned by citizens themselves but instead has been primarily controlled by tech giants. Yet whether market-oriented business behaviours can ensure data protection raises a number of questions.
Public sector struggles
A centralised, democratised and publicly owned data powerhouse is the best way to return data ownership to citizens. But what is the current status quo of data management in the public sector?
Information silos among government entities have led to severe fragmentation, time lags and incongruent data formats, where data in one department cannot talk to data in another. In other words, each player only gets one piece of the puzzle. To integrate data across these silos – or simply borrow data from other players for single usage – would be a massive undertaking. In this regard, the public sector is already far behind in shouldering the vital mission of the times.
Lack of a data-driven view or value anchor
Now let us dive deeper into central banks’ predicaments against this backdrop of tension between public and private sectors in the data revolution.
The global economy has been stuck in a troubling pattern for the past few decades. Since the financial crisis that began in 2007–08, there has been a long-term decrease in the efficacy of central banks’ monetary policies. This systemic fragility has been exacerbated by the Covid‑19 pandemic, with clear signs such as negative interest rates, excessive quantitative easing and expanding balance sheets. Current monetary policy tools are starting to reach their limits.
The underlying root cause is the lack of a data-driven overview of the economy and the absence of a hard value anchor. Global currencies, rather than standing on firm ground, are now standing on a bubble that is inherently unstable. The essential component of that equation is the value anchor of currencies, which used to be pegged to precious metals – the gold standard. This was replaced by the dollar standard following the collapse of Bretton Woods in 1971, in which currencies are solely backed up by the intangible and unquantifiable credibility of countries worldwide. As a result, the price of our currencies ends up deviating further and further from a true value anchor. This has opened the floodgates to overreliance on loose monetary policies and has significantly increased the systemic spread of credit risks and high debt levels.
However, the digital revolution has unveiled a broader spectrum of what is possible and endowed central banks with the option of a data-empowered value standard, where currencies can be pegged to a country’s actual economic strengths derived from real-time data.
Shared challenges and current solutions
But do we have the data to realise this? Central banks and government institutions now face shared challenges with regard to data management, such as limited data sources, varied formats and granularity (see tables A, B and C).
Indeed, there have been novel approaches deployed to address these issues. These include the introduction of additional data sources – high-frequency data, web-scraped social media data, anecdotal data purchased for textual/sentiment analyses, and so on – improving transmission mechanism through new regulatory technology solutions to simplify reporting procedures, and building unified data lakes by bridging data silos and extracting data from various sources into one location.
But there is an alternative approach that has not yet been explored, a genuinely transformative method that reconstructs the underlying data logic and resolves these challenges once and for all: a National Data Brain, which enhances the data management capabilities while laying the solid, trustworthy groundwork for a data standard to measure, depict and restore the actual value
of a currency.
National Data Brain
The National Data Brain requires a comprehensive upgrade to a nation’s cloud data infrastructure. It is composed of the three essential elements for effective data collection, management and application, respectively:
- A data collection network
- A central processing unit (CPU)
- An open-access data platform.
It is worth noting that the construction of a National Data Brain has to start from – and must be based on – existing central bank data networks, because central banks already have direct and unfettered access to the most logical and ubiquitous sources of information regarding all human economic activities: monetary transactional data. When aggregated, this data provides a real-time snapshot of national economic dynamics. With access to monetary transactional data at their doorstep, central banks are already halfway to solving the issue of inadequate data sources available.
1. Data collection network
The question now is how central banks should collect data. With a simple algorithmic add-on onto all existing payment terminals, monetary transactional data is captured into a single standardised data package that encompasses all dimensions and aspects of economic activities. This means data will no longer be scattered into different silos, removing the risk of fragmentation.
For example, today, when a bottle of water is purchased, two pieces of data are generated. The commodity data – one bottle of water – is usually collected into an inventory system, then accounted for by a statistics bureau a few months later. The monetary data – in this case, £1 – is transmitted to settlement or commercial banks and eventually reported to the central bank. Thus, these two pieces of data are separated at the source and will remain in isolated silos. Integrating them requires tremendous investment in time and resources for data scraping, cleansing and mining.
What if these two data points were synchronised at the very beginning of the collection process? This is the most efficient method for both synchronisation and automatised transmission of economic data on a real-time basis, at a large scale, while operating at 1% of the cost of traditional data infrastructure.
2. The CPU
Once such a massive national dataset is formed, how should central banks aggregate and manage it? The National Data Brain solution achieves a CPU-like data governance framework through a simple but powerful concept: data standardisation on a national level.
As previously discussed, at the point of collection all monetary transactional data will be organised into a standardised meta-structure that recategorises all data collected under six major socioeconomic dimensions. These metadata packages centred on individual citizens are called ‘algorithm units’.
To prevent data from being scattered across different databases, the algorithm unit unifies different dimensions of monetary transactional data – for example, commodity and monetary data – from the point of collection. Instead of extracting data from various silos at huge cost into a single centralised data lake after the silos are formed, the National Data Brain solution would offer an alternative path that deconstructs current data architecture by reconstructing it, where algorithm units integrate data in myriad forms into one standardised package.
Furthermore, with a homogeneous format, these prepackaged algorithm units are ready to be accessed and computed by various agencies for a variety of uses. They act as the basic building blocks for all data architecture, laying the foundation for an extremely cost-effective computing mechanism for data on the national scale.
Data collected is automatically organised into an algorithm unit with six dimensions, as represented by a cube in figure 1. They can be stacked on top of each other from micro to macro, and offer a penetrating view from macro to micro. They can be added, subtracted and calculated quickly and cheaply, allowing policy-makers to know at a glance how much water is consumed in each province, how many roads are needed in each city or how much of what resource is consumed for which purposes.
3. Open-access data platform
Once all data is effectively centralised and governed, what can a country do to leverage such a powerful resource? The open-access platform, constructed upon a central bank’s cloud architecture, is designed to allow for the versatile and multidimensional sharing of data. It not only supports transparent policy-making for the public sector, but also makes data easily accessible for the private sector to spur further innovations. Such democratisation of data for all citizens will rejuvenate the digital economy in a regulated way.
For public sectors, granular data captured through the lens of monetary transactions will project a full picture of the economy with supply and demand, inputs and outputs of each industry for precise decision-making. It enables the platform to monitor the accounting of systems of national accounts, government finance statistics manuals, and monetary and financial statistics manuals and compilation guides.1,2,3 Furthermore, this highly compatible data can further empower machine learning models and numerous other toolkits, such as artificial intelligence and natural language processing, for customised in-depth analyses.
For private sectors, a unified national data engine will break the current data monopoly, and establish an efficient data infrastructure for all and a systemic set of regulations. Such a data powerhouse will boost a robust data industry.
An envisioned future
The National Data Brain provides a clear view of the macroeconomy, a hard value anchor for currency and a catalyst for sustainable growth.
Despite the world becoming increasingly digital, policy-makers still struggle to garner a clear understanding of economic preformance in real time. Relying on traditional data techniques, policy-makers evaluate the economy within a ‘black box’ with lagging data, putting decision-making in a semi-blind state. Imagine the possibilities of a bird’s-eye view of the entire economic landscape – a data-driven crystal ball for economic nowcasting. The National Data Brain will remove the mystery or guesswork from monetary policy-making and enable central banks to make precise adjustments based on accurate information.
Furthermore, centralisation of national economic data will become a hard value anchor for national currencies by realigning currency prices with real economic values. Currencies can be pegged to each country’s true economic strength – as quantified by real-time data. Domestically, inflation could be better managed; internationally, credibility and the status of a national currency can be improved for global trade. With accurate monetary and macroprudential policies, central banks – the stewards of currency – can become leaders of a data-empowered future for economic, monetary and financial stability.
Finally, data will be transformed into a public good – like electricity or water. This will reduce repetitive investments in database constructions, creating a catalyst for accelerated digital development. With a data powerhouse and digitalised economic governance model, each country can expect to achieve an additional average annual GDP growth rate of 0.5–1.5%. Moreover, data resources and services could be transformed into a steady source of fiscal revenues in the form of a data tax.
This is a high-level overview of what is possible in this digital epoch. It is an irreversible trend and a golden opportunity for central banks and governments worldwide to embark on this new and exciting journey.
1. UN Statistics Division (2021), The system of national accounts
2. International Monetary Fund (May 2019), Government finance statistics manuals and guides
3. International Monetary Fund (2016), Monetary and financial statistics manual and compilation guide
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email email@example.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email firstname.lastname@example.org