In a study published in the journal Humanities and Social Sciences Communications, researchers propose a new theoretical framework for understanding and regulating the data economy. Their "general theory of data, artificial intelligence, and governance" aims to capture the essential dynamics of data capitalism and derive implications for digital governance and data policy.
The current economic models based on prices and quantities fail to fully capture the workings of the data-intensive economy, where data and AI play a fundamental role. The researchers highlight two key issues; first, the traditional macroeconomic equilibrium equation needs to account for the challenges in the data and knowledge dimensions of the economy. Traditional models focus on monetary flows, while data introduces new complexities around knowledge and power.
Second, current governance tackles the monetary imbalances caused by digital giants through fines and taxes. However, it does not address societal challenges like power concentration, opacity, and unfair value capture. Regulatory approaches are confined to the monetary realm and cannot reshape data flows for the public interest. Therefore, a new theoretical lens is required to rethink digital governance in light of data capitalism's transformational forces. The study aims to provide such a perspective by modeling the data economy and its regulation.
The Data Economy
To address this need for a new theory, the study puts forward a theoretical model of a hypothetical data-intensive economy. The framework is based on seven key assumptions:
- Daily activities of households and firms generate big data flowing to data holders. Individuals and businesses produce extensive data trails in their routine digital interactions.
- Data holders use AI to extract knowledge from data to produce digital services. Platforms and tech firms analyze aggregated data to generate insights and offerings.
- Data serves as a means of payment, challenging traditional price-quantity thinking. Barter-like exchanges of data for services complicate monetary transaction models.
- Consumers maximize utility, assuming their data has near-zero value. Individuals focus on services obtained rather than the potential costs of providing their data.
- Data holders treat data as a valuable asset to maximize profits. Companies recognize and leverage the wealth in their accumulated data.
- Data markets exhibit direct and indirect network effects. More comprehensive data access allows more users, increasing data's value in a self-reinforcing loop.
- Knowledge production from big data shows increasing returns to scale. Combining more data sources enables better insights without diminishing returns.
The researchers represent this economy through a "semi-circular flow" diagram. It augments the standard circular flow model by adding big data, AI, and knowledge flows from households/firms to data holders.
Unlike money, data flows are semi-circular - only towards data holders, not back. The model conceptualizes the digital economy's dynamics while identifying sources of disequilibrium and market failures. It provides a theoretical foundation for rethinking governance.
Data Sharing and More
In this economy, the amount of knowledge produced trends toward monopoly levels - below what is socially optimal. The study proposes "data sharing" as an intervention to address this.
Data sharing involves removing barriers to data access (leakages) to generate additional knowledge (injections). Wider data circulation can counter the tendency toward knowledge concentration.
This knowledge can provide insights into tackling the societal challenges of data capitalism and guiding digital governance. More transparency and research can illuminate regulatory needs.
The study models data sharing similar to monetary taxation. Just as an optimal tax rate maximizes revenue, an optimal data-sharing rate can maximize knowledge production. The relationship between data sharing and knowledge is represented through a "Data Sharing Laffer Curve." It visually captures the idea that more sharing does not linearly increase knowledge due to behavioral responses.
Beyond an optimal point, higher sharing may discourage economic activity and reduce knowledge. However, limited sharing leaves knowledge monopolized. The curve elucidates this trade-off.
Consistency with Evidence
The researchers argue that the seven assumptions reasonably match the real world. They cite evidence on:
- One-way data flows to data holders. Individuals and firms provide data to platforms but need more reciprocal access.
- Knowledge disclosure tends toward monopoly levels. A few powerful companies control and benefit from insights.
- Attraction of investment by data holders. Data access drives concentration and dominates business values.
- Economies of scale in knowledge production. Combining more data enables better predictions and services.
- Market concentration and lack of competition. A shrinking number of firms own critical digital infrastructures.
- Creative destruction expanding data holders' reach. New data-driven services disrupt sectors like taxis, hotels, and banking.
- Price discrimination from information asymmetries. Personalized pricing and steering based on data advantages.
They relate their model to literature streams like antitrust, intangible assets, big data business models, and AI governance reviews. The theory elucidates data capitalism's workings while supporting proposals like data sharing for efficiency and equity.
Implementing Data Sharing
The paper concludes by discussing how data sharing can be implemented to produce socially beneficial knowledge. The modalities could involve norms, regulations, market solutions, or architectural changes. Relying solely on social norms is unlikely to spur criticism of data capitalism. As concentration increases, so can incentives to maintain the status quo.
Laws like GDPR provide a foundation for individual data rights and portability. However, citizens need more tech literacy to exercise ownership and control. Ignorance of data's value limits reform pressures. Personal data stores are a potential architecture for managing information sharing. However, they require massive adoption to reconfigure data flows. Efficient data sharing would improve households' and firms' situation without discouraging investment. Enabling access can complement private sector innovation.
Policymakers could promote "merit knowledge" that benefits society while avoiding "demerit knowledge" with negative externalities. Nevertheless, such judgments require democratic oversight. Central banks, competition authorities, inspectors, and researchers are proposed as potential merit users. Their access to data could further transparency, competition, forecasting, and new theories without rivaling data holders' activities. However, implementation requires balancing myriad complex incentives and effects.
Future Outlook
Going forward, the proposed theoretical model can inform research and policymaking on digital governance in several ways; first, it provides an overarching framework to analyze the relationships between data, AI, knowledge, and power in the emerging data economy. The theory ties together the forces shaping the rise of data capitalism.
Second, it highlights societal challenges and dilemmas stemming from the increasing concentration of control over data and AI. By illuminating sources of imbalance, it suggests directions for governance reforms. Third, it conceptualizes data sharing as a policy lever to counter monopolistic knowledge flows and rebalance public benefits. Data access emerges as a critical variable shaping equity and innovation.
Fourth, it models trade-offs between openness and control in data systems via the Data Sharing Laffer Curve. The curve elucidates complex interactions arising from policy interventions. Finally, the theory can inform the design of regulations and institutions governing competition, transparency, and data rights. Its lens clarifies stakeholders, objectives, and constraints for governance architectures.
However, practical implementation will require a nuanced understanding of complex incentive structures, technological possibilities, and risks of over-regulation. As digitalization accelerates, the theory provides a valuable anchor for scholarship and policymaking seeking to maximize prosperity and equity. However, its transmission into effective governance will be an evolving, open-ended process requiring continuous learning and experimentation.