The inevitability of numbered data – TechCrunch



[ad_1]

More contributions from this contributor

  • The "plastics" of yesterday are today's encryption chips

We are coming to the end of an inevitable clash between big tech and regulators with a battlefield still underdeveloped around consumer data. In many ways, the fact that things have happened here shows that the market has not yet developed an alternative to Google's and Facebook's data paradigm as suppliers and sellers and to Amazon's as the host who dominates today.

Tokenization and decentralization of data offer such an alternative. While the first generation of "utility" tokens was based only on dreams, a new generation of tokens, explicitly tied to the value of the data, will emerge.

The conversation around the data has reached a new point of inflection.

The presidential candidate, Senator Elizabeth Warren, called for the dissolution of the tech giants, including Amazon and Facebook. In many ways, this decision seems to be the inevitable outcome of the last few years in which public opinion on the technology sector has shifted from an extremely positive attitude to an increasingly skeptical attitude.

Part of the growing skepticism is that when populist ideology sets in, all institutions of power are subject to increased scrutiny. But when you focus on the details, it is clear that the problem behind the loss of trust in technology companies lies in the data: what is collected, how is it used, and who benefits from it.

The Facebook Analytica scandal of Facebook, in which a significant amount of user data was used to help Russian political actors break the divide and help Trump to be elected in 2016, and the subsequent testimony of Facebook president Mark Zuckerberg ahead of Congress, marked a turning point in this loss of confidence around the data.

Those who rejected the consumer outrage about the event pointing out that virtually no one had left the platform because of the event, have not recognized that the actual impact was always more likely to occur as well: providing political cover for a call for the dissolution of the company.

Image reproduced with the kind permission of Bryce Durbin

Of course, not all Democratic presidential candidates of 2020 agree with Warren's call. In response to Warren, Andrew Yang – the quintessential candidate who made waves by focusing on universal basic income and after appearing on Joe Rogen's popular podcast – wrote: "Agree on fundamental problems related to advanced technology. But we need to expand our toolkit. For example, we should share the benefits of using our data. Better than just regulating. We need a new legal regime that does not rely on consumer prices as antitrust. "

Although we may think that Yang is biased, since he came from the world of technology, he was more eloquent and articulated about the upcoming threat of moving automation than n & # 39; Any candidate. His notion of a different understanding of the data economy between those who produce them and the platforms that use it (and sell advertising against) deserves to be examined.

In fact, one could argue that not only is this type of heavy regulatory approach to data inevitable, but it represents a fundamental market failure in the way the data economy is organized.

Interior of the modern server room in the data center

The data, it has been said, is the new oil. This is, in this badogy, the fuel with which the economy of attention works. Without data, there is no advertising; without advertising, there are none of the free services that dominate our social life.

Of course, the data market also has another aspect, the one in which it resides. Chamath Palihapitiya, an investor (and former growth manager on Facebook), pointed out that 16 percent of the money he invested in companies went directly into Amazon's vaults for data hosting.

This shows that, even though regulators – and even more so, presidential candidates looking to get points with a populist base – might think that all technology is aligned with preserving today's status quo – he In fact, there are great financial incentives for something else.

Enter "decentralization".

In his seminal essay "Why Decentralization Matters," A16Z investor Chris Dixon explained how incentives vary across networks. At the beginning of networks, network owners and participants have the same interest: to increase the number of nodes on the network. Inevitably, however, a threshold is reached when pure growth of new participants is not achievable, and the network owner must instead extract more from existing participants.

Decentralization, according to Dixon, offers an alternative. In short, tokenization would allow all users to participate in the financial benefits and benefits of the network, eliminating the distinction between network owners and network users. When there is no separate property clbad, no one has the need (or the power) to extract it.

The test was a brilliant articulation of an idealized state (reflected in his more than 50,000 applause on Medium). With the ICO boom, things did not go exactly as Dixon had imagined.

The problem, at a fundamental level, was knowing what the token really was. In almost all cases, "utility tokens" were simply payment tokens – an alternative for money only for this service. Their value was based on the speculation that they could reach a certain monetary premium allowing them to transcend the utility for this network only – or to allow that network to become so large that this value could be maintained over time.

It is not difficult to understand why things were designed this way. For network builders, this type of payment token allowed for a completely non-dilutive, global and instantaneous form of capitalization. For retail buyers, they offered the opportunity to participate in venture capital in a way that was denied to them by the Accreditation Acts.

At the end of the day, however, the truth was that these tokens were only supported by dreams.

When the market for these dreamy rooms finally collapsed, many decided to toss the baby token with ICO bath water.

And if that raised a question: what would happen if the tokens of the decentralized networks were only supported by dreams, but we were rather protected by data? And if instead of dream parts, we had pieces of data?

The data is indeed the oil of the new economy. In the context of a given digital application, the value lies in the data: for the companies that are paid to host it; for platforms capable of selling advertising against it; and for users who effectively exchange their data for discounted services.

The data is, in other words, an badet. Like other badets, it can be segmented and decentralized into a public blockchain. It is not difficult to imagine that the future of all useful data in the world will be represented by a private key. Linking tokens to data explicitly creates a world of new options for reconfiguring the way applications are created.

First, data segmentation could create an opportunity for the nodes of a decentralized hosting network – that is, a decentralized alternative to AWS – that could speculate on the future value of the data in the applications for which they are provided hosting services, creating a financial incentive beyond the mere provision of services. When third parties such as Google want to crawl, query, and access data, they return the data token (a data point) to minors who secure and store it, as well as to developers who acquire, structure, and tag the data. the data so that it is useful to third parties, especially machine learning and AI companies.

Secondly, application developers could not only exploit the benefits of more streamlined capitalization through tokens, but also experiment with new ways of organizing value streams, for example by limiting the value users to their own data and allowing them to take advantage of it.

Third, users could begin to have a tangible (and traceable) idea of ​​the value of their data and exert market pressure on the platforms to include them in the upside, while further controlling where and how their data was used.

In other words, symbolic data could create a market mechanism for redistributing the balance of power in technology networks without resorting to rigorous (even well-intentioned) regulation such as the GDPR, or worse, to the sort of break proposed by Warren.

Even after the implosion of the ICO phenomenon, there are many like Fred Wilson who believe that the pbadage to the control of data by the user, facilitated by blockchains, is not only possible, but inevitable.

Historically, technology has evolved from closed to open, back to closure, and back to open. We are now in a closed phase in which centralized applications and services have and control the vast majority of data access. Decentralized p2p databases – public block strings – will open up and generalize data creation in a disruptive manner, which will change the flow of how the value is captured and created on the Internet.

In simple terms, open and token data can limit monopoly control over future innovation while opening up a new computing age.

This is how information can finally be released.

[ad_2]
Source link