Analytics

Data Validation and the Problem of Standards: Why it is Difficult and How to Fix it in a Decentralized Way

Fabian Riewe is the Co-founder of Web3 data lake solution KYVE.
__________

Is distribution of data a hurdle to Web3 scalability? Overlooked benefits of decentralized data storage in creating a censorship-resistant space

Data validation has always been a key factor in analyzing and clarifying data. From financial transactions completed online to approvals across computer programs, having the correct data is key.

Conducting transactions without the correct data leads to malfunctions and ultimately undermines the integrity of the entire project.

The additional issue for Web3 projects is that, not only does every piece of information in the process of executing functions need to be stored, it also needs to be accurately validated across the entire distributed database. All of the copies have to be kept in sync.

With the growth of Web3 and investors in 2021 pouring USD 30 billion into cryptocurrency, the number of blockchains and ecosystems has expanded. However, these are often in siloed centers without interoperability.

While multi-chain issues are often discussed, what is not touched on often is the issue of standardized validation of this massive expanse of data. The hurdle to achieving assured validation, especially in a decentralized ecosystem, needs to be solved in order to fundamentally enable interoperability from the most basic level.

Challenges with reaching consensus on distributed networks

Decentralization provides promises of expanded ownership and censorship resistance, but it also brings challenges for consensus and validation.

Without a single centralized entity to provide the final stamp of approval, mechanisms of validation rely on dispersed contributors. This leaves significant gaps for mistakes and malicious actors.

This is not to say that centralized entities are without failure either. Decentralization offers a different level of accountability not found in centralized spaces, which can hide and obscure issues or bad practices.

Finally, blockchains suffer from a shared identity issue that extends to validation. The difference between proof-of-work (PoW) and proof-of-stake (PoS) means how validation is happening is different depending on what blockchains in the wider ecosystem are operating on.

A framework to tame data corruption

There has been over USD 1 billion lost in crypto hacks in 2022 so far. The current data validation standards are simply missing in Web3.

Standardization is often conflated with centralization, but this doesn’t have to be the case.

Rather making it possible to access data validation without requiring conformity from participants is key. Without a standardized consensus of validation, blockchains will continue to run into problems.

As such, building configurable validation into networks is crucial to ensure incorrect and malicious data is identified early and easily. A system where validators can configure the chronology of the node and identify a dispute if there is a violation is incredibly useful.

This places an emphasis on positive validations without limiting the nodes and enabling the growth of centers of decision-making power.

The case for the industry standards

Data validation is something that matters deeply but is often overlooked or only considered when it becomes a problem.

For example, even when discussing the Ethereum (ETH) Merge, the impact it will have on data validation is not a focus, rather the conversation drifts towards how this will affect transactions or developers.

Since historical data won’t be required for validating the chain once this happens, nodes will no longer be incentivized to carry this data. This will make solutions providing decentralized data validation and storage more important than they have been up until this point.

This will, in some ways, push the community towards a standard as it will no longer be an option if they wish to ensure full decentralization.

By taking on this challenge and understanding the importance of good, decentralized data, this fundamental resource can be open, scalable, and customisable therefore expanding the capabilities it supports.

____

   

Source

Click to rate this post!
[Total: 0 Average: 0]
Показать больше

Добавить комментарий