With the advent of the “big data” era, the importance of data in all walks of life has been greatly enhanced, so the data collection, storage and analysis have become an industry direction which is desiderated to be promoted. And existing infrastructures have become increasingly difficult to cope with the requirements of more than 3.6 billion connected personnel and the massive amounts of data generated in the process. The server storage of nowadays and the framework which provides data access are both expensive and inefficient. In the era of increasing the number of connected devices, data access efficiency, storage cost and performance stability, etc., are facing great challenges.
Today’s central intermediary — the server storage, and the framework which provides data access, are expensive but also very inefficient. Meanwhile, data center consumes between 1.5% and 2.0% of global electricity, and grows at a rate of 60% per year. With the accelerated integration of the Internet and industry at this stage, the volume of global data will show an exponential upward trend. As early as 2010, the scale of global data has reached the level of “ZB” (1ZB = 1024TB). And through IDC’s estimation, the total amount of global data information can be doubled every two years or so. Only a year later, in 2011, the total amount of data created and replicated worldwide exceeded 1.8ZB. Nowadays, owning to the rise of the concepts of “Internet +” and smart cities, etc., the world has gradually realized the valuable application value of data. According to the prediction of IDC, global data usage will reach 40ZB in 2020, and such a large amount of data is equivalent to about 42.9 billion 1TB hard disk storage space.
However, the most pressing challenge in the industry today is the issue of security and integrity. Security aims at the security of data storage, the privacy of data, while the integrity is mainly for the compatibility of data interaction and the continuity of data.
Today, under the era of big data which is extremely popular, most industries have gradually learned about big data, and began to care about how data is combined with business to improve corporate profitability. The application of data has gradually become the core of the entire data industry. However, the current data industry’s investment in big data and the investment in hardware devices and ERP software in the last era, there is no essential difference. They are believed that this kind of investment can help it increase income, reduce expenditure and solve business problems. But apart from data transactions, data cannot be direct contribution to enterprise’s profits, the value of the data is to generate greater profits of enterprise’s main business.
According to Wikibon report, the market scale of global big data in 2016 was $45.26 billion, and the scale of these two segmentation markets of industry solutions and applications in the niche industry was $22.65 billion. At present, the most developed Chinese companies in the data industry are mainly investing in hardware and technology, and the investment of application layer is relatively small. But the potential of the application layer is undoubtedly huge. In the future, it will be a 100-billion market in a single vertical industry.
But the data sharing industry still fails to address the following pain points very well.
1.1. Incompatibility of Standard
At present, the fundamental reason of which data is difficult to achieve open sharing lies in the existence of sharing barriers between existing information systems and theoretical systems. Data design and construction personnel completely determine the data and data structure, which results in data in different information systems being completely heterogeneous. Owning to these differences, standard inconsistency leads to the waste of resources and various problems in device intercommunication calls.
1.2. High Storage Cost and Low Efficiency
At present, the data obtained or generated after receiving mobile phone information in the Internet is stored on the central server. However, under the influence of geometric increase of networking devices, data storage cost, access efficiency, and stability, etc., are all existing great issues.
1.3. Poor Security
The current “Internet +” areas (including e-commerce, energy, finance, distribution, medical care, etc.) have already seen specific landing scenarios. In a wide variety of applications, the number of data transmission and networking equipment both are constantly climbing, and due to the execution environments of each scenario have different features, the original network security has been challenged. If the defense is breached, firstly, there is a danger of data tampering, and secondly, there is a danger of data leakage. Both are likely to bring a significant security impact.
1.4. Strong Intervention of Centralization and Intermediary
On the one hand, no matter how complete the network defense or legal means, it is impossible to completely avoid the danger from the data center itself. For example, the current network disk sharing is controlled by the operating company behind it. We often find some uploaded resources were banned suddenly. Whenever we share data into a centralized data center, we lose control over the data, and the data center will have the supreme power to revise your shared information.
We saw a lot of losses and inconveniences caused by these problems, so we created Datumesh.
Datumesh solves the problem of sharing all user identity attributes by building a public link for ecological data sharing. At the same time, it solves all industry problems by compressing data to retrieve cost and shard data storage public chain, and finally realizes data sharing.
The blockchain itself is a decentralized database, and it guarantees the ownership and privacy after the separation of data information owners and storage. It returns the control rights of the data to each node in a complete way, and stores, transmits and shares the network data through these decentralized nodes.
The most outstanding feature of blockchain is the distributed large book, which combines cryptography and advanced mathematics to achieve decentralization and consensus. The storage and transaction diary of the traditional database corresponds to a small database embedded in the blockchain. Meanwhile the blockchain actively absorbs the mature fragmentation technology of the database and continuously improves the generalization level of transactions between segments.
What Datumesh is realizing is the integration of blockchain and database underlying technology in order to create a mature decentralized data sharing system. It does not have global locks and fixed node generation logs, maintaining data integrity and transparency, while leveraging Merkel trees and mining inspections to maintain block security.
Based on the blockchain system, Datumesh builds a reliable data sharing library through decentralization. The use of blockchain is a feature of distributed books based on consensus, and the foundation of smart contracts and other assistive technologies. Datumesh, can be used to build a new generation of transactional applications, and meanwhile ensuring the core trust mechanism and transparency security, and simplifying the transaction process at the same time. Datumesh makes all the data exchanges of any node in the system for a period of time, all can be recorded and calculated by a cryptographic algorithm onto a data block (block), and then the key of the block is generated for the connection chain block and complete the check. All participating nodes in the system will then jointly determine whether the record is true and complete the consensus.
It is hard to deny that the future blockchain technology will probably enable human society to achieve a truly interconnected mechanism. Datumesh has the confidence and ability to rely on blockchain technology as a tool to address the trust issues that exist in the data industry.
Company Name: Datumesh
Contact Person: Media Relations
Email: Send Email
Powered by WPeMatico