The tremendous development of cloud computing with related technologies is an unexpected one. However, centralized cloud storage faces few challenges such as latency, storage, and packet drop in the network. Cloud storage gets more attention due to its huge data storage and ensures the security of secret information. Most of the developments in cloud storage have been positive except better cost model and effectiveness, but still data leakage in security are billion-dollar questions to consumers. Traditional data security techniques are usually based on cryptographic methods, but these approaches may not be able to withstand an attack from the cloud server's interior. So, we suggest a model called multi-layer storage (MLS) based on security using elliptical curve cryptography (ECC). The suggested model focuses on the significance of cloud storage along with data protection and removing duplicates at the initial level. Based on divide and combine methodologies, the data are divided into three parts. Here, the first two portions of data are stored in the local system and fog nodes to secure the data using the encoding and decoding technique. The other part of the encrypted data is saved in the cloud. The viability of our model has been tested by research in terms of safety measures and test evaluation, and it is truly a powerful complement to existing methods in cloud storage.
Since the twenty-first century, communication, information, computers, and related technologies have developed very rapidly in a short period of time. Cloud computing, a developing technology in this century, was suggested and defined by NIST in 2006 [
The development of cloud technology and growth is unexpected in the various factors like storage capacity, performance, stability, and cost. Considering the factors like huge storage and cost-effectiveness, most companies prefer this, as the data center is unsatisfied with the requirements of the consumers. So, people start to think and search for new technology and strategy which fulfill their needs. The greatest strength in storage capacity and cost-wise more users select the cloud storage. Cloud storage is a form of data storage and management service that has its own computing system. Inspire of cloud storage and the development, still there are a lot of questions in terms of security. Sharing the data in public leads to privacy problems, for example, Hollywood actresses preferred cloud storage to store private data and family albums, but it was hacked by hackers.
In
Fog computing is a form of computing model that combines traditional cloud computing with restricted capabilities such as storing and networking services computing. It offers an excellent and better solution for latency-sensitive IoT applications [
The security and effectiveness in cloud storage have taken a lot of attention from a variety of industries. A lot of research is still going on in terms of security in cloud storage on daily basics. To solve those problems, the paper [
In the Proposal [
Hou et al. [
Feng et al. [
In this paper [
Harnick et al. [
The proposed system explains cloud security with the methodologies of deduplication, divide and combine, encoding and decoding, finally encryption and decryption, when a user uses the cloud server to save the information.
Deduplication (dedup) Process. Divide the information into three parts. First data (5% data) is encoded and saved in the local machine. Secondly (10% data) is encoded and saved in the fog server. Third (85% data) is encrypted and saved on the cloud server.
Decrypt the user information from the cloud server and restore it to the next layer. Decode the data from the fog server and restore it to the next layer. Decode the data from the local machine. Combine the data.
Deduplication is an important factor in the storage environment in both physical and cloud storage. When the user has the same file but in different names, the consumption of space will be overhead. So, when the user prefers to cloud storage for his environment, the consumer needs to pay how much space is required to store the data. So, before using cloud storage, the company can have one node to remove the duplicates or unwanted blocks in the files. Deduplication (dedup) is the concept of removing duplicate blocks.
Fog storage and computing is a form of distributed computing in which many devices are connected to a cloud storage system. An added advantage is that the processed data is most likely required by the same devices that produced the data, reducing latency during processing both remotely and locally. Fog storage is stated as, “It is a distributed computing architecture with a resource pool which consists of one or more ubiquitously connected heterogeneous devices and the network edge and to collaboratively provide elastic computation, storage, and communication.” Whilst, Vaquero et al. defined fog computing as; “a network through which a large number of heterogeneous, universal, and decentralized devices communicate and collaborate to perform processing tasks and to manage storage without the need for third-party intervention.” The significance of fog is to provide consistent latency and time-sensitive for various applications in IoT.
The fog technology characteristics are.
Location awareness and low latency. Geographical distribution. Scalability. Support for mobility. Real-time interaction and Heterogeneity.
We propose an MLS architecture based on the fog node framework to secure customer information. The MLS framework allows users to have powerful management control and ensures that their privacy is protected. It is already noted the protection against internal attacks is difficult.
The various security models of encryption techniques are well and good for an outside attack, but the CSP has problems. Here the CSP is accessing and managing the data internally, so, encryption techniques become invalid. In the proposed model, the customer's data is divided into three parts and applies the encryption technology in cloud storage.
The first block is to remove the duplicate data from the original data and the local machine have 5% of data and 10% of data uploaded to the next fog layer, here 10% of the data are encoded and saved, the rest part of the data is uploaded in the cloud server of 85%. The third layer also saves the encoded data. Elliptic curve cryptography is used in the above operations.
Before dividing the user data, the deduplication concept is applied here to remove the duplicate blocks of data. Mainly, removing the duplicate blocks is for space management and bandwidth requirements. In computing, in this model, the user's data are split into three parts like user machine, fog machine, and finally cloud storage in encrypted data, with the scale of small to large data, respectively. In this strategy, while initially the duplicates are removed, and data is split into three the data is stored so the hacker may not reproduce the original form of data at any cost. The cloud storage management also gets the exact user's information, because the two layer user's machine and fog server are controlled and managed by the user.
As mentioned above, the first two divisions of data are saved in the local and fog machine of 5% and 10% respectively. So, the first data are encoded using the Hash-Solomon encoding algorithm. The procedure is,
User information is received and the received data is deduplicated to remove the duplicates. Once the initial process is completed, the data is divided into 3 parts in the ratio of 5:10:85. The first two parts are taken in the process and do the encoding using the Hash-Solomon encoding algorithm. For encoding first, we are doing mapping transformation the file which needs to store, each which holds a number, says X it shown in Second, we do a hash function and on the received value X to get the matrix Y, Now multiplication will generate (k) the data blocks.
Once data is received in cloud storage (X1-X4), it calls the function elli_curv (data) in the below pseudo code. During the decoding process, the reverse action happens.
It is a new form of cryptographic mechanism to secure the data in both physical and cloud storage. ECC is one of the best techniques based on the theory of elliptical curves. The property of the elliptic curve is used to generate the keys for encryption methodology instead of existing techniques which use large prime numbers. ECC uses the elliptic curve equation for key generation. In 1985, N. Kobiltz and V. Miller proposed the elliptical curve cryptography for unpredictable data to have secure data. The basic idea behind ECC is to use an elliptic curve to integrate a discrete logarithm problem. The model ECC has the most complex mathematical problems in public-key cryptography. In this case, where computing power is constantly improving, solving discrete logarithms in a mathematical sense becomes a relatively simple task. As a result, consider moving the discrete logarithm problem solution to a more critical and difficult area to implement, such as elliptic curves.
The most significant of the ECC model uses the smaller keys for security. When compared to the other model (1064-bit key) ECC takes the 164-bit key for the same level of security. This model is a public-key cryptosystem for the generation of keys centered on the mathematical complexity of solving the elliptic curve discrete logarithm problem to encrypt and decrypt data. It measures the hops needed to travel from one point on an elliptic curve to another. Elliptic curves are symmetrical over the x-axis and are binary curves.
ECC is a cryptographic scheme in which each client has two keys: a public and a private key. Encryption and signature authentication is done with the public key, while decryption and signature generation is done with the private key.
Any cryptography technique applied for security, the initial and the most significant process is the key generation for both public and private. With the help of the receiver's public key, the sender encrypts the message data, and the receiver decrypts it with its private key. Here we have the code for the elli_curv (data) function for cloud storage. The Key generation procedure is below.
Senders public key (rand_nopA) a random number from {1, 2,…t−1} Senders public key S.public_keyXA = rand_nopA* Gen_ptQ Receivers public key (rand_nopB) a random number from {1, 2,…t−1} Receiver public key R.public_keyXB = rand_nopB* Gen_ptQ Sender security key, key = rand_nopA* R.public_keyXB Receiver security key, key = rand_nopB*S.public_keyXA S.public_keyXA = rand_nopA* Gen_ptQ ## sender public key R.public_keyXB = rand_nopB* Gen_ptQ## receiver public key key == rand_nopA* R.public_keyXB## sender security key key == rand_nopB* S.public_keyXA## sender security key
From this proposal, the user data is divided into three parts. The third part of 85% of user information is encrypted with the ECC method. The plain text is sent to the cloud, before saving the data it can be encrypted into points and it can be saved as encrypted points (F1, F2). The encrypted points are stored in cloud storage. The first user data is saved in the respective storages and during the third part of data, it calls the ellp_curve function.
## Now sender, send message e to the end receiver. ##e has any point Ellp_curve on elliptic curve, ##Now the sender choose a random number, r from {1, 2…t−1} ##Cipher text will be written from points (F1, F2) here, F1 = r * Gen_ptQ and F2 = Ellp_curve+(r*Gen_ptQ)
Once the user stores the data in encrypted points (F1, F2) during the retrieval of the data, the encrypted points are decrypted as plain text that can be user readable.
## On computing the receiver side, multiplication of F1 with its private key (rand_noPb * F1) ## Then subtracts from the result of previous step ## Ellp_curve – Original message Ellp_curve=F2-(rand_noPb* F1) Ellp_curve=F2 –(pB *F1)
We computed the efficiency and feasibility of the MLS system based on the ECC and the fog model through a sequence of tests that include time-consuming encoding, similarly for decoding and testing the data are in different sizes. Here, the algorithm computes the user data into three segments of data, before dividing the duplicates of data blocks are removed. Once the data segment into three parts of data, they have moved to local, fog server, and cloud server with different sizes of data. Here in the local and fog server, data is processed with an encoding scheme called the Hash-Solomon code algorithm. The simulation was used to perform all the experiments in this paper, and the environmental parameters are described in
No | Items | Parameter |
---|---|---|
1 | Operating system | Windows 2008 |
2 | Language | C |
3 | Processor | Intel i7 2.50 GHz |
4 | RAM | 8 GB |
5 | Storage | 1 TB |
Let us take three different types of files which are image (42.2 MB) and video (650 MB). One more block theory is used in all the experiments in this article, which implies that there are t + 1 data volumes. By the way, the method will protect data privacy while also alleviating the storage burden on the lower servers.
Below results are from various metrics and parameters that are considered. In the figure, we take the image file for analysis of the local storage, where the x-axis consists of the number of storage blocks and the y-axis consists of data volume stored in the machine. Similarly, the figure shows the result of the video file for similar parameters. In both the graphs, we are comparing the data volumes stored on the local machine for the Reed-Solomon code and Hash-Solomon code algorithm. According to the graph, as the number of data volume increases, the amount of data stored in the user's computer decreases. So, the large volume of data shows good results during this experiment. The consumption of space is less.
When the 100 data blocks are encoded, it takes nearly 600 ms and more than 2500 ms for the proposed and existing methods. Similarly, while decoding for 100 blocks, it consumes nearly 600 and 1200 ms respectively.
In the research article, a new Multi-layer secure architecture based on a fog model is proposed for protecting the information using an elliptical curve algorithm. Here, the user is divided into a ratio of 85:10:5 and all allocated into the devices and use two different techniques to make the data insecure. If hackers are trying to crack the data, two different techniques are applied. The performance of the proposed architecture is evaluated in terms of execution time, encoding time, decoding time, and storage efficiency of data. The elliptic curve cryptography algorithm performance is compared with an existing technology such as MRSA. From the experimental result analysis, the proposed technique is feasible and secure. The proposed MLS architecture achieves lower computational costs and faster. It also allows the use of secure exchange protocols, adding to the overall security. In the future, this work may be extended using advanced cryptographic techniques to improve the security of information stored in cloud computing.