CHAPTER5 PROPOSEDARCHITECTUREANDMETHODOLOGY 5.1Proposed Architecture Inthis section,wedescribetheproposedarchitecture of deduplication system.In our proposed architecture we replace some technique with new technique like we replaced convergent encryption by AES encryption. We eliminate some steps that are not necessary to be there. That will help to reduce execution time. Mainly we made three changes in our proposed architecture: 1. We use AES encryption in place of convergent encryption. 2. We eliminate secret key encryption between server and metadata manager. 3. We proposed to check duplication at starting stage. In old architecture duplication checked when data converted in another form means after encryption. These three changes in …show more content…
So we eliminate one secret encryption layer that was present between server and metadata manager. System takes so much time every time to check integrity and address of file because of that secret encryption layer. So clearly this step will reduce execution time. And at last we check duplication at starting stage. Now, there is no need to encrypt every data for integrity checking. This also eliminates extra overload from system. 5.2 COMPONENTS: In this section we describe the role of each component. A. User The role of the user is limited to uploading files to the system,encrypting them with the AES encryption technique,signing the resulting encrypted files and creating the storagerequest. In addition, the user also generates the key to AES system by his ID and PASSWORD. It works as file identifier to the system. For each file, this key will be used to decrypt and re-build the file when it will be retrieved. Instead, thefile identifier is necessary to univocally identify a file overthe whole system. Finally, the user also signs each files with a special signature scheme. In order not to apply costly signatureoperations for the file, for all it adds owner name to a separate MM. The main architecture is illustrated in Fig. 5.1. B. Server The server has three main roles: authenticating users duringthe storage/retrieval request, performing access control byverifying block signatures embedded in the data, encrypting/decrypting data traveling from users to the cloud
In today’s world of instant connectivity and information at users’ fingertips, it’s vital that sensitive information is safeguarded against those who seek to do personal harm and profit from gaining access to the data. The key behind keeping information safe is the method in which it’s protected and encrypted. In order to appreciate how information is secured, users must understand the encryption concepts behind it. To do this, one must comprehend the current encryption standards, the trends and developments in encryption technology, the importance of securing data, the government’s regulations pertaining to encryption, the companies involved in research and implementation, the implications of leaked or stolen data, and a brief look into
A company allows its staffs in the same groupor department to store and share files in the cloud. By utilizing the cloud, the staffs can be completely released from the troublesome local data storage and maintenance.However, it also poses a significant risk to the confidentialityof those stored files. Specifically, the cloud serversmanaged by cloud providers are not fully trusted by userswhile the data files stored in the cloud may be sensitive andconfidential, such as business plans. To preserve dataprivacy, a basic solution is to encrypt data files, and thenupload the encrypted data into the cloud. Unfortunately,designing an efficient and secure data sharing scheme forgroups in the cloud is not an easy task due to the followingchallenging issues.First, identity privacy is one of the most significantobstacles for the wide
1.17 Consider a computing clusters consisting of two nodes running a database. Describe two ways in which the cluster software can manage access to the data on the disk. Discuss the benefits and disadvantages of each.
Objective 3 – A system that can ensure the safety of data from other possible circumstances that may result to corruption and loss of data.
Chapter 7 discusses compression algorithms. Compressions are used often and sometimes we may not even be aware of it. The items we download or upload may be compressed in order to save bandwidth. Chapter 8 discusses the fundamental algorithms underlying databases (MacCormick, 7). This chapter emphasizes the techniques used to achieve consistency and to ensure that databases never contradict each other. Chapter 9 discusses the ability to ‘sign’ an electronic document digitally (MacCormick, 7). Chapter 10 discusses algorithms that would be considered great if it existed.
[15] W. K. Ng, Y. Wen, and H. Zhu. Private data deduplicationprotocols in cloud storage. In S. Ossowski and P. Lecca, editors,Proceedings of the 27th Annual ACM Symposium on Applied Computing,pages 441–446. ACM, 2012.
Therefore the cloud server could not have any idea about the no of documents stored, length of the files . In addition ,when a data user request for any particular files, he receives the random number of blocks which contains the documents.
This report illustrates the Message Digest 6 (MD6) hashing algorithm, how it functions, and the essentialness of a hashing function. Besides, it depicts the qualities of this hashing capacity over its predecessor, and its present shortcomings.
Cloud because of its wide range of applications it allows users to store data their data remotely in the cloud and enjoy the on-demand high quality cloud applications and reveal burden from the local storage, cost and maintenance. In this according to the user’s perspective, including both individuals (private) and enterprises like companies appealing the cloud benefits by storing data remotely into the cloud in a flexible on-demand manner and relief of the burden of storage management along with this he/she can also enjoy the universal data access which dependent geographical locations and avoidance of the capital expenditure, software, hardware and personnel management and maintenances and so on.
Services such as, data storage and security, are provided by cloud computing over the internet. In cloud computing, users can pay for what they consume (Bisong & Rahman, An Overview of the Securtiy Concerns in Enterprise Cloud Computing, 2011). Cloud computing is an emerging information technology, which can make it easier for the users to manage their data. Cloud computing allows businesses to expand as new cloud-based models are being discussed and implemented as solutions (Bamiah & Brohi, 2011).
Usage of remote servers via internet to store, manage and process data instead of using a personal computer is known as Cloud computing. It’s a set of Information Technology services with the ability to scale up or down their service requirements. Most of the cloud services are provided by a third party service provider. In cloud computing, organizations can utilize IT services without in advance investment. Despite its benefits obtained from the cloud computing, the organizations are slow in accepting it due to security issues and challenges. Security is one of the major problems which hinder the growth of cloud. It’s not wise to handing over the important data to another company; such that clients need to be vigilant in understanding the risks of data infringement in this new environment. This paper discusses a detailed analysis of the cloud computing security issues and challenges. (Ayoleke)
Modern day computing systems rely on a distributed system for data, functions and services. Arguably all popular software such as Uber, Spotify, Facebook and Fitbit among others, host their data and applications on dedicated servers to allow for user access services through their devices. The challenge with server based systems is that the integrity and security of private data is left out to third parties, nowadays established as corporations who offer hosting services for applications database and file storage. The main advantage of cloud based models is that customers do not have to pay for the installation of data storage and processing capabilities for applications (Jadeja,
The National Institute of Standards and Technology describes cloud storage as a model for enabling ubiquitous, on-demand network access to a shared configurable computing resources that can be swiftly accessed and released with minimal effort or service provider collaboration. It is comprised of a collection of hardware and software that allows the infrastructure of the cloud to work in a seamless, unified effort. Depending on the classification of information and the service provider the remote servers can be located within the same facility. The stored data is
Files data is divided into equal data blocks. In advance storing the data blocks on cloud servers, user pre-computes the verification tokens. These tokens are used to check the integrity of data stored on cloud servers. Also these tokens are used to locate the cloud server on which data has been changed by the attacker. Before data division and dispersing file consumer generates tokens on individual data blocks. When consumer wants to check the correctness of the data, he sends the file identifier to the cloud servers. User may send challenge on particular data block also. Upon receiving challenge token, each cloud server computes the token on the data blocks, and sends them to client.
Encryption is a method of converting plain text to cipher text using some base keys and