Abstract—Cloud Storage provide users with abundant
storage space and make user friendly for immediate data access.
But there is a lack of analysis on optimizing cloud storage for
effective data access. With the development of storage and
technology, digital data has occupied more and more space.
According to statistics, 60% of digital data is redundant, and the
data compression can only eliminate intra-file redundancy. In
order to solve these problems, De-Duplication has been proposed.
Many organizations have set up private cloud storage with their
unused resources for resource utilization. Since private cloud
storage has limited amount of hardware resources, they need to
optimally utilize the space to hold maximum data. In this paper,
we discuss the flaws in existing methods for Data De-Duplication.
Our proposed method namely Dynamic Whole File
De-duplication (DWFD) provides dynamic space optimization in
private cloud storage backup as well as increase the throughput
and de-duplication efficiency.
Index Terms—Cloud backup, cloud computing, constant-size chunking, data de-duplication, full-file chunking, private storage cloud, redundancy.
The authors are with the R.M.D Engineering College, Chennai, India (e-mail: email@example.com, firstname.lastname@example.org, email@example.com).
Cite: M. Shyamala Devi, V. Vimal Khanna, and A. Naveen Bhalaji, "Enhanced Dynamic Whole File De-Duplication (DWFD) for Space Optimization in Private Cloud Storage Backup," International Journal of Machine Learning and Computing vol.4, no. 4, pp. 376-382, 2014.