Journal
SYMMETRY-BASEL
Volume 8, Issue 7, Pages -Publisher
MDPI
DOI: 10.3390/sym8070069
Keywords
storage system; deduplication; duplication elimination ratio; content defined chunking
Categories
Funding
- National Natural Science Foundation of China [61572394]
- National Key Research and Development Plan of China [2016YFB1000303]
- Shenzhen Fundamental Research Plan [JCYJ20120615101127404, JSGG20140519141854753]
Ask authors/readers for more resources
Deduplication is an efficient data reduction technique, and it is used to mitigate the problem of huge data volume in big data storage systems. Content defined chunking (CDC) is the most widely used algorithm in deduplication systems. The expected chunk size is an important parameter of CDC, and it influences the duplicate elimination ratio (DER) significantly. We collected two realistic datasets to perform an experiment. The experimental results showed that the current approach of setting the expected chunk size to 4 KB or 8 KB empirically cannot optimize DER. Therefore, we present a logistic based mathematical model to reveal the hidden relationship between the expected chunk size and the DER. This model provides a theoretical basis for optimizing DER by setting the expected chunk size reasonably. We used the collected datasets to verify this model. The experimental results showed that the R-2 values, which describe the goodness of fit, are above 0.9, validating the correctness of this mathematic model. Based on the DER model, we discussed how to make DER close to the optimum by setting the expected chunk size reasonably.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available