Critical Analysis of Solutions to Hadoop Small File Problem

Authors

  • Prof. Shwetha K S

  • Dr. Chandramouli H

Keywords:

Abstract

Hadoop big data platform is designed to process large volume of data Small file problem is a performance bottleneck in Hadoop processing Small files lower than the block size of Hadoop creates huge storage overhead at Namenode s and also wastes computational resources due to spawning of many map tasks Various solutions like merging small files mapping multiple map threads to same java virtual machine instance etc have been proposed to solve the small file problems in Hadoop This survey does a critical analysis of existing works addressing small file problems in Hadoop and its variant platforms like Spark The aim is to understand their effectiveness in reducing the storage computational overhead and identify the open issues for further research

How to Cite

Prof. Shwetha K S, & Dr. Chandramouli H. (2023). Critical Analysis of Solutions to Hadoop Small File Problem. Global Journal of Computer Science and Technology, 23(C2), 23–28. Retrieved from https://computerresearch.org/index.php/computer/article/view/102323

Critical Analysis of Solutions to Hadoop Small File Problem

Published

2023-10-28