How to fix missing and underreplicated blocks in Hadoop


Sometimes, you may encounter "missing blocks" issue in your CDH. Here i am going to help you to fix this this

1. First check how many corrupt file blocks is there by running below command

hdfs fsck -list-corruptfileblocks

2. By running above command you will get result like below

3. Now its time to delete those corrupted files by running below command

hdfs fsck / -delete

That's it

Previous
Next Post »

1 comments:

Write comments
Anonymous
AUTHOR
April 21, 2017 at 5:28 PM delete

I really appreciate information shared above. It’s of great help. If someone want to learn Online (Virtual) instructor lead live training in Apache Kafka, kindly contact us http://www.maxmunus.com/contact
MaxMunus Offer World Class Virtual Instructor led training on in Apache Kafka. We have industry expert trainer. We provide Training Material and Software Support. MaxMunus has successfully conducted 100000+ trainings in India, USA, UK, Australlia, Switzerland, Qatar, Saudi Arabia, Bangladesh, Bahrain and UAE etc.
For Demo Contact us.
Nitesh Kumar
MaxMunus
E-mail: nitesh@maxmunus.com
Skype id: nitesh_maxmunus
Ph:(+91) 8553912023
http://www.maxmunus.com/


Reply
avatar