Disk leak on Ubuntu with Hadoop and Eclipse
I'm using Hadoop with Eclipse in Ubuntu 12 VirtualBox Guest with Windows 8 Host for development of MapReduce programs. I'm using a large data during unit testing and apparently there are some memory leaks as the hadoop program throws disk error after few trials. I closed the eclipse, but memory still doesn't show back in my Windows Task Manager. After I restart the Ubuntu machine, I am able to run the programs again till there is a disk error. Did any one faced similar issue or know how to fix this ?
When I ran with large data, the program in eclipse ended with Disk space error and did not clear the tmp folder. I manually deleted the hadoop files in tmp folder and I saw the disk space back again. Dropping caches did not help in this case.