Memory is not deallocated after a giraph job is finished

I am using Apache Giraph version 1.0 upon Hadoop version 0.20.203. It executes ConnectedComponentsVertex and SimpleShortetPathsVertex, examples of apache giraph, jobs successfully, but there exists a problem. After a job is finished memory is not deallocated. As System monitor shows, java processes that are created for the job are still live. I don’t understand why this problem is occurred. Is it a Giraph's bug or I am doing something wrong??? I'm using Ubuntu 11.10 and java 1.6. Any help would be appreciated.

Thanks

Answers


Yes, I believe it's a bug (see my question on the user lists: https://mail-archives.apache.org/mod_mbox/giraph-user/201403.mbox/%3COF416E2CF4.1613A751-ON86257C9F.00498FA5-86257C9F.0049D454@us.ibm.com%3E). The way I'm getting around it is by sending a kill to all the workers after a job is done:

ssh worker-name "kill -9 \$(ps aux | grep \"[j]obcache/job_[0-9]\{12\}_[0-9]\{4\}/\" | awk '{print \$2}')"

Need Your Help

Running Count Reset On some Column Value in select query

sql oracle

I want to achieve running value but condition is reset on some specific column value as like

Which usb driver is called in order to transfer data?

linux-kernel usb linux-device-driver

I would like to change the behaviour of one of my USB flash drive by editing its driver, but i can't find which driver is called.