I have added a set of jars to the Distributed Cache using the DistributedCache.addFileToClassPath(Path file, Configuration conf) method to make the dependencies available to a map reduce job across the cluster. Now I would like to remove all those jars from the cache to start clean and be sure I have the right jar versions there. I commented out the code that adds the files to the cache and also removed them from where I had copied them in hdfs. The problem is the jars still appear to be in the classpath because the map reduce job is not throwing ClassNotFound exceptions. Is there a way to flush this cache without restarting any services?
Edit: Subsequently I flushed the following folder: /var/lib/hadoop-hdfs/cache/mapred/mapred/local/taskTracker/distcache/ . That did not solve it. The job still finds the references.