6
votes

I've written a Julia module with various functions which I call to analyze data. Several of these functions are dependent on packages, which are included at the start of the file "NeuroTools.jl."

module NeuroTools

using MAT, PyPlot, PyCall;

function getHists(channels::Array{Int8,2}...

Many of the functions I have are useful to run in parallel, so I wrote a driver script to map functions to different threads using remotecall/fetch. To load the functions on each thread, I launch Julia with the -L option to load my module on each worker.

julia -p 16 -L NeuroTools.jl parallelize.jl

To bring the loaded functions into scope, the "parallelize.jl" script has the line

@everywhere using NeuroTools

My parallel function works and executes properly, but each worker thread spits out a bunch of warnings from the modules being overwritten.

WARNING: replacing module MAT
WARNING: Method definition read(Union{HDF5.HDF5Dataset, HDF5.HDF5Datatype, HDF5.HDF5Group}, Type{Bool}) in module MAT_HDF5...
(contniues for many lines)

Is there a way to load the module differently or change the scope to prevent all these warnings? The documentation does not seem entirely clear on this issue.

2
Note that the current Julia parallelization, e.g. with -p 16, @everywhere etc. is based on separate processes, rather than separate threads. Multi-threading will start to be introduced in Julia 0.5 - Michael Ohlrogge

2 Answers

4
votes

Coincidentally I was looking for the same thing this morning

(rd,wr) = redirect_stdout()

So you'd need to call

remotecall_fetch(worker_id, redirect_stdout)

If you want to completely turn it off, this will work

If you want to turn it back on, you could

out = STDOUT
(a,b) = redirect_stdout()
#then to turn it back on, do:
redirect_stdout(out)
0
votes

This is fixed in the more recent releases, and @everywhere using ... is right if you really need the module in scope in all workers. This GitHub issue talks about the problem and has links to some of the other relevant discussions.

If still using older versions of Julia where this was the case, just write using NeuroTools in NeuroTools.jl after defining the module, instead of executing @everywhere using NeuroTools. The Parallel Computing section of the Julia documentation for version 0.5 says,

using DummyModule causes the module to be loaded on all processes; however, the module is brought into scope only on the one executing the statement.

Executing @everywhere using NeuroTools used to tell each processes to load the module on all processes, and the result was a pile of replacing module warnings.