0
votes

I am trying to run a for loop in parallel which includes trying a eras model in parallel. Following is the code:

function init_population(pop :: _population)
        addprocs(16)

    @sync @parallel for i in 1:pop.size
    @everywhere ran=sample(1:202,10,replace=false)
    @everywhere w=get_weights(ran)  ####keras model
    @everywhere gg=_genotype(ran,w)   ### composite type
    @everywhere m,v=get_mean_variance(gg)  ####func doing calculation
    @everywhere pp=_phenotype(m,v)    ### composite type
    @everywhere fitn=get_fitness(pp)   ####func doing calculation
    @everywhere new_guy = _individual(gg,pp,fitn)     ### composite type
    @everywhere push!(pop.individuals, new_guy)
end
return pop
end

The error I am getting ::

ERROR: LoadError: UndefVarError: sample not defined
eval at ./boot.jl:235
eval_ew_expr at ./distributed/macros.jl:116 [inlined]
#135 at ./distributed/remotecall.jl:319
run_work_thunk at ./distributed/process_messages.jl:56
#remotecall_fetch#140 at ./distributed/remotecall.jl:344
remotecall_fetch at ./distributed/remotecall.jl:344
#remotecall_fetch#144 at ./distributed/remotecall.jl:372
remotecall_fetch at ./distributed/remotecall.jl:372
#33 at ./distributed/macros.jl:102
#remotecall_fetch#140(::Array{Any,1}, ::Function, ::Function, ::Base.Distributed.LocalProcess, ::Expr, ::Vararg{Expr,N} where N) at ./distributed/remotecall.jl:345
remotecall_fetch(::Function, ::Base.Distributed.LocalProcess, ::Expr, ::Vararg{Expr,N} where N) at ./distributed/remotecall.jl:344
#remotecall_fetch#144(::Array{Any,1}, ::Function, ::Function, ::Int64, ::Expr, ::Vararg{Expr,N} where N) at ./distributed/remotecall.jl:372
remotecall_fetch(::Function, ::Int64, ::Expr, ::Vararg{Expr,N} where N) at ./distributed/remotecall.jl:372
(::##73#75)() at ./distributed/macros.jl:102
Stacktrace:
 [1] sync_end() at ./task.jl:287
 [2] macro expansion at ./distributed/macros.jl:112 [inlined]
 [3] evolutionary_loop(::_population) at ./untitled-75c3e04a7f530386f03caa1b6d061e62:372
 [4] include_string(::String, ::String) at ./loading.jl:522
 [5] include_string(::Module, ::String, ::String) at /Users/yash/.julia/v0.6/Compat/src/Compat.jl:88
 [6] (::Atom.##112#116{String,String})() at /Users/yash/.julia/v0.6/Atom/src/eval.jl:109
 [7] withpath(::Atom.##112#116{String,String}, ::Void) at /Users/yash/.julia/v0.6/CodeTools/src/utils.jl:30
 [8] withpath(::Function, ::String) at /Users/yash/.julia/v0.6/Atom/src/eval.jl:38
 [9] hideprompt(::Atom.##111#115{String,String}) at /Users/yash/.julia/v0.6/Atom/src/repl.jl:67
 [10] macro expansion at /Users/yash/.julia/v0.6/Atom/src/eval.jl:106 [inlined]
 [11] (::Atom.##110#114{Dict{String,Any}})() at ./task.jl:80
while loading untitled-75c3e04a7f530386f03caa1b6d061e62, in expression starting on line 395

I am not sure how to do remote call and how it works. I basically to run the for loop in 16 processes .pop.size=100...and I need to run them on the same array.

Any help is greatly appreciated

1

1 Answers

2
votes

Your code is missing @everywhere using StatsBase Since each worker is an additional process the module StatsBase should be imported across all workers.

If you use @parallel loop you do not need neither @sync nor @everywhere inside the loop. @parallel simply divides the loop across the workers and executes parts on each worker. Depending on what you want to do you are probably missing aggregator function so it usually is:

@parallel (my_agg_function) for i in 1:n
   # do something - job will be evenly split across workers
end 

Please also consider using pmap instead of @parallel.

@everywhere executes the command across all workers. In parallel simulations it is usually used to things such as initializing variables/simulation states or importing libraries. Please note that if you wish to send data across workers you might want to use ParallelDataTransfer.jl.

Last but not least addprocs(16) inside a function is usually not a good pattern - new 16 julia processes will be spawn each time the function is invoked. Use the -p command line option instead (e.g. start Julia with julia -p 16 command).