Hi all-

I am trying to load a file of functions on a cluster of computers. In the 
past, I used require() (now depreciated) and the sendto() function 
described here  
<http://stackoverflow.com/questions/27677399/julia-how-to-copy-data-to-another-processor-in-julia>to
 
make a data variable available on all workers. ( Note that I cannot simply 
load the data upon initializing the program because the data will change 
outside of the module, eventually receiving a stream of data from another 
program. So speed and flexibility is imperative).  As recommended here 
<https://groups.google.com/forum/#!searchin/julia-users/$20require/julia-users/6zBKw4nd20I/5JLt7Ded0zkJ>,
 
I defined a module containing the functions and used "using MyModule" to 
send it to the available workers. It seems that the major limitation of 
this approach is that data is not available to the functions within the 
module when using sendto(). I suspect this is because modules are 
encapsulated from other variables and functions. Bearing that in mind:


1. Is there a way around this problem using the module method? 

2. Alternatively, is there a way I can make the functions and packages 
available to the workers without using modules? Perhaps something akin to 
the old require method? 

3. Or is there a way to send the data via map() along with my function and 
distributed array? Essentially, my code loads stored inputs for numerous 
kernel density functions and converts them to a distributed array of 
arrays. For example:

map(EvalKDFs,MyDistArray)

Each time the above function is called, "MyData" needs to be available to 
the function EvalKDFs. However, map(EvalKDFs,MyDistArray,MyData) does not 
work because there is one array of data and many arrays within MyDistArray.

I might be able to post a stripped down version of my code if my 
description does not suffice. 

Any help would be greatly appreciated. 

Reply via email to