Skip to content

Commit f577191

Browse files
committed
update readme
1 parent 6f7573c commit f577191

File tree

1 file changed

+15
-8
lines changed

1 file changed

+15
-8
lines changed

README.md

+15-8
Original file line numberDiff line numberDiff line change
@@ -14,28 +14,35 @@ downloading datasets.
1414

1515
## Installation
1616

17-
Currently it is not registred so you can install it with the url.
1817
```julia
19-
] add https://github.com/albheim/DistributedEnvironments.jl
18+
julia> ] add DistributedEnvironments
2019
```
2120

22-
## Example
21+
## Usage
2322

2423
Make sure the current active environment is the one that should be copied.
2524

2625
```julia
2726
using DistributedEnvironments
2827

29-
nodes = ["10.0.0.1", "otherserver"]
30-
@initcluster nodes
28+
machines = ["10.0.0.1", "otherserver"]
29+
@initcluster machines # Copies environment and sets up workers on all machines
3130

32-
@everywhere using SomePackage
31+
@everywhere using DelimitedFiles # Want this loaded on all machines
32+
@eachmachine download("somepage.com/somedata.csv") # If each worker wants same data we only need to download once per machine
33+
@everywhere data = readdlm("somedata.csv", ',') # Want to read the data everywhere
3334
...
3435
```
3536

36-
For example, one could run hyperparameter optimization using the `@phyperopt` macro from [Hypteropt.jl](https://github.com/baggepinnen/Hyperopt.jl)
37+
## Example
38+
39+
One could for example run hyperparameter optimization using the `@phyperopt` macro from [Hypteropt.jl](https://github.com/baggepinnen/Hyperopt.jl)
3740
```julia
38-
... # Initial setup as above
41+
using DistributedEnvironments
42+
43+
machines = ["10.0.0.1", "otherserver"]
44+
@initcluster machines
45+
3946
@everywhere using Hyperopt, Flux, MLDatasets, Statistics
4047
@eachmachine MNIST.download(i_accept_the_terms_of_use=true)
4148

0 commit comments

Comments
 (0)