Skip to content

Commit 4fd3f99

Browse files
committed
update readme / and docker files
1 parent 7059d63 commit 4fd3f99

13 files changed

+164
-30
lines changed

.dockerignore

+2-1
Original file line numberDiff line numberDiff line change
@@ -5,4 +5,5 @@ research
55
tensorboard
66
agents
77
data/tensorboard
8-
data/agents
8+
data/agents
9+
data/postgres

.gitignore

+1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
**/__pycache__
33
data/tensorboard/*
44
data/agents/*
5+
data/postgres/*
56
data/log/*
67
*.pkl
78
*.db

README.md

+73-4
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
[![Build Status](https://travis-ci.org/notadamking/RLTrader.svg?branch=master)](https://travis-ci.org/notadamking/RLTrader)
44
[![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=flat-square)](http://makeapullrequest.com)
55
[![GPL Licence](https://badges.frapsoft.com/os/gpl/gpl.svg?v=103)](https://opensource.org/licenses/GPL-3.0/)
6+
[![Python 3.6](https://img.shields.io/badge/python-3.6-blue.svg)](https://www.python.org/downloads/release/python-360/)
67

78
In this series of articles, we've created and optimized a Bitcoin trading agent to be highly profitable using deep reinforcement learning.
89

@@ -19,28 +20,96 @@ https://towardsdatascience.com/using-reinforcement-learning-to-trade-bitcoin-for
1920

2021
# Getting Started
2122

22-
The first thing you will need to do to get started is install the requirements in `requirements.txt`.
23+
The first thing you will need to do to get started is install the requirements. If your system has an nVIDIA GPU that you should start by using:
2324

2425
```bash
2526
pip install -r requirements.txt
2627
```
2728

28-
The requirements include the `tensorflow-gpu` library, though if you do not have access to a GPU, you should replace this requirement with `tensorflow`.
29+
If you have another type of GPU or you simply want to use your CPU, use:
30+
31+
```bash
32+
pip install -r requirements.no-gpu.txt
33+
```
34+
35+
Update your current static files, that are used by default:
36+
```bash
37+
python update_data.py
38+
```
39+
40+
Afterwards you can simply see the currently available options:
41+
42+
```bash
43+
python ./cli.py --help
44+
```
45+
46+
or simply run the project with default options:
47+
48+
```bash
49+
python ./cli.py opt-train-test
50+
```
51+
52+
### Testing with vagrant
53+
54+
Start the vagrant box using:
55+
```bash
56+
vagrant up
57+
```
58+
59+
Code will be located at /vagrant. Play and/or test with whatever package you wish.
60+
Note: With vagrant you cannot take full advantage of your GPU, so is mainly for testing purposes
61+
62+
63+
### Testing with docker
64+
65+
If you want to run everything within a docker container, then just use:
66+
```bash
67+
./run-with-docker (cpu|gpu) (yes|no) opt-train-test
68+
```
69+
- cpu - start the container using CPU requirements
70+
- gpu - start the container using GPU requirements
71+
- yes | no - start or not a local postgres container
72+
Note: in case using yes as second argument, use
73+
74+
```bash
75+
python ./ cli.py--params-db-path "postgres://rl_trader:rl_trader@localhost" opt-train-test
76+
```
77+
78+
The database and it's data are pesisted under `data/postgres` locally.
79+
80+
If you want to spin a docker test environment:
81+
```bash
82+
./run-with-docker (cpu|gpu) (yes|no)
83+
```
84+
85+
If you want to run existing tests, then just use:
86+
```bash
87+
./run-tests-with-docker
88+
```
89+
90+
# Fire up a local docker dev environment
2991

3092
# Optimizing, Training, and Testing
3193

3294
While you could just let the agent train and run with the default PPO2 hyper-parameters, your agent would likely not be very profitable. The `stable-baselines` library provides a great set of default parameters that work for most problem domains, but we need to better.
3395

34-
To do this, you will need to run `optimize.py`.
96+
To do this, you will need to run `cli.py`.
3597

3698
```bash
37-
python ./optimize.py
99+
python ./cli.py opt-train-test
38100
```
39101

40102
This can take a while (hours to days depending on your hardware setup), but over time it will print to the console as trials are completed. Once a trial is completed, it will be stored in `./data/params.db`, an SQLite database, from which we can pull hyper-parameters to train our agent.
41103

42104
From there, you can train an agent with the best set of hyper-parameters, and later test it on completely new data to verify the generalization of the algorithm.
43105

106+
# Common troubleshooting
107+
108+
##### The specified module could not be found.
109+
Normally this is caused by missing mpi module. You should install it according to your platorm.
110+
- Windows: https://docs.microsoft.com/en-us/message-passing-interface/microsoft-mpi
111+
- Linux/MacOS: https://www.mpich.org/downloads/
112+
44113
# Project Roadmap
45114

46115
If you would like to contribute, here is the roadmap for the future of this project. To assign yourself to an item, please create an Issue/PR titled with the item from below and I will add your name to the list.

Vagrantfile

+1-1
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
5252
vm_config.vm.synced_folder '.', '/vagrant', disabled: false
5353
vm_config.vm.provision "default setup", type: "shell", inline: <<SCRIPT
5454
apt update
55-
apt install mpich
55+
apt install mpich libpq-dev
5656
DEBIAN_FRONTEND=noninteractive apt install python3-pip
5757
pip3 install -r /vagrant/requirements.no-gpu.txt
5858
SCRIPT

dev-with-docker

+68
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,68 @@
1+
#!/usr/bin/env bash
2+
3+
set -e
4+
5+
SCRIPT_DIR=$(dirname "${BASH_SOURCE[0]}")
6+
CWD=$(realpath "${SCRIPT_DIR}")
7+
8+
if [[ -z $1 ]]; then
9+
echo "Should have 1 argument: cpu or gpu"
10+
exit
11+
fi
12+
13+
TYPE=$1
14+
shift;
15+
16+
if [[ -n $2 ]]; then
17+
docker build \
18+
--tag 'trader-rl-postgres' \
19+
--build-arg ID=$(id -u) \
20+
--build-arg GI=$(id -g) \
21+
-f "$CWD/docker/Dockerfile.backend" "$CWD"
22+
23+
mkdir -p "$CWD/data/postgres"
24+
docker run \
25+
--detach \
26+
--publish 5432:5432 \
27+
--tty \
28+
--user "$(id -u):$(id -g)" \
29+
--volume "$CWD/data/postgres":"/var/lib/postgresql/data/trader-data" \
30+
trader-rl-postgres-dev
31+
shift
32+
fi
33+
34+
if [[ $TYPE == 'gpu' ]]; then
35+
GPU=1
36+
else
37+
GPU=0
38+
fi
39+
40+
MEM=$(cat /proc/meminfo | grep 'MemTotal:' | awk '{ print $2 }')
41+
CPUS=$(cat /proc/cpuinfo | grep -P 'processor.+[0-7]+' | wc -l)
42+
43+
MEM_LIMIT=$((MEM/4*3))
44+
CPU_LIMIT=$((CPUS/4*3))
45+
46+
if [ $CPU_LIMIT == 0 ];then
47+
CPU_LIMIT=1
48+
fi
49+
50+
if [ $GPU == 0 ]; then
51+
N="trader-rl-cpu-dev"
52+
docker build --tag $N -f "$CWD/docker/Dockerfile.cpu" "$CWD"
53+
else
54+
N="trader-rl-gpu-dev"
55+
docker build --tag $N -f "$CWD/docker/Dockerfile.gpu" "$CWD"
56+
fi
57+
58+
docker rm -fv rl_trader_dev || true
59+
docker run \
60+
--name 'rl_trader_dev' \
61+
--user $(id -u):$(id -g) \
62+
--entrypoint 'bash' \
63+
--interactive \
64+
--memory "${MEM_LIMIT}b" \
65+
--cpus "${CPU_LIMIT}" \
66+
--tty \
67+
--volume "${CWD}":/code \
68+
"$N"

docker/Dockerfile.backend

+4-3
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,9 @@ FROM postgres:11-alpine
33
ARG ID=1000
44
ARG GI=1000
55

6-
ENV POSTGRES_PASSWORD=rl-trader
7-
ENV POSTGRES_DB='rl-trader'
6+
ENV POSTGRES_USER=rl_trader
7+
ENV POSTGRES_PASSWORD=rl_trader
8+
ENV POSTGRES_DB='rl_trader'
89
ENV PGDATA=/var/lib/postgresql/data/trader-data
910

10-
RUN adduser -D -u $ID btct
11+
RUN adduser -D -u $ID rl_trader

docker/Dockerfile.cpu

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ ADD ./requirements.no-gpu.txt /code/requirements.txt
55
WORKDIR /code
66

77
RUN apt-get update \
8-
&& apt-get install -y build-essential mpich
8+
&& apt-get install -y build-essential mpich libpq-dev
99

1010
# should merge to top RUN to avoid extra layers - for debug only :/
1111
RUN pip install -r requirements.txt

docker/Dockerfile.gpu

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,5 +5,5 @@ ADD ./requirements.txt /code/
55
WORKDIR /code
66

77
RUN apt-get update \
8-
&& apt-get install -y build-essential mpich \
8+
&& apt-get install -y build-essential mpich libpq-dev \
99
&& pip install -r requirements.txt

docker/Dockerfile.tests

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,5 +6,5 @@ ADD ./requirements.tests.txt /code/requirements.txt
66
WORKDIR /code
77

88
RUN apt-get update \
9-
&& apt-get install -y build-essential mpich \
9+
&& apt-get install -y build-essential mpich libpq-dev \
1010
&& pip install --progress-bar off --requirement requirements.txt

lib/cli/RLTraderCLI.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ def __init__(self):
1414
self.parser.add_argument('--mini-batches', type=int, default=1, help='Mini batches', dest='nminibatches')
1515
self.parser.add_argument('--train-split-percentage', type=int, default=0.8, help='Train set percentage')
1616
self.parser.add_argument('--verbose-model', type=int, default=1, help='Verbose model')
17+
self.parser.add_argument('--params-db-path', type=str, default='sqlite:///data/params.db',
18+
help='Params path')
1719
self.parser.add_argument(
1820
'--tensor-board-path',
1921
type=str,
@@ -35,8 +37,6 @@ def __init__(self):
3537
optimize_parser.add_argument('--trials', type=int, default=1, help='Number of trials')
3638
optimize_parser.add_argument('--parallel-jobs', type=int, default=1, help='How many jobs in parallel')
3739

38-
optimize_parser.add_argument('--params-db-path', type=str, default='sqlite:///data/params.db',
39-
help='Params path')
4040
optimize_parser.add_argument('--verbose-model', type=int, default=1, help='Verbose model', dest='model_verbose')
4141

4242
train_parser = subparsers.add_parser('train', description='Train model')

optimize.py

-11
This file was deleted.

requirements.txt

+2-1
Original file line numberDiff line numberDiff line change
@@ -9,4 +9,5 @@ ta
99
statsmodels==0.10.0rc2
1010
empyrical
1111
tensorflow-gpu
12-
ccxt
12+
ccxt
13+
psycopg2

run-with-docker

+8-4
Original file line numberDiff line numberDiff line change
@@ -2,14 +2,16 @@
22

33
set -e
44

5-
SCRIPT_DIR=$(dirname "${BASH_SOURCE[0]}")
6-
CWD=$(realpath "${SCRIPT_DIR}")
5+
CWD="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
76

87
if [[ -z $1 ]]; then
98
echo "Should have 1 argument: cpu or gpu"
109
exit
1110
fi
1211

12+
TYPE=$1
13+
shift;
14+
1315
if [[ -n $2 ]]; then
1416
docker build \
1517
--tag 'trader-rl-postgres' \
@@ -20,13 +22,15 @@ if [[ -n $2 ]]; then
2022
mkdir -p "$CWD/data/postgres"
2123
docker run \
2224
--detach \
25+
--publish 5432:5432 \
2326
--tty \
2427
--user "$(id -u):$(id -g)" \
2528
--volume "$CWD/data/postgres":"/var/lib/postgresql/data/trader-data" \
2629
trader-rl-postgres
30+
shift
2731
fi
2832

29-
if [[ $1 == 'gpu' ]]; then
33+
if [[ $TYPE == 'gpu' ]]; then
3034
GPU=1
3135
else
3236
GPU=0
@@ -59,4 +63,4 @@ docker run \
5963
--tty \
6064
--volume "${CWD}":/code \
6165
"$N" \
62-
python /code/cli.py opt-train-test
66+
python /code/cli.py $@

0 commit comments

Comments
 (0)