-
Notifications
You must be signed in to change notification settings - Fork 4
Code installation
It is recommended to use either Linux or Mac for development. The Windows WSL 2 backend should work, but can run out of memory. Alternatively, one can develop through Windows instead.
Use Miniconda, mamba, or venv for environment management. Can quickly install Miniconda by running the following:
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O /tmp/Miniconda3-latest-Linux-x86_64.sh
bash /tmp/Miniconda3-latest-Linux-x86_64.sh -b -p ~/miniconda
source ~/miniconda/bin/activate ~/miniconda
Will also need to install Docker for containerisation.
Run the following:
git clone https://github.com/autoreduction/workspace
cd workspace
bash runme.sh # will checkout all the repositories
It is recommended to keep all Autoreduction repositories in a subfolder as this makes building container images faster (due to the context sent to the daemon being smaller).
The commands below will install the packages as editable -e
. Making them editable -e
means that development changes can be tested without re-publishing the base packages and reinstalling them. Note that switching out some of them to a different branch than the one being worked on can cause errors. See PR testing to work around this.
- Make a new Python environment via
virtualenv
,venv
, orconda/mamba
, or any other virtual environment source. - Go into the folder where the repositories were checked out, and install them as editable.
Note: You will need to wrap the package in quotes if you are using zsh instead of bash. e.g. pip install -e 'utils[dev]'
pip install -e utils[dev]
pip install -e db[dev]
pip install -e scripts[dev]
pip install -e queue-processor[dev]
pip install -e rest-api[dev]
pip install -e frontend[dev]
- To check all repositories have been installed as editable, run
pip freeze | grep auto
and compare with the following:
-e git+https://github.com/autoreduction/autoreduce-db@7a11d1ee210836f1b358baf727ef52ccec7882b4#egg=autoreduce_db
-e git+https://github.com/autoreduction/autoreduce-frontend@1606644e35b2650b821e1b19dc429c01a1cd9dc9#egg=autoreduce_frontend
-e git+https://github.com/autoreduction/autoreduce@bef2219813cfcb8c08352f9be10f67625e66e42b#egg=autoreduce_qp
-e git+https://github.com/autoreduction/autoreduce-rest-api@b432701242a575e96f1804bdc00178723e1d1261#egg=autoreduce_rest_api
-e git+https://github.com/autoreduction/autoreduce-scripts@734ba68b576a968fb250d5f8fa8e793dcea9b2dc#egg=autoreduce_scripts
-e git+https://github.com/autoreduction/autoreduce-utils@3499829d90822e545c5834d953fcb96c9de2055a#egg=autoreduce_utils
- Development of Autoreduce uses direnv for environmental variables. These variables populate the application with development variables. Run 'direnv' to see if it is installed already. If you receive a message similar to "command not known", you will need to install it by running:
curl -sfL https://direnv.net/install.sh | bash
Then add the following line at the end of the ~/.bashrc
file
eval "$(direnv hook bash)"
Restart your shell. Finally, in the autoreduce-workspace root folder, run
direnv allow
- Run
make
inautoreduce-frontend
. This will migrate, create, and apply database migrations; creating a file at~/.autoreduce/dev/sqlite3.db
- Run the server with
autoreduce-webapp-manage runserver
. By default it will run in development mode, which will put logs and the SQLite3 database file at~/.autoreduce
The actions repo contains most of the packages needed for linting/formatting/testing (e.g. pylint-django, pytest). If you wish to install these to use locally, run the following in the workspace
directory:
pip install -e actions[dev]
To avoid headaches it is advised to do PR testing in another clone of the repositories installed in a different virtual environment. Essentially, repeat getting the Code and installing Autoreduction, but in a different folder!
Before testing anything you make sure you have migrated the DB with fixtures: Run the Migrate database (with PR fixtures)
task, or run make migrate-with-fixtures
in autoreduce-frontend/
Running the webapp is enough to test simple UI changes and webapp only features. For bigger features that require interaction with the backend, e.g. reductions, reruns, you'll have to run Kafka and the Consumer.
In some cases you may want a clean database, without any fixtures (e.g. if you are cloning the production DB). For that the VSCode task Migrate database (clean)
or make migrate
can be run.
Because the whole database is stored in the sqlite3.db
file, resetting everything is as simple as deleting or renaming the ~/.autoreduce/dev/sqlite3.db
file, and migrating again.
None of the rerun/configure new runs functionality is enabled without a reduce.py
and reduce_vars.py
for the instrument. With the test fixtures the instrument name is TestInstrument
, and to make the required files with some sample code run:
mkdir -p ~/.autoreduce/dev/data-archive/NDXTESTINSTRUMENT/user/scripts/autoreduction
echo 'standard_vars = {"variable1":"value1"}' > ~/.autoreduce/dev/data-archive/NDXTESTINSTRUMENT/user/scripts/autoreduction/reduce_vars.py
echo 'def main(input_file, output_dir): print("Hello")' > ~/.autoreduce/dev/data-archive/NDXTESTINSTRUMENT/user/scripts/autoreduction/reduce.py
You will need the Kafka broker running locally, so that the producer & consumer can connect to it. Environmental variables are used for credentials.
With make
:
- In
autoreduce/
runmake kafka
With VSCode:
- From the tasks run
Run Kafka Docker daemon
Once you've got the DB & fixtures installed you can run the Queue Processor & Web App, by:
- Going to
Run & Debug
tab (CTRL+SHIFT+D default keybind) in VSCode and start theRun Kafka Consumer
andRun webapp
tasks - all you need to do afterwards is CTRL-click thehttp://127.0.0.1:8000/
in the terminal
if not using VSCode, and running from a terminal
- Make sure to activate your environment
- Start QP with
python autoreduce_qp/queue_processor/confluent_consumer.py
- you will need a 2nd terminal for the webapp - Start webapp with
python autoreduce_frontend/manage.py runserver
- The recommended approach is to run the
autoreduction/qp
Docker container locally. The one hosted onautoreduction/qp
may be out of date as updates get published before a new cycle. It's recommended you build theautoreduction/qp
image from autoreduce-containers locally - Using the system Python and installing Mantid following the relevant installation instructions
- Note: you won't out-of-the box be able to use a virtualenv since installing Mantid through the system package manager will use
/usr/bin/python3
. To use a virtualenv you will have to build Mantid from source
- Note: you won't out-of-the box be able to use a virtualenv since installing Mantid through the system package manager will use
- Using a
conda
environment. Create a new environment based on either scipp's mantid-framework, or mantid's mantid-framework. I'd suggest using whichever is newer- Once you got Mantid installed just install the autoreduce packages into it, and it should be picked up. Note that you don't have access to any GUI features, including the easier workspace plotting.