Skip to content

Alfresco/alfresco-ansible-deployment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Alfresco Ansible Deployment

community enterprise release pre-commit Publish docs Bump artifacts versions

This project provides Ansible playbooks capable of deploying Alfresco Content Services (ACS) with different configuration flavours.

The user documentation is available on GitHub Pages.

Developers guide

This page is a developers guide for to popular commands used in the process of setting up environment for development, testing and release.

Introduction to pipenv

The general purpose of pipenv is similar to that of Package installer for Python (PIP) and built-in venv. This is external python library, which handles package installing from either command line (with special pipenv prefix in shell, similar to pip install), requirements.txt or Pipfile. Whether we install one package or dependencies from requirements.txt, pipenv creates Pipfile, a special file for itself, and then installs packages we specified inside virtual environment, NOT globally. While specifying new package, pipenv adds that package name to Pipfile, installs it (while also generating hashcodes for .lock file) and installs it inside virtual environment. Then we can also use this package we have installed. The default virtual environment is created in our working directory. You may want to visit: https://pipenv.pypa.io/en/latest/

NOTE: Pipenv does not install packages globally but into a virtual environment

Basic pipenv usage

Pipenv comes with bunch of commands, but the most important are highlighted below: The command with flag --dev installs packages from Pipfile needed for developing purposes.

pipenv install --dev --python $(cat .python-version)

This command below is opening the virtual environment, that pipenv created while installing all packages. Once we are inside this environment, we can use all the packages which we specified in install command. This is the environment where it is highly recommended to develop python.

NOTE: Right now it is not supported by pipenv to have two virtual environments in the same directory, so if you try work with two different environments within same directory, you will overwrite the previously created virtual environment

pipenv shell

Otherwise, if you are not planning to use virtual environment and need to just simply use molecule (or any other package), you can use these commands which will execute them inside pipenv's virtual environment:

pipenv run command

Which runs said package in project's virtual environment, for example:

pipenv run molecule test

Which uses molecule package installed in virtual environment to execute test

NOTE: This command runs the script with the specified package and then specified script inside virtual environment. You need to always be sure you are using pipenv run command inside directory where you have previously executed pipenv install command.

Development

The roles developed for this playbook are tested with Molecule.

Roles tests

NOTE: REMEMBER THESE COMMANDS NEED TO BE USED INSIDE VIRTUAL ENVIRONMENT, IF NOT YOU NEED TO ADD PREFIX PIPENV RUN

You can run Molecule tests on your machine if you have a Docker Engine installed locally.

Enter the role folder and run molecule <action> (see [official docs](https://molecule.readthedocs.io/en/latest/getting-started.html#run-test-sequence-commands)).

To provision the activemq role run:

cd roles/activemq
molecule converge

To execute tests after converge run successfully:

molecule verify

To enter the container and inspect manually the state:

molecule login

To destroy the container and release resources at the end:

molecule destroy

If you want to test a different operating system, set the MOLECULE_ROLE_IMAGE to a different docker base image before converging:

MOLECULE_ROLE_IMAGE=ubuntu:20.04 molecule converge

Integration tests

On the root folder there are two different molecule scenarios to run the entire playbook on EC2 instances with different operating systems (single node or multimachine/clustered).

Some environment variables are required to execute integration tests locally, please take a look at the .envrc file as a reference.

To have those environment variables automatically loaded when entering the project folder on your dev machine, you may want to install direnv, otherwise you can also configure them as you prefer.

When using direnv, you must add your secrets in the .env.credentials in the root folder, following the standard export convention of bash. Direnv will automatically suggest you to do that.

Scenario-specific variables are defined in the vars-scenario.yml files inside the molecule/default folder.

To run an integration test you need execute molecule with -e molecule/default/vars-scenario.yml parameter:

molecule -e molecule/default/vars-rhel8.yml test

Docker based tests

There is also a local molecule scenario that use the same approach of roles molecule tests, using the docker driver.

You can run it with:

molecule -s local test

Adding support for a new distribution

We expect distribution support to be added using mostly roles vars files. If distro specific tasks are needed those should be skipped for other distros and possibly added in separate task files.

New distributions must be added to the supported_os variable in the group_vars/all.yml file.

If a new OS enters the official supported matrix but is not supported by the playbook. It must be mentioned in the Versioning chapter of the doc

Release

Follow this quick checklist:

  1. Review currently open dependabot/renovate and merge them.
  2. For minor releases, ensure to update the links beginning with https://support.hyland.com/r/Alfresco to reflect the latest version or corresponding minor update documentation.
  3. In case of a new ACS major version, copy the versions inside the group_vars/all.yml to a new XX.N-extra-vars.yml
  4. Bump versions constraints in scripts/updatecli/updatecli_acs*.yml (workflow will take care of the rest)
  5. Ensure that the versions table in the main readme has been updated
  6. Ensure that docker images and AMI id for the root molecule tests are reflecting any minor OS release (e.g. default suite)
  7. Ensure that activemq, tomcat and java versions are up to date (latest patch version)
  8. After merging every pending PR, proceed with tagging:
    • git tag -s v2.x.x -m v2.x.x
    • git push origin v2.x.x
  9. Wait for the Release workflow go green.
  10. Draft a new release on GitHub with the tag you just pushed. If the release is for a new ACS major version, mention the ACS release in the title, e.g. v2.x.x (ACS 23.4.0)