Initialize a Git repo for managing your ontology the OBO Library way!
For more details, see
- How-to guides:
- How to create your first repository with the ODK
- How to add license, title and description to your ontology
- How to import large ontologies efficiently
- Reference:
- Learn about the different kinds of release artefacts
- Community:
- If you have issues, file them here: https://github.com/INCATools/ontology-development-kit/issues
- We also have an active community on slack, you can request access by making a ticket here as well
You will likely want to customize the build process, and of course to edit the ontology.
We recommend that you do not edit the main Makefile, but instead the supplemental one (e.g. myont.Makefile) is src/ontology
An example of how you can customise your imports for example is documented here
The ODK is designed for creating a new repo for a new ontology. It can still be used to help figure out how to migrate an existing git repository to the ODK structure. There are different ways to do this.
- Manually compare your ontology against the template folder and make necessary adjustments
- Run the seed script as if creating a new repo. Manually compare this with your existing repo and use
git mv
to rearrange, and adding any missing files by copying them across and doing agit add
- Create a new repo de novo and abandon your existing one, using, for example, github issue mover to move tickets across.
Obviously the second method is not ideal as you lose your git history. Note even with git mv
history tracking becomes harder
If you have built your ontology using a previous version of ODK, migration of your setup is unfortunately a manual process. In general you do not absolutely need to upgrade your setup, but doing so will bring advantages in terms of aligning with emerging standards ways of doing things. The less customization you do on your repo the easier it should be to migrate.
Consult the Changes.md file for changes made between releases to assist in upgrading.
You will find additional documentation in the src/ontology/README-editors.md file in your repo.
The ODK also comes with built in options to generate your own shiny documentation, see for example the PATO documentation here which is almost entirely autogenerated from the ODK.
You can run the seed script without docker using Python3.6 or higher and Java. See requirements.txt for python requirements.
This is, however, not recommended.
Note: this is an highly experimental feature as of ODK version 1.2.24. Note that the display and the scores are under active development and will change considerably in the near future.
Example implementation:
- https://github.com/obophenotype/obophenotype.github.io (web) You need two files to run the ODK dashboard generator:
- An ODK container wrapper (called
odk.sh
in the following), similar to therun.sh
file in your typical repossrc/ontology
directory. - A dashboard config YAML file (called
dashboard-config.yml
in the following)
With both files, you can then create a dashboard using the following command:
sh odk.sh obodash -C dashboard-config.yml
The wrapper (odk.sh
) should contain something like the following:
#!/bin/sh
# Wrapper script for ODK docker container.
#
docker run -e ROBOT_JAVA_ARGS='-Xmx4G' -e JAVA_OPTS='-Xmx4G' \
-v $PWD/dashboard:/tools/OBO-Dashboard/dashboard \
-v $PWD/dashboard-config.yml:/tools/OBO-Dashboard/dashboard-config.yml \
-v $PWD/ontologies:/tools/OBO-Dashboard/build/ontologies \
-v $PWD/sparql:/tools/OBO-Dashboard/sparql \
-w /work --rm -ti obolibrary/odkfull "$@"
Note that this essentially binds a few local directories to the running ODK container. The directories serve the following purposes:
dashboard
: this is where the dashboard is deposited. Look at index.html in your browser.ontologies
: this is where ontologies are downloaded to and synced upsparql
: an optional directory that allows you to add custom checks on top of the usual OBO profile.
This is a minimal example dashboard config for a potential phenotype dashboard:
title: OBO Phenotype Dashboard
description: Quality control for OBO phenotype ontologies. Under construction.
ontologies:
custom:
- id: wbphenotype
- id: dpo
base_ns:
- http://purl.obolibrary.org/obo/FBcv
environment:
ROBOT_JAR: /tools/robot.jar
ROBOT: robot
The ontologies will, if they exist, be retrieved from their OBO purls and evaluated. There are more options potentially of interest:
title: OBO Phenotype Dashboard
description: Quality control for OBO phenotype ontologies. Under construction.
ontologies:
custom:
- id: myont
mirror_from: https://raw.githubusercontent.com/obophenotype/c-elegans-phenotype-ontology/master/wbphenotype-base.owl
- id: dpo
base_ns:
- http://purl.obolibrary.org/obo/FBcv
prefer_base: True
profile:
baseprofile: "https://raw.githubusercontent.com/ontodev/robot/master/robot-core/src/main/resources/report_profile.txt"
custom:
- "WARN\tfile:./sparql/missing_xrefs.sparql"
report_truncation_limit: 300
redownload_after_hours: 2
environment:
ROBOT_JAR: /tools/robot.jar
ROBOT: robot
mirror_from
allows specifying a download URL other than the default OBO purlbase_ns
allows specifying the set of namespaces considered to be owned by the ontology (only terms in these namespaces will be evaluated for this ontology. Default is http://purl.obolibrary.org/obo/CAPTIALISEDONTOLOGYID).report_truncation_limit
allows truncating long (sometimes HUGE ontology reports) to make them go easier on GITHUB version control.redownload_after_hours
: this allows to specify how long to wait before trying to download an ontology (which could be a time consuming process!) again.environment
: is currently a necessary parameter but will be made optional in future versions. It allows adding environment variables directly to the config, rather than passing them in as -e parameters to the docker container (both are equivalent though.)profile
is an optional parameter that allows specifying your own profile for the quality control (ROBOT) report. By default, this is using the ROBOT report default profile. You can either specify your own profile from scratch, or extend the current default with additional test by using thebaseprofile
parameter. Find out more about ROBOT profiles here.
A fully working example can be found here.