Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tag unit testing failures for optional dependencies as known failures #124

Closed
GoogleCodeExporter opened this issue Apr 4, 2015 · 17 comments

Comments

@GoogleCodeExporter
Copy link

When running a test that concerns an optional dependency which is not installed on the testing machine, it is assumed that the test will fail due to the dependency not being installed (and not due to a possible bug). The idea is to tag this kind of failure as "known" failure while allowing the test to "properly" fail when the failure is not related to the lack of dependency.

By doing so, an user that is not using the optional dependency will be able to run tests without the output being "polluted" by false positives.

Yet, a warning should be raised in the case a known failure to remind developers that they should take care of it.

Original issue reported on code.google.com by [email protected] on 20 Dec 2012 at 8:46

@richardjgowers
Copy link
Member

So now we've got travis we should be able to include this as part of the build matrix, so we build with

  • everything (current build, mda with all the optional stuff)
  • only MDA + required deps (and have all optional stuff known fail)

I think our only optional dependencies are NCDF and matplotlib. I thought scipy was too, but it's in the required in setup.py.. so maybe that's a mistake.

@orbeckst
Copy link
Member

The full functional core (i.e. not analysis or visualization) needs

  • numpy
  • biopython
  • netCDF4

Analysis wants for full functionality

  • networkx
  • GridDataFormats
  • matplotlib
  • scipy

Visualization wants

  • scipy
  • matplotlib
  • mayavi

Not all of these are easy to install with pip so we went with a compromise to make the most functionality immediately available while also reducing entry hurdles and installation pain:
We are currently requiring all the packages that install easily with pip:

  • numpy
  • biopython
  • networkx
  • GridDataFormats

and make others like netCDF4 optional (which is only used for the binary AMBER reader and if it is not available, the user gets a detailed error message pointing to ncdf installation information). matplotlib used to be a pain to install with pip so we leave that to the user and their package manager.

@richardjgowers
Copy link
Member

https://github.com/MDAnalysis/mdanalysis/blob/develop/package/setup.py#L267

Does this mean scipy is required? I think you're right in that it's not used anywhere but analysis... if memory serves.

@orbeckst
Copy link
Member

Hm, I am confused (read: to lazy to look up at the moment) what requires does compared to install_requires in setup.py. If requires just advertises all the packages that you might want then scipy, matplotlib, mayavi should all be in there. If pip tries to pull those in, then scipy should not be in there.

@richardjgowers
Copy link
Member

I'm going to bump this because installing netcdf isn't trivial and running all the tests without it gives a lot of Error results

@orbeckst
Copy link
Member

orbeckst commented Oct 8, 2015

So you want a decorator that decides that if netcdf is not installed then it should be a knownfailure? I think @knownfailure_if (or similar) already exists; it is used for the HOLE and CLUSTALW tests.

@richardjgowers
Copy link
Member

Yeah I was thinking along those lines, I didn't know it already existed. I'll look into those

@orbeckst
Copy link
Member

orbeckst commented Oct 8, 2015

Btw, the conda install magically installs netcdf (it seems), at least that was my experience when I tested conda-based installation on a pristine VM.

@richardjgowers richardjgowers self-assigned this Oct 8, 2015
@hainm
Copy link
Contributor

hainm commented Oct 8, 2015

I'm going to bump this because installing netcdf isn't trivial and running all the tests without it gives a lot of Error results

with conda, it's trivial. :D

@orbeckst
Copy link
Member

orbeckst commented Oct 8, 2015

Something along executable_not_found_runtime(); in fact it should be used with @dec.skipif.

@richardjgowers
Copy link
Member

It was possible with our existing knownfailure decorator, if you specify that ImportError is the expected failure. I think I'll put a layer around knownfailure which specifies knownfailure_if(test). So if test returns True, it's expected to fail. Then test can be something simple like

import MDAnalysis as mda

def check_ncdf():
    try:
        import netCDF
    except ImportError:
       return True
    else
       return False

@knownfailure_if(check_ncdf)
def test _ncdfthing

I think I can get that bundled with 0.12.1.

Then going forward we want our travis build to have another dimension, which is testing both minimal and full installations.

@mnmelo
Copy link
Member

mnmelo commented Oct 9, 2015

That's a cool idea @richardjgowers. We can even overload the already existing knownfailure to take an argument only_if, that defaults to True.

In this particular case it'd become:

@knownfailure(only_if=check_ncdf) #The check could use a better name...
def test _ncdfthing

What do you think? I'll be glad to make a PR for it.

@richardjgowers
Copy link
Member

Yeah a kwarg for knownfailure might be smarter actually. If you want to take this that would be great.

@mnmelo
Copy link
Member

mnmelo commented Oct 9, 2015

In the end I think @orbeckst's skipif way is cleaner. Had to decorate each and every test under the netCDF test classes. These numpy decorators can't be applied to classes.

@richardjgowers
Copy link
Member

I had luck just decorating the setUp method?

@mnmelo
Copy link
Member

mnmelo commented Oct 9, 2015

Ah, should have tried that! ;)

@mnmelo mnmelo closed this as completed in 1bf2db0 Oct 12, 2015
richardjgowers added a commit that referenced this issue Oct 12, 2015
netCDF tests now skipped if library missing (closes #124)
jbarnoud added a commit to jbarnoud/mdanalysis that referenced this issue Dec 7, 2015
Some tests requires scipy. Scipy is an optional dependency, so it can be
missing leading to the tests to fail.

This commit decorate the tests that need scipy with the dec.skipif
decorator. This follows the guideline discussed in MDAnalysis#124.
jbarnoud added a commit to jbarnoud/mdanalysis that referenced this issue Dec 7, 2015
Some tests requires scipy. Scipy is an optional dependency, so it can be
missing leading to the tests to fail.

This commit decorate the tests that need scipy with the dec.skipif
decorator. This follows the guideline discussed in MDAnalysis#124.
jbarnoud added a commit to jbarnoud/mdanalysis that referenced this issue Dec 7, 2015
Some tests requires scipy. Scipy is an optional dependency, so it can be
missing leading to the tests to fail.

This commit decorate the tests that need scipy with the dec.skipif
decorator. This follows the guideline discussed in MDAnalysis#124.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants