Test suite fails with 1 value error and 8 assertion errors.
TheChymera opened this issue · 6 comments
Having built pybedtools with the following full build.log I get a few assertion errors such as the one seen below (full list towards the end of the log).
___________________________________ test_all[shuffle: {'seed': 1, 'genome': 'hg19', 'chrom': True} gzip] ___________________________________
tests = {'bedtool': 'a.bed', 'convert': {}, 'kw': {'chrom': True, 'genome': 'hg19', 'seed': 1}, 'method': 'shuffle', ...}
def test_all(tests):
> run(tests)
lib/pybedtools/test/test_iter.py:241:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
d = {'bedtool': 'a.bed', 'convert': {}, 'kw': {'chrom': True, 'genome': 'hg19', 'seed': 1}, 'method': 'shuffle', ...}
def run(d):
method = d['method']
bedtool = d['bedtool']
convert = d['convert']
kwargs = d['kw'].copy()
expected = d['test_case']['expected']
bedtool_converter = convert.pop('bedtool')
bedtool = (
converters[bedtool_converter](pybedtools.example_bedtool(bedtool))
)
for k, converter_name in convert.items():
kwargs[k] = (
converters[converter_name](pybedtools.example_bedtool(kwargs[k]))
)
result = getattr(bedtool, method)(**kwargs)
res = str(result)
expected = fix(expected)
try:
> assert res == expected
E AssertionError: assert ('chr1\t123081365\t123081464\tfeature1\t0\t+\n'\n 'chr1\t243444570\t243444670\tfeature2\t0\t+\n'\n 'chr1\t194620241\t194620591\tfeature3\t0\t-\n'\n 'chr1\t172792873\t172792923\tfeature4\t0\t+\n') == ('chr1\t46341498\t46341597\tfeature1\t0\t+\n'\n 'chr1\t45615582\t45615682\tfeature2\t0\t+\n'\n 'chr1\t102762672\t102763022\tfeature3\t0\t-\n'\n 'chr1\t17293432\t17293482\tfeature4\t0\t+\n')
E - chr1 46341498 46341597 feature1 0 +
E - chr1 45615582 45615682 feature2 0 +
E - chr1 102762672 102763022 feature3 0 -
E + chr1 123081365 123081464 feature1 0 +
E + chr1 243444570 243444670 feature2 0 +
E + chr1 194620241 194620591 feature3 0 -
E - chr1 17293432 17293482 feature4 0 +
E ? --- ---
E + chr1 172792873 172792923 feature4 0 +
E ? + +++ + +++
lib/pybedtools/test/test_iter.py:96: AssertionError
Can you help me figure out what's wrong?
Yes, it turns out different versions of bedtools have different outputs for tools that use random numbers, like shuffle
here -- even though the seed is set to be the same.
I see one failure in that log due to no internet connection (test_chromsizes
), and the rest are tests for shuffle
and sample
which are both tools that have a random component.
The latest expected output is from bedtools v2.30, so those should match, at least on Linux 64-bit.
BTW, the fact that these tests are running implies that #329 can be closed, is that correct?
BTW, the fact that these tests are running implies that #329 can be closed, is that correct?
Yes, thank you.
and the rest are tests for shuffle and sample which are both tools that have a random component.
Shouldn't the tests account for such randomness, though?
The one requiring an internet connection I can set as failing by default, since connecting to the internet during the build for a system-wide install can be used to execute arbitrary code — but I'd need to figure out what to do about the rest.
The tests do set a random seed, and for a given version of bedtools these are stable with the set random seed. Pybedtools is a wrapper around bedtools, and the specific output of these tests depends on the specific version of bedtools currently installed on the system.
I think the proper way to handle this would be to set up separate testing environments each with different versions of bedtools and write separate shuffle
and sample
tests that run in a version-specific manner. Maybe use the last 3 or 4 bedtools versions?
I had done something like this for shuffle in previous versions where there was a change in output, so the mechanism is there -- see for example
- https://github.com/daler/pybedtools/blob/master/pybedtools/test/test_shuffle215.yaml
- https://github.com/daler/pybedtools/blob/master/pybedtools/test/test_shuffle227.yaml
- https://github.com/daler/pybedtools/blob/master/pybedtools/test/test_iter.py#L13
I just haven't set it up for subsequent versions yet and don't have the bandwidth at the moment.
but I'd need to figure out what to do about the rest.
I would phrase it as the following: if bedtools v2.30 is installed then these tests should pass, if not bedtools v2.30 then these tests are expected to fail.
@daler the thing is, I have bedtools 2.30 specifically installed:
* Contents of sci-biology/bedtools-2.30.0:
/usr
/usr/bin
/usr/bin/annotateBed
/usr/bin/bamToBed
/usr/bin/bamToFastq
/usr/bin/bed12ToBed6
/usr/bin/bedToBam
/usr/bin/bedToIgv
/usr/bin/bedpeToBam
/usr/bin/bedtools
/usr/bin/closestBed
/usr/bin/clusterBed
/usr/bin/complementBed
/usr/bin/coverageBed
/usr/bin/expandCols
/usr/bin/fastaFromBed
/usr/bin/flankBed
/usr/bin/genomeCoverageBed
/usr/bin/getOverlap
/usr/bin/groupBy
/usr/bin/intersectBed
/usr/bin/linksBed
/usr/bin/mapBed
/usr/bin/maskFastaFromBed
/usr/bin/mergeBed
/usr/bin/multiBamCov
/usr/bin/multiIntersectBed
/usr/bin/nucBed
/usr/bin/pairToBed
/usr/bin/pairToPair
/usr/bin/randomBed
/usr/bin/shiftBed
/usr/bin/shuffleBed
/usr/bin/slopBed
/usr/bin/sortBed
/usr/bin/subtractBed
/usr/bin/tagBam
/usr/bin/unionBedGraphs
/usr/bin/windowBed
/usr/bin/windowMaker
/usr/share
/usr/share/bedtools
/usr/share/bedtools/genomes
/usr/share/bedtools/genomes/human.hg18.genome
/usr/share/bedtools/genomes/human.hg19.genome
/usr/share/bedtools/genomes/human.hg38.genome
/usr/share/bedtools/genomes/mouse.mm10.genome
/usr/share/bedtools/genomes/mouse.mm8.genome
/usr/share/bedtools/genomes/mouse.mm9.genome
/usr/share/doc
/usr/share/doc/bedtools-2.30.0
/usr/share/doc/bedtools-2.30.0/README.md.bz2
Ah, @TheChymera I did a new release (v0.8.2) the other day and mistakenly assumed this new issue's build log was from that. Looking at the top of the build log though this is using v0.8.0 of pybedtools.
v0.8.2 has the edits to the test files that reflect output from bedtools v2.30.0 (see e.g., a2642bb).
So:
- if you want to package pybedtools v0.8.0 as in the build.log, the tests in that version should run on bedtools v2.29.2 (see this travis-ci log line, thanks historical travis-ci logs!)
- if you want to package the latest pybedtools v0.8.2, then continue using bedtools v2.30.0. The v0.8.2 tests should then pass (see here)