The statement: `Applying a mark to a fixture function never had any effect, but it is a common user error` is False
pokowaka opened this issue ยท 11 comments
The warning: https://docs.pytest.org/en/stable/deprecations.html#applying-a-mark-to-a-fixture-function is incorrect. If the statement was true to test would succeed.
See example below:
cat pyproject.toml
dependencies = [
"pytest==8.3.3",
"alt-pytest-asyncio==0.7.2",
]
cat fail.py
import pytest
import asyncio
@pytest.fixture()
@pytest.mark.async_timeout(0.1)
async def my_amazing_fixture():
try:
await asyncio.sleep(1)
yield 1
finally:
await asyncio.sleep(1)
async def test_my_amazing_fixture(my_amazing_fixture):
assert True
Next run the test:
pytest fail.py
================================================================================================================ test session starts ================================================================================================================
platform darwin -- Python 3.10.6, pytest-8.2.2, pluggy-1.5.0
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /Users/jansene/Downloads/example
configfile: pytest.ini
plugins: benchmark-3.4.1, timeout-2.1.0, alt-pytest-asyncio-0.7.2, rerunfailures-10.2, dependency-0.5.1, repeat-0.9.1, flakefinder-1.0.0, devpi-server-6.7.0, cov-3.0.0
collected 1 item
fail.py E [100%]
====================================================================================================================== ERRORS =======================================================================================================================
_____________________________________________________________________________________________________ ERROR at setup of test_my_amazing_fixture _____________________________________________________________________________________________________
def raise_error():
if info["cancelled"]:
> assert False, f"Took too long to complete: {fle}:{lineno}"
E AssertionError: Took too long to complete: /Users/jansene/Downloads/example/fail.py:4
E assert False
../../.pyenv/versions/3.10.6/lib/python3.10/site-packages/alt_pytest_asyncio/async_converters.py:77: AssertionError
================================================================================================================= warnings summary ==================================================================================================================
fail.py:6
/Users/jansene/Downloads/example/fail.py:6: PytestRemovedIn9Warning: Marks applied to fixtures have no effect
See docs: https://docs.pytest.org/en/stable/deprecations.html#applying-a-mark-to-a-fixture-function
async def my_amazing_fixture():
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============================================================================================================== short test summary info ==============================================================================================================
ERROR fail.py::test_my_amazing_fixture - AssertionError: Took too long to complete: /Users/jansene/Downloads/example/fail.py:4
============================================================================================================ 1 warning, 1 error in 1.17s ==========================================================================================
The warning is true as far as pytest is concerned. The alt-pytest-asyncio
plugin is to blame here, as it manually discovers the mark when patching pytest fixtures, in a way that is confusing and inconsistent.
I don't think there is anything pytest should do here, IMHO the plugin should change how it applies timeouts to fixtures.
Ok, let me close this and open an issue against alt-pytest-asyncio
. We unfortunately rely on the timeout behavior in some of our tests and would be very sad when we upgrade to newer versions of pytest that turn this warning into an error.
Filed delfick/alt-pytest-asyncio#19 against the plugin maintainer.
Hi @The-Compiler :)
in a way that is confusing and inconsistent.
I don't doubt that, do you have ideas of less confusing/inconsistent way of doing this?
My immediate thoughts are making my own decorator to record that timeout on the fixture and I wonder if you have anything off the top of your head that would sound more appropriate?
I imagine when I have time I'll have a look at pytest-asyncio again and see how they do things these days and see if there are ideas there.
we currently explicitly avoid passing from fixtures to nodes, as the edge cases around fixture overrides/decorators and dynamic fixture lookup make for errors most users wont be able to sanely find once it happens
we yet have to work out a reasonably explicit fixture dependency graph that can support those features across fixture overrides
a workaround may be a pytest plugin that concerns itself with transferring markers using a new decorator but also warning when a fixture is setup that was not considered for transfer on a particular test item
a special caveat is that this is still missing already active higher scope fixtures and theres no case of safe transfer to high scope collection nodes as those are not part of parameterize
I don't doubt that, do you have ideas of less confusing/inconsistent way of doing this?
(I never used either of [alt-]pytest-asyncio, so take this with a grain of salt as I might be missing context)
pytest-asyncio seems to have its own pytest_asyncio.fixture
decorator. That approach seems to make a lot of sense to me, as you could do @alt_pytest_asyncio.fixture(timeout=0.1)
and even add more arguments in the future if needed.
ah yeah, that certainly aligns with my immediate idea :)
I imagine the difference is that pytest-asyncio has required decorators from the beginning for everything that had the async keyword whereas I made different tradeoffs to make that not necessary. So it may be more a happy coincidence from that.
I've ended up going with a solution that uses fixtures instead of decorators, delfick/alt-pytest-asyncio#20
The fix is in version 0.9.0 of alt-pytest-asyncio :)
thanks for the update, closing this as a plugin implemented the behavior and the statement holds true for pytest itself