GLOB sdist-make: /tmp/tmpc5u3en0o/pytest-dolphin-0.3.9/setup.py py37 create: /tmp/tmpc5u3en0o/pytest-dolphin-0.3.9/.tox/py37 py37 installdeps: pytest==6.0.1 py37 inst: /tmp/tmpc5u3en0o/pytest-dolphin-0.3.9/.tox/.tmp/package/1/pytest-dolphin-0.3.9.zip py37 installed: ansi2html==1.5.2,apipkg==1.5,attrs==20.1.0,coverage==5.2.1,Django==1.10.8,execnet==1.7.1,importlib-metadata==1.7.0,iniconfig==1.0.1,more-itertools==8.5.0,packaging==20.4,pluggy==0.13.1,py==1.9.0,pyparsing==2.4.7,pytest==3.0.4,pytest-cov==2.4.0,pytest-django==3.1.2,pytest-dolphin @ file:///tmp/tmpc5u3en0o/pytest-dolphin-0.3.9/.tox/.tmp/package/1/pytest-dolphin-0.3.9.zip,pytest-html==1.12.0,pytest-pythonpath==0.7.1,pytest-splinter==1.7.7,pytest-travis-fold==1.2.0,pytest-xdist==1.15.0,selenium==3.141.0,six==1.15.0,splinter==0.14.0,toml==0.10.1,urllib3==1.25.10,zipp==3.1.0 py37 run-test-pre: PYTHONHASHSEED='2729399978' py37 run-test: commands[0] | pytest --trace-config --help PLUGIN registered: <_pytest.config.PytestPluginManager object at 0x7f3a0f3bbc88> PLUGIN registered: <_pytest.config.Config object at 0x7f3a0ee24390> PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: PLUGIN registered: )> PLUGIN registered: )> PLUGIN registered: )> PLUGIN registered: )> PLUGIN registered: )> PLUGIN registered: )> PLUGIN registered: )> PLUGIN registered: PLUGIN registered: )> PLUGIN registered: )> PLUGIN registered: <_pytest.capture.CaptureManager object at 0x7f3a097de4a8> PLUGIN registered: <_pytest.cacheprovider.LFPlugin object at 0x7f3a098002e8> PLUGIN registered: PLUGIN registered: <_pytest.terminal.TerminalReporter object at 0x7f3a09800630> usage: pytest [options] [file_or_dir] [file_or_dir] [...] positional arguments: file_or_dir general: -k EXPRESSION only run tests which match the given substring expression. An expression is a python evaluatable expression where all names are substring-matched against test names and their parent classes. Example: -k 'test_method or test_other' matches all test functions and classes whose name contains 'test_method' or 'test_other'. Additionally keywords are matched to classes and functions containing extra names in their 'extra_keyword_matches' set, as well as functions which have names assigned directly to them. -m MARKEXPR only run tests matching given mark expression. example: -m 'mark1 and not mark2'. --markers show markers (builtin, plugin and per-project ones). -x, --exitfirst exit instantly on first error or failed test. --maxfail=num exit after first num failures or errors. --strict run pytest in strict mode, warnings become errors. -c file load configuration from `file` instead of trying to locate one of the implicit configuration files. --continue-on-collection-errors Force test execution even if collection errors occur. --fixtures, --funcargs show available fixtures, sorted by plugin appearance --fixtures-per-test show fixtures per test --import-mode={prepend,append} prepend/append to sys.path when importing test modules, default is to prepend. --pdb start the interactive Python debugger on errors. --pdbcls=modulename:classname start a custom interactive Python debugger on errors. For example: --pdbcls=IPython.terminal.debugger:TerminalPdb --capture=method per-test capturing method: one of fd|sys|no. -s shortcut for --capture=no. --runxfail run tests even if they are marked xfail --lf, --last-failed rerun only the tests that failed at the last run (or all if none failed) --ff, --failed-first run all tests but run the last failures first. This may re-order tests and thus lead to repeated fixture setup/teardown --cache-show show cache contents, don't perform collection or tests --cache-clear remove all cache contents at start of test run. reporting: -v, --verbose increase verbosity. -q, --quiet decrease verbosity. -r chars show extra test summary info as specified by chars (f)ailed, (E)error, (s)skipped, (x)failed, (X)passed, (p)passed, (P)passed with output, (a)all except pP. The pytest warnings are displayed at all times except when --disable-pytest-warnings is set --disable-pytest-warnings disable warnings summary, overrides -r w flag -l, --showlocals show locals in tracebacks (disabled by default). --tb=style traceback print mode (auto/long/short/line/native/no). --full-trace don't cut any tracebacks (default is to cut). --color=color color terminal output (yes/no/auto). --durations=N show N slowest setup/test durations (N=0 for all). --pastebin=mode send failed|all info to bpaste.net pastebin service. --junit-xml=path create junit-xml style report file at given path. --junit-prefix=str prepend prefix to classnames in junit-xml output --result-log=path DEPRECATED path for machine-readable result log. --html=path create html report file at given path. --self-contained-html create a self-contained html file containing all necessary styles, scripts, and images - this means that the report may not render or function where CSP restrictions are in place (see https://developer.mozilla.org/docs/Web/Security/CSP) collection: --collect-only only collect tests, don't execute them. --pyargs try to interpret all arguments as python packages. --ignore=path ignore path during collection (multi-allowed). --confcutdir=dir only load conftest.py's relative to specified dir. --noconftest Don't load any conftest.py files. --keep-duplicates Keep duplicate tests. --doctest-modules run doctests in all .py modules --doctest-report={none,cdiff,ndiff,udiff,only_first_failure} choose another output format for diffs on doctest failure --doctest-glob=pat doctests file matching pattern, default: test*.txt --doctest-ignore-import-errors ignore doctest ImportErrors test session debugging and configuration: --basetemp=dir base temporary directory for this test run. --version display pytest lib version and import information. -h, --help show help message and configuration info -p name early-load given plugin (multi-allowed). To avoid loading of plugins, use the `no:` prefix, e.g. `no:doctest`. --trace-config trace considerations of conftest.py files. --debug store internal tracing debug information in 'pytestdebug.log'. -o [OVERRIDE_INI [OVERRIDE_INI ...]], --override-ini=[OVERRIDE_INI [OVERRIDE_INI ...]] override config option, e.g. `-o xfail_strict=True`. --assert=MODE Control assertion debugging tools. 'plain' performs no assertion debugging. 'rewrite' (the default) rewrites assert statements in test modules on import to provide assert expression information. --setup-only only setup fixtures, don't execute the tests. --setup-show show setup fixtures while executing the tests. --setup-plan show what fixtures and tests would be executed but don't execute anything. distributed and subprocess testing: -n numprocesses shortcut for '--dist=load --tx=NUM*popen', you can use 'auto' here for auto detection CPUs number on host system --max-slave-restart=MAX_SLAVE_RESTART maximum number of slaves that can be restarted when crashed (set to zero to disable this feature) --dist=distmode set mode for distributing tests to exec environments. each: send each test to each available environment. load: send each test to available environment. (default) no: run tests inprocess, don't distribute. --tx=xspec add a test execution environment. some examples: --tx popen//python=python2.5 --tx socket=192.168.1.102:8888 --tx ssh=user@codespeak.net//chdir=testcache -d load-balance tests. shortcut for '--dist=load' --rsyncdir=DIR add directory for rsyncing to remote tx nodes. --rsyncignore=GLOB add expression for ignores when rsyncing to remote tx nodes. --boxed box each test run in a separate process (unix) -f, --looponfail run tests in subprocess, wait for modified files and re-run failing test set until all pass. Travis CI: --travis-fold=[{never,auto,always}] Fold captured output sections in Travis CI build log splinter integration for browser testing: --splinter-webdriver=DRIVER pytest-splinter webdriver --splinter-remote-url=URL pytest-splinter remote webdriver url --splinter-wait-time=SECONDS splinter explicit wait, seconds --splinter-implicit-wait=SECONDS pytest-splinter selenium implicit wait, seconds --splinter-speed=SECONDS pytest-splinter selenium speed, seconds --splinter-socket-timeout=SECONDS pytest-splinter socket timeout, seconds --splinter-session-scoped-browser=false|true pytest-splinter should use a single browser instance per test session. Defaults to true. --splinter-make-screenshot-on-failure=false|true pytest-splinter should take browser screenshots on test failure. Defaults to true. --splinter-screenshot-dir=DIR pytest-splinter browser screenshot directory. Defaults to the current directory. --splinter-webdriver-executable=DIR pytest-splinter webdrive executable path. Defaults to unspecified in which case it is taken from PATH django: --reuse-db Re-use the testing database if it already exists, and do not remove it when the test finishes. --create-db Re-create the database, even if it exists. This option can be used to override --reuse-db. --ds=DS Set DJANGO_SETTINGS_MODULE. --dc=DC Set DJANGO_CONFIGURATION. --no-migrations Disable Django 1.7+ migrations on test setup --migrations Enable Django 1.7+ migrations on test setup --liveserver=LIVESERVER Address and port for the live_server fixture. --fail-on-template-vars Fail for invalid variables in templates. coverage reporting with distributed testing support: --cov=[path] measure coverage for filesystem path (multi-allowed) --cov-report=type type of report to generate: term, term-missing, annotate, html, xml (multi-allowed). term, term- missing may be followed by ":skip-covered".annotate, html and xml may be be followed by ":DEST" where DEST specifies the output location. --cov-config=path config file for coverage, default: .coveragerc --no-cov-on-fail do not report coverage if test run fails, default: False --no-cov Disable coverage report completely (useful for debuggers) default: False --cov-fail-under=MIN Fail if the total coverage is less than MIN. --cov-append do not delete coverage but append to current, default: False [pytest] ini-options in the first pytest.ini|tox.ini|setup.cfg file found: markers (linelist) markers for test functions norecursedirs (args) directory patterns to avoid for recursion testpaths (args) directories to search for tests when no files or dire usefixtures (args) list of default fixtures to be used with this project python_files (args) glob-style file patterns for Python test module disco python_classes (args) prefixes or glob names for Python test class discover python_functions (args) prefixes or glob names for Python test function and m xfail_strict (bool) default for the strict parameter of xfail markers whe doctest_optionflags (args) option flags for doctests addopts (args) extra command line options minversion (string) minimally required pytest version rsyncdirs (pathlist) list of (relative) paths to be rsynced for remote dis rsyncignore (pathlist) list of (relative) glob-style paths to be ignored for looponfailroots (pathlist) directories to check for changes python_paths (pathlist) space seperated directory paths to add to PYTHONPATH site_dirs (pathlist) space seperated directory paths to add to PYTHONPATH DJANGO_CONFIGURATION (string) django-configurations class to use by pytest-dja DJANGO_SETTINGS_MODULE (string) Django settings module to use by pytest-django django_find_project (bool) Automatically find and add a Django project to the FAIL_INVALID_TEMPLATE_VARS (bool) Fail for invalid variables in templates. environment variables: PYTEST_ADDOPTS extra command line options PYTEST_PLUGINS comma-separated plugins to load during startup PYTEST_DEBUG set to enable debug tracing of pytest's internals to see available markers type: pytest --markers to see available fixtures type: pytest --fixtures (shown according to specified file_or_dir or current dir if not specified) ___________________________________ summary ____________________________________ py37: commands succeeded congratulations :) write json report at: /tmp/tmpc5u3en0o/pytest-dolphin-0.3.9/result.json Time: 76.1 seconds