-
-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Closed
Labels
plugin: pytesterrelated to the pytester builtin pluginrelated to the pytester builtin pluginplugin: warningsrelated to the warnings builtin pluginrelated to the warnings builtin plugintype: enhancementnew feature or API change, should be merged into features branchnew feature or API change, should be merged into features branchtype: proposalproposal for a new feature, often to gather opinions or design the API around the new featureproposal for a new feature, often to gather opinions or design the API around the new feature
Description
while writing some other bits and pieces, I had a use case for checking the warnings
omitted, RunResult
has a assert_outcomes()
that doesn't quite offer warnings=
yet the information is already available in there, I suspect there is a good reason why we don't have assert_outcomes(warnings=...)
so I propose some additional capabilities on RunResult
to handle warnings in isolation.
With assert_outcomes()
the full dict comparison may get a bit intrusive as far as warning capture is concerned.
something simple like:
result = pytester.runpytest(...)
result.assert_warnings(count=1)
Thoughts?
Metadata
Metadata
Assignees
Labels
plugin: pytesterrelated to the pytester builtin pluginrelated to the pytester builtin pluginplugin: warningsrelated to the warnings builtin pluginrelated to the warnings builtin plugintype: enhancementnew feature or API change, should be merged into features branchnew feature or API change, should be merged into features branchtype: proposalproposal for a new feature, often to gather opinions or design the API around the new featureproposal for a new feature, often to gather opinions or design the API around the new feature