问题
According to capybara's README:
The two following statements are functionally equivalent:
page.should_not have_xpath('a') page.should have_no_xpath('a')
However, when trying this out, that does not appear to be true. This seems to work fine when using capybara-webkit:
visit dashboard_accounting_reports_path
click_link 'Delete'
page.should_not have_css('#reports .report')
But when using poltergeist, it often fails with this, which seems to say it's using #has_css?
under the covers, which won't actually wait for the given element to disappear:
Failure/Error: page.should_not have_css('#reports .report')
expected #has_css?("#reports .report") to return false, got true
If I change the assertion to read like this instead, it seems to succeed every time:
page.should have_no_css('#reports .report')
Am I crazy, or is this a bug in poltergeist? I'm using PhantomJS 1.8.2, poltergeist 1.1.0, and capybara 2.0.2.
Here's the debugging output from the should_not have_css
example: http://pastebin.com/4ZtPEN5B
And here's the one from the should have_no_css
example: http://pastebin.com/TrtURWcZ
回答1:
I think I found the problem - I had been requiring 'capybara/dsl' in my spec_helper.rb instead of 'capybara/rspec', so the proper should_not
behavior and error messages from Capybara::RSpecMatchers were not included in that spec.
EDIT: It turns out if you're requiring 'rspec/rails', it will automatically bring in the correct capybara stuff for you. But if you're using some non-standard RSpec stuff, like using Capybara in request specs, you'll still need to manually include Capybara::DSL and Capybara::RSpecMatchers there. See also: https://github.com/rspec/rspec-rails/blob/master/Capybara.md
来源:https://stackoverflow.com/questions/15252111/does-poltergeist-support-capybaras-should-not-rspec-matchers-correctly