I need to access a few HTML pages through a Python script, problem is that I need COOKIE functionality, therefore a simple urllib HTTP request won\'t work.
Any ideas?
Why don't you try Dryscrape for this:
Import dryscrape as d
d.start_xvfb()
Br = d.Session()
Br.visit('http://URL.COM')
#open webpage
Br.at_xpath('//*[@id = "email"]').set('user@enail.com')
#finding input by id
Br.at_xpath('//*[@id = "pass"]').set('pasword')
Br.at_xpath('//*[@id = "submit_button"]').click()
#put id of submit button and click it
You don't need cookie lib to store cookies just install Dryscrape and do it in your style