Answer the question
In order to leave comments, you need to log in
Why doesn't scrapy accept more than one cookie?
Hello.
The essence of the question is this: there is a certain site, I log in to it using phantomjs, I get a json file with cookies.
After that, I would like to give these cookies to scrapy, so that scrapy logs in and fetches information from the site.
When testing the code below on httpbin.org/cookies, no cookies were received.
If I set an index, for example cookies[0], then I get one cookie on httpbin.org/cookies (which is predictable). But you need to give everything from the json file.
class MySpider(BaseSpider):
name = 'MySpider'
start_urls = ['http://site.ru']
def get_cookies(self):
os.system("phantomjs ~/ph.js")
with open('cookie.json') as data_file:
data = json.load(data_file)
print data
return data
def parse(self, response):
cookies = self.get_cookies()
return Request(url="http://httpbin.org/cookies", cookies=cookies,
callback=self.after_login)
def after_login(self, response):
print response.body_as_unicode().encode('utf-8')
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question