S
S
sddvxd2017-01-17 23:52:58
Python
sddvxd, 2017-01-17 23:52:58

How to “wind up” the visitor counter using urllib?

Good evening
In the process of learning the urllib library, I immediately wanted to try my "knowledge" and wind up views on some left site (I found out about proxy, everything seemed to be clear)
Here is the code:

import time
from urllib import request as urlrequest

proxy = ["120.52.73.97:87"]
url = "http://example.com"
req = urlrequest.Request(url)
iterator = 0

while True:

    if len(proxy) == (iterator):
        print("через 5 сек по новой")
        time.sleep(5)
        iterator = 0

    req.set_proxy(proxy[iterator], "http")
    response = urlrequest.urlopen(req)

    print("выполнил запрос и получил ответ")
    iterator += 1
    #print(response.read())

input()

It seems that the cycle makes several requests - responses, but after a series of successive attempts, an exception gets out:
urllib.error.HTTPError: HTTP Error 302: The HTTP server returned a redirect error that would lead to an infinite loop.
The last 30x error message was: Moved Temporarily

Views do not increase, even if I use many proxies in the list (10)
Please help

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Alexey Yarkov, 2017-01-18
@sddvxd

This is how I now wind up voices on the site for my nephew with VPS))))

#!/usr/bin/python
# -*- coding: utf-8

import sys
import requests

if len(sys.argv) > 1:
    proxy_file = sys.argv[1]

url = "http://foto.konkurs.ru/like.php?id=123" # URL фэйковый )))
counter = 0

with open(proxy_file, 'r') as proxyFile:
    for proxy_line in proxyFile.read().splitlines():
        pr_dict = {"http": "http://%s" % proxy_line}
        try:
            result = requests.request(
                "GET", url, timeout=(5, 10), proxies=pr_dict)
            if result.status_code == requests.codes.ok:
                counter += 1
                print proxy_line
        except requests.exceptions.ConnectTimeout:
            print "Error: %s" % proxy_line
        except requests.exceptions.ReadTimeout:
            print "Error: %s" % proxy_line
        except requests.exceptions.ConnectionError:
            print "Error: %s" % proxy_line
        except:
            pass
print "Total success requests: %d" % counter

Proxy here:
1.txt - free.proxy-sale.com/?port%5B%5D=http&type%5B%5D=an
2.txt - awmproxy.com/freeproxy.php
3.txt - www.prime-speed .ru/proxy/free-proxy-list/all-worki... I
concatenate into one file, process it for uniqueness and run the above script:
$ cat 1.txt 2.txt 3.txt | sort | uniq > proxies.txt
$ nohup ./votes.py ./proxies.txt &

D
Dimonchik, 2017-01-17
@dimonchik2013

there can be protection
and use pyCurl

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question