My FeedDiscussionsHeadless CMS
New
Sign in
Log inSign up
Learn more about Hashnode Headless CMSHashnode Headless CMS
Collaborate seamlessly with Hashnode Headless CMS for Enterprise.
Upgrade ✨Learn more

Requests.get() runs multiple times on a single request

mark barton's photo
mark barton
·Feb 13, 2019

I have a simple python proxy script that uses a GET request to download the HTML of a website and display it inside an HTML iframe, however, I've noticed a strange issue upon running the script multiple times.

It seems that every time I run the script, the GET request is run again on top of previous requests. E.G.

GET /background_process?txtAddress=facebook.com HTTP/1.1" 200

GET /background_process?txtAddress=youtube.com HTTP/1.1" 200 GET /background_process?txtAddress=youtube.com HTTP/1.1" 200

GET /background_process?txtAddress=reddit.com HTTP/1.1" 200 GET /background_process?txtAddress=reddit.com HTTP/1.1" 200 GET /background_process?txtAddress=reddit.com HTTP/1.1" 200

It looks like a leak of some kind but I can't figure out how to stop it.

#Python 2.7

app.route('/background_process')
def background_process():

    try:
        address = request.args.get("txtAddress") #Handled by AJAX
        resp = requests.get(address) #Request website content     
        return jsonify(result=resp.text) #Send back to HTML

    except Exception, e:
        return(str(e))

EDIT:

Done a bit of digging around and I think it all stems from my AJAX call. I believe it is not resetting the previous value and as such runs again and again, incrementing by one each time. Does anyone know how to stop this?

        //RUNS PROXY SCRIPT
        $('a#process_input').bind('click', function()
            {             
         $.getJSON('/background_process', 
                       {txtAddress: $('input[name="Address"]').val(),
                       }, 
                       function(data)
                       {$("#web_iframe").attr('srcdoc', data.result);
                       });
            return false;
        });