1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Multiple requests

Discussion in 'Visual Basic .NET' started by christoss1959, Aug 31, 2012.

  1. christoss1959

    christoss1959 Senior Member

    Joined:
    Nov 25, 2010
    Messages:
    894
    Likes Received:
    1,150
    Home Page:
    Hi guys I've made a problem for a client and all should be working fine except one thing after the program crawled a number of links (all on the same website) the program cannot crawl another (even though if you try that link on its own the program works and there is no IP ban if I just close the program and start it again it will work). I think it has to do something about the connections but for the life of me I cannot figure out the problem. Any help would be greatly appreciated.

    This is the code with all the tricks I knew that could help with this problem
    Code:
    Dim request2 = DirectCast(WebRequest.Create( tempurl), HttpWebRequest)
    request2.KeepAlive = False
    Dim sp = request2.ServicePoint
    Dim prop = sp.[GetType]().GetProperty("HttpBehaviour", BindingFlags.Instance Or BindingFlags.NonPublic)
    prop.SetValue(sp, CByte(0), Nothing)
    
    
    
    request2.CookieContainer = mycookiecontainer
    request2.UserAgent = "Mozilla/5.0 (Windows NT 6.1; rv:14.0) Gecko/20100101 Firefox/14.0.1"
    request2.Timeout = 5000
    Dim response2 As WebResponse = request2.GetResponse()
    Dim stream2 As Stream = response2.GetResponseStream()
     Dim reader2 As New StreamReader(stream2)
    html2 = reader2.ReadToEnd
     response2.Close()
     request2.Abort()
    
    Christos
     
  2. gimme4free

    gimme4free Executive VIP Jr. VIP Premium Member

    Joined:
    Oct 22, 2008
    Messages:
    1,879
    Likes Received:
    1,931
    Perhaps the site is adding a cookie to track the process & this is why when you restart the program it works again ok?

    Why do you call Abort() also, this should be called when you want to stop a process mid-way rather than after Closing the response, should you not be using request2.Close?

    Also make sure to check your program's flow - use Try, Catch blocks, skip bad links etc.
     
  3. christoss1959

    christoss1959 Senior Member

    Joined:
    Nov 25, 2010
    Messages:
    894
    Likes Received:
    1,150
    Home Page:
    There is no request2.close() option and since some connection for no apparent reason remains live I've tried abort as a failsafe but that didn't work.
    I have a try loop that forces repeat until success but it doesn't help after 179 repeats I had no luck. Also in case of error the program cleans the cookies so cookies shouldn't be a problem.
     
  4. christoss1959

    christoss1959 Senior Member

    Joined:
    Nov 25, 2010
    Messages:
    894
    Likes Received:
    1,150
    Home Page:
    The problem seems to go away when you close and dispose the stream and the streamreader