Skip to content

Target URL list causing issues?

I purchased a couple of lists the other day and have noticed an issue with the left Target URLs but not sure if its causing issues with the project.

Basically I have turned off the search engines so the project is not scraping at all, just feeding from two separate lists. When I was setting up the project to start with the lists I did clear the URL cache and target URL history (but didn't delete the accounts). At the time the Left Target URLs was somewhere around 33 million which I thought was unusual.
Since clearing it though and running with the two lists, its been just over a day and checking on it this morning after noticing it running quite slow I saw the left Target URLs is now at 46 million.

This is my kitchen sink project but I noticed a Blog Comment project showing something similar (around 26million) the other day too.

So how does the list get so large in just a couple of days? The lists had a combined total under 200,000 URLs
And is this causing an issue with the projects? Surely such a large number of URLs will cause it to slow down

Anyone else getting a similar problem?

Comments

  • SvenSven www.GSA-Online.de
    you probably have other options checked below that search engine box.
  • I have checked 'Use URLs linking on same verified URL (supported by some engines only) but I haven't checked 'Analyse and post to competitor backlinks'

    I have the same bits checked on most of my projects and it seems to only happen on the blog comments & kitchen sink projects.


    If I uncheck that would the list stop growing so large?
  • SvenSven www.GSA-Online.de
    yes uncheck this.
  • Okay unchecked this yesterday but noticed this morning the list is still growing.

    Yesterday I made the change when the list was at 46,254,475
    Left the project running and its submitted around 20k links, but checking the Left Target URLs this afternoon I can see the list is at 46,255,553 - so its grown by over 1,000 even though its processed at least 20,000 URLs.

    Should I just keep it running and hope it starts to decrease soon, or would it be worth trying to clear out the list and start again?
  • I had the same problem with my own scraping list that I wanted GSA to identify. I unchecked "Continuously try to post to a site even if it fails" and my numbers started to drop   
  • Did look at that but its already unchecked so don't think its that one for me at least.

    Its locking up for a minute or so every once in a while but I am going to give it another 24hrs and see if its still on the rise
  • Okay just checking up on it and its definitely still on the rise, currently showing 46,255,787

    Is there another setting which could be causing this? I have a feeling processing over 46 million URLs is quite a big ask for the project
Sign In or Register to comment.