Skip to content

Request: Remaining Targets in List for GSACF

@Sven could you add a remaining targets number either by right click option or next to submissions on the bottom of the tool? I scrape (separately to CF) and import lists, but if the tool gets through them in the night, I am left for hours with the tool not running. I have noticed that 200k lists are around the right amount to import, anything more and it goes slower, but depending on my proxies and server the performance is erratic so it's difficult to judge when to add another list.

If there's a better way of doing this process, I'm keen to hear.

Comments

  • SvenSven www.GSA-Online.de
    right click on the header...you have plenty of columns therethat should show you what you need.
  • looplineloopline autoapprovemarketplace.com
    My 2 cents is that I have a couple servers that all they do is run identification on domains.  I load them in in 1 million chunks and tend to run 1 to 2 identification projects at 350 to 500 threads (depending on the server) and use no proxies for this process. 

    Then I post on other servers and also load in in 1 million chunks.  I run about 60K to 100K per campaign per day.  going faster tends to lower click thru rates due to spam filters.   Also going too many to a given domain in a day tends to land them on the domain blacklist. 

    thats maybe more then you asked for but perhaps will spark an idea.
  • londonseolondonseo London, UK
    @loopline - what spam filters and domain are you referring to?
  • londonseolondonseo London, UK
    @sven - I do not see a way to view remaining target URLs.
  • SvenSven www.GSA-Online.de
    sorry, I thought you where talking about Search Engine Ranker.
  • looplineloopline autoapprovemarketplace.com
    Email spam filters.  I mean these may be contact forms, but the end websites "typically" email that response to someone, so it still passes thru spam filters. 

  • londonseolondonseo London, UK
    @loopline - if the message was in SPIN format thereby making each message 100% unique, would it still pass thru email spam filters?

    Do you see the option to view remaining target URLs?

    Thanks
  • looplineloopline autoapprovemarketplace.com
    I don;t think there is a column that specifically shows how many of the project are unprocessed. 

    Spam filters pick up on all sorts of things, the message, body, the subject, the urls.  So even if you spin the message to death but use only 6 different domains for your link, like spin out 6 domains - and you sent 60K message, each message might be unique, but your going to have 10K messages still referencing the domains.  llllll

    So they are all going to be read by the spam filters, but as to how many make it thru, you would need to split test to know for sure.
    Thanked by 1londonseo
  • @everyone there isn't a remaining/unprocessed figure - that's what this whole thread was about....

    I'll try again - Request: Can we get an unprocessed stat please?
  • londonseolondonseo London, UK
    @sven - can we have this added?
  • SvenSven www.GSA-Online.de
    I still don't get it...do you mean what keyword was used for searching and what has to be done?
  • looplineloopline autoapprovemarketplace.com
    If I understand correctly, and someone can correct me if I am wrong, I believe they want to see a quantity count of how much is left to process in a list of urls. 

    Where it gets tricky is you have multiple moving parts. 

    So if I load in a list - which I believe is the use case they are speaking of - and its 1 million urls, then its easy enough to put a column that is a counter that says XXXX complete and XXXX left to process.  Of course your simply subtracting the filtered, success and failed from the total loaded and that is whats left unprocessed.  This would be handy of course as you can predict what day a project will end, or there abouts. 

    However of course that assumes your loading a list, if your loading keywords, its gets entirely trickier, because you can't predict how many results you will scrape from each keyword/footprint, so you would then almost have to have a counter with a quantity of keywords left, I guess. 

    The issues I see is if scraping is involved or scraping and a loaded list together.  Because of course you can't predict how many results are from a keyword so you need to give a count of keywords left to harvest.  But then someone is using 32 different engines, then you run into tracking all those stats from how many total combos of keywords/footprints and engines there are. 

    Then the next issue I see is that the entire premise here is being able to see status and predict when its done and you could well have things go smoothly and then run into a group of keywords that produce little to know results across 32 engines and then one minute your half way done and the next your 80% done and then someone is going to be potentially annoyed because their prediction was off.  I mean it might be fine and good, but I always like to think thru worst case scenario as your building a feature that could be used by thousands of people in different use cases. 


    At any rate, I think thats the premise of what people are after.  @londonseo @JudderMan   Are you talking about in a use case where you load a list to post directly to urls you loaded or are you talking about using keywords and letting WC scrape engines for results, or both mixed?  Or something else?
  • I'm talking about loading lists. I don't 'check' first, don't have time for that and it doesn't save that much time compared to running it off the bat.

    I don't use keywords, therefore, as it's all from the lists.

    My problem is: I load in 50k or 100k lists, on top of a project that may have ran 1m before. I can't use huge lists as it throttles and slows the tool down, so 50k chunks are best I've found. I often find that it's blasted through them and been sat doing nothing at peak times of the day. I can't babysit it all day, far too many other things to do. Sometimes it can take hours to go through 100k, sometimes under an hour. If I had an idea of what's remaining, then I can plan and import lists once its nearly finished.

    ORrrrrrrr is it possible to import lists from Dropbox once it clears through one txt file? That might be a better idea.

    All of these little tweaks and functionality is available in SER so that's why I 'think it may be easy to implement.

    Obviously, @Sven I'm not demanding anything and just keen to make the tool better/more hands-free, which in turn will probably make it sell more.
  • looplineloopline autoapprovemarketplace.com
    You can already import from a file.  That got added like 20 versions ago. 

    Under Start go to import urls + send + delete

    But looks like Sven added stats in the most recent updates, it has all kinds of stats now.
  • @loopline :p I know that, I meant to auto import once one text file has been processed.

    Sweet about stats I'll check it out.
  • looplineloopline autoapprovemarketplace.com
    You mean like dynamically scanning a folder looking for new text files to import when the current ones are done?  So you can always just be dropping new files in there to import?
  • Yep. Auto import. So I can just drop files into Dropbox.
  • @Sven - as always you da man! Thanks buddy, that % of list thing is perfect.
  • looplineloopline autoapprovemarketplace.com
    @Sven Yes I have to say I thought that my system was fine but I have found that having a percent of targets left to post to is very handy.  Thanks for that!
  • AlexRAlexR Cape Town
    @JudderMan - this would be awesome:
    Auto import. So I can just drop files into Dropbox.
    That would be amazing. So you could set a dropbox folder for each project and just dump txts in there and it will work through them when they get added & delete them when done.

    That's how I'd use it.
    My problem is: I load in 50k or 100k lists, on top of a project that may have ran 1m before. I can't use huge lists as it throttles and slows the tool down, so 50k chunks are best I've found.
    Is this because of the rendering?
  • looplineloopline autoapprovemarketplace.com
    You can still kind of already do this.  I mean you put a bunch of files in the folder and use the function that already exists.  Then when its done, just add in a bunch more files.  You still have to touch it once in a while, but really I wouldn't want to leave it running for weeks and weeks anyway. 

    I check that captcha solving is working weekly and you need to deal with opt outs, and you want to update WC from time to time and you want to make sure your proxies are still working etc... 

    I want my stuff to run for a week to 2 weeks tops, but aft that point I want to touch it to make sure that everything is working.  In having more then a dozen servers I find that from time to time something will always go wrong, its the way of things. 

    Im not saying Sven shouldnt build it or that you guys are wrong, Im just giving my 2 cents. 
Sign In or Register to comment.