Skip to content

Miss old times when GSA running good

2»

Comments

  • I just stopped GSA SER , using ver 8.32 and clicked on all these download failed
    errors in the campaigns, and checked on a whole bunch of them and the web-sites
    links open up in my browser using vps ip address. So i went over to firefox loaded the ip address 
    that was used for the download failed on that particular url, and the ip is not blocked. opens right
    up.

    19:06:45: [-] 014/160 download failed (proxy:!!!!!!!!!) - http://palestradeportoalegre.com.br/index.php/time/noticias-todas-as-novidades/entry/palmeiras-confia-na-permanencia-de-felipao-goalcom
  • Well, but it could be just a temporary ban of your IP. It was a very recent "download failed"?
  • Ehi!
    With the last 2 updates I installed today, my LPM increased to more than 100 LPM, and the VPM to more than 50.
    There are still some "download failed (aborted)", but much less than before :)>-
  • BrandonBrandon Reputation Management Pro
    9.33 is awesome!
  • radrad
    edited December 2014
    i updated earlier with the recent updates and seems to be much better. !!!

    Just added the Gscraper Proxy service in Gscraper. wow!!!!!!!!!!  1.5 billion urls scraped
    in 5 hrs.  :)
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    I use GreenCloudVPS as host and still facing issues with low verified ones - even after 9.34
    Not sure what to do really, could be I should consider to invest in Xrumer as suggested by peterperseo, to see if that improves things.

    Remember that if some says in runs wonderful - they don't state what engines that goes for....
    Could be url redirect, blogs etc...which is worthless in my opinion.

    In my case, even premium lists has an enormous amount of fails.
    Will run some more tests, perhaps add more private proxies - if that fails, well i'm about to look into Xrumer or Senuke

  • BrandonBrandon Reputation Management Pro
    @magically are you sure it's not your setup? If you had good success with old lists, that won't stay the same. Sites go down on a daily basis. I expect about a 50% loss after a week for my newly verified lists.
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    @Brandon
    Well I'm pretty sure it's not the setup - You are welcome to take a look, if you have some spare-time...
    Should be possible to create a new user, so you can check up on the things...

    Yep - I'm aware of the loss too, things are not static - specially not sites:P

    However the performance is pretty bad, regardless of premium list or scrapings.
    In terms of scrapings - I'm also pretty sure they are decent, as I use an ini-tool to extract footprints from GSA Ser and my keywords are also optimized as well.



  • @magically

    I have GreenCloud VPS as well. I have been to busy with Gscraper. I updated
    yesterday and not todays update. I am using ver 9.32 and at the same time 
    I contacted GreenCloud VPS and sent them a ticket to give my vps a hard 
    re-boot, and since then i ran GSA SER for 5hrs last night and no more errors. 
    I also noticed a big decrease in download failed. Im sure the engines will improve
    again. 

    Update to the latest GSA SER update and send a ticket inside GreenCloud hosting
    panel and ask them to give your vps server a HARD RE-BOOT!
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    @rad
    Thank you for your advice, will surely try that out:)
    Just bought some keywords and footprints - so I will focus on Gscraper for a couple of hours as well.

    Once ready, I will report back if there are improvements.
  • radrad
    edited December 2014
    @magically

    One thing i did notice inside my vps, my clock kept changing after setting it, even 
    after re-starting inside my vps. After hard re-boot from green cloud. The clock is
    working fine. Also, still havent updated gsa ser, because i am scraping with gscraper,
    icracked the TCP limit and the thing is a beast. im up to 45k urls scraped a min. My final
    scrape will be done in 4hrs. 24hr run and i should have 6 to 7 billion urls. Then GSA SER
    goes back on full speed with new updates as i sort out the 6-7 billion urls. lol


    OHHHHHHH YEAAAA, also, if you dont ask for a hard re-boot in the next few hours. they probably
    wont get too it till tomorrow. 
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    @rad
    Could you please tell me how to crack the tcp limit?
    I have server 2008....I bet it's the problem I have!!!!
  • magically 
    Win 2008 dont have any limits. I run 15-20k connections with one software on server with windows 2008 without problems. Gscraper have button "crack tcp limits" thats how you can crack limits but it wont make any difference.
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    @satyr85
    Yeah - exactly as I expected...did read up on it, something in a file, which have no impact at all.
    Well, even after buying footprints and keywords - a small test run still looks pretty bad.
    I'm able to scrape with approx. 26K pr. min stable.
    However still no signs of luck - damn it's pissing me off


  • edited December 2014
    magically 
    26k lpm is nothing big as you can see. I dont think anyone will sell you good kws or footprints. I saw few premium footprints/keywords lists and it was nothing special.

    Good footprints and kws - you will find some targets.
    Low quality footprints/kws - you can scrape veeery long and you wont find anything special.
    Self made, high quality footprints - priceless.
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    @satyr85
    I'm afraid you are right here...At some point I hope to be able to get some decent ones that actually works.
    I will study the 'art of footprints' during Christmas and refine and develop some great ones.
    Afterwards, I will program a tool in Java, that will be able to generate exactly what one is after.
  • radrad
    edited December 2014
    use GSCRAPER, use EXTRACT FOOTPRINTS. DO

    on a long list. Then Sort. You can always use GSA SER footprints, with a ton of keywords, then scrape
    for a day or days, then Extract Footprints. Then Sort if there Good.

    @magically    You need the gscraper proxy service. Or a ton of proxies. 

    Crack the TCP limit is 1500 threads. Make sure you have plenty of proxies.

  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    @rad
    Awesome advice, many thanks - will try to extract footprints within gscraper:)

    Sure that will be better, - Awesome rad!!!!
  • @magically I spent months getting the footprints right...sorry, not gonna give those away. However, you can find about 3-5k footprints on BHW quite easily.

    @Tim89 how do you verify your URLs? I stop all projects once a day or two days and run 50 projects verifying for 1-3 hours at a time. This works so much better than letting SER verify automatically. Yes it is a problem for those with multiple servers but I do too, and TBH I'm just used to doing it this way as it works. I also reverify every 4-7 days, but this is one that takes forever and I can't be sure that the ones being removed actually need to or not but that's for a different thread.
  • Tim89Tim89 www.expressindexer.solutions
    Hey @judderMan I have always let SER automatically verify on the fly, making use of resources and also as you said, I have multiple servers which would make it a little tedious..
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    edited December 2014
    @JudderMan
    Thanks for the tip - Actually I did try many many of these ones...Unfortunately, same shitty results.
    Feel like I have waisted a lot of money on nothing really - Merry Christmas up front btw;)
  • @Tim89 try it manually, stop all projects, in fact restart SER then Active V. Once it's finished, restart SER again and let it rip. I don't know why but it just works better this way. Yes it's a little tedious but the results, for me anyway, have been worth it.

    @magically use Santos's SER Footprint Tool Editor then you can see how many targets each footprint will give, I find it better than the in-built one in SER.
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    @JudderMan
    Thanks for the tip - surely will try this out and see if I'm able to get some more luck:)
    Highly appreciated, will run a scrape now and some small tests to see.
  • Ehy! Today I came back to old days and, after 3 weeks of low LPM, I'm getting around 180 LPM / 40 VPM.
    Not bad in comparison with last days.

    A part of this change can be for the new Loopline list and sync that is live from last night, but probably also in Ser something got fixed.

    =D>

    Hope everything is better also for all you!
Sign In or Register to comment.