Skip to content

7.80 Update has once again broken saving verified engines to list

2»

Comments

  • SvenSven www.GSA-Online.de

    One reason why sites are not saved is: They are pulled from the same list before and so they will not get saved to them again. That was added in latest version.

    So if a project gets a new target from verified site lists, it is not adding it to identified/successful/verified site list as it assumes it has added that before there due to the fact it got it from the verified site list.

  • @Sven, if a project gets a new target from "identified" site lists, would it add to the verified site list?...
  • SvenSven www.GSA-Online.de
    yes
  • Thanks for the confirmation...

    I usually Import URLs (identify platform and sort in), let SER run for a few days, then delete the identified folder. SER will then load urls from the verified list. It's much faster...
  • Sven assuming I import from site list verified. Not pulling, right click on the project and import from verified. Will the new created verified link be added to the verified list?
  • edited March 2014
    Another thing: what if a URL is imported from failed, a post gets verified. Asking since a lot of guys (including me) tend to save lists in the failed folder.
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    Even 7.83 is not saving the verified urls properly.

    I took note of my verified url count from stats, waited till about 200 verified links showed in the right side verified window then checked the verified stats again and it only showed and extra 30.

    I even manually searched for a domain name in a verified file that it showed to be verified in the right side window and that domain was not even in the file. Engine was Zendesk.
  • I don't think it was either that's why i reverted back to a previous version.   I programmed a complex PHP script that I run wamp on, it records all urls i've already posted to using another program into mysql database, then when i find new urls through GSA for wordpress sites, I run it through what I call is a dupe checker.   It removes urls i already posted to and then outputs ones that have not been posted to.   With 7.80 on, GSA was not recording a single site, as Sven says, because it already identified them previous.   However this can't be true, because when I reverted and it saves all identified sites, only 50% - 70%  of them are found to be duplicates by the script I wrote.  

    For my purpose, I just paid $24.85 and purchased a list of every single domain that is currently registered, I think it is somewhere around 100,000,000 sites.   I now just import them into GSA as target urls into my wordpressfind2.ini engine, and let it go through and search the source code for /wp-content/.   This works perfectly for me, as it is fast enough on an 8 core amd fx9320 to output enough wordpress sites to keep 2 over clocked core 2 quad 9300's running full time 24/7 with the contact form submitter program i use.   
  • Experiencing same, imported URLs from a .txt file and it's not saving properly. 

    Assuming a URL I manually import from a .txt file into a project is in identified / successful / verified (in one of those), it's not getting saved to verified again after a new verified post been done to it?
Sign In or Register to comment.