Whenever i start a project to reverify links it always finds less verified links even if i make the check after a hour. Is there something wrong with the re-verification process or the verified links are really removed from the pages so quickly?
If i want a project for the first tier only to verify all links once a day what status i have to choose for the project - Active (Verify Only) or Active (Re-Verify)?
if u have a mixed list, 1. do u re-verify the links? is it recommended to do so? (i tried verified some (old) links and most of it dead) 2. how do u verified it (separately) before use it/import it to the existing (verified) list? 3. how long/what % normally those GSA links will stay? what's the point if the links failed to stay?
1. If you build second tier i think it's good to re-verify links once a day. If you don't build a second tier i think it is no needed.
2. Status > Active (Re-Verify)
3. Some engines like Guestbooks are not keeping the links on the same page and some links are removed by moderators. If links can't be verified that doesn't mean necessarily that they are not there.
From your replies.. 1. For 2nd-tier? is that meant GSA SER not good for Tier1? (or only good if with a good big verified list?) 2. If the mixed list is bought, how to import it so i can do the "Status > Active (Re-Verify)?", - create a new project? then how to import the list to verify?
1. I mean if you build a second tier it's good to re-verify the links on your first tier once a day, so you make sure you are not building second tier links to not existing first tier links. GSA is good for first tier, but all depends how it's used.
2. If you want to use your own list create a new project, right click on the project and select Status>Import Target URLs.Uncheck all search engines - from project options>search engines to use, right click and select Check None. If you leave the search engines checked gsa ser will first go through your imported list of urls and then will start scraping url to find new targets.
If you haven't built any links you don't need to use the option Active (Re-Verify).
GSA SER steady fails to reverify exisiting URLs and moves them to failed.
@Sven, how does SER verify or re-verify URLs? Does it use only nacked URLs or contextual links as well? Most of failed pages contain contexual links and not nacked URLs.
And after moving to failed there are no possibility to reverify them again (only export). Is it possible to add that feature?
What do I miss? And how can I put existing (but failed for SER) URLs to verified folder again automatically?
OK, @Sven, maybe it relly proxies's issue: I've just got an idea that the proxies of ReverseProxies.org are on IP hard bounded and my IP is dynamical. I will see.
But how can I import the URLs again to the "Re-Verified URLs" automatically or at least manually but with right engines?
Hmm thats not much and should not influence your bandwidth that much. Can you try to see if the connection in general is down when you see ReVerified failed (open browser and surf). If thats the case, you face the problem of a weak/low cost router that can not handle many open ports.
@Sven, I'm watching my bandwich steady and I think it's not the router (not bad) but ISP! I pay for flat rate 50 Mbps but they reduced practically to ZERO! My bandwich is between 0-9 kbps! 23, 75 hours a day!
hey, i noticed a lot of links being delete from my projects, so i unchecked re-verify option. Now i checked links "manualy". And SER is flagging links to articles/blogs that actualy exist So it crashing my tier campains a bit, and manual checking takes forever;[
Comments
if u have a mixed list,
1. do u re-verify the links? is it recommended to do so? (i tried verified some (old) links and most of it dead)
2. how do u verified it (separately) before use it/import it to the existing (verified) list?
3. how long/what % normally those GSA links will stay? what's the point if the links failed to stay?
Thanks
2. Status > Active (Re-Verify)
3. Some engines like Guestbooks are not keeping the links on the same page and some links are removed by moderators. If links can't be verified that doesn't mean necessarily that they are not there.
i hope that helps.
From your replies..
1. For 2nd-tier? is that meant GSA SER not good for Tier1? (or only good if with a good big verified list?)
2. If the mixed list is bought, how to import it so i can do the "Status > Active (Re-Verify)?",
- create a new project? then how to import the list to verify?
2. If you want to use your own list create a new project, right click on the project and select Status>Import Target URLs.Uncheck all search engines - from project options>search engines to use, right click and select Check None. If you leave the search engines checked gsa ser will first go through your imported list of urls and then will start scraping url to find new targets.
If you haven't built any links you don't need to use the option Active (Re-Verify).
You may watch some of the video tutorials if you are just starting with gsa ser: https://forum.gsa-online.de/discussion/11/gsa-search-engine-ranker-video-tutorials-updated-2014-01-14#latest
I have the problem with re-verifying as well.
GSA SER steady fails to reverify exisiting URLs and moves them to failed.
@Sven, how does SER verify or re-verify URLs? Does it use only nacked URLs or contextual links as well? Most of failed pages contain contexual links and not nacked URLs.
And after moving to failed there are no possibility to reverify them again (only export). Is it possible to add that feature?
What do I miss? And how can I put existing (but failed for SER) URLs to verified folder again automatically?
OK, how does SER verify??? All links are alive (manually proved) but SER is unable to re-verify them. Why?
@Sven, how can I let SER to CORRECTLY re-verify links again and automatically automatically?
OK, @Sven, maybe it relly proxies's issue: I've just got an idea that the proxies of ReverseProxies.org are on IP hard bounded and my IP is dynamical. I will see.
But how can I import the URLs again to the "Re-Verified URLs" automatically or at least manually but with right engines?
@Sven, I've found out why SER couldn't re-verify: it was toooo busy - maybe proxies testing because it's only task making SER tooo busy.
usually 20