Re-verifying Links & Indexing
Hi, do you guys re-verify your T1, T2 & T3? Using the GSA "right-click > modify project > verify all links" or export all to scrapebox for checking? I'm trying to decide how to send the links in for indexing. I have an account with indexification. I was using the API at first but after trying to bump up my LPM to a decent number, I read @ron mentioning not to use api abut to do them manually. I just don't want to send non-verified links to it. To tell yo the truth, it would be easier to just use the API but I don't know how much it would slow it down. Also, since it is 10's of thousands of links, should I use only public proxies for scrapebox? Or for that matter the same if using the GSA remove feature instead? This tool and all the features are starting to get to me
Comments
You completely, 100%, *misunderstood* what I was saying.
I send *everything* to Lindexed with my API - you couldn't have misunderstood me any more than you have. The API is your *best* option in all cases. It doesn't slow down SER one nanosecond.
What I was referring to were two thing I don't do: 1) Use the SER pinger which I think is a waste of time *especially* if you are sending your links to an API indexer; and 2) Send links to GSA Indexer on autofeed because SER + GSA Indexer running at the same time will bring everything down to a crawl - that has been well documented - they are both software beasts, and you shouldn't run two beasts of any kind of software on the same PC at the same time.
For the record, I have my T1 on automatic verification daily, and I have verification disabled on all other tiers. Why? Because the T1 is your lifeline - those links directly hit your moneysite. You want that tier as updated as possible with respect to verified links.
Once a week or so, I stop SER, make all projects inactive, highlight all projects, right click>Show Urls>Verified>Verify. That is how you verify all links immediately.
Now why do I only have T1's set to verify daily, and have all other projects disabled for verification? Because I want SER posting, not wasting time verifying lower level links or junk tier links. A total waste of time in my opinion. Plus, I get a faster LPM. So that is what I was talking about - not the API.
Lastly, Scrapebox plays no role in the verification process. SER does a great job of verifying links, and SER gives you all sorts of ways to do it - automatically, custom times, and manually. If you want to scrape targets in SB and use it to find targets for SER, more power to you. I have never needed to do that, and I honestly haven't had a need for SB since I bought SER a year ago. It still is a great tool, and there all sorts of uses for it, but not for what you suggested. And only use public proxies with scrapebox (for scraping, pr checking, index checking), and use private proxies with SB for posting.
No problem. What I feared was somebody reading what you said, and then saying that ron said to do this - when I didn't say that.
140 LPM is stellar. And in 5 hours, that's probably a world's record.
Because it takes time for even the fast indexer to process a bunch of links, I always prioritize so that contextual tiers get first priority, and then junk tiers if I have time and/or pc resources to do it.
@kiosh - Yes, "Verified Links must have exact URL" should be checked. That triggers verification throughout the day. That way, your T1 is as up-to-date as possible, and then your lower tiers start building links to those new T1 links immediately - which is what you want.
And yes, I have the indexer option box check with other, and then I have my indexer services set up in the Main Options with their respective API keys.
with 500 threads and a list, all content sources, in just under 11 hours of runtime, using one of my machines at home with a 70mb connection, I have achieved over 300,000 verified contextual links from 1 SER install @ 1100 threads. Just another day at the office!