It looks like you're new here. If you want to get involved, click one of these buttons!
To help make it go away, I would disable verification on all junk tier projects like T1A, T2A, etc. You can always run verification on those tiers over the weekend.
Have verification on contextual tiers like T1, T2 set for automatic every 1440.
The other thing I would do is come up with a very large list of keywords, and break them up into smaller files, and use a spin folder macro in the keyword field. I have tested having this folder on my desktop vs. Dropbox, and it is noticeably faster if the folder is in Dropbox.
Forget global site list, and do not scrape keywords using SER. Scour the internet and find a big-ass list of the most common words/phrases/searchterms etc. I mean 100,000 for starters, but you really want 1,000,000 or more ultimately. Put them in files of about 1000. And then have 1000 files in that folder, and you now have 1,000,000. You need to feed the beast to keep it going fast. The more 1-word and 2-word terms you have, the more new targets you will find.
Then look at your advanced tools, and see what the engine ratios are for verified vs. submitted. Kill the inefficient engines.
Always set up dummy projects where you run the engines you think are bad so that you always have data on all engines. Sometimes Sven changes an engine, and suddenly it becomes a performer. Maybe because he tweeked the script. So never completely discard an engine because today it is failing you. Always compile data on the "bad" engines because some do change for the better.
Go here to understand contextual (T1,T2,T3) vs. junk (T1A,T2A,T3A): https://forum.gsa-online.de/discussion/2930/ser-tiers/p1 - Hopefully you are not mixing junk links with good contextual links.
I would put the T1 and T2 at automatic 1440 minutes. You only need to verify those once per day. Otherwise you slow down the posting.
If you stick in a million keywords in SER, you will definitely slow it down. You don't want to throw extra weight on the shoulders of the program. By using a macro (look at the help button in SER), you can direct the keyword field to look at a folder stuffed full of keyword files. It makes things go much faster.
I never use public proxies for anything. Terrible for performance.
Try to make sure you don't have filters for PR or OBL as that will slow things down terribly.
And for articles I use KM spins with the anchor text tokens embedded in the spins. I would change them every month or two, probably two.
You stick this in the keyword field:
Note: The path above is the path to where my keyword files are. Obviously, your path may not be Administrator, you may not have Dropbox and instead use a folder on your desktop, etc. In that folder, you maybe create 1,000 different files (in .txt vertical format, meaning no commas or pipes, just a vertical list) with 1,000 different keywords in each file. SER will randomly select different files in that folder, and you will never run out of targets.
@sven just mentioned in another thread that a good way to get SER to not overshoot your daily linkbuilding target is to have more frequent verification. So I won't tell you to not follow Sven's advice on that. My opinion is to only do that on the T1 project. Who cares if you overshoot on a T2 or T3, and again, who cares of you overshoot on the junk tiers T1A,T2A,T3A. It just doesn't matter because NONE of those link directly to your moneysite. I have verification disabled on junk tiers, and set to automatic 1440 minutes on the contextual tiers T1, T2, T3 - but that's what works for me.
Scrapebox gives you too many longtail search terms. Start searching the internet for one word and two word lists. Hell, if you think about it, a dictionary is one big ass one word list. That should give you some ideas.
I can't answer the mechanics of how it grabs the keywords within the file. I just know that it grabs a random file from the folder each time.
My assumption (if you are using scheduler) is that it will grab a different file each time the project is turned on by the scheduler. But once it grabs the file, I would think it would start from the top - but I could be wrong on that.
I never do that. I think it sounds great in theory, and have tried it, but I have never witnessed higher speed after I did it. I think some people did that to decrease Ram memory if they were having freezes, but not to gain speed.
Speed doesn't come all at once. In comes in steps like a staircase. You experiment, you figure something out, it goes up. Then you stay there until you experiment with other stuff.
Things that help are removing inefficient footprints from engine files, and constantly assessing verified-to-submit ratios on engines, and then making cuts or additions based on performance.
Then you also test other stuff like your settings, which boxes you check, which ones you don't, etc. You literally need to think about what SER is doing with each little box you check - i.e., does checking that little box mean that SER will have to work harder to find links, etc., etc.
When you get to the point that you have tested everything, then I guarantee that you will have terrific speed.