If you delete that, GSA will try and create new accounts, but the problem is it will use the same email, so most of them will just fail as the email is already registered.
Only use that option if you are creating a new project that you duplicated based on an old one, not for these maintenance routines.
I never do that, excpet when I duplicate projects. I sometimes login to profiles and change some weird anchortext right. - there should be a security "do you really want to" question asked..
most software have to many "do you really want to..."questions, but here I miss it..
10 private proxies and 230 threads. that sounds unbalanced to me but as long as its working, why not. maybe monitor your log for a while to see if thats the case.
you could also try to not use any proxies for verification.
I have 100k keywords loaded into 35 Projects all T3 deep. Could that be the problem?
I'll buy more proxies if that would speed up things, but I can't see that jumping me to 5x the speed. I don't know how these guys are pulling in 200+. I did everything they've wrote.
do you use those 100k keywords on each project for the same engines you target? that doesn't make sense to me, but i fear many people do that. to split that keyword-list into different parts plus adding some self created list on other projects makes more sense to me.
@LeeG there are significantly more locations and the search results are measurably higher with blog comments than Elgg installs for example. I know you know this and you're being facetious but someone will come along later and be confused
@LeeG keep your insults by yourself. or post some LPM screenshots, thats more helpful.
Quote Ozz" posting a comment to a blog is for sure faster than registering to a social network and post after account got verified. the less steps to take, the faster the submission. but i don't know how much faster it is overall. "
thats what I mean. And I dont insult people, I help them and they help me back.
@LeeG - "A good tool for checking the results a footprint can return is gscraper pro" - just want to say a big thanks. This was on my todo tomorrow and was going to see if I could get a coder to automate something like this...now there is no need. Cheers! :-)
Can someone help a nub where to find this "tool by sant0s" so that I extract footprints then enter them in Gscraper Pro to only keep high yield footprints
posting a comment to a blog is for sure faster than registering to a social network and post after account got verified. the less steps to take, the faster the submission. but i don't know how much faster it is overall.
1) Click the search button and load the directory with the .ini files. (usually located C:\Program Files (x86)\GSA Search Engine Ranker\Engines) 2) Check which engines/.ini's you want to extract footprints from. 3) Click "Extract Search Terms" 4) Click "Save" and choose where to save the .txt
*extra option* Check the Append %KW% box if you want %KW% to be added at the end of every footprint. Easier for merging keywords with Scrapebox.
I have a back up of my edited files. Then just paste them back after each ser update
A better method would be to simply rename the files edited files. That way they are not over written on each update. And if any engines are updated, you can easily see the new ones by date
Some engines Im dropping down to one single footprint. BlogSpot and BlogSpot.es for example. No point in going after results that can return thousands, when you can be pulling a lot more results and reducing the already passed message
And AlexR, I know one of the admin over there, I already told him about you mate. He asked if I knew a good online retailer of valium (joke)
That's where I got most of my extra footprints from
And a few that have been shared on here
Gscraper pro, has a built in part that gives you the amount of results on a footprint
Thtats when gscraper is working and not being under a ddos attack like today
So you can see the better ones to use that return a lot of results and in return reduce the amount of calls to the search engines. Thus reducing slaps to your proxies for too many searches too quickly
Comments
Only use that option if you are creating a new project that you duplicated based on an old one, not for these maintenance routines.
most software have to many "do you really want to..."questions, but here I miss it..
Just watch the "download failed".
But he has a high timeout.
... with low re-verified.......
if you do profile-based links like socialB, forums, etc.. its much slower.
give your self some time.
I also use edited engines to bring in a lot more results, which most people never take the time to do
Kill off low yield footprints, add high ones
Try and keep footprints that will return millions of results, rather than thousands
A good tool for checking the results a footprint can return is gscraper pro
Extract them with the tool by santos and then import them into gscraper and you can easily see what to keep and bin
So alex, Im holding back the snigger's and belly laughs on your above post
Please answer me this, Im trying not to laugh at the stupidity of your comment
How much longer does it take to post to an article site or wiki, than a blog comment or forum profile?
Enough to slow down 200LpM to 2LpM?
Great point @LeeG on editing the engines. +1
Yeah, I want to see the backup stats from @thisisalex on how those platforms slow down SER.
So is it the racecar or the driver?
Quote Ozz"
posting a comment to a blog is for sure faster than registering to a
social network and post after account got verified. the less steps to
take, the faster the submission. but i don't know how much faster it is
overall. "
thats what I mean. And I dont insult people, I help them and they help me back.
http://rapidupload.net/f/5136
2) Check which engines/.ini's you want to extract footprints from.
3) Click "Extract Search Terms"
4) Click "Save" and choose where to save the .txt
*extra option* Check the Append %KW% box if you want %KW% to be added at the end of every footprint. Easier for merging keywords with Scrapebox.
thisisalex, Im not insulting you, I just want know how you have come to the conclusion about the drop in LpM etc you claim to know about
Your constantly giving advice at present that's always way off the mark
The footprint extractor, I cant find the link to the thread on bhw, but this is the link to the download for it
In the thread by Santos on bhw, he gives precise info on how to use the tool in conjunction with scrapebox
http://www.mediafire.com/?0ca7er18azzlw67
I have a back up of my edited files. Then just paste them back after each ser update
A better method would be to simply rename the files edited files. That way they are not over written on each update. And if any engines are updated, you can easily see the new ones by date
Some engines Im dropping down to one single footprint. BlogSpot and BlogSpot.es for example. No point in going after results that can return thousands, when you can be pulling a lot more results and reducing the already passed message
And AlexR, I know one of the admin over there, I already told him about you mate. He asked if I knew a good online retailer of valium (joke)
That's where I got most of my extra footprints from
And a few that have been shared on here
Gscraper pro, has a built in part that gives you the amount of results on a footprint
Thtats when gscraper is working and not being under a ddos attack like today
So you can see the better ones to use that return a lot of results and in return reduce the amount of calls to the search engines. Thus reducing slaps to your proxies for too many searches too quickly