Small GSA SER research (global list vs normal active) old scrapebox vs new scrapebox
Doon
Netherlands
Guys, I want to give a little insight on my own research that I have done for GSA SER.
Basically, I'm running into some issues, and I hope someone can clarify some things im experiencing currently.
So before I start with everything. I had used before a Nulled scrapebox version for my links (yes I know, I'm evil bla bla)
And when I scraped with it, out of couple million links (1-2) only like 100-300k were duplicates.
Second thing I want to add is, when I used GSA Pi on it, almost 80-90% of the platform was detected.
Meaning I had a good list to use in GSA SER to create backlinks.
Other thing is the lpm / vpm.. It was just insane... between 800 to even 1200 lpm (500 threads for 800, and when I used 800+ threads It went up to 1200lpm)
so here's what happened after that.
I bought the original Scrapebox because a friend told me he scraped 100 times faster than me. seriously the nulled version scraped 200links per second max.. while he was saying he scraped 6k links per second.
So ever since i bought scrapebox, I get literally garbage lists.
I scraped 78 million links in 1 night. GSA PI duplicate removal removed 60+ million. and I was left with 11 million links.
that's not the only issue..
when I run GSA PI out of the 11 million links over 9,8 million were undetected platforms.
So I was left with 2-3 millon links for GSA.
Hold on, Hold on.. that's not all!
I created 10 projects (i always do that, duplicate my projects with fresh emails)
I grouped one 10 projects as "search active"
I made another 10 projects and grouped as "scrapebox list"
So basically I got 20 projects, working on the exact same URL list, exact same keywords, everything is the exact same. (except fresh mails / proxies)
the only difference is I put on group on "active" the other group on "global site list" with my 2-3 million links from scrapebox.
So here are the results after 2 days :
Basically, I'm running into some issues, and I hope someone can clarify some things im experiencing currently.
So before I start with everything. I had used before a Nulled scrapebox version for my links (yes I know, I'm evil bla bla)
And when I scraped with it, out of couple million links (1-2) only like 100-300k were duplicates.
Second thing I want to add is, when I used GSA Pi on it, almost 80-90% of the platform was detected.
Meaning I had a good list to use in GSA SER to create backlinks.
Other thing is the lpm / vpm.. It was just insane... between 800 to even 1200 lpm (500 threads for 800, and when I used 800+ threads It went up to 1200lpm)
so here's what happened after that.
I bought the original Scrapebox because a friend told me he scraped 100 times faster than me. seriously the nulled version scraped 200links per second max.. while he was saying he scraped 6k links per second.
So ever since i bought scrapebox, I get literally garbage lists.
I scraped 78 million links in 1 night. GSA PI duplicate removal removed 60+ million. and I was left with 11 million links.
that's not the only issue..
when I run GSA PI out of the 11 million links over 9,8 million were undetected platforms.
So I was left with 2-3 millon links for GSA.
Hold on, Hold on.. that's not all!
I created 10 projects (i always do that, duplicate my projects with fresh emails)
I grouped one 10 projects as "search active"
I made another 10 projects and grouped as "scrapebox list"
So basically I got 20 projects, working on the exact same URL list, exact same keywords, everything is the exact same. (except fresh mails / proxies)
the only difference is I put on group on "active" the other group on "global site list" with my 2-3 million links from scrapebox.
So here are the results after 2 days :
So the image explains everything.
GSA searching for links has better, faster and more results than my list.
Which before with my own list was the exact opposite. In the same amount of days (2-3 days) I have buildt 2.3 million links to my site.
Can anyone clarify what could be the issue of this?
Is GSA SER stuck on something?
Is Scrapebox original new version a peace of garbage compared to the old nulled one?
what am I doing wrong here ?
hope someone can clarify..
who am I ?
Sir Doon, at your service, bringing you personal tests, stats and results for the lazy fucks who don't want to do them their self.
GSA searching for links has better, faster and more results than my list.
Which before with my own list was the exact opposite. In the same amount of days (2-3 days) I have buildt 2.3 million links to my site.
Can anyone clarify what could be the issue of this?
Is GSA SER stuck on something?
Is Scrapebox original new version a peace of garbage compared to the old nulled one?
what am I doing wrong here ?
hope someone can clarify..
who am I ?
Sir Doon, at your service, bringing you personal tests, stats and results for the lazy fucks who don't want to do them their self.
Comments
Nulled Scrapebox
today my test results
Original Scrapebox
what the hell is going on here..
Cause that's exactly what those original vs nulled scrapebox numbers look like. If nulled was doing only 100 results / keyword, you would get much less duplicates but also you would get less links at the end.
And that would also explain the GSA PI part because when you're pulling 1000 results / keyword, you get a much less % of identified links (but more identified links in total).
Now it certainly wouldn't explain the SER part because that one just doesn't make sense.
2.3mil vs 4k ?
There's no way in hell that scraping with different versions of SB would have such a drastic impact on number of verified links. Hell, if you were to do 2 scrapes with completely different keywords & completely different footprints, you still wouldn't get even close to that much difference in number of verified links at the end.
As far as your first post / test goes, try directly importing the list to all your projects rather than using the global site list, I found that I got much better performance in the past that way.
And also make sure that you use a different site that you're building links to in each project (my sugestion is a made-up url).
I know there are 2 ways of importing a list. And I dont know exactly which one you mean with that comment.
@hinkys
but when it got open I saw 3.8million links.. Sooo could it be that GSA SER is just aint doing what it's supposed to do? because to me that looks like the entire list.
I meant importing them by selecting all projects and then Right Click -> Import Target URLs (rather than using the "Use URLs from global site list" option)
Well as far as I know, on Active it's going to go through the remaining target urls first and after that, try to get / find more target URLs if you have any options for that enabled ("How to get target URLs" section under project's options).
When on Global site list, it's going to ignore the current remaining target urls and just use...well... the global site list. The same thing (or at least that's how I understand it) as setting the project to "Active" but only enabling the "Use URLs from global site list if enabled" option (and disabling everything else under "How to get target URLs" section) as a way to get target links.