Skip to content
toggle menu
GSA SEO Forum
GSA SEO Forum
Categories
Mark All Viewed
GSA
Manuals
Donate
Categories
Mark All Viewed
GSA
Manuals
Donate BTC
Donate PayPal
Sign In
Register
Sign In
·
Register
×
Home
the_other_dude
Back to Profile
the_other_dude
About
User Name
the_other_dude
Joined
July 2014
Visits
331
Last Active
September 2023
Roles
Member, VIPs
Location
United States
Thanked
53
Thanked
53
Discussions
38
Comments
357
<blockquote class="Quote">
<div class="QuoteAuthor"><a href="/profile/s4nt0s" class="js-userCard" data-userid="2">s4nt0s</a> said:</div>
<div class="Quote…
Thanked by
1
s4nt0s
<blockquote class="Quote">
<div><a rel="nofollow">sickseo</a> said:</div>
<div>What's 75 LMP?
Mind sharing what new tools you got? I'm still makin…
Thanked by
1
sickseo
<blockquote class="Quote">
<div><a rel="nofollow">sickseo</a> said:</div>
<div>Yup, I'm still around!</div>
</blockquote>
Hey!! Not…
Thanked by
1
AliTab
Amazing contributions you have made to this community. Thank you for being so generous!
Thanked by
1
rastarr
<blockquote class="Quote">
<div class="QuoteAuthor"><a href="/profile/solidseovps" class="js-userCard" data-userid="4684">solidseovps</a> said:</div>
<div…
Thanked by
1
solidseovps
<blockquote class="Quote">
<div><a rel="nofollow">hardcorenuker</a> said:</div>
<div><blockquote class="Quote">
<div><a rel="nofol…
Thanked by
2
hardcorenuker
draculax
<div>You know what - The math I shared is incorrect for this application. I apologize. That math is for finding the amount of connections to make simultaneously for a faster style of scraping. …
Thanked by
1
jonny23
I don't like to do that because it will likely start getting other proxies banned. That interferes with the math. If you need to do proxy retry then you need to adjust connections, timeout, or both.
Thanked by
1
jonny23
<div>If you have 20 proxies you need to reduce connection to 1. Set delay to 10 seconds. This should keep you from getting banned.
</div><div>I came up with this number because…
Thanked by
1
jonny23
<div>Dedicated proxies are fine for scraping. Post these things:</div><div><ul><li>How many proxies you have</li><li>Your settings, such as connections numbe…
Thanked by
1
jonny23
Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR i…
Thanked by
3
ruffy
Deeeeeeee
hardcorenuker
I don’t notice a difference in website speed when using any browser anymore. I recall internet explorer being slow but that had to of been 15 years ago lol I use Firefox and safari everyday.
Thanked by
1
hardcorenuker
<blockquote class="Quote">
<div class="QuoteAuthor"><a href="/profile/verydima" class="js-userCard" data-userid="22907">verydima</a> said:</div>
<div clas…
Thanked by
1
seodamage
Only way you’ll really know is to try it. My opinion is that you should focus on sending traffic to the website for a couple of weeks at a minimum before you start building links, otherwise those lin…
Thanked by
1
hardcorenuker
You need to decrease scraping threads and increase wait time between searches until your proxies stop getting banned. You are searching too frequently.
Thanked by
1
huggies12
Have a great time, Sven! Hope you come back refreshed and happy!
Thanked by
1
Sven
Yes first option sends the identified urls to identified engine lists. It would be the same if you were using GSA PI and told it to add identified URLS to the identified sitelist folder, per engine. …
Thanked by
1
googlealchemist
<blockquote class="Quote">
<div class="QuoteAuthor"><a href="/profile/SvetoslavStoilov" class="js-userCard" data-userid="24445">SvetoslavStoilov</a> said:</div>
Thanked by
1
SvetoslavStoilov
<div>I am beginning to learn scripting engines in SER. Right now I am learning with docuwiki engine.
</div><div>
</div><div>In the engine script under [login_st…
Thanked by
1
cherub
Public proxies are the best and most affordable. If you can find a provider of port scanned public proxies with two lists daily you can scrape at max threads all day and night and not have to worry a…
Thanked by
1
hardcorenuker
That’s exactly what I was talking about, yes.
Thanked by
1
jonny23
Shortened links used to work great for indexing spam (2014-2016). What I used to do is build my tiers then create shortened links with the lowest tiers verified links. Import the shortened urls into …
Thanked by
1
jonny23
Yes. It’s the same thing as buying an expired domain and redirecting it to your domain. Everything passes through the 301 to the new destination. Links, anchor text, and even DA.
Thanked by
1
jonny23
Yes.
Thanked by
1
jonny23
Add them in the url box where you put your domain url.
Thanked by
1
jonny23
That’s not really how 301 redirects or shortened links work. They don’t typically get indexed in search like a web page gets indexed, because it is a status code that tells googlebot (and other bots)…
Thanked by
1
jonny23
You can use URL RD pro to automatically make shortened links. You can then use those shortened links in SER along side or in place of your domain urls.
Thanked by
1
jonny23
Web 2.0 engines in SER are hardcoded for specific websites. Article engines are not. That’s why you need a list for article engines etc. <div>
</div><div>Unless you are asking f…
Thanked by
1
draculax
You do not need more than 4GB of ram if all you are running is SER, as it is a 32 bit application. If you will be running more than SER on the VPS you need to take into account the amount of ram you’…
Thanked by
1
shizzledizzleeee
Gscraper does not work.
<div>
</div><div>Scrapebox is the choice to make unless you want to drop a bunch of cash on A-Parser. A-Parser is the best scraper but it is expensiv…
Thanked by
1
coral99
send me the discount for 3 month access please
Thanked by
1
AliTab
I agree with 4gb. 4gb is good for new users. Very comfortable.
If you're going to be scraping with scrapebox (or something else), running GSA PI to sort your raw scrapes, you might want some…
Thanked by
1
PLM
I use hetzner, or OVH.
Thanked by
1
PLM
<a rel="nofollow" href="https://reproxy.network/">https://reproxy.network/</a>
this is what i am using with SER and zennoposter simultaneously to resolve recaptcha affordably.
Thanked by
1
wts3849
I scrape my own lists. So I can’t comment from experience on a good list provider. But I saw another member mention this website yesterday. Probably worth a shot.
https://serverifiedlists.com
Thanked by
1
jaboli
You will not get around the need to solve captchas. You must have a captcha solving software and/or service. <div>
</div><div>You should buy a verified link list. Then turn off …
Thanked by
1
jaboli
Yes. You can also use GSA URL redirect pro. I have it. Very good software for creating redirects on high DA websites automatically.
Thanked by
1
SahilKumar
Great suggestions. Thanks for supporting this product (and all GSA products) so well, @Sven!
Thanked by
1
wts3849
article forge and word ai is the easiest that I know of. I am using both of those currently.
Thanked by
1
hardcorenuker
I believe you do this by right clicking the project > modify project > export > account data
Thanked by
1
googlealchemist
<blockquote class="Quote">
<div><a rel="nofollow">googlealchemist</a> said:</div>
<div><blockquote class="Quote">
<div><a rel="nof…
Thanked by
1
googlealchemist
I use zennoposter for web 2 registration and automation. It’s not a plug and play solution. You must use zennoposter to create the scripts. Because it is under your control, you will always have good…
Thanked by
1
PLM
It makes sense to have protected words per project to me. Every project is different for those of us that work with several different niches and thousands of topics/keywords at a time. The words I ne…
Thanked by
1
wts3849
We all need to buy @sickseo a beer or a coffee.
Thanked by
1
sickseo
Agree this would be very useful.
Thanked by
1
AliTab
skin in 2018 is so thin. pathetic.
I'm sure we are all doomed now.
Here's his engine selection for anyone else that was interested, since OP wants to take his ball and go home for peo…
Thanked by
2
2Take2
JudderMan