Skip to content

WHite hat setting

Updated today and got this

Please note that it is your responsibility of what sites you submit to
and what data you enter for your project.
GSA can not be hold responsible for any damage that might happen
when using this software.
This software can be used for good (white hat SEO) or bad (black
hat SEO) and GSA is not the one to judge on
what you do with this software. You the customer are responsible
for anything you do with this software.

 

what are the best setting for white hat SEO vs Black hat seo, we want penguin and panda safe links???

What are the best setting for this??

Thanks in advance for your time and expertise.

Comments

  • @kccan don't take this the wrong way, but if you are asking questions such as this, you probably shouldn't be using SER.

    On a serious note @Sven how come the disclaimer?
  • @davbel, I am new to GER and am learning something new everyday but best way to learn sometimes is to ask direct questions.  Simple for you to make a statement why not say hey its over here in this forum or over here on the web.  This forum is to help other people not bashing someone for asking a question quite sure you had a bit of a learnin curve before you became a SER expert

  • LeeGLeeG Eating your first bourne

    Any building of links to manipulate your search rankings is blackhat

     

    The best way to learn something is to experiment.

    If you don't make mistakes, you cant learn from them

  • davbeldavbel UK
    edited July 2013
    OK m'bad

    I understand why you might have asked after reading the disclaimer, but you need to understand this:

    "White Hat" SEO = links that others build to your site *naturally* without any intervention from you
    "Black Hat" SEO = everything else

    Owning something like SER gives you the potential to screw things up massively if you don't know what you're doing, so you need to spend some time understanding SEO, SER and what SER can or can't do - There's no such thing as a White Hat / Black Hat switch :D

    Both @ozz and @ron, as well as others have created some fantastic posts explaining how to get the most out of SER, but I think you need some time getting a more rounded understanding of how SEO works and how that translates to SER

    And at that point, as I've been doing this for far too long, I can't help with giving you a recommendation, but I'm sure someone else on the forums will be able to help???
  • Yeah ser Deff has a learning curve not nearly a big as Xrumer but it still has one I know the first 3-4 site I put into GSA got penalized hard.  It's all about trying and seeing what works. but literally it's all here on the forums just gotta read posts from Ron and Ozz and you'll get more than enough info on what you need to start a successful campaign.
  • @davbel thank you kindly, its all a bit overwhelming at times and seeing that disclaimer as a newbie freaked me out.

    Google hasn't been the kindest to my ecommerce sites lately for major keywords appears people/companies, I hired to do SEO were doing the blackhat and our website has paid the price dearly.

    So I figured I would work 2 hours per day and have been reading posts in this forum, sometimes easy to find and other times frustrating.

    Thanks to @Hunar Borderland is a great game, cool graphics, and great story...

    and thanks to @LeeG, all input welcome and appreciated.

     

  • LeeGLeeG Eating your first bourne

    When I taught people to push ser to 200LpM, I was testing seven days a week, 12 to 14 hours a day for three months

     

    Those that get results with ser, put a lot of work in and a lot of hours testing to get things right

    I do daily submissions of a 1/4 million links and average over 50k verified daily

    You need to spend a lot of time testing and tweaking to hit those kinds of submissions

    And those that have spent the time, wont hand you how to do it on a plate

  • no worries, don't want to be spoon fed a few tips here and there would help and I research the heck out of it, we all learn by researching and only dumb question is one not asked.

    I run SER about 12 hours a day, at most, and gain about 300- 600 links for our website, verfiied, I run different campaigns keeping track of what seems to be working and what doesn't I do not want to add 1000s of links overnight as I believe google would frown on this

    :(

    Thanks again

  • @LeeG do you use URL shorteners to reach high LpM?
  • do you mean like bitly and google url shortener? if yes, then no I don't

    and what is a High LPM?

    Thank you agan.

  • ronron SERLists.com

    I'm trying not to laugh too hard because the white hat settings on SER is roughly in the same spot where you find them on xrumer =))

    All joking aside...

    If you really want to go white hat, which in a perverse way is part of the method to make this all work better, is to only build great, high quality links (probably manually or highly selectively) direct to your moneysite, and then build GSA SER links to those high quality links. That's probably the best way to stay out of trouble.

  • @davbel "how come the disclaimer?"

    Maybe people shot down their pages and complained to sven. lol.

    If you play with the fire you better prepare to get burned (sooner or later) ;)
  • SvenSven www.GSA-Online.de

    @davbel just to get peoples attention that this is a tool after all that people use and we can not be hold responsible for. 

    I see it like a compiler. You can code an word processing application for office or a trojan/virus. It's you who control it and you can not go to the compiler producer and blame him that someone made a trojan/virus with it.

    And yes, it's the same with our GSA SER program. You can script things and control almost anything with the script language and settings. It's the customer who must understand this.

  • LeeGLeeG Eating your first bourne
    edited July 2013

    phiziks  The url shortners have only recently been added

    The method I use to determine which engines to use is simple.

    Choose a bunch of engines. Ie articles

    Run them for a week. Check the submissions and verified.

    Disable any that dont give any results

    Move onto the next batch, ie web 2's

    Do the same

    Move onto the next, ie social networks

    Do this for all engines types you want to target

    Eventually, you end up with a good set of working engines that give good results

    It all takes time to do, but the results are worth it

     

    Add to that being selective with the search engines.

    I also run a small selection of random googles so Im not pulling the same search results time and time again, which slows pulling results and submissions.

     

    All the above has been shared on here time and time again

  • sorry for offtopic...

    @LeeG ... I read your first post about random google se... you must use some really good proxies to choose random googles per earch project, isn't it?... for me it wasn't good solution because proxies in google got banned around the world regardless the language...
    maybe random bing, ask works better...
    how it looks with your settings?
  • LeeGLeeG Eating your first bourne

    I use 40 proxies from proxy hub

    Support is crap there, but the proxies work well and don't need a dedicated support team just to swap them out 24/7 like another proxy supplier

    I use the same settings on search engines that I have told and recommended people to use for a long time

    Use global sites lists, (submitted and verified) So if your on a slap, you still have places to submit to

     

    What a lot of people over look is simple common sense

    They blame the search engine settings for google blocking their proxies

    They use shared proxies

    How do you know what the other people sharing those proxies are doing when they run them?

    They could be running the same software as you and have poor settings on ser

    You running round in circles altering your settings, blaming ser and the dog for poor results

    When its nothing to do with your own settings, but the others sharing those proxies

    Less engines, less repeat search results

    Most of the engines are google clones

    More places found to post links

  • edited July 2013
    I actually use 20 private proxies from buyproxies and even if tested successfully, most of them keep blocked (in google)
    now I am trying to use 1 language per search engine per project from these:
    1. aol
    2. ask
    3. duckduckgo
    4. euroseek
    5. excite
    6. google
    7. lycos
    8. msn=bing
    9. startpage
    10. yahoo
    ...and I will see how it works for me, btw. I have to really go through these long topic about increasing lpm. I am satisfied with my current (but low) and I want more hehe ;-)
  • @LeeG Thanks for the comment. I read the "how to get better LpM" discussion and your comments helped out a lot. My LpM revolves around 70-80 right now.

    My question was mostly directed for the URL Shortener engine: they got great submission and verified rate. I just don't know if it's worth the effort to check this engine. What do you guys think?
Sign In or Register to comment.