Skip to content

The contexual links dissapear

Hi 

I am creating contexual backlinks both dofollow and nofollow links, and the pagerank are 2 and above. But as the time goes i am watching my links disappear, and why is that? 

How do you maintain you contexual links?

Comments

  • VMartVMart Natural SEO
    :(
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Theres a bunch of reasons.

    Are you using spun content?
  • VMartVMart Natural SEO
    @shaun
    Can google detect spun articles? 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    No idea. I would imagine they could though.

    Months, maybe years ago I pasted a spun article into grammarly and it detected loads of problems with the article. If they can do it I would imagine Google can do so much more not to mention potential duplicate content detection as they have a duplicate snip it patent but I havent looked into it much.

    On the other site months after I did that I was still using SER successfully.

    My main point for the OP would have been if he is using spun content and you were the webmaster of higher PR sites and you were manually checking articles you would delete the article too.
  • redraysredrays Las Vegas
    edited March 2017
    Beyond spun content, you need to keep in mind that your posts on the vast majority of decent quality sites will be off topic and removed by a moderator for that reason. I had a friend who tested using good content & human captcha solving on decent sites with a good CMS (stuff like Budddypress). Tons of his links still got taken down because the sites would be about something like fashion while his niche was something like finance.
  • VMartVMart Natural SEO
    edited March 2017
    Already I have informed sven about the SER BUGS contextual link disappear (link loss).

    Link that generated in tier 1 are stored in a text file.
    In 2,3 days SER has deleted the verified links from that campaign.

    But still that verified links are alive. Again I imported them to my project some links accepted the campaign and some are alive "unknown status" even they are alive.

    Please solve this issue @sven

     



  • SvenSven www.GSA-Online.de
    there is no bug. All that might be buggy are your proxies here. SER will only remove verified urls once the link no longer is present on the page and there was content delivered when checking that url...and all of that only of course once you enable the re-verify option.
  • VMartVMart Natural SEO
    Even I can also not even being without enabling re-verification.

    I'm using buyproxy for submission. Before two months the link which is removed by SER is still alive for me. I'm not technically strong If told anything wrongly plz don't mistake me wrongly.

    The verified links which I have stored are still alive. For that only I got confused lot.

    If I re-verify with using proxy will this issue will be cleared.
  • SvenSven www.GSA-Online.de
    no probelm of course ... I see you have issues with this but I can't really see any bugs in this.

  • redraysredrays Las Vegas
    Proxies could be buggy, the sites could be going down for a while because they're getting slammed by too many SER users, etc etc etc. If you're doing automated link building it's going to be very difficult to check which links are still alive without a significant error rate.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Pretty much what redrays said.

    VMart imagine you have 10 verified links and your re-verify option kicks in.

    • 8 of the sites are fine 2 are offline for a what could be one of a number of reasons. SER removes those two links.
    • 8 total URLs now, 6 are online two are offline so they are removed.
    • 6 total sites shown in SER now but one site has came back online from the first reverification. SER has no way to check that natively.
    One work around I was using is have a master project that never gets reverified. Right click and duplicate everything on it and then reverify the duplicate. Open its verified URLs and paste them into your T2 projects website field to build links for.

    This is not ideal and has problems with it but its they best I have been able to come up with.

    Another thing that could be causing it is your link output per proxy per domain as I think this is what was causing almost 90% of my links to be removed a few month back.

    For ease of example say you have 10 proxies and 100 targets with you posting to each of them once per day and everything works fine. Now you have scaled up your projects but not your proxies or target list. To meet you new daily link requirement you are now having to hit each of your targets 50 times per day for example still on 10 proxies. The webmaster simply needs to instal plugins to detect the submissions and their IP and delete the ones like this.

    Thankfully, if this is the problem just down scale your link output per proxies and your link retention goes back up.

    I am only focusing on my white hat site right now as I wanted a break from black hat stuff and I am really busy moving back home but have been planning some black hat stuff for next month. I still plan to use SER on T1 but it will not have a T2 built to it as the domains the links are too unreliable. I will use SER on T1 to increase the reffering domains and dilute anchor text ratios. I also plan to use weither RX or SERE depending on its state by the time I start up again for a set of T1 web 2s. These will then have a T2 of SER to them to power the web 2.0s up.
  • redraysredrays Las Vegas
    As usual @shaun does a much better job with the explanation :)
Sign In or Register to comment.