Skip to content

Garbage - Dont waste your money!

TheCreeperTheCreeper Turkey
edited August 10 in SEREngines
Submit only 7-8 links, i have 500 dedicated proxies, bulk mail.ru and google accounts but no way. Only 7-8 links at all. I sent many e-mails to owner but no reply at all. Tried everything from 15 days. I did exactly the same on videos, tried something else but got only 8 link maximum. Something wrong with serengines and the main thing he is no reply emails. Ppl pay you $$$, maybe you are busy but need to answer or fix your tool. $15 nothing for me but for some country $15 huge money. Dont rip please ppl, @Sven you are a king and very good person. You have to kick him from this forum that he cant rip anymore. So bros dont waste your money for only 7-8 garbage links as evernote and diigo links..
Tagged:

Comments

  • SvenSven www.GSA-Online.de
    Sorry to hear that really, and yes you have a point here as well. He should fix his stuff! :neutral:
    Thanked by 1gh0stmichael
  • Good morning,
    he doesn't even have the courtesy to answer that makes several times.
    He only has to give you the project and you will monetize it @Sven, it is in your strings.
  • sickseosickseo London,UK
    I tried his service as well last month. Seemed like a good cheap replacement for rankerx to get those web 2.0 links, But I had the same experience, hardly get any links from the service and I just cancelled after days of trying and testing. Didn't bother asking for a refund as he's been awol on the forums for ages. Even his sales thread people complain about his abscence.
    Thinkng about it now, he gets $14.95 each time someone tries out his service.  And it is poop. It's not right. He should either fix it, or it should stop being advertised so other users can avoid the same experience.
  • Everything depends on your settings and the URL base that you have.
  • sickseosickseo London,UK
    Everything depends on your settings and the URL base that you have.

    Are you kidding? How many links do you get from the service?
  • I wasn't too happy either last time I used them a few months ago, dedicated proxies, captcha breakers, gmail emails the whole nine..
    It only created a few links on 3 engines out of the whole list.
  • Although I must say when I emailed him in the past, he did answer quite fast and was helpful.
  • YamiraanYamiraan Manchester
    guys is it a good idea to use custom gmail catchall emails?
  • @Sven tired to upload some para sites template but got a "Unhandled archive type"
    please any ideas on how to rectify this?
  • @loopline with the new scrapebox features now added,how do we scrape for target urls with the gsa ser footprints?we how do we apply these footprints with the keywords without having to use the scarepox platforms
  • @loopline with the new scrapebox features now added,how do we scrape for target urls with the gsa ser footprints?we how do we apply these footprints with the keywords without having to use the scarepox platforms


    This is how 
  • @Sven please how do i solve the issue - "download failed - Socks connection refused"
  • @Sven i have been having a lot of those messages recently and i just bought 30 private proxies
  • sickseosickseo London,UK
    edited December 4
    Are you running a scraped list or a new verified list?

    That error basically means the site was inaccessible. Usually shows up when either the site is down or they are running something like cloudflare and your ip is being blocked.

    I see that message all the time, so it's perfectly normal.


    Or are you talking about SER engines? It doesn't work is the short answer.
  • sickseo said:
    Are you running a scraped list or a new verified list?

    That error basically means the site was inaccessible. Usually shows up when either the site is down or they are running something like cloudflare and your ip is being blocked.

    I see that message all the time, so it's perfectly normal.


    Or are you talking about SER engines? It doesn't work is the short answer.
    yes i am running a scrapped list from scrapebox, what in your opinion would be the remedy?
  • sickseosickseo London,UK
    You need to processs that list first with 1 project. Place your scraped list in the identified folder first and set one project to use that identified folder only. This list contains a mixture of working and non working sites so you will see those errors whilst processing it.

    You also should set your verified folder to an empty folder location. Any additional projects should be run using only the verified folder. This folder will automatically save any verified links that are found whilst processing your 1st project. So any projects that use it, will only be running working sites, hence mimising the number of errors you'll see.

    Make sure to do this also: 



    Make sure the verified box is ticked, this ensures new verified urls will be saved to the verified folder.
  • sickseo said:
    You need to processs that list first with 1 project. Place your scraped list in the identified folder first and set one project to use that identified folder only. This list contains a mixture of working and non working sites so you will see those errors whilst processing it.

    You also should set your verified folder to an empty folder location. Any additional projects should be run using only the verified folder. This folder will automatically save any verified links that are found whilst processing your 1st project. So any projects that use it, will only be running working sites, hence mimising the number of errors you'll see.

    Make sure to do this also: 



    Make sure the verified box is ticked, this ensures new verified urls will be saved to the verified folder.
    thanks for your help
Sign In or Register to comment.