Skip to content

Ability to Filter Redirects

edited October 2014 in Feature Requests
After extensive testing I've found the issue with SER creating duplicate contextual links even when I tell it not to. The only way to lessen this is to uncheck "continuously try to post" and also uncheck "allow posting on same site" and apply filters but it still will result some extra duplicate links.

The reason for this is are the urls from link lists created with url shorteners, other 301 type links, and the n.a.m.e@examplesite.com type redirects. SER creates a duplicate link because it does not have the initial redirect link in the projects verified list. For example:

1) SER loads http://urlshortnersite.com/example123
2) SER does not have any http://urlshortnersite.com links in the projects verified urls
3) SER continues with trying to submit a link
4) SER proceeds to process link which redirects to http://example.com
5) SER creates duplicate link
6) SER saves the initial REDIRECT link to site list (if enabled)
7) SER does steps 1-6 again and again and again everytime it comes across http://urlshortnersite.com/example123

Even with the options unticked and domain filters of "@" and many url shorteners I still get a few duplicates but I miss out on ALOT of links since I can't risk deleting the target url history on already parsed urls unless I want more duplicate links.

Everyone that does not want duplicate links on domains should check their projects verified links. SERList.com lists while fantastic is a good example of link lists that have many of the same targets but with alot of shortened url versions.

I'm very tired so sorry if I went on rambling or didn't quite make sense but pretty much any type of URL that redirects is at risk of creating duplicates. I don't even know if this feature could be done but it would DEF be nice.

Comments

  • SvenSven www.GSA-Online.de
    hmm indeed this is something I will look into and fix.
  • @Sven - Thanks. I (as well as many others I'd imagine) would absolutely love for this issue to be fixed. I get so damn frustrated seeing 20-30 links from the same domain when I tell it to only post once. I thought the amount of emails may be a bug in the way it works but it's not. It all stems from links which redirect to something else. I''ve had to miss out on alot of links cuz I can't retry the parsed ones unless I want to start the duplicate link cycle all over again.
  • @Sven - Thanks for applying a fix so quick. I've done quite a bit of testing and haven't been able to replicate anymore duplicate links which is fantastic! It stops any of the url shortener redirects and http//:example.com=http://example2.com type redirects.

    I still have my filter of "@" (without quotes) to stop the n.am.e@example.com urls some list providers like to do so I'm not sure if these type of urls would result in a domain change --> error or not. But it doesn't matter to me if it does or not since the filter saves bandwidth without trying to load the url anyways.
  • SvenSven www.GSA-Online.de
    it should take any @ infront of the domain out and do a compare.
  • Okay. I wasn't sure if it did or not since before the update it would create duplicate links with those blabla@example.com type links.
Sign In or Register to comment.