Inconsistent Link parsing



  • Issue:
    In 2 of the 3 ways to post a link to the forums (Markdown, BBCode), if the hyperlink is not complete (e.g. missing the protocol), Discourse will neuter the hyperlink and make it inert, yet the third way (plain text) will correct the link to make it valid.

    Example:
    Plain text: www.google.com
    Markdown: www.google.com
    BBCode: [url=www.google.com]www.google.com[/url]

    Plain text:  www.google.com
    Markdown:  [www.google.com](www.google.com)
    BBCode:  [url=www.google.com]www.google.com[/url]
    

    Expected:
    Consistent behavior, either the plain text URL wouldn't be parsed (and not show as a URL), or, preferably, all 3 would be corrected to be valid URLs.



  • Posted it over on meta.d with a link back here.



  • https://meta.discourse.org/t/inconsistent-link-parsing/21083/4

    So the inconsistency is a bug, I'm just hoping "all work" wins out over "lets break the currently functional".



  • *sigh* Looks like we are going to get the plain text "fixed" by no longer working.

    https://meta.discourse.org/t/inconsistent-link-parsing/21083/6

    https://meta.discourse.org/t/inconsistent-link-parsing/21083/9



  • Whaaaaaat?!

    That doesn't even ma... oh, nevermind. Yay Discourse.



  • The non Jeff answer of "should be broken" even has a like from a third person. I don't think either are DiscoDevs (but could be wrong).

    Main point being that the DiscoDevs may have people clamoring for their style of fixes for broken things. Perchance we shouldn't be dumping on them so hard if they are being asked to build broken things (they are still broken and shitty, but if people are going "hey can you give me a shit sandwich" then you can't really hold it against someone for making said sandwiches).



  • It's not as crazy as a typical JDGI, and it's not completely obvious to me that it's wrong. I mean, www is pretty easy to detect, but the regex happy discodevs will end up linkifying stuff like i.e. if you go much further. It might be simpler to be consistent and only linkify stuff with a proper protocol prefix. It would be a step towards consistency and away from discoursistency at least.



  • I don't think there's very many software suites out there that detect links, and ignore just 'www.'

    The fact that www.google.com automatically redirects to https://www.google.com is just icing on the cake.



  • @boomzilla said:

    but the regex happy discodevs will end up linkifying stuff like i.e. if you go much further

    OK, fair enough on that point. It just feels wrong to take something that is working without the protocol (which most users don't care about and browsers tend to hide) and breaking it instead of making the other two work without protocol.



  • @boomzilla said:

    only linkify stuff with a proper protocol prefix

    The check to see if they have a protocol prefix could likely be used to either go "invalid URL, don't bother making it a link" (this is what it looks like it already does for Markdown and BBCode) or "oh, they forgot the protocol, let's just tack http:// on for them" (this is what it looks like it does for plain text).

    That's making an assumption on how they built the piece that looks for that in the first place.



  • @locallunatic said:

    Perchance we shouldn't be dumping on them so hard if they are being asked to build broken things

    True - but they should also be using their 'years of experience' with forums to know that your average forum user doesn't care about protocols, they just want to be able to type www.google.com and have it become a link. It isn't just forums that do it like this, plenty of sites or applications let you miss out the protocol and have a working link.
    @boomzilla said:
    It would be a step towards consistency and away from discoursistency at least.

    Yes I agree there. Consistency is an improvement, even if it is in the wrong direction.



  • @loopback0 said:

    True - but they should also be using their 'years of experience' with forums to know that your average forum user doesn't care about protocols, they just want to be able to type www.google.com and have it become a link. It isn't just forums that do it like this, plenty of sites or applications let you miss out the protocol and have a working link.

    The problem is not that the link isn't working. The problem is that it looks like it's working, when it's really an empty href. They already linkify www.google.com. But [test](www.google.com) results in a link without a href. test

    Also, while we're at it, what the fuck happened here?

    http://www.10minutemail.com =>



  • @Maciejasjmj said:

    Also, while we're at it, what the fuck happened here?

    Looks like it's trying to see if your link is onebox-able, character by character as you type it. Because a slight delay in showing the onebox preview after you finish typing the link is unacceptable, but spamming the server 25 times before that is a-ok.



  • @hungrier said:

    Looks like it's trying to see if your link is onebox-able, character by character as you type it. Because a slight delay in showing the onebox preview after you finish typing the link is unacceptable, but spamming the server 25 times before that is a-ok.

    Yeah, it was a rhetorical what-the-fuck, as opposed to explanation-demanding what-the-fuck.


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.