So, here's a question



  • No, I get that it gets baked twice - once on the client in javascript and once on the server (in Ruby I assume since IIRC that's what Discourse uses). The problem is that the two ovens (for lack of a better term) cook the posts slightly differently leading Dscoursistency in rendering.

    And I get DRY, but if that's you're concern, they should use Node on the server to bake it in the same oven. I have no problem with them doing it twice.


  • :belt_onion:

    @rad131304 said:

    No, I get that it gets baked twice - once on the client in javascript and once on the server (in Ruby I assume since IIRC that's what Discourse uses). The problem is that the two ovens (for lack of a better term) cook the posts slightly differently leading Dscoursistency in rendering.

    Nope - it's actually baked in JS on the server as well. Why it gives different results? Well f**k if I know :P

    From my albeit limited Ruby experience, they're basically running literally the same code



  • @sloosecannon said:

    @rad131304 said:
    No, I get that it gets baked twice - once on the client in javascript and once on the server (in Ruby I assume since IIRC that's what Discourse uses). The problem is that the two ovens (for lack of a better term) cook the posts slightly differently leading Dscoursistency in rendering.

    Nope - it's actually baked in JS on the server as well. Why it gives different results? Well f**k if I know :P

    Well then, yes, that's a huge :WTF:. I thought they were 2 different libraries for 2 different languages that implemented the "spec" of the HTMarkdownBBCODEWhateverSink++ differently.


  • BINNED

    @sloosecannon said:

    Well, how exactly would this affect them? The text is still there, just not HTML-ified... IANASEE so I'm legitimately asking

    1. Crawlers are designed for HTML, not random HTMLDownBBCode. They know how to parse it to get pure text output. They have no idea if [something] is actual content, custom BBcode style tag or whatever. They will most likely assume it's legitimate content. Which it is not.
    2. Semantics. Search engines understand (proper use of) <h1> ... <h6> for example. Or lists. Or tables. HTML5 has <article> and <aside> for example - <article> should, reasonably, be prioritized. Check what quotes are rendered as here, btw.

    @sloosecannon said:

    Also, they actually do do that. How they end up with different content, I have NO F**KING clue

    Do you have a reliable source on that? Because I heard it both ways already and I don't know any more. And no, I don't feel like poking around the hellstew that is Discosource.



  • @Onyx said:

    Do you have a reliable source on that? Because I heard it both ways already and I don't know any more. And no, I don't feel like poking around the hellstew that is Discosource.

    It's actually a sensible idea - you don't want to throw AJAX requests back and forth on keydown, and otherwise you lose the live preview. How they use the same code and end up with different results I don't know, and don't even know if I want to know.


  • BINNED

    Oh, the idea is fine. There was just much contention about what renders stuff serverside; I'm sure sam said they were using JavaScript, but I couldn't find the post, and Discoursistent rendering results started making me doubt my memory.

    Maybe it's not the parser? Again, I found some stuff that got sanitized serverside which looked utterly broken in preview. And vice-versa, really. Maybe they are not running the sanitizer clientside and that's what is mucking with rendering results, rather than the parser itself?



  • They are sanitizing... something. <script> and stuff won't show up in preview either.

    I think they muck with the cooked post further serverside in Ruby, but the cooker itself is the same. You can find random HTML parsing snippets in Ruby code, and whether they're exposed to client-side editor... well that would require work.



  • @sloosecannon said:

    That doesn't address the question though... I'm asking why we even need to do any work server-side at all... why can't we just pass the raw when you post, then pass it to the client when the post is requested?

    I don't think the baker is currently passed to most mobile clients right now. There isn't a need for it since there isn't even a preview.

    Also, doing the official baking on the server side allows whitelists and blacklists (admit it, we know discourse has both) to be updated immediately, without worrying about needing to update client scripts. You just have to pull the *list from the DB on server side, but you actually have to bundle it in a script for client side.



  • @sloosecannon said:

    Other way. When I submit a post, the post:cooked param has my post in it...

    That is certainly ignored.

    @loopback0 said:

    It's just the oven on the server and the oven on the client are different.

    @Maciejasjmj said:

    How they use the same code and end up with different results I don't know, and don't even know if I want to know.

    The post gets passed through an XML parser on the server that the client doesn't have AFTER running through the JS, which is the same. This is responsible for the "tags automatically closed" discrepancies.

    @sloosecannon said:

    @RaceProUK said:
    You mean, rebake the post client-side every time it's requested? Sure, why not? Run a mobile phone battery flat in 30 minutes, but sure!

    I... don't think it would be that big of an issue... Most webpages run a fair amount of JS anyways. I do see your point though. FWIW, Discourse kills my phone's battery anyways so.... meh

    You're forgetting the part where baking oneboxes makes network requests.


  • FoxDev

    @riking said:

    You're forgetting the part where baking oneboxes makes network requests.

    Which would, of course, increase battery drain even more



  • @RaceProUK said:

    Which would, of course, increase battery drain even more

    And you think TCotCDCK cares?


Log in to reply