The javascript support solution, circa 2015

  • We're building a lightweight MVC replacement for a site primarily aimed at low-end "feature phones" for a site that was originally coded in ASP.NET webforms (I know...)

    The target market is Nigeria, Kenya and Tanzania now, with expansion into the other large countries in the area. The user's phones range from crappy black & white barely-can-browse-the-internet "feature phones" up to the more affluent iPhone-type. Our team's plan to approach this wide variety is to build a very basic, lightweight site that relies on server-side validation for the crap phones; and to "progressively enhance" using javascript (making use of jQuery Mobile for feature detection) so the high-end users get a decent experience.

    This is what our "Solutions Architect" had to say on that:

    Hi guys. The requirement for XXXXX is no javascript. Even for those feature phones that do "support" JS we dont know what subset is implemented and whether the JS implementation or dynamic DOM works well. Since the requirement is for maximum compatibility even for obscure phones we will never test on, the safest option is to have no JS.

    Ja. That is a great idea. Because shitty UX on a feature phone should definitely affect high end users.

    Oh great.

    Followed up with this fucking uneducated directive from the guy in charge of the project:

    Guys, please no Javacript what so ever.

    I'm sorry. This fucking problem was solved in 199-fucking-8 when not all browsers supported the same JS. The fact that it has come back and on mobile makes no fucking difference to the concept. Back then it was called "graceful degradation". Now the buzzword is "progressive enhancement". Throughout my entire career as a web developer I've been catering to the lowest target browser while enhancing the higher ones.


    This is a project we picked up for another team (while management argues about our upcoming project), so we are doing the guy in charge a favour. Thank god we don't have to work with them all the time.

  • SockDev

    Going for maximum compatibility, I can understand a policy of zero JS; even with modern libraries and frameworks, it's still often more trouble than it's worth.

    …on the other hand, seems a shame to serve an 80s BBS level UI to phones that can process more than one calculation a year…

    <!-- Emoji'd by MobileEmoji 0.2.0-->

  • Well, they have a point. A phone might support javascript but extremely slowly or with lots of bugs, and there's sadly no way to tell.

    Actual example: Discourse has a "no javascript" version that only lets you read, not post (mostly for search engines to scrape). Until very recently, if you tried to load this forum in Opera mobile classic, you got the full version, but so buggy it was completely unusable.

  • SockDev

    @anonymous234 said:

    Until very recently, if you tried to load this forum in Opera mobile classic, you got the full version, but so buggy it was completely unusable.

    Discourse Mobile isn't exactly a rock-solid product on modern smartphones… 😛

    <!-- Emoji'd by MobileEmoji 0.2.0-->

  • jQuery Mobile is what I'd use.

    It grades phones in an A, B and C category by means of feature detection (or, if the phone can't handle JS at all, obviously no category, as the JS won't even run)

    Category A can handle anything pretty much. B is iffy, C can do pretty much nothing.

    As what we are arguing about is the use of JS for client-side validation (nothing close to Discourse!) - which in a data-frugal mobile market* is a very important usability concern. The site really doesn't need much more than that to become way better to use.

    If I can prevent a class B phone from having to post back (with the resultant data use on an Edge connection) with a simple piece of Javascript I do not see any reason why I should not do so.

    Class A phones can have a decent experience with Ajax etc; Class C will get little or nothing. It's the class B ones I care about (eg Spice M6800 Flo)

    * In Africa data is expensive, and connections are slow. We have way higher mobile internet coverage than cable coverage, but the mobile network is overloaded and stretched to capacity. Most connections use the low speed/low data Edge or 3G connections.

  • I survived the hour long Uno hand

    @RaceProUK said:

    80s BBS level UI


    "No javascript" does not mean "80s BBS UI". Seriously, I think if Chrome dropped JS support all the web devs would simultaneously quit their jobs and cry like babies. "What will I do without JQUERY?!"

  • SockDev

    I was thinking more because the non-JS phones would also have crap displays, so they wouldn;t be able to show the whole of the Discorainbow.

    …and I think I just coined a new Discoterm? Maybe two?

    <!-- Emoji'd by MobileEmoji 0.2.0-->

  • SockDev

    Now to work out the definition… done. Discoterm too.

    <!-- Emoji'd by MobileEmoji 0.2.0-->

  • When used correctly, JS can reduce bandwidth by a shitton, which people in countries with metered Internet would really appreciate.

  • @Yamikuronue said:

    all the web devs would simultaneously quit their jobs and cry like babies.

    Would anyone actually notice if this happened?
    Filed under: can we use jQuery for this?


    @tar said:

    Filed under: can we use jQuery for this?

    $('.web-developers').on('hissy-fit', function(e) { /* your code here */ });

  • @tar said:

    Would anyone actually notice if this happened?

    Remember the web ca. 1990?

  • You sure you mean "the web"? Gopher was a thing in 1990, but I think Mosaic didn't exist yet...

  • @tar said:

    Gopher was a thing in 1990, but I think Mosaic didn't exist yet.

    Mosaic was developed at the National Center for Supercomputing Applications (NCSA)[5] at the University of Illinois Urbana-Champaign beginning in late 1992. NCSA released the browser in 1993,[7] and officially discontinued development and support on January 7, 1997.[8] However, it can still be downloaded from NCSA.[9]

  • That basically confirms my hypothesis, I believe...

  • Yes, it does.

  • Alright then. @Ragnax, are you sure you didn't mean "the net ca. 1990"?

  • Discourse touched me in a no-no place

    @HardwareGeek said:

    NCSA released the browser in 1993,

    I remember using the early Mosaic. Using HTTP sucked a lot because networks were very slow and nothing would show until everything had been downloaded. (Like some modern JS-heavy sites…) It was particularly bad because the Mosaic help pages were hosted in the US, and the transatlantic network bandwidth was pitiful at the time.

    But Mosaic was still good, as it was a much better Gopher client than the nasty things we'd been using before.

  • I remember Mosaic being the 'new thing', probably around my second year of university. Prior to that, internet was all gopher and telnet-based BBSs/MUDs, Usenet and filesharing over weird ftp sites. (Literally typing ftp at the TTY and going through all that palaver.)

    Then again, even the old text-based stuff was fascinating, seeing as it was like nothing else we'd ever seen at the time (plus you could fantasize that you were somehow involved in the movie WarGames...)

  • @tar said:

    I remember Mosaic being the 'new thing'

    I was a bit late to the game, using Mosaic only briefly before Netscape left it in the dust. I wasn't all that much interested in the Web, considering Usenet and email adequate, especially since my ISP charged extra for SLIP connections, until I got access at work. Of course, I changed my mind quickly once I finally saw the "new thing" for myself.

  • @tar said:

    You sure you mean "the web"? Gopher was a thing in 1990, but I think Mosaic didn't exist yet...

    That would be the point I was making: watch the web crumble back into the primordial state of the old 'net' if there are no specialized developers to produce or maintain modern applications for it.

Log in to reply

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.