Apple Swift



  • Apple announced a new programming language today called Swift.

    After a brief review of their highlights, here are some of the things that stood out to me:

    Inferred types make code cleaner and less prone to mistakes.

    Because who wants strongly typed variables?

    you don’t even need to type semi-colons.

    We don't need any clear indication of statement endings. Let's throw out one of the clear indicators that this is a C-based language and make it look more Basic.

    Create new tests, verifying they work before promoting into your test suite.

    Umm, other environments have been doing this for years. Case in point, Visual Studio. I use 2005 and 2008 at work, and both give me the ability to do this. Making a feature that is well behind the curve part of your highlights makes it feel like you're stretching.

    I'll be honest, I've avoided developing for Apple products in the past for 2 reasons:

    1. The entry price (face it, their hardware is expensive, and I won't release something that's only been virtually tested).
    2. The monstrosity of objective-C.

    Maybe now that Swift is out, I'll be down to one reason not to develop for Apple.


  • ♿ (Parody)

    Seems like everyone else is turning to javascript, but that's not cool enough for Apple, so we have iJavascriptSwift.



  • So did they branch Python?


  • Discourse touched me in a no-no place

    @abarker said:

    I'll be honest, I've avoided developing for Apple products in the past for 2 reasons:

    1. The entry price (face, their hardware is expensive, and I won't release something that's only been virtually tested).

    2. The monstrosity of objective-C.

    Maybe now that Swift is out, I'll be down to one reason not to develop for Apple.

    Look on the bright side. Maybe you'll just add one more reason to not like them?


    Filed under: Testing stuff is ✨ NEW! ✨ and I've only been doing it my whole career


  • Discourse touched me in a no-no place

    But now we've got a blend of JS and Obj-C!!!!!!!

    😃 🔫



  • @dkf said:

    Look on the bright side. Maybe you'll just add one more reason to not like them?

    😕 Hmmm ... I should have considered that possibility. After all, I can find no indication that Swift will replace objective-C, only that it is another option.



  • We just need Ben to contrast it with Go.


  • FoxDev

    @abarker said:

    >Inferred types make code cleaner and less prone to mistakes.
    Because who wants strongly typed variables?

    Type inference != weak typing. C# is strongly typed, and has support for type inference with the var keyword.
    @abarker said:
    >you don’t even need to type semi-colons.
    We don't need any clear indication of statement endings. Let's throw out one of the clear indicators that this is a C-based language and make it look more Basic.

    Why is this a problem? Plenty of languages use newlines to terminate statements; there's no reason you have to always use a semi-colon.


  • ♿ (Parody)

    @RaceProUK said:

    Plenty of languages use newlines to terminate statements; there's no reason you have to always use a semi-colon.

    And so plenty of languages have a WTF feature. Either use them or don't.



  • Technically, you only need a single semicolon per program in languages that support a C-like preprocessor.


  • ♿ (Parody)

    Obfuscated. But fair.



  • @RaceProUK said:

    Type inference != weak typing. C# is strongly typed, and has support for type inference with the var keyword.

    You got me there. I inferred from their wording - possibly incorrectly - that they were eliminating strong typing in Swift. On a second reading, I'm still leaning toward my initial interpretation, but there is a chance that I'm incorrect. In either case, this is another indication of "falling behind the curve" on Apple's part.

    @RaceProUK said:

    @abarker said:
    >you don’t even need to type semi-colons.

    We don't need any clear indication of statement endings. Let's throw out one of the clear indicators that this is a C-based language and make it look more Basic.

    Why is this a problem? Plenty of languages use newlines to terminate statements; there's no reason you have to always use a semi-colon.

    This one probably comes down to preference. I like being able to easily wrap a long line of code from one line to the next for readability. In many C languages, it's easy, just add a line break. The semi-colon indicates when the statement is finished. In Basic languages, not quite so much. You have to remember to add a special character to tell the compiler that the code statement continues on the next line (I think it's an underscore in VB). It just doesn't feel as clean. Even when you don't wrap your long lines of code, it helps make things more readable. Once you see the semi-colon, you know that you've reached the end of that statement. Incidentally, the semi-colon also makes line-breaks unnecessary.



  • Visual Basic 2010 has implicit line continuation, which I have to say, isn't that hard to read.

    Although, I rarely use VB .NET these days, and instead am forced to use \ for line continuation in Python.



  • @boomzilla said:

    Either use them or don't.

    What's puzzling to me is when (as in this press release) the lack of semicolons is used as some sort of selling point. There appears to be some interesting stuff in the article and some "hmm..." stuff, but I wonder who's getting excited at the fact that they can avoid that extra 1 keystroke on 60-70% of lines? The readability impact is so miniscule either way...



  • The stronger (i.e., more expressive) a type system is, the more the compiler can infer.

    It remains to be seen how expressive Swift's type system is. I don't see any theoretical papers out. If Apple is smart, they'll make it as strong as System-F.



  • @subscript_error said:

    What's puzzling to me is when (as in this press release) the lack of semicolons is used as some sort of selling point.

    Every time someone uses a semicolon God kills a kitten.


  • BINNED

    @subscript_error said:

    What's puzzling to me is when (as in this press release) the lack of semicolons is used as some sort of selling point. There appears to be some interesting stuff in the article and some "hmm..." stuff, but I wonder who's getting excited at the fact that they can avoid that extra 1 keystroke on 60-70% of lines? The readability impact is so miniscule either way...

    I don't understand the semicolon hate either. The amount of stupid Javascript hacks whose only use is to force semicolon insertion in the right place I saw around the internet is just ridiculous.


  • Considered Harmful

    @da_Doctah said:

    Every time someone uses a semicolon God kills a kitten.

    ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;



  • @abarker said:

    I'm still leaning toward my initial interpretation, but there is a chance that I'm incorrect.

    From hacking around with Swift this morning, I can conform Swift is strongly typed - it's all checked at compile time. 'Legacy' Objective-C calls return AnyType for stuff like collections and dictionaries, and anything originally returning id.


  • 🚽 Regular

    Another reason I like semicolons is that it disambiguates — and possibly deters bugs in — stuff like this, when any expression is considered a statement:

    somethingHere()     // Is this a single statement
    + somethingElse();  // or two statements, with this one having a unary + for some stupid reason
    

    or this:

    str = "this is a long string" +
          "I'm breaking into several lines" +
          "oops, forgot plus sign here -->"
          "I hope a single string literal" +
          "isn't accepted as a statement";
    

    I've been bitten by NodeJS happily accepting "where id = ?"; as a separate statement after a big sql = "select ..." +\n "..." +\n "...".

    What are the odds of them having considered these cases in their grammar and duly reporting on them?


  • ♿ (Parody)

    @subscript_error said:

    What's puzzling to me is when (as in this press release) the lack of semicolons is used as some sort of selling point. There appears to be some interesting stuff in the article and some "hmm..." stuff, but I wonder who's getting excited at the fact that they can avoid that extra 1 keystroke on 60-70% of lines? The readability impact is so miniscule either way...

    Except making them optional makes stuff less readable because you have to figure out if the missing thing is really a terminator, supposed to keep going or is a bug.


  • Discourse touched me in a no-no place

    Some languages do this by making newlines always be a statement terminator (and requiring the programmer to explicitly mark places where that isn't true). Experience suggests that people adapt rapidly.

    Or hate it and go off and use some other language. 😉


  • ♿ (Parody)

    @dkf said:

    Some languages do this by making newlines always be a statement terminator (and requiring the programmer to explicitly mark places where that isn't true). Experience suggests that people adapt rapidly.

    Yes. I'm not a fan of that, but at least it's consistent, so you know what you're looking for.



  • Kudos on them for finally getting rid of all those fuckin' semi-colons. A pity their example screenshot looks like it was painted by a drunk 5 year-old.



  • @dkf said:

    Some languages do this by making newlines always be a statement terminator

    No. It's the other way around. Some crazy languages give you two ways to terminate a statement (e.g. by automatically inserting semicolons). That's obviously a bad idea. Normal languages either use a semicolon, or a newline.


  • BINNED


  • Discourse touched me in a no-no place

    @anonymous234 said:

    Normal languages either use a semicolon, or a newline.

    “Normal languages are teh lamez!”
              — Apple (well, perhaps)



  • I actually read in a Wired article a quote of some programmer asking why they didn't just use Ruby - and I threw up a little in my mouth.


  • BINNED

    @DrakeSmith said:

    I actually read in a Wired article a quote of some programmer asking why they didn't just use Ruby - and I threw up a little in my mouth.

    In that case you probably don't want to know about this.



  • @antiquarian said:

    No, but that would have been a lot less work and there was already a demand for it.

    Python (and Ruby, for that matter) is an interpreted language, so it is probably much slower than native Objective-C or Swift.
    Also PyObjC is already provided with OS X (not sure about iOS, though) so you already have the choice.


  • BINNED

    @VinDuv said:

    Also PyObjC is already provided with OS X (not sure about iOS, though) so you already have the choice.

    Yes, and that was basically my point. There are already plenty of suitable languages for Cocoa development.



  • Earlier today, a friend of mine (who does iOS development fulltime) was gushing over Swift and its impressive new features. He mentioned Generics (despite now fully knowing what it is), "Optional" types (a.ka. 'this integer can also be null') and non-nulleability by default for some types (except when you mark them as Optional).

    So I welcomed him to 2005, back when Microsoft added Generics and Nullable Types to the .NET Framework 2.0.



  • I've done Objective-C programming before. I would be surprised if that language has had any updates since about 1989.



  • That was his response as well. :D



  • @abarker said:

    Inferred types make code cleaner and less prone to mistakes.

    Because who wants strongly typed variables?

    you don’t even need to type semi-colons.
    We don't need any clear indication of statement endings. Let's throw out one of the clear indicators that this is a C-based language and make it look more Basic.

    You ought to have a look at Boo. It's strongly, statically typed (except when you use the built-in dynamic type, called Duck) but type declarations are optional. Put them in when it makes sense to, leave them out and let the compiler infer types when it's obvious. And it uses Python style indentation and line-ending rules, but semicolons are optional. (And not like in JS, with all its horrible automatic-sem-insertion hackery.)



  • @Mason_Wheeler said:

    Python style indentation

    Gah. And it looked so nice.



  • @Maciejasjmj said:

    Gah. And it looked so nice.

    Technically, that's optional too. There's a compiler switch that lets you change the block formatting rules to Ruby-style: every block is terminated with the end keyword, and indentation is irrelevant. I haven't seen any actual Boo code that uses this, though.



  • It looks like a cross between Actionscript/Javascript with some Pythonic sugar. Not that bad actually, certainly better than the garbage known as Objective-C.


  • Discourse touched me in a no-no place

    @ObiWayneKenobi said:

    Not that bad actually, certainly better than the garbage known as Objective-C.

    I see you don't like Smalltalk.



  • @antiquarian said:

    Yes, and that was basically my point. There are already plenty of suitable languages for Cocoa development.

    Python doesn't run on the iPad and iPhone, AFAIK.



  • @TGV said:

    Python doesn't run on the iPad and iPhone, AFAIK.

    chalk up another win for Android.



  • @Mason_Wheeler said:

    You ought to have a look at Boo. It's strongly, statically typed (except when you use the built-in dynamic type, called Duck) but type declarations are optional. Put them in when it makes sense to, leave them out and let the compiler infer types when it's obvious. And it uses Python style indentation and line-ending rules, but semicolons are optional. (And not like in JS, with all its horrible automatic-sem-insertion hackery.)

    That sounds like Haskell. Granted, Haskell doesn't exactly promote its unityped sublanguage. But it's in there, in the guise of Data.Dynamic.

    Like I said, the more expressive a type system is, the more the compiler can infer. If you make sure to put types on all your top level declarations, you will catch about 90% of all the possible blunders, including the garbage Zecc was talking about.

    If you actually define types with semantic meaning, so you can't mix up say, strings with different purposes, you get rid of another 9% or so. This is a very low cost way to eliminate bugs and increase robustness.



  • Android also doesn't run Python pre-4.0, due to a lack of Unicode support.



  • @riking said:

    Android also doesn't run Python pre-4.0, due to a lack of Unicode support.

    It probably doesn't run on a TI-82, either.



  • @riking said:

    Android also doesn't run Python pre-4.0, due to a lack of Unicode support.

    Unicode support does not depend on Android version but on NDK version. I am not sure which version added it, but it was either 5 or 6. When compiling with that or newer version, the binary can use unicode and will run on older android. I have this tested back to 2.1, but that's not the earliest version possible.



  • @abarker said:

    Apple announced a new programming language today called Swift.

    Bonus points for the marketing fail.

    All the cool new languages these days have their home page at language-name-lang.org. We have http://d-lang.org/, http://go-lang.org, http://rust-lang.org, http://swift-lang.org. Guess what, the last one is different swift.



  • Can't they just adapt C# syntax? :/ Or go back to C++? Objective C is so bloody verbose it's almost like writing complex windows apps with functional (not OOP!) WinAPI...



  • Oh my god... swift is worse than I thought


  • 🚽 Regular

    By allowing Unicode characters in constant and variable names, Apple’s new Swift programming language will allow programmers whose native languages don’t use the Roman alphabet to write code that makes more sense to them.


  • So the one shit object of class 4 shits is calling the three shits function with the variables chicked head and tiger furry head's guy with jizz on his face member?


Log in to reply