UI programming for the chronically distracted


  • BINNED

    Hmm... this actually warrants some investigation. There is another bit that always did bother me about the installer: it doesn't pick up my widget style properly. Now, I buttumed that it might be the nature of the beast, being all statically linked in order for the installer to work without Qt libraries actually being installed, but it might be that it is written in something else completely.

    For reference, this is what a typical dialog looks like on my system:

    And this is the Qt installer:

    Notice the background color and the buttons.

    Once I install it and start Qt Creator, which was written in Qt:

    Yup, completely fine. So yeah, that installer is fishy as all hell.



  • I guess nobody even told them that first impressions matter.



  • The installer isn't the first impression. Never was.

    The first impression is the box, website or salesman.

    As far as I can tell most commercial software is actively hostile to the user during installation, and the more expensive it is the worse it is to install.

    Case in point - SAP.

    SAP appears to have been designed by a team of homicidal maniacs who have bet their careers on high suicide rate among their users, and high bankruptcy rates among their customers.


  • Java Dev

    I believe our product is one of the easiest to install of our company's offering. And pre-acquisition, we were on an appliance model.



  • Have you ever tried to install or update the Windows SAP-ERP client?

    It's apparently utterly impossible to do a silent update or install, and when IT Support updated my SAP client last year the poor sod burnt an entire afternoon running the multiple manual installers needed to do it - apparently no 'roll-up' or 'uninstall old/reinstall new' option was available. (Support's internal, so no daft billing excuses)

    I felt bad about it because my machine was about to be replaced and thus get the new standard image anyway, but Manglement had so decreed it...

    I could list the client-end UI and process WTFs in SAP until the heat death of the universe, but this isn't the thread for that.
    Not to mention the number of businesses seriously damaged by it. Seriously.

    That said, the two SAP client UI things that probably most annoy me and might be vaguely relevant to the thread:

    • I don't care whether it's Enter or an arbitrary F* key that makes the transaction happen, but please just ****ing pick one of them and stick with it.

    • If the item can't be edited in the current context, disable editing of the item. Equally, if it can be edited, don't ****ing disable it - especially if it's the one containing the invalid value the error message complained about.

    Pretty sure the majority of the issues aren't because Java, though the nothing-like-native Java GUI of our install certainly makes it harder to use.

    [size=10]Then my new machine had the old client anyway. Screw it, Support have enough shit to deal with and it's not like I actually need it much, if at all.[/size]


  • Discourse touched me in a no-no place

    @tar said:

    I did say "might", but if you've only ever experienced 5 second compiles, then I need to know what line of business you're in. I think 2 minutes is the quickest turnaround I've seen in 15 years (and that was at the start of a project, and didn't last). I've worked on "hit F7, go make a cup of coffee or two" types of projects...

    It depends on the size of the project and the language you're using. Bigger projects take longer (duh!) and C++ is a slow language to compile. C is faster. Java (and probably C# too) is much faster, despite all the VM overhead.


  • FoxDev

    @dkf said:

    Java (and probably C# too) is much faster, despite all the VM overhead.

    Mainly because it's compiled to a bytecode/intermediate language; the rest of the compilation is JIT on execution.



  • No, its because there's better encapsulation of compilation units.



  • @RaceProUK said:

    Mainly because it's compiled to a bytecode/intermediate language; the rest of the compilation is JIT on execution.

    Mostly because there's no #include, everything goes into sensibly-organized assemblies.

    I don't think compiling C# into bytecode is necessarily any faster than compiling C/C++. But the C# is done in nice bite-sized chunks and can be easily parallelized to-boot.



  • @tar said:

    I did say "might", but if you've only ever experienced 5 second compiles, then I need to know what line of business you're in. I think 2 minutes is the quickest turnaround I've seen in 15 years (and that was at the start of a project, and didn't last). I've worked on "hit F7, go make a cup of coffee or two" types of projects...

    I do mostly C#. If I have time to get impatient waiting for the compiler, something is wrong with my computer. I typically work on projects in the range of tens of thousands to the hundreds of thousands of lines of code



  • @blakeyrat said:

    no #include,

    That right there is probably a huge speed win. To the backend of the C++ compiler after the preprocessor's done its work, a 1k .cpp file is transformed into about 1M of library header declarations.

    (There's a counterintuitive trick for large C++ projects to speed up builds: make a single .cpp file for each library in your solution, and have that file #include all of the library's .cpp files. Then you're only paying the preprocessor overhead once per-library rather than once-per file. Even though an incremental build is building more actual code on average, it's still a net speed win in most cases.)



  • Hey that's what I said!



  • So I watched this video (which won't onebox, so I quoted a single frame in an anchor to the video):

    It was moderately interesting, although clearly geared at people who already know what MVVM is. There's some stuff about design-time data, which is an interesting idea, and you can get a sense of how essential the "View Designer" part of the process is. The actual "app" he's building is pretty (amazingly) basic, so you don't need to get worried about getting bogged down in unnecessary detail. There was a slide towards the end where he tries to address downsides of MVVM, which was interesting as well.

    (I think his take on this is that his MVVM Light tries to address the issues around needing a lot of boilerplate code, which may be why he's calling that out...) He does go on to address some common "MVVM myths" as well, such as "Code-behinds can't contain code", "MVVM is slow/complex/etc".

    This felt like a moderately productive use of an hour.



  • Got the Qt source code installed, so had a very brief look at that as well. There seems to be about 66,000 files under Qt\5.4\Src\—that's a lot of code.

    So far, I found a file called Qt\5.4\Src\qtbase\qmake\main.cpp which seems to indicate they have their own inhouse build system. This file starts off with (some #includes then) a QT_BEGIN_NAMESPACE macro and then immediately launches into a single function implementation of sed (static int doSed(int argc, char **argv), so this codebase is already off to a promising start! :D

    I will probably put the code to one side and have a play with Qt Creator tomorrow...


  • BINNED

    @tar said:

    seems to indicate they have their own inhouse build system.

    qmake is Qt's additional preprocessor, basically. It generates standard makefiles in the end and Creator will just run make for you after it's done, but it does some additional magic first which is needed for Qt's meta-object system to work. You can hook it up into cmake as well but I never really bothered doing that myself.



  • Basically, Qt has a few non-standard 'macros' that are processed by "moc" before passing the results on to the actual compiler, primarily for the signals/slots implementation. They could have done it the same way as boost but Qt supports compilers where that's not possible, and moc is faster.

    Qmake automates the whole build system to compile the ui, run moc and generates the make file. You can do it manually but that's boring.

    Under Windows/Visual C++, Qt's "jom" replaces cmake and is much, much faster.

    Qt Creator (or plugins for other IDE if preferred) makes it all totally transparent.

    They have been working on "qbs" as a general replacement for "make", but it's not ready yet and I don't know whether it would really be better.


  • Banned

    @tar said:

    (There's a counterintuitive trick for large C++ projects to speed up builds: make a single .cpp file for each library in your solution, and have that file #include all of the library's .cpp files. Then you're only paying the preprocessor overhead once per-library rather than once-per file. Even though an incremental build is building more actual code on average, it's still a net speed win in most cases.)

    It's not. The only situation it makes sense to do so is to get more optimized binary. For reducing compilation time, there's precompiled header.

    @Onyx said:

    qmake is Qt's additional preprocessor, basically. It generates standard makefiles in the end and Creator will just run make for you after it's done, but it does some additional magic first which is needed for Qt's meta-object system to work. You can hook it up into cmake as well but I never really bothered doing that myself.

    Actually, qmake doesn't deal with Qt magic - it just generates the makefiles. For Qt magic, there's MOC which is invoked through the generated makefile. And FWIW, qmake is the easiest C/C++ build systems I have ever dealt with.


  • BINNED

    @Gaska said:

    Actually, qmake doesn't deal with Qt magic - it just generates the makefiles. For Qt magic, there's MOC which is invoked through the generated makefile.

    Point. I keep equating them in my head since qmake just fucking works and I never had the reason to go into depth with it so far. I just fed it the .pro file and off it went.


  • Discourse touched me in a no-no place

    @Gaska said:

    The only situation it makes sense to do so is to get more optimized binary.

    You'd be better off going for Link-Time Optimization in that case. Which is akin to the sort of stuff that the C# and Java JITs have been doing all along. :-)


  • Banned

    While LTO makes it possible to optimize heavily and still benefit from the incremental build, the result is always non-strictly worse than compiling one giant blob.



  • @Gaska said:

    While LTO makes it possible to optimize heavily and still benefit from the incremental build, the result is always non-strictly worse than compiling one giant blob.

    I probably build Debug 100 times for every one time I build Release.


  • Banned

    Do you LTO in debug?


  • FoxDev

    @Gaska said:

    Do you LTO in debug?

    Shouldn't do really. Well, maybe sometimes, but not all the time.



  • Maybe I misunderstood what you were saying. Nevermind.

    I just hit the play button and trust C# and Visual Studio will do the right thing, which they invariably do.


  • Banned

    I think you assumed I'm advocating for include-everything-into-single-cpp, which is almost total opposite of what I was saying (I meant you should never, ever, ever do it, unless you make a release build for gaming console where every cycle matters).


  • FoxDev

    @Gaska said:

    unless you make a release build for gaming console where every cycle matters

    Which hasn't really been the case ever since the PlayStation came out.



  • No, what 90% of the readers of this thread are thinking is "If you are using a language where compiling in order to debug is an actual problem, then you should consider writing in a different language". Most of the rest of the world doesn't give this problem a second of thought.

    If this is a general C++ thing, then it's yet another strike against the language.


  • Banned

    @RaceProUK said:

    Which hasn't really been the case ever since the PlayStation came out

    You must have never developed for Playstation.

    @Jaime said:

    No, what 90% of the readers of this thread are thinking is "If you are using a language where compiling in order to debug is an actual problem, then you should consider writing in a different language".

    And how does it relate to anything I've said?


  • ♿ (Parody)

    @lightsoff said:

    SAP-ERP

    For when you need to make Lotus Notes look good.


  • FoxDev

    @Gaska said:

    You must have never developed for Playstation.

    tru.cs
    Never written a game either. But I guess cycle-pinching stopped happening from the PS2 generation onwards.


  • BINNED

    I don't know... apparently new consoles barely push 720p these days.

    Granted, that's mostly on the GPU, but still.



  • @Gaska said:

    It's not. The only situation it makes sense to do so is to get more optimized binary. For reducing compilation time, there's precompiled header.

    It is faster. It's been measured to be faster several different times on several different projects. The fact you've never tried measured it doesn't invalidate other people's results.

    Precompiled headers are great if you know that you'll never want to port your codebase to another compiler, but they're stilll a non-standard hack.



  • @RaceProUK said:

    Never written a game either. But I guess cycle-pinching stopped happening from the PS2 generation onwards.

    PS2 was still a bit slow, people still would drop into assembler for it. The PS2 had such a demented architecture in general that it was almost impossible to write performant code for it, what with needing to get one CPU to tell another CPU to DMA memory to a third CPU...
    fucking PS2...


  • FoxDev

    Oh yeah, that whole Emotion Engine bullshit…


  • Banned

    @tar said:

    It is faster. It's been measured to be faster several different times on several different projects. The fact you've never tried measured it doesn't invalidate other people's results.

    I tried. And it wasn't significantly faster than PCH on full rebuild, but it was significantly slower with minor changes because incremental build becomes impossible. Remember that majority of build time is Boost.

    @tar said:

    Precompiled headers are great if you know that you'll never want to port your codebase to another compiler, but they're stilll a non-standard hack.

    They have nothing to do with standard. If your compiler doesn't support PCH, the code still compiles and works fine because it's just plain old include file. It won't benefit from the speedup, yes - but you can use the same argument for optimizing compilers.



  • @Gaska said:

    I tried... Boost.

    Well, we found your problem, then.

    @Gaska said:

    They have nothing to do with standard. If your compiler doesn't support PCH, the code still compiles and works fine because it's just plain old include file. It won't benefit from the speedup, yes - but you can use the same argument for optimizing compilers.

    If your compiler doesn't support PCH, you now have a "pch.h" header included everygoddamnwhere which is going to cause a full rebuild every time you touch it. (And if you don't touch it to add new headers as they're created, your PCH build is now fucked, but you won't know because your compiler won't tell you...)


  • Banned

    @tar said:

    If your compiler doesn't support PCH, you now have a "pch.h" header included everygoddamnwhere which is going to cause a full rebuild every time you touch it.

    Same for compilers that do support PCH. That's why you put only libraries in it.

    @tar said:

    And if you don't touch it to add new headers as they're created, your PCH build is now fucked, but you won't know because your compiler won't tell you...

    Adding new libraries to the project doesn't happen as often as that sentence might suggest.

    It all boils down to the right tool for the right job.


  • Discourse touched me in a no-no place

    @tar said:

    If your compiler doesn't support PCH, you now have a "pch.h" header included everygoddamnwhere which is going to cause a full rebuild every time you touch it. (And if you don't touch it to add new headers as they're created, your PCH build is now fucked, but you won't know because your compiler won't tell you...)

    If you can't write a script to auto-generate that, you should hang your head in shame. :)



  • That actually is a great point, it's exactly what I'd do in a personal project. I've only ever used PCH's at work though, where anything outside of VS's comfort zone is immediately frowned upon (by the 'Senior' engineers, for shame...) and they can hire an intern to maintain the pch.h's...

    Fun trivia: If I hit Alt instead of Shift on my phone, PCH is 0_&


  • ♿ (Parody)

    @tar said:

    If I hit Alt instead of Shift on my phone, PCH is 0_&

    And if I read PCH on my screen, it's always Pacific Coast Highway. That's less helpful than your Senior Engineers' sense of adventure.



  • @Gaska said:

    For reducing compilation time, there's precompiled header.
    Precompiled headers speed up the part that doesn't take much time -- parsing. At least in my experience, the real killer to compilation times are things resulting from template instantiations. These include the instantiation process itself, searching for the correct instantiation to use, and often repeated compilation of the same functions across multiple compilation units. These are things that are at later stages of compilation than what PCH helps with; at least for the code base I somewhat work on, PCH was not very helpful. (I didn't try plopping everything into one .cpp file though, so I can't compare.)



  • @Jaime said:

    No, what 90% of the readers of this thread are thinking is "If you are using a language where compiling in order to debug is an actual problem, then you should consider writing in a different language"
    Most people don't realistically have that choice. If I were to start writing in another language, I'd presumably be fired before too long.

    I may not be a big fan of C++ and there are some things I could submit to the front page (but won't) about our code base, but it's made up for by the fact that what I'm actually working on at a higher level is really cool and interesting.


  • Discourse touched me in a no-no place

    @EvanED said:

    Precompiled headers speed up the part that doesn't take much time -- parsing.

    I was playing around with COM a couple of years ago, and I added PCH to a hand-rolled project (I used a makefile instead of a project file because I felt like it.) Adding PCH cut a ~10s build time down to essentially nothing. I never tried it on a really large project, though.



  • @tar said:

    Probably a good tutorial on MVVM (or related paradigms) would be the #1 takeaway from this topic for me.

    Oh, hey @tar, I found just the thing for you. It's a walkthrough of about the most basic MVVM app it's possible to conceive of, and even better, the app is in an obviously incomplete state, so you'll feel compelled to start adding the 'missing' functionality, and applying the concepts you picked up in the article:

    And you can download the source code from here. It's a bit old, but the solution opens and upgrades in VS2013 easily enough (just had to faff with a reference to some UnitTesting DLL). I guess the next thing to try is building it on Mono/Linux...



  • Hmm...

    [code]
    $ sudo apt-get install mono-devel
    $ cd MvvmDemoApp
    $ xbuild MvvmDemoApp.sln
    [...]
    DataAccess/CustomerRepository.cs(6,22): error CS0234: The type or namespace name Resources' does not exist in the namespace System.Windows'. Are you missing an assembly reference?
    App.xaml.cs(9,32): error CS0246: The type or namespace name Application' could not be found. Are you missing an assembly reference? MainWindow.xaml.cs(3,54): error CS0234: The type or namespace name Window' does not exist in the namespace System.Windows'. Are you missing an assembly reference? RelayCommand.cs(14,33): error CS0246: The type or namespace name ICommand' could not be found. Are you missing an assembly reference?
    View/AllCustomersView.xaml.cs(3,60): error CS0234: The type or namespace name Controls' does not exist in the namespace System.Windows'. Are you missing an assembly reference?
    View/CustomerView.xaml.cs(3,56): error CS0234: The type or namespace name Controls' does not exist in the namespace System.Windows'. Are you missing an assembly reference?
    App.xaml.cs(28,43): error CS0246: The type or namespace name StartupEventArgs' could not be found. Are you missing an assembly reference? ViewModel/CommandViewModel.cs(11,53): error CS0246: The type or namespace name ICommand' could not be found. Are you missing an assembly reference?
    ViewModel/CommandViewModel.cs(20,16): error CS0246: The type or namespace name ICommand' could not be found. Are you missing an assembly reference? ViewModel/WorkspaceViewModel.cs(33,16): error CS0246: The type or namespace name ICommand' could not be found. Are you missing an assembly reference?
    ViewModel/CustomerViewModel.cs(196,16): error CS0246: The type or namespace name `ICommand' could not be found. Are you missing an assembly reference?
    [...]
    4 Warning(s)
    11 Error(s)

    Time Elapsed 00:00:00.7160020
    [/code]

    Well that was slightly less of a total failure than I thought it was going to be...


Log in to reply