I was Doing It Wrong™ and Visual Studio 2015 crashed
-
What did I do? Change
.tt
file into.cs
.
-
-
Oh man I bet there's a killer Chuck Norris joke about this...
-
Get your broken heart outta my topic!
-
I actually found VS2015 to be the least stable I've ever used. I stumbled upon several 100% reproducible compiler crashes in C++, and they stay unfixed since launch. I must say - this version is shit.
-
As is the new version of Microsoft Word. I've used it for a day and already found 3 super-annoying bugs.
In an attempt to imitate Apple, Microsoft seems to have stopped testing their software.
-
and they stay unfixed since launch.
natch.
fixing them would break backwards compatibility.
-
In an attempt to imitate Apple, Microsoft seems to have stopped testing their software
Software complexity always ratchets upward. Year on year, there is never a reduction in general levels of expectation about what software ought to be able to do; on the contrary, once it's become normal for a certain class of software to be able to do a certain set of things reasonably well and reasonably reliably, the only way any given package can avoid being buried by the competition is to attempt to do all the things the competition does as well, even if there are loads of things the competition doesn't do as well as the package in question.
For example: Google Docs arrived, and it did some of the things that Microsoft Office did, and did them almost as well as Office did them. It was missing a heap of existing Office features, and some of the Office features it did have were implemented really poorly, but it had this killer new thing: real-time Internet-based collaboration on any given document. Microsoft's reaction was completely typical of the software industry: make Office do that too. So Office's complexity ratcheted upward another couple of notches.
This kind of thing has been going on for decades. In my opinion, it's now reaching something akin to critical mass: we're starting to see the effects of complexity that exceeds what's manageable in any market-compatible amount of time by software engineering. The industry seems intent on trying to sweep this under the rug by building framework upon framework upon framework, but the whole thing is now so multi-layered that the main effect of that is simply to give bugs more places to hide.
We will never reach The Singularity, because by the time we have software that's as good as we are at designing software, the tools it has at its disposal will be the same bug-infested over-complicated piles of crap that we used when we built it, and it will itself be no less buggy than we are. The software industry, like all technology, is in the process of becoming crushed by the unintended consequences of its own successes.
-
Software complexity always ratchets upward. Year on year, there is never a reduction in general levels of expectation about what software ought to be able to do; on the contrary, once it's become normal for a certain class of software to be able to do a certain set of things reasonably well and reasonably reliably, the only way any given package can avoid being buried by the competition is to attempt to do all the things the competition does as well, even if there are loads of things the competition doesn't do as well as the package in question.
A good example is CD burners.
-
-
It's hard to measure its deadness, but the graph on the right of this page (Poland's most popular website with software) suggests that it still has quite a few users. Anyway, I mostly referred to the situation circa 2005, not now.
-
I mostly referred to the situation circa 2005, not now.
I think Nero was pretty dead then, too.
-
I think Nero was pretty dead then, too.
I bought three DVD-RWs in the last two years for casebuilds i was doing (one personal, two for a friend (well two different friends... (anyway, that's off topic)))
all three came with a Nero Burning ROM Install CD
all three CDs went straight into the shredder, but i can confirm at least that the software is still being sold (or at least bundled in new drives)
-
I think Nero was pretty dead then, too.
In Poland, it was still #1 thing for some reason.
-
was just as dead in 2005 as he is now. :)
-
Little do you know...
-
Little do you know...
Nero Redivivus does not invalidate my statement; he is equally dead now as he was 10 years ago.
-
First time I tried Visual Studio, many years ago, it was magical. I loved how the auto-complete helped me choose the right member variable, and go-to-definition could help me discover any new code base. It has now become a must-have in my toolbox, so much that I simply cannot live with simple editors e.g. just
vim
(unless it is with ycm).
I since have used Eclipse, kdevelop, PayCharm and XCode (the most annoying one), but frankly do not need anything but auto-complete and go-to-definition from any IDE. If a problem is complicated, debugging is difficult anyways, and if it is simple any IDE is good enough already.
-
-
I since have used Eclipse, kdevelop, PayCharm and XCode (the most annoying one), but frankly do not need anything but auto-complete and go-to-definition from any IDE.
IDEs are great if you don't want to wind up your own makefile. I think that's the most important thing to me, next to go-to definition.
-
They are great even if you have makefiles. Eclipse works fine with makefile code, and indexes the code base just fine. kdevelop by default targets cmake, which is really much better than simple makefiles, and is cross platform.
-
None of them can cooperate with the makefile abomination at my work, though...
-
Yes, makefiles are horrible. With Eclipse you can basically ask it to just index everything, and use it to write code.
I do not compile from inside the IDE, nor use its line-by-line debugging (even though you can attach with gdb).
-
Yes, makefiles are horrible. With Eclipse you can basically ask it to just index everything, and use it to write code.
I did something like that with Netbeans. It was taking two hours to reindex things, and it still was mostly wrong. The fact you can't compile locally doesn't help either.
-
It was taking two hours to reindex things
That must be a a huge code base. I used Eclipse to index the Linux kernel (only x86 arch) and it took that much time, fortunately I needed to do that once. I am not surprised though, eclipse needs a lot of memory and uses CDT which is not used by anything but Eclipse.
If your code base is huge you should look into something that usesllvm
to do indexing, even thoughycm
does that it is much easier to use kdevelop 5.
-
That must be a a huge code base.
Not huge per se. Just heavy use of Boost. And 100x more libraries in include path than actually used.If your code base is huge you should look into something that uses llvm to do indexing, even though ycm does that it is much easier to use kdevelop 5.
Our code is incompatible with Clang.
-
-
I actually found VS2015 to be the least stable I've ever used.
- Install Visual Studio 6 (that one released in 1998) on a Windows 7 machine.
- Open it using a limited (non administrator) account
- Open a code editing window
- Click on the window to put keyboard focus on that
- Press Home
- Enjoy your crashed Visual Studio
-
Do you even write
Makefile
orMakefile.in
?
-
Fuck autoconf in every available orifice.
-
I tried to replace Makefile with a CMakefiles.txt
Now I have to mantain both. :-(
-
You stole my line I had prepared if you had said you use autoconf
I tried to replace Makefile with a CMakefiles.txt
Now I have to mantain both.
Why? Do you callmake
in youCMakeFile
in a custom target or something? Worse than Makefile is a mishmash of build systems
-
Our code is incompatible with Clang.
That is a most likely a . Even if you never intend to run binaries compiled with clang, the sheer fact that it compiles there with no warning is a good sign that is high quality code. Compile your code with as many compilers as possible (as long as you have a proper build system it would be simple) and benefit from the diversity.
-
I bought three DVD-RWs in the last two years for casebuilds i was doing (one personal, two for a friend (well two different friends... (anyway, that's off topic)))
all three came with a Nero Burning ROM Install CD
You don't buy OEM drives? That's the route I take, and I never get burning software.
-
Why? Do you call make in you CMakeFile in a custom target or something? Worse than Makefile is a mishmash of build systems
Branches that should be merged and never are, platforms that should use the same build file but do not, and otter wacky stuff
-
-
Yes, makefiles are horrible.
There's a rather famous story which appears in the Unix Haters Handbook about make and the effect of backwards compatibility on one's freedom to fix bugs:
According to legend, Stu Feldman didn't fix [b]make[/b]’s syntax *[srl: specifically, the requirement to use tabs instead of spaces]*, after he realized that the syntax was broken, because he already had 10 users.
Keep in mind that make, like a lot of early Unix utilities, was written with the expectation that it would never see the light of day outside the Slabs. Which explains a lot about Unix in general, I think. Ten users would have been about as many as he'd ever expected to have to support.
-
specifically, the requirement to use tabs instead of spaces
Wait, you consider that a bug? You'd rather your makefiles had 9-byte newlines instead of 2-byte newlines?
-
@ScholRLEA said:
specifically, the requirement to use tabs instead of spaces
Wait, you consider that a bug? You'd rather your makefiles had 9-byte newlines instead of 2-byte newlines?
I'm not entirely sure you're joking, so I'll bite: first, that was me giving the context of the quote, rather than my own assertion. However, yes, I would say that not being able to accept either a tab or a series of whitespaces (though not on the same line, presumably) was a bug, if not from a technical standpoint than from a UX one. While UHH's rants about how confusing this is (see pages 178-181) were way over the top, the basic point of how that one error has caused a lot of headaches over the years is a valid one.
In any case, even in 1976, a difference of seven bytes per line would not have been a significant one on any system capable of running Unix.
-
a bug, if not from a technical standpoint than from a UX one
You know what happens to UX bugs around here :P
-
You are aware that @ben_lubar is the go guy here, right?
-
-
I can't believe that there are still people who actually like Makefiles. I mean, apart from the syntax, they're kind of OK for small projects (around 20 C files) that only ever need to build on one platform. As soon as your project gets bigger than that, though, you should use an actual build system unless you want to create a horrible mess.
Yes, CMake is a huge collection of hacks as well, but at least CMakeLists.txt is readable and maintainable.
-
Node.js aficionados get really hipster about their tech: they're all moving to
vim
andmake
for some reason, like using older tools makes them more 'legit'.
-
I can't believe that there are still people who actually like Makefiles.
Our architect at work loves them. Almost everything in our project is run from makefiles (almost, because these makefiles are out of date in places). And it's horrible to everyone else. Also, he invented the most ridiculous IPC protocol imaginable.
-
Node.js aficionados get really hipster about their tech
When they're not using every brand new framework that's been around for 54 seconds and will get 'replaced' with another in about 20 minutes.
-
Also, he invented the most ridiculous IPC protocol imaginable.
Sounds like you should post that here.
-
every brand new framework
It's the same thing, isn't it? It's either so new you've never heard of it, or so old you'd never seriously consider using it.
-
I might have quoted the wrong part of the post.
You're right though, it is the same.
-
You don't buy OEM drives?
when they're cheaper i do, newegg had a sale on non OEM drives that made them cheaper so i went that route.