This guy reminds me of a fellow who stubbornly kept arguing with everybody that in C, there was never any reason at all to use a pointer to a pointer (e.g. int**), and that everything should be done with only one level of indirection.
seaturnip
@seaturnip
Best posts made by seaturnip
-
RE: Using clipboard for application communication
-
RE: Are you saying that this linux can run on a computer without windows underneath it, at all ?
This is someone's joke character, not something someone really believes. The line about OS/2 proves it. Anyone computer-literate enough to have heard of OS/2 couldn't make this mistake.
Latest posts made by seaturnip
-
RE: The new memcpy!
If this approach really gives a substantial performance boost, why not just use a preprocessor macro to restrict it to (certain versions of) GCC? I've certainly done this kind of thing before, though admittedly on proprietary projects where we knew there was essentially no chance of using a different compiler; I don't know if portability standards are different in the open-source world.
-
RE: C++0x
Many of the new features in C++0x are things that you would already expect C++ to be able to do at some level, but then it turns out it can't and you're screwed. So you kludge up some horrible hack to fill in the gap. Used judiciously the new standard could be a net boon to clarity and simplicity in practice, I think.
-
RE: Boss thinks there isn't an ANSI standard for C
@Maciej said:
Many kernel developers use pure C for other reasons. One I have heard often is the more transparent relationship between code and the machine instructions produced: if I type a+b in C, I can be fairly sure that this is a straightforward operation that takes a few cycles. In C++, there could be a giant, expensive function call behind this. C is often referred to as "the world's only portable assembler" for good reason, and that is a valuable asset when coding close to the hardware.
That's a good point. Ultimately though it's not that hard to avoid unexpected repercussions in C++, one only needs to be aware and avoid when necessary the quite constrained set of situations that leads to overly fancy stuff. It's easy to show how abuse of operator overloading can lead to insane results, but it's equally easy to just not overload operators beyond the simple and straightforward uses of the feature. Implicit object creation is the biggest gotcha, but again a bit of care in passing and returning objects by value, and using the 'explicit' keyword, largely eliminates the problem. Forsaking all the conveniences of C++ to avoid these few gotchas is throwing out the baby with the bathwater.
Ultimately this is just another way of phrasing the more common criticism of C++, which is that it's full of arcane quirks and complexity. Well yes, but you would think smart guys like kernel programmers could manage it as well or better than the rest of us.
-
RE: Boss thinks there isn't an ANSI standard for C
Kernels being written in C has more to do with historical reasons: Linux, Windows NT and BSD are all at least 15 years old, back when C++ was pretty crap in many ways. Device drivers are written in C because they have to interface with the kernel, and because programmers in that domain have more training in pure C. If one were to write a new kernel now for a non-embedded system, there would be little reason not to do it in C++.
C++ is not lower performance than C; it is unarguably not in theory and nowadays it is not in practice either. There are only a few slower-performance features: virtual methods, exceptions, iostreams, RTTI and STL containers. All are entirely optional, and were carefully designed to not require any more performance hit than necessary (with the exception of exceptions, which admittedly really are slow). In fact, using the C++ features may sometimes be faster than the C equivalent: for instance, a virtual function call can be faster than a switch statement; a cout series of << can be inlined by the optimizer and requires no string parsing step; an STL sort algorithm can be faster than a C qsort() call because the comparison function calls can be inlined. Whether these gains are actually realized is implementation-dependent, but anyway no one can accuse the C++ standard of specifying a bloated language.
As for the proof that implementations are up to standard: in the game industry, another application domain where performance is critical but which evolves more quickly than the don't-fix-what-ain't-broke kernel domain, everyone is using C++ nowadays -- even the C holdout John Carmack has now switched over.
-
RE: Ubuntu WTFs
Okay, yeah, come to think of it, I was too quick to rant here and should cut Ubuntu a little slack. I've never known WiFi to be perfectly reliable anyway, and it's not like other desktop operating systems have a package manager at all.
-
RE: Ubuntu WTFs
Hmm, okay, that's not nearly as WTFy as what I had inferred had happened, then. I guess "failed to verify" must just mean it didn't have the proper flags to be enabled (maybe even something I specified at one point in the install but forgot about). I guess at worst the WTF here is unclear error messages, then.
-
Ubuntu WTFs
So I have this old Pentium 3 laptop whose hard disk I had wiped clean with the intention of putting it into the trash. Then I realized that for my move to the other end of the country, at least in the first few days before I have time to buy something new, I would need some kind of computer to e-mail all the people I need to e-mail, and my desktop does not fit in my checked luggage. So I reconsidered and brought it with me. I figured since it was wiped anyway I might as well install Linux on it, and chose Ubuntu since I heard it was the best for desktop-style use.
Now I haven't really used Linux since oh 2002 or so. I used it as my main desktop operating system from about 1998-2000, then relegated it to a fileserver/firewall role from 2001-2002, before getting rid of it entirely and switching to Windows XP. Back in those days you had to do everything by editing config files since all autoconfiguration tools were utter rubbish. I've got more of a life now and don't have time for that kind of nonsense anymore. But I'd heard Linux had gotten so much easier now.I installed Ubuntu from a LiveCD while the laptop was not connected to the net (important point: you'll see why later) and everything went nice and smoothly with very little input from my part. Well it got stuck at one point for a stupid reason (I forget exactly what, but it was stupid) and required me to click Ok to continue, but that's a very minor flaw. The installed Ubuntu booted fine. Then, I tried what I considered an acid test: while the system was running, I hotplugged in my D-Link USB WiFi adapter into a USB port which was itself on a PCMCIA card. Then I opened the Network control panel and lo and behold, a "Wireless Connection" was present. Impressive! So, happy with my new Ubuntu, I left with it to the Bay Area.
Arrived at my new apartment a thousand miles away, [i]now[/i] the WTFs begin to rear their ugly head. My new roommate has a bog-standard Linksys G router configured at 192.168.1.1 with DHCP support with WPA-PSK. In other words, almost the factory defaults for one of the most commonplace home NAT routers in the world. My roommate's Macbook Pro has no difficulty using the WiFi, nor does my just-purchased BlackBerry Curve. However, when I try using it with my Ubuntu laptop using the graphical interface only, the connection works for about 30 seconds, then mysteriously drops forever. Running ifconfig reveals that it loses the IP address for some reason. This happens the same whether I use DHCP or static IP addresses (although switching between the two can cause a reset that makes it work again -- but only for another 30 seconds). I spent an hour reading help wikis and trying to get it to work using command-line utilities, but in vain. I'm sitting here with a too-short Cat-5 wire plugged straight into the router.
Secondly. I tried running svn to get some code of interest to me from a public repository, and Ubuntu told me to run apt-get to install it. Fair enough. But then, apt-get mysteriously fails with "E: Couldn't find package subversion". Which raises the question of why the advice to run apt-get was there at all, if the package doesn't exist. Anyway, long story short, I did some poking around, and it turns out that the problem was in /etc/apt/sources.list. It was full of lines like this:
# Line commented out by installer because it failed to verify:
#deb http://ca.archive.ubuntu.com/ubuntu/ gutsy main restricted
Screw you, Ubuntu. So because I didn't have Internet access when I initially installed you, you assume those servers have ceased to exist and knock them out [i]permanently[/i]? (unless the user is savvy enough to go edit that file with a text editor) Ever hear of a little thing called testing, Ubuntu team? Or simple logic for that matter?
But, these are just minor kinks right? Later in the year I'm sure they'll have it all sorted out. 2009: the year of desktop Linux! -
RE: VS2005 says oranges != oranges
Awaiting deanonymization based on bottoms and tops of letters in 3... 2... 1...
-
RE: Traveler's Online Security WTF
AbbydonKrafts: You sued your own brother over some petty theft? Your dysfunctional family relations sound like the real WTF in all this...
-
RE: Do you want to permanently remove source control bindings from the project?
@asuffield said:
@Cap'n Steve said:
I'm still not convinced that emacs and vi aren't just running jokes that no one will let me in on. I've only used Visual Studio a couple times, but I like some of its features, like being able to exit the program without reading a tutorial.
This sounds like a variation on the Blub effect...
Yeah.
What I find hard to understand is that Cap'n Steve, who is presumably a computing professional like most of us, complains about having to spend a few hours reading a tutorial to learn a tool that can be used daily for a decade or more afterward. Even a small productivity improvement caused by a switch to a superior but cryptic editor should more than pay off the initial investment.