@Applied-Mediocrity said in In other news today...:
as OSS fanbois like to remind everyone as if splitting arguments was some Nobel Peace Prize shit
That is offensive to commercial Unix fanbois.
@Applied-Mediocrity said in In other news today...:
as OSS fanbois like to remind everyone as if splitting arguments was some Nobel Peace Prize shit
That is offensive to commercial Unix fanbois.
@remi said in Programming Memes Thread:
@PleegWat on a related note, one module I was developing was integrated some years back in one software that we sold to external users. So the support team was in charge of writing some doc about it (which they actually did! It wasn't "great" but it was there, and not entirely shitty either). They also regularly asked me more technical questions about the theory behind it, and I usually answered their emails with as much info as I could (I don't write Wall'O'Text only here!), in the semi-formal tone of speaking to a coworker.
Then at some point the branch selling the software got itself sold, including that module (I had a thread about it). I recently, for , had a look at their website and all the docs they have. And I could see that for this module, they had a bunch of varied "methodology" documents that were... my emails, with at most the greeting/signature lines removed!
Well, that is kinda in-line with the "religious text" concept mentioned in this thread.
Bonus points if your name is Paul.
@remi said in (are (arguments for (using lisp)) (still valid?)):
You can't do that natively in C++. You can emulate it with various workarounds (some of them are described in the Wiki page). But they're not direct features of the language, like it apparently is in C#. Those workarounds are going to be either more (boilerplate or weird) code, or worse performances. Likely both at the same time.
You actually can do it in C++, because it does have function overloading, although only at compile time. The same is true in C#, except that the "compiler part" is available at runtime (intended for scripting languages, which is why the usage is so awkward in C#).
Basically, Stroustrup decided that overriding is something completely different and unrelated to overloading and must never be confused (which is why they have similar names, because the C++ design philosophy is ). Otherwise, he would have to implement CLOS multiple dispatch.
But, back to the C# thing, the part that bothers me is that it is up to the caller to ensure they say "hey, this variable is
dynamic
, don't forget to dispatch based on it" which sounds wrong to me. The workaround (shown in this subthread and in the Wiki page) is to use another wrapper function around it, but that's ugly boilerplate, which makes the feature less attractive.
As said above - this dynamic
part should not be in the language at all, as it is basically just .NET implementation internal thing. IMHO it is exposed only to make C# the "ultimate" .NET language that is superset of every other .NET langauge.
@BernieTheBernie said in WTF Bites:
@error That's a deliberate , not an accidental ♭♯♮.
I call that a "Bear Trap" pattern.
@dkf said in (are (arguments for (using lisp)) (still valid?)):
The one pattern I miss from Java is doing dynamic dispatch by the real types of the arguments (as you can do with CLOS); using reflection to do it is very tricky (unless you're using a sealed type tree) and there isn't anything like the trick you can do in C# to achieve it (which I believe is nasty).
May I ask what is that trick?
You end up having to write a whole load of bloated observer stuff just to work around the lack of a feature. I've written that in the past, but it's ever so nasty.
Yeah... with a little bit of foresight, this can be part of classes used by parameters, but it is a bloat. And what's worse: it's hard to read/understand and I have very hard time explaining what does it do and even why. Some well-established term for the concept would help a lot (and even better, a good explanation from someone who can actually explain stuff).
@DogsB said in In other news today...:
*edit after reading the comments I think we need to coin a law where every discussion about fantasy will eventually steer into complaining about Song of Fire and Ice never going to get an ending.
Without delving into the cesspit myself: did anyone actually that GRRM is actually the original author of Githyanki?
@dkf said in (are (arguments for (using lisp)) (still valid?)):
@Kamil-Podlesak said in (are (arguments for (using lisp)) (still valid?)):
The best illustration is the ultimate Java thing, the worst thing about it and endless source of woe: Design Patterns. The GOF Book has been published in 1994!
They were things that were always rather abused, mostly by people not
bothering understandingthem muchat all.
FTFY
Just to be sure: I mean the whole concept of Design Pattern. The specific ones are kinda meh anyway.
What they should have been: here's a common name for a particular shape of solution in this type of programming language, with common consequences described. What they became for too many: a quest to shove as many buzzwords as possible into the code, whether or not doing so was either correct or made sense, and without regard for the consequences.
Yeah, almost everyone missed the "common name" and considered it a "metaprogramming", which leads to argument that "LISP and C++ don't need that because they have templates"
@Bulb said in (are (arguments for (using lisp)) (still valid?)):
@Kamil-Podlesak said in (are (arguments for (using lisp)) (still valid?)):
I blame Simula and strongly object to the usual "it's a Java fault", when in fact Java just jumped into already established mainstream bandwagon.
Smalltalk and Simula are the real culprints, but are you sure the banwagon was mainstream when Java jumped into it?
Yes, absolutely. Simula was, indeed, always mostly obscure thing outside simulation niche (hence the name), but it clearly served as an inspiration to many.
Notably, Turbo Pascal 5.5 (1989) and of course C with classesC++ (1982!)
In 1996 (release year of Java 1.0), OOP was absolutely the hot shit.
The best illustration is the ultimate Java thing, the worst thing about it and endless source of woe: Design Patterns. The GOF Book has been published in 1994!
@Bulb said in (are (arguments for (using lisp)) (still valid?)):
@dkf said in (are (arguments for (using lisp)) (still valid?)):
I don't know CLOS well enough to comment on what it does.
The thing that makes CLOS different is that it came up with the concept of multimethods, that is it dynamically dispatches on types of all arguments, not only the designated invocant.
While many statically typed languages do that statically, including C++ and Java, the only other language that I know of that does dynamic dispatch on multiple argument types is Julia.
I definitely recommend getting acquainted with CLOS.
It used to be so much fun to pull it out in a discussion with a proponent of the "OOP modeling" cultphilosophy and see how the removal of "method belongs to object" melts his brain.
Ah, good times.
Btw, to make my point clear: I blame Simula and strongly object to the usual "it's a Java fault", when in fact Java just jumped into already established mainstream bandwagon.
@Mason_Wheeler said in (are (arguments for (using lisp)) (still valid?)):
@Bulb said in (are (arguments for (using lisp)) (still valid?)):
@Mason_Wheeler Well, yes, it does. Because you know C and you've probably seen how object oriented programming is done in it. But the early proponents of object-oriented languages pretended that it's something completely new and different.
Dude. "Object-oriented languages" != "Java." Many of "the early proponents of object-oriented languages" were venerable graybeards by the time Sun's contribution to the genre arrived on the scene. The first proto-OO language, Simula, was an ALGOL variant developed in 1962. The follow-up five years later, Simula 67, introduced classes, inheritance, and virtual methods, among other common features we are familiar with today. Encapsulation in the form of
public/protected/private
was added in a branched version during the 70s, and integrated into the main Simula standard in 1986. C++ and Delphi both took these ideas and implemented them (in very different ways!) on top of C and Pascal, respectively, and were enjoying a great deal of mainstream success at the time Java burst on the scene.
What is this blakeyrant about, how is this relevant? Java is actually the language from Simula / C++ family, where methods were just functions with special this
parameter. Or, rather, function pointer.
The "methods are absolutely nothing like function" is a trademark of Smalltalk (and maybe Objective C) crowd. And I dare to say that is what @Bulb have run into if he studied on one particular university
Although TBH there was some overlap in 90s (I was there, too) and Sun tried to pander to Smalltalk crowd (but then again, Sun tried to pander to pretty much everyone).
@Carnage said in Nope, you eat it:
Too bad it doesn't have weiner würstchen.
You shall not mix Lower and Further Austria.
@HardwareGeek said in Nope, you eat it:
@Zerosquare said in Nope, you eat it:
I find regular ones delicious. But those gluten-free ones...
I make, or used to make, before I had to give up sugar, chocolate chip cookies with almond flour. They don't have the same texture as regular, wheat flour cookies (which can be quite varied, depending on the recipe), but not bad at all. Assuming you're not allergic to tree nuts, of course.
You can very easily make gluten-free peanut butter cookies (with chocolate chips or without) and they are just perfect. Unless you're allergic to peanuts, of course.
@Gurth said in Rem(a)inders of a Previous War:
https://en.wikipedia.org/wiki/American_Service-Members'_Protection_Act
SEC. 2008. of the Act authorizes the President of the U.S. "to use all means necessary and appropriate to bring about the release of any person described in subsection (b) who is being detained or imprisoned by, on behalf of, or at the request of the International Criminal Court". The subsection (b) specifies this authority shall extend to "Covered United States persons" (members of the Armed Forces of the United States, elected or appointed officials of the United States Government, and other persons employed by or working on behalf of the United States Government) and "Covered allied persons" (military personnel, elected or appointed officials, and other persons employed by or working on behalf of the government of a NATO member country, a major non-NATO ally including Australia, Egypt, Israel, Japan, Argentina, the Republic of Korea, and New Zealand).
Ok, a little late, but I have just noticed...
If a dutch man from Haag commits war crimes and is apprehended, POTUS is obliged to invade Netherlands to free him from Haag and ensure his safe return to Haag?
I don't know if I'm too much or something, but lots of these points are actually for -10x managers .
@Mason_Wheeler said in Delphi 2024:
@Kamil-Podlesak said in Delphi 2024:
@Mason_Wheeler said in Delphi 2024:
@Kamil-Podlesak said in Delphi 2024:
@cheong said in Delphi 2024:
@dkf said in Delphi 2024:
@PleegWat said in Delphi 2024:
@dkf said in Delphi 2024:
strings had much smaller limits
IIRC pascal strings use a length prefix instead of the null terminator which is common in C. Thus the string length is bounded by the size of the length field, which would probably have been (and might still be) 1 byte, meaning strings cannot be longer than 255 bytes.
Exactly that. Counted strings are a (very) good idea (especially if you also track the buffer size for modifiable strings), but using a single byte for the length isn't; a max length of 255 was always minuscule. C, for all its faults, at least didn't impose a small artificial upper limit on string sizes.
Btw, .NET strings are also Pascal string internally (because C# and Delphi shares the same father), with size field of int32 (i.e. max. 2,147,483,647 characters)
No, that's because .NET 1.0 is 90% clone of Java. Or because, as @dkf said, it's an obvious and good idea.
delegate
is the Delphi part .NETNot really. They've diverged quite a bit since, but .NET 1.0 was, if anything, "a 90% clone of Delphi" rewritten to look like Java, because Microsoft wanted something that would compete with Java and be palatable to Java developers, but the guy they got to do it was the inventor of Delphi.
Yeah, yeah, I remember these arguments from that time Nobody ever explained what does this "Delphiness" actually mean (except he
delegate
, as I have already mentioned), but apparently it was very real. Trust me bro.For the big obvious one, have a look at WinForms and how it looked and functioned so very much like the VCL, and not at all like anything Java was doing at the time.
Ok, fair enough, GUI libraries were nothing like those in Java. It was before "suicide by copy" became Microsoft's motto.
@cvi said in In other news today...:
@Kamil-Podlesak said in In other news today...:
For comparison, rice consumption is about 7kg/person/year. I don't know about you, but I find that hillarious.
You should look into the idea of tröstäta.
Not much consolation to be found in rice. It's good for actual meals, but it doesn't have the effect of binging a bar of chocolate.
Actually, I'm not sure what the latter does, but I do it anyway. It seems to be socially accepted to do so when you're on a downer. You can always follow up with alcohol later.
Looks like the famous nordics prohibition actually does work
@topspin said in In other news today...:
Guys, this is not a drill!
I need my chocolate fix!
Trivia time!
Germany and Switzerland top the chocolate consumption chart both in Europe and in the whole world, at 9 ( ) and even 11 () kg pro person pro year.
For comparison, rice consumption is about 7kg/person/year. I don't know about you, but I find that hillarious.
@Mason_Wheeler said in Delphi 2024:
@Kamil-Podlesak said in Delphi 2024:
@cheong said in Delphi 2024:
@dkf said in Delphi 2024:
@PleegWat said in Delphi 2024:
@dkf said in Delphi 2024:
strings had much smaller limits
IIRC pascal strings use a length prefix instead of the null terminator which is common in C. Thus the string length is bounded by the size of the length field, which would probably have been (and might still be) 1 byte, meaning strings cannot be longer than 255 bytes.
Exactly that. Counted strings are a (very) good idea (especially if you also track the buffer size for modifiable strings), but using a single byte for the length isn't; a max length of 255 was always minuscule. C, for all its faults, at least didn't impose a small artificial upper limit on string sizes.
Btw, .NET strings are also Pascal string internally (because C# and Delphi shares the same father), with size field of int32 (i.e. max. 2,147,483,647 characters)
No, that's because .NET 1.0 is 90% clone of Java. Or because, as @dkf said, it's an obvious and good idea.
delegate
is the Delphi part .NETNot really. They've diverged quite a bit since, but .NET 1.0 was, if anything, "a 90% clone of Delphi" rewritten to look like Java, because Microsoft wanted something that would compete with Java and be palatable to Java developers, but the guy they got to do it was the inventor of Delphi.
Yeah, yeah, I remember these arguments from that time Nobody ever explained what does this "Delphiness" actually mean (except he delegate
, as I have already mentioned), but apparently it was very real. Trust me bro.
At the same time, I have had a job at a "Windows+AS/400 shop" and spent some time converting C# code to Java (a vice versa) by a simple sed
script.
TBH, since that time, C# did indeed diverged a lot and mostly for the better.
@cheong said in Delphi 2024:
@dkf said in Delphi 2024:
@PleegWat said in Delphi 2024:
@dkf said in Delphi 2024:
strings had much smaller limits
IIRC pascal strings use a length prefix instead of the null terminator which is common in C. Thus the string length is bounded by the size of the length field, which would probably have been (and might still be) 1 byte, meaning strings cannot be longer than 255 bytes.
Exactly that. Counted strings are a (very) good idea (especially if you also track the buffer size for modifiable strings), but using a single byte for the length isn't; a max length of 255 was always minuscule. C, for all its faults, at least didn't impose a small artificial upper limit on string sizes.
Btw, .NET strings are also Pascal string internally (because C# and Delphi shares the same father), with size field of int32 (i.e. max. 2,147,483,647 characters)
No, that's because .NET 1.0 is 90% clone of Java. Or because, as @dkf said, it's an obvious and good idea.
delegate
is the Delphi part .NET
@da-Doctah said in In other news today...:
Finally, a useful suggestion from one of this red state's politicians:
Oh, somebody discovered Rick&Morty...
@Applied-Mediocrity said in WTF Bites:
I’m English, I have no idea legitimately how to put accents on anything
I seem to recall you folks had that language professor guy, I forget the name, who wrote stuff like Númenor, Khazad-dûm and Namárië
Born in South Africa. Which reminds of another man from that land...
Tolkien at least spared his children.
@dkf said in Fun with maps:
@HardwareGeek said in Fun with maps:
@boomzilla I kinda half wish Chicago really was part of Texas. There'd be a lot less BS going on there.
Or there would be a lot more BS going on in Texas...
And BIGGER!
@Arantor said in Different countries are different:
@Bulb the odd rows do this: they have the end brick across rather than along, so you then get the benefit of the alternating half lengths atop one another.
Having an entire row of across seems like it would be structurally deficient against lateral forces.
The original dutch image is just plain wrong. If you look carefully, every second row consists of square bricks.
@Atazhaia said in Infinite Craft:
Sometimes it makes sense, like "Pokemon" + "Emperor" gives "Arceus". But then it fails with "Pokemon" + "Penguin" = "Pikachu". And it can also go full like when "Israel" + "Hamas" = "Peace". Or into depraved fanfic with "Bowser" + "Princess Peach" = "Baby" or "Luke" + "Leia" = "Incest". And to annoy every italian with "Pie" + "Pineapple" = "Pizza".
Is it an Atlus game?
@MrL said in The Cooking Thread:
@Polygeekery said in The Cooking Thread:
@Kamil-Podlesak said in The Cooking Thread:
It's not absurd, it's a very common issue with parallel execution. In one thread, you prepare and consume dumplings. In the other thread, you prepare and consume vodka with pickles. It's obvious which one goes first.
Now swap out the dumplings with ice cream and now you get to the real issue with parallel execution as your process vomits out the result of incompatible inputs.
If you want a more dramatic example feel free to swap out the dumplings with Diet Coke and the vodka and pickles with Mentos.
There was this one time that we were out of any sensible chasers. So what do? You can just not drink... ahahahah haha ha. ha. We used what was at hand of course, which in this instance was milk. 3.2% milk.
Interesting, so vodka->dairy is a valid order?
I know that in the case of beer, it is not. There is even an easy-to-remember rule: cow can carry a brewer, but brewer cannot carry a cow!
@MrL said in The Cooking Thread:
@Zecc said in The Cooking Thread:
@MrL Are you sure you can tell from the picture?
Well, cabbage can be in the dumplings, true. That would make them the best dumplings too.
Absence of pickles and vodka could be explained by them being already consumed. But that's absurd of course, sausage/dumplings go first, then vodka with pickles.
It's not absurd, it's a very common issue with parallel execution. In one thread, you prepare and consume dumplings. In the other thread, you prepare and consume vodka with pickles. It's obvious which one goes first.
Besides an empty document, they even abused their templates to put a fucking tutorial in it (2nd entry). Or extremely useful things like printing your own fucking calendar, which I'm sure is a thing normal people do regularly.
Hmm, JANUAHR... What language is that?
@boomzilla said in In other news today...:
@Kamil-Podlesak said in In other news today...:
@Watson said in In other news today...:
@Benjamin-Hall said in In other news today...:
In other news, Pikachu's shocked face could not be reached for comment.
Busy cloning sufficient additional Pikachus to provide sufficient shocked Pikachu faces.
See, your problem is that you are doing yourself instead of letting an AI to do it. Use PokeGPT!
Be a rebel and use PalGPT.
If the legally-not-a-Pikachu face is enough for you...
@Watson said in In other news today...:
@Benjamin-Hall said in In other news today...:
In other news, Pikachu's shocked face could not be reached for comment.
Busy cloning sufficient additional Pikachus to provide sufficient shocked Pikachu faces.
See, your problem is that you are doing yourself instead of letting an AI to do it. Use PokeGPT!
@Arantor said in The absolute state of web storage protocols:
@Kamil-Podlesak I dunno if it’s a “brainworm”, but there are definitely cases where it feels like the appropriate way to sanely reflect the behaviours you’re trying to embed into the code.
Not that it should always do that, mind, I just think it can be a useful approach, just not the only approach.
Yes, of course there are valid cases. And less valid cases.
The real problem, however, lies in the reverse direction: objects/classes that do not represent anything in the problem domain and exist solely to organize code. In my experience, majority of OOP-raised developers do not really grasp the concept. Even in the Java world, home of the (in)famous AbstractSingletonProxyFactoryBean. Which, TBH, is quite a bad name and it should be anonymous, or have a non-descriptive name. Like "Jeff" or "Laszlo" (for the fans of Hungarian notation).
@Mason_Wheeler said in The absolute state of web storage protocols:
@Bulb said in The absolute state of web storage protocols:
@Mason_Wheeler said in The absolute state of web storage protocols:
In the 21st century, OO is table stakes. And Rust just looks ridiculous for not including it.
Actually, in the 21st century, OO is dead as a doornail and language designers are realizing how big of a mistake it was.
Most of the languages designed in 21st century don't have inheritance. Just some of them have some delegation crutch to cover the “simple” cases, but in Rust they never agreed on which cases are simple enough for a generic tool while being non-trivial enough to actually need it.
Don't be ridiculous. There's a reason OOP conquered the world while languages without it remain relegated to minor niches: The class/inheritance model is the single best tool we've ever developed for managing complexity.
Is it perfect? No. But it's significantly better than anything else out there, and people pointing to problems and saying "we should be using my preferred language because it doesn't have this problem" reek of sour grapes. OOP has triumphed decisively in the marketplace of ideas.
Uhm, no. It's not the worst, but definitely not the best and the triumph was not that long-lived. It is better than the critics see it, but sadly most people have (or at least had, in my generation) very wrong idea how to actually use it.
Which stems directly from the "Simula brainworm": the idea that OOP Objects are model of the Problem Domain (or even Real-World) entities. Which is a very, very bad idea. With some exceptions like GUI, I suppose.
@Arantor said in In other news today...:
@DogsB the only fighting I’m doing over scones is getting mine before everyone else eats them all. It’s clearly pronounced scone, and the people who pronounce it scone are wrong. Jam, cream, whichever first would make you happiest - I too am a cream first person but jam first is finel
Could you please reach some conclusion? I using the word in a translation of common saying in my mother tongue: "some men like hoes, other men like scones"
Do I need to start over? The closes current english idioms "whatever floats your boat" or "no kink shaming" do not have the same edge...
Edit: Also, I am definitely a "scone" guy
@Benjamin-Hall said in Azure bites:
@Arantor what about naming things?
Has been renamed to "cache invalidation", obviously.
@Benjamin-Hall said in Techniques for sharing limited resources in a partially remote environment?:
If we were in person, we could do it with physical objects--you check out a "dev environment marker" and put it back when you're done with it.
It's called "shingle" https://thedailywtf.com/articles/The-Source-Control-Shingle
@Bulb said in The absolute state of web storage protocols:
@Kamil-Podlesak said in The absolute state of web storage protocols:
log4j2
does, which is a completely different library implemented from scratch with lots of new, shiny, "useful" features like special processing of the messages.Apropos useful features—the only feature I want from a logging library is turning messages on or off by a category and severity and maaaybeee adding timestamp.
Well, yes,
log4j (1.x) still exists and if someone really wants something newer, there are at least two projects created by people with similar expectations.
Then just dump it to stderr, either journald or containerd will pick it up from there and send it wherever, that's none of the application's business.
I would also add files, that is the safest. Right now, I am doing support for some systems where logging goes to syslogd which then forwards it to journald... and all the messages are either duplicated (ok, annoying, but I can live with that) or missing completely (that is worse...). TBH it's journald from 2015, because "enterprise" (read: the head cover with longer-wavelength reflective properties )
@topspin said in In other news today...:
@Kamil-Podlesak my reaction to reading influencer is the same as if they’d call themselves “manipulator”, or “propaganda officer”. Don’t know why everyone else isn’t put off by them.
Yes, kinda, but I don't really see the political connotation (also, that word already exists: "pundit").
I have always seen it more as a "one-man Advertisement Agency". Kinda like "blacksmith" is "one-man thyssenKrupp" or "baker/conditioner" is "one-man Nestlé"
@MrL said in In other news today...:
@DogsB said in In other news today...:
I think we have to start associating the term influencer with advertisement platform.
I worked at a company that used influencers. Company small enough that the details of this cooperation was known to employees.
Influencers are an advertisement platform and literally nothing else.
Seriously, does anyone actually think anything else? I mean, the word "influencer" itself is pretty clear!
@dkf said in The absolute state of web storage protocols:
@Bulb said in The absolute state of web storage protocols:
@Gustav Fortunately this time they are implementing it in Rust where that kind of things does not tend to be on-by-default (and usually even not implemented).
There will still be the other root cause of the log4j problem potentially about: reparsing things that shouldn't be reparsed. That's not a language fault (unless the language is especially bad) but is definitely possible in user code or library code. SQL injection is an example of this sort of thing.
Obligatory note: log4j
does not have such problem at all. log4j2
does, which is a completely different library implemented from scratch with lots of new, shiny, "useful" features like special processing of the messages.
@dkf said in In other news today...:
@Kamil-Podlesak said in In other news today...:
I wonder what kind of is desirable in former British South Caelid.
Do the bugs want to kill you? If not, they're unsuitable for .
I see. Another challenge for the global community of software developers!
@PleegWat said in In other news today...:
Australia doesn't need no stinking teslas.
Ship was contaminated with undesirable bugs.
I wonder what kind of bugs is desirable in former British South Caelid.
@Applied-Mediocrity said in Florida Man goes to...:
“It’s crazy -- the schemes people come up with to get money out of people,” she said.
I suppose she won't be "investing" in Skittles coins then
this kind of scam is based on the same principle: using completely technology from a wrong era, so people don't understand how it works. The only difference is the era: 1800s instead of 2000s.
Quoting from the second article (emphasis mine):
The scammer will ask that the extra money be refunded. Shortly after the contractor sends the cash and occasionally after work begins, the scammer’s original check will bounce from the contractor’s bank account.
@Arantor said in Exit the cloud:
@Bulb everyone I’ve spoken to, including the people teaching K8s workshops, and people running large applications in the cloud, utterly insist that Helm is necessary to “effectively” manage K8s. “Best practices” or some such.
You definitely need something. It might be something you have written yourselves (like, shell scripts running kubectl
), or it might be someone (running kubectl
) , but "just use k8s" is not even a thing.
Unfortunately, Helm is the only widely accepted tool, despite being . We can only hope that the "Godot" phase will eventually end.
@Bulb said in SQL Server transaction deadlocking with itself?:
@iKnowItsLame This merge is just an
INSERT OR REPLACE
of other databases. It's a bit verbose for the purpose, but seems sound.
Obligatory note: MERGE originates from DB2.
Which actually makes the point stronger, because "Microsoft trying to emulate IBM" is a special class of... "fun".
@Gern_Blaanston said in The unofficial offical bad pun of the day thread:
For New Year's Eve I threw an Erectile Dysfunction Party.
.
Nobody came.
It was obvious fake - US Constitution clearly says that Erection Day must be Tuesday.
Just an idea: maybe some of the (sub-)queries are actually run on the original database? I am pretty sure that running queries in remote db is possible, and I would not be surprised if the cloning process somehow created a database link....
I don't think people code this way out of "purism",
theywe simply do it out of lazyness.
Oh, absolutely, that is the most common case.
What people think is "If the INSERT fails, a database error just occurred and we log it as an error with full stacktrace (and if you're good, the failing query and its parameters." The code doesn't know that this error was an input error because it doesn't bother to check (and it would be a paint to implement in some cases, especially if you have to handle multiple DBMSs).
This is (surprisingly) quite well standardized by SQLSTATE/SQLCODE codes and handling those is not that hard. But yes, it's extra work.
So the result is, "Any error not caught in the front-end returns HTTP code 500 even when it should be 4xx".
Raise your hand if you had to write regexp parser to detect common validation cases from a (REST) API, accepting both Oracle and Postgresql exception messages.
It's up to the maintainer (me again) to check the log and tell the customer to stop wasting my time and get their shit together.
It helps if the developer is the maintainer.
I mean, it helps to improve the software.
Having separate roles help to create the right enterprise/corporate atmosphere.
@PleegWat No, old and cynical. The young and naîve want to fix all errors and polish their software to perfection. Then they find everybody else produces crap and doesn't give a fuck, so they stop giving fuck too and silence all errors until someone starts to complain.
See, we were recently setting up monitoring for the software that we are deploying. The architect came up with some conditions we should set up alert for, and one of the them was when an error-severity message is logged. So I set it up … and it promptly went off and kept going off almost every detection cycle. While the software worked just fine.
The main source was that RabbitMQ—which is just a standard third party component in the system, nothing that we'd touch ourselves—logs an error every time a connection attempt fails and, well, any port opened to the internet gets its fair share of port scans and other bogus connection attempts.
So I restricted it to our own components only, but there was still a bunch of false positives in there. I ended up dropping that alert and just putting a last 10 errors query on the dashboard. Because even if something failed, it is not an error until you know it was supposed to succeed, and the server usually doesn't know that.
So there are some cases where software should long an error—e.g. failing to connect to the database is an error, because nothing will work in such case—but most things shouldn't be errors, just info or warning something didn't succeed, but you'll have to decide whether it's interesting after the customer calls you that something ain't no workey.
Yeah, this is a really tricky question that confuses most developers.
At the basic level, we have validation of input data (user-entered maybe) which of course displays an error... so some people logs it as an error.
Which is obviously wrong, and easy to explain.
...but then we have something like constraint error (FOREIGN KEY, UNIQUE etc, etc) which is quite often the same as above - an error in the input data record, just validated at the different level. But in this case, so many people just insist that "it's a database error, must be logged as an error". With full stacktrace, of course, which makes the log very "fun" to read (in Java, at least).