<spoiler>pre madonna</spoiler> on a ¡Jedi! hunt



  • @cartman82 said:

    The least he can do is sound a bit grateful for that.

    Ok, good point. In trivia, this raises questions about the transport system, but then my country stretches maybe 600 miles from coast to coast at the most.

    @cartman82 said:

    Instead of whining how no one patted him on the head for the little homework he was asked to complete. Or how he was expected to actually demonstrate the knowledge he will need on the job (python stdlib).

    I think the actual issue he has is this part:

    Feel free to apply again in another six months *when you have more experience*.
    Admittedly this is not well explained, and I have no idea if he actually applied for a job when he didn't know python.

    If he has 10-odd years commercial experience, this is simply unprofessional and insulting. In that regard, HR should know better.

    @Yamikuronue said:

    Horribleness: Implied interest then never followed up again.

    Here I think he is taking issue with the way communication works. We are taught that there is a request and a response, and that is the format for communication. We are never taught that what people actually do is simply stop talking when they lose interest. Based on my own experiences, I don't like being told it's my fault when I don't get back to people and that it's up to me to put up with it when people don't get back to me.

    I get the impression the issue you guys actually have is that he's not a good writer. He doesn't explain or dismantle the actual issues at hand. His stories are anecdotal examples of what is wrong with our economic model, our society and our style of communication. He never covers any of that, and I don't think he understands that's what he's angry about. All he knows is that he's angry and these are the stories that make him angry.



  • @Shoreline said:

    If he has 10-odd years commercial experience, this is simply unprofessional and insulting. In that regard, HR should know better.

    Until you meet a guy who has 10-odd years commercial experience, except it's the same year over and over again, and he's a walking disaster.



  • @cartman82 said:

    Until you meet a guy who has 10-odd years commercial experience, except it's the same year over and over again, and he's a walking disaster.

    Heh. In trivia, I think I had the same 9 months of experience 3 times in one of my previous jobs. Mostly PHP/MySQL/SugarCRM. It was pretty obvious when I had to move on that I wasn't up to date with modern technology. I think I removed PHP from my CV altogether.

    I actually produced some run-of-the-mill WTFs just because I thought I'd get into trouble for producing code properly. The environment was pretty toxic.



  • Trigger warning: I'm responding to this post without reading the topic to the end first. I may have been :hanzo:'d.

    @Groaner said:

    @FrostCat said:
    Sure, but lots of things don't need that granularity, either. If you're storing an expiration date, or an effective date, or a birth date, you probably don't care about the time, so why store it?

    Truncating to the date loses precision and creates ambiguity.

    If the date is all the precision you need, then adding the time (and maybe offset) creates ambiguity.

    Use the data type that gives you the precision required for the data you're storing. If you need more precision than days, by all means use a DateTime. If all you need is days, then use a Date (and please don't fake it with using a DateTime with the Time part set to midnight).

    Likewise, if you're counting things, you'd use an (unsigned) integer, not a float. The extra precision provided by a float doesn't make any sense in that scenario, introduces ambiguity (someone reading your code will wonder why you're using a data type that provides precision greater than integral numbers) and may introduce errors in your calculations1.

    1 This is because of the implementation of IEEE 754, so you might say this is a strawman. I think it's analogous to errors that may be introduced in date calculations when the underlying type has a Time part..

    @Groaner said:

    Expiration dates for physical products are naturally going to be ambiguous (and the actual date might vary by several days in either direction), but what about an expiration date that, say, controls whether an account can log into a website? When exactly does access expire - at 12:01 AM on the date? 5:00PM? 12:01AM on the following date?

    I was thinking about the expiration date on food, so a date would be precise enough for what I had in my mind.

    Your other examples may or may not require the extra precision provided by a DateTime over a plain Date.
    When exactly access expires is probably determined by business rules. If you need a DateTime to implement those business rules, then by all means use a DateTime!
    If you don't need that precision2, use a Date.
    All I'm saying is: use the appropriate type that gives the appropriate precision for what you need to implement.

    2 For example: "Accounts expire at the set date, at 17:00 UTC.". You don't need a DateTime to store expiration dates, since the Time part doesn't provide required precision. If I give you just an expiration date in this scenario, you'd still be able to say exactly when during the day the account expires.
    @Groaner said:

    As for birth dates, my birth certificate has a timestamp on it. Why not store the extra precision if it's available? What is there to lose by doing so?

    Why not indeed? If you have the timestamp and if it makes sense and/or it is required for the particular thing you're talking about, by all means use a DateTime!
    I'm not advocating to change all occurences of DateTime's to Date's, because who needs sub-24-hour precision anyway. I'm advocating using the appropriate type for the specific scenario you're looking at.

    @Groaner said:

    @OffByOne said:
    If the data type you use doesn't give the precision you need, then you're using the wrong data type.

    DateTime gives more precision than needed for a date. Why is this a bad thing? Is the project in question on an embedded system with kilobytes of storage?

    It doesn't only give more precision, it also changes the calculation rules. If dates are sufficient and you use a DateTime and if you're not careful, you'll get "interesting" results around DST or when you're doing cross-timezone stuff.

    @Groaner said:

    @OffByOne said:
    Christmas is an all day long event. The appropriate data type is Date.

    Or, it's an event that begins at 2016-12-25T00:00:00.000 and ends at 2016-12-26T00:00:00.000. What about events that don't necessarily begin and end at midnight?

    You could abuse DateTime's to represent Date's, like you can use floats to represent integers. You would be using the wrong3 data type though.

    Events that don't begin and end at midnight are not "all day long events". They are events that begin and end at a specific time and happen to have a duration of 24 hours.

    3 Wrong as in: added precision for no reason at all and creating ambiguity in the process.

    @Groaner said:

    @OffByOne said:
    Why indeed? As @FrostCat mentioned, that granularity doesn't make sense for an invoice due date (or an expiration date, as was his example).

    What if invoice payments are submitted electronically? What if payers are in multiple timezones? Are you going to note receipt by just storing the date? Are you going to determine timeliness by truncating the receipt timestamp to the date before performing the comparison?


    I brainfarted on the invoice due date. You're right that a timestamp makes sense in that scenario.

    @Groaner said:

    @OffByOne said:
    Subtracting two Date/Time/DateTime things from eachother don't give you a new Date/Time/DateTime. It gives you yet another data type: you're looking for an Interval (or a Duration or a Period or whatever you'd like to call it).

    You'll note that I said nothing about the result of subtracting two DateTimes, so being pendantic about what type the result is is just being pendantic for pendantry's sake.

    You didn't? *scrolls back*

    @Groaner said:

    Why measure shipping times in days when you can subtract the time a package was scanned at the warehouse from the time the recipient signs for the package?

    You didn't say anything about the type of the result of subtracting two DateTime's, but you did mention subtracting them.
    I wrote the bit about the result type of Date/Time/DateTime arithmetic for completeness' sake.

    @Groaner said:

    Whatever you want to call it, it gives much more useful results than just comparing day intervals. If you've upgraded the routing algorithms for your shipping trucks to save, on average, half an hour in transit times, how else will you know if the upgrade is working?

    In that case, it does make a lot of sense to track things with a precision of minutes or even seconds.

    If I ask you how old you are on the other hand, I wouldn't expect an answer with day precision, let alone hours, minutes or seconds precision.



  • This feller really should take that page down during his next job-hunting attempt.

    Whilst I've been extremely fortunate in my career, and not suffered too many difficult interviews, I just struggle to believe that this isn't a parody. He can't be serious!

    What he doesn't say is that unless he is independently wealthy, or someone wealthy / generous relatives, who has an extremely good reason as to why he wants to "settle down", his penchant for frequent, ad-hoc travel would already be a big red-flag for any hiring manager looking to fill a mid to senior level role.

    I don't know about the US, but In Australia, recruiters charge large, non-refundable up-front sums for placing a candidate, regardless of the duration of their tenure. So given that he's a proven flight risk, there's no guarantee that he doesn't just wake up one morning and decide he want's to move to Yemen for 2 weeks with a brief stop-over in Antarctica.

    As a hiring manager, this alone would be enough for me to reject his application.

    Being able to recall in intimate detail the obscure quirks of the interviewer's favorite weird language feature is just stupid, however requiring an expert-level working knowledge of such isn't an unreasonable ask for a senior level position.

    For example, one question we recently asked in a technical interviews for C# coding candidates is "What is the underlying implementation of the lock(object) { ... } construct." Bear in mind, at the beginning, we tell all candidates that you are allowed to look things up on the internet during the interview.

    If you think that is an oddly specific question. The job advertisement listed "5+ years commercial experience, or Expert Level knowledge of implementing multi-threaded systems in C#"

    Some of his other difficulties could be easily explained by his dislike of phone calls despite being constantly in transit. Why complain about having an interview on a train if you want to spend your life traveling? I'd be grateful to both the interviewer and the other passengers that he was even allowed to do that.

    He has one valid point though.
    Any interview requiring you to write syntactically correct code on whiteboards, pencil and paper, windows notepad, etc. without access to the internet is just pointless. Making use of available reference material and research techniques is a valuable skill. In my books someone who can use google to find and evaluate an algorithm or technique will be a lot more useful than someone who has memorized the exact implementation of a red-black tree - Would anyone want to work with a guy like Sheldon Cooper.



  • @cartman82 said:

    The guy who is in some way behind redis (didn't research the details) rants about his awful job hunt experiences.

    Even if this guy founded the Redis project (which he didn't - Some Italian guy who's blog is here did), it's not particularly novel, good, easy to use, or anything else remarkable. Linus Torvalds' childish attitude is tolerated, largely because his work as "Emperor Penguin" for the Linux world has resulted in something of huge notoriety.

    This guy made 20 or 30 commits to a glorified hash-map.


  • I survived the hour long Uno hand

    @caffiend said:

    Any interview requiring you to write syntactically correct code on whiteboards, pencil and paper, windows notepad, etc. without access to the internet is just pointless.

    Fair, but if you can't write a simple, mostly correct script without Google, you're not going to be very productive. Whiteboarding an psuedocode algorithm I think is a much better interview question than whiteboarding perfect code


  • BINNED

    @caffiend said:

    He has one valid point though.Any interview requiring you to write syntactically correct code on whiteboards, pencil and paper, windows notepad, etc. without access to the internet is just pointless.

    Welcome to my college exams. C on paper. Using the original standard, so none of these fancy new-fangled declarations of variables when you need them, no no no, every variable has to be declared on the top of the function. Oh, you realized you need another one midway through the assignment? Fuck you, go declare it on the top! And then be lambasted by the examiner for writing something like

    int x; double z;
    

    In a single line, because it's "unreadable".

    Fun fact: I lost a bunch of points on my first exam because, writing on paper, I instinctively used instead of =<. No, appeals to common sense did not help.



  • @Onyx said:

    Fun fact: I lost a bunch of points on my first exam because, writing on paper, I instinctively used ≤ instead of =<. No, appeals to common sense did not help.

    Yeah, we all had to do it. But I bet you were in the top 5% of the class. The examiners just used this as a method of fixing the bell-curve.

    If it makes you feel any better I lost marks for not drawing an ampersand properly.

    But that was college, and we all know how much like the real-world that is. I used to be able to drink during the day at college, I'd take a paper-based job interview if they guaranteed to let me do that ;)


  • BINNED

    @caffiend said:

    Yeah, we all had to do it. But I bet you were in the top 5% of the class. The examiners just used this as a method of fixing the bell-curve.

    Nah, that was the first exam out of 3, first year. Unless you mean they were fixing the curve for each exam individually, which I doubt. They were just :pendant:s.

    @caffiend said:

    I used to be able to drink during the day at college, I'd take a paper-based job interview if they guaranteed to let me do that 😉

    That reminds me, need to get some liquor to pour into my coffee for "special" occasions.



  • @ben_lubar said:

    So how do you store the date unambiguously?

    Like this. I really don't know when people will learn.

    The underlying storage implementation shouldn't be important, however it should have a requirement of not doing anything retarded. Like have an epoch date and not being able to support dates before it, being able to run out before the heat-death of the universe, or not being able to sort properly, use floating point maths for arithmetic operations. Basically anything that doesn't introduce weird site-effects.



  • @Onyx said:

    Nah, that was the first exam out of 3, first year

    What I meant is that they only pulled that pedantic stuff on the stronger students to make it more closely fit a normal subject.
    Those who were struggling never copped deductions for it.

    One of my professors admitted to it, saying that otherwise coding subjects would have a bell-curve so horribly skewed that it would confuse the rest of the academic community.

    IMHO, whilst coding can be learned, I would say that it certainly requires a natural aptitude to learn, and a lot of people in first year CS degrees just don't have it. They usually drop out for other reasons after first year anyway.



  • @blakeyrat said:

    I'm not asking for them to REMOVE DateTIme, just add a separate Date and Time.

    Well, they didn't do that. But they did add DateTimeOffset, which despite looking really useful, causes all manner of problems if you actually try and use it for anything other than getting DateTimeOffsets out of SQL Server databases.

    Also problematic is when people write functions which return DateTime objects, where only the time part is important and then put some cute value (like their first child's birthday or some shit like that) in the Date part.

    Blakey, if you want to set up a think-tank or symposium to genuinely solve the problem of representing dates using computers. I'll be the first to sign up. I think Swatch tried it once, but unfortunately the idea didn't catch on.



  • @caffiend said:

    Blakey, if you want to set up a think-tank or symposium to genuinely solve the problem of representing dates using computers.

    Between SQL Server's Date and Time types, and C#'s DateTimeOffset and TimeSpan types, it's solved.

    The only problem is getting all FOUR of those types into ONE programming language (and making DateTimeOffset the default for everything.)

    Most annoying to me is that virtually nothing (even brand-new databases and programming languages like Rust, MongoDB, etc.) implement TimeSpan. Even after C# has proven how goddamned fucking useful it is.



  • @OffByOne said:

    It doesn't only give more precision, it also changes the calculation rules. If dates are sufficient and you use a DateTime and if you're not careful, you'll get "interesting" results around DST or when you're doing cross-timezone stuff.

    I've had the pleasure of working with a few reports that compile statistics over long processes (a really fast one might go through in two weeks, most take about six months, some linger on for several years before someone decides to cancel them). Times are long enough that the business people wants to have durations expressed in days.

    The guy who originally wrote the reports didn't think about timezones or daylight savings.
    The result was that the report gave different results depending on whether it was run in Winter or Summer and, by code, it was possible to get negative durations in some cases if the timezones were just right and someone was in a hurry so it looked like a task was completed before it was started. I don't know if that ever happened in practice but it was possible.

    What the original programmer intended to do was to create timestamps of the midnights of the relevant days (in seconds since whatever) and then divide their difference by 86400 and round down to get the number of days. But a combination of timezones all around the world, DST, and a database where some values were stored as Dates and some as DateTimes ensured that he ended up getting dates that were sort of correct. And some people's bonuses were tied to those figures.


  • Grade A Premium Asshole

    @Buddy said:

    Are null checks really so costly that it's worth it to disregard the possibility of someone doing something like this:

    A couple of things:

    1. I guess we shouldn't use things like true because someone could do

      #define true false

    and ruin our day. ;)

    1. The first commit is in core Redis. That second commit is in hiredis. From hiredis's README: "Hiredis is a minimalistic C client library for the Redis database.". Client libs generally don't require that you pull in and link against an entire server. Hiredis really looks like it doesn't buck this tradition.

    2. The second commit was present ~6 months before the first commit, and -from what I can tell- the code touched remains unchanged today. Either the code has never been properly reviewed, or the removal of the NULL check was the right thing to do. I know which one I think is more likely. 😄



  • @bugmenot said:

    A couple of things

    Words for Small Sets



  • @bugmenot said:

    Either the code has never been properly reviewed, or the removal of the NULL check was the right thing to do. I know which one I think is more likely. 😄

    Yeah, I think we all know which of those two things is more likely :P

    But the point is, who does that? Who removes null checks from a code base? What value could that possibly add?


  • Notification Spam Recipient

    @Buddy said:

    Who removes null checks from a code base?

    Jerks that think "This cannot be null, obviously. I am so clever. Why do I have to work with all those morons?".

    When the thing blows up, they blame others for 'making it nullable'.



  • Running his own shop, he could always blame his employees.


  • Grade A Premium Asshole

    @RandomStranger said:

    Running his own shop, he could always blame his employees.

    Maybe, if he wants to try and feel better about himself. But he is still to blame for the hiring decisions and setting the course of the business.

    If your startup/business fails, then you failed.



  • @MrL said:

    When the thing blows up, they blame others for 'making it nullable'.

    A dev was telling me how a dev (who had left, fortunately) was telling them off for changing the HTML in some piece of code. This wasn't outside of spec, they just didn't like having to make a piece of code that did something different from what they designed.

    I would love for someone to tell me off for changing their code. I'd just be like "how fat do you want your character to be in my sit-com?".


Log in to reply