Not? Shortcutting logic? Never heard of those!



  • @boomzilla said:

    Worst IDE ever.

    No, WordPad is worse.



  • Pretty darn sure that WordPad has a hope of saving it as a plain-text document.



  • @riking said:

    Pretty darn sure that WordPad has a hope of saving it as a plain-text document.

    Technically, so does Word.

    The fact that both, invariably, fail to achieve such...?



  • @riking said:

    Pretty darn sure that WordPad has a hope of saving it as a plain-text document.

    Just write a macro to load the file into a VBA module, and diplay it in the VBA editor.
    As an added bonus you will get broken syntax hightlighting.


  • Discourse touched me in a no-no place

    @antiquarian said:

    I saw it suggested somewhere that := should be used for assignment, == for comparison, and = should be a syntax error.

    I wrote a language like that once (it was a strange hardware language, specialized for a very high degree of asynchronous processing). While it had =, it was for variable/constant initialization and not for assignment as normally used (which used :=). In particular, it was not an operator but more like a straight keyword.

    It's a long time ago (I don't remember why we didn't use VHDL) but it went something like this:

    let X:signed_int[16] = A+B*C
    # The grammar was something like this
    # {let} {var_name} {type_separator} {type} {be_defined_as} {expression}
    

    (Yes, the language supported arbitrary-width integer types natively, but no recursive structures. That sort of thing makes sense with hardware. I'm not entirely sure if the language was really Turing-complete or whether it just approximated it; bounding the types like we did can have some odd implications.)



  • SAS gives you "." for "missing value"

    Actually SAS gives you ".A" - ".Z" so you can assign different values to different causes for the data to be missing.

    So, works like NULL, but you can sort and filter on it.



  • @chubertdev said:

    Made a ton of WTFs, left the company, got re-hired, I fired up Word.

    Is this some sort of street gangsta lingo? What did you do to that poor guy?



  • @Keith said:

    Is this some sort of street gangsta lingo? What did you do to that poor guy?

    It's front-page gangsta lingo.



  • @HardwareGeek said:

    In Verilog, some boolean data types can be true, false or a couple different invalid/unknown states.

    You mean a language actually exists where FILE_NOT_FOUND is a syntactically valid boolean value?





  • @chubertdev said:

    @antiquarian said:
    I saw it suggested somewhere that := should be used for assignment, == for comparison, and = should be a syntax error.

    This is how you get Perl, people!

    Except := isn't a thing in Perl, which is surprising. Since := is perfectly acceptable line noise, it should at least do something in Perl.



  • @OffByOne said:

    @chubertdev said:
    This is how you get Perl, people!

    Except := isn't a thing in Perl, which is surprising. Since := is perfectly acceptable line noise, it should at least do something in Perl.

    Yeah, that was mostly just comment on Perl's operator soup.


  • Discourse touched me in a no-no place

    @OffByOne said:

    You mean a language actually exists where FILE_NOT_FOUND is a syntactically valid boolean value?

    It makes sense at the hardware level, where you're peeking beneath the simple boolean abstraction. Though it's not so much FILE_NOT_FOUND as CURRENT_SOURCE_AND_SINK_NOT_FOUND. Real hardware designers think of digital circuits in terms of currents anyway; voltages are for the slow…



  • @dkf said:

    It makes sense at the hardware level, where you're peeking beneath the simple boolean abstraction. Though it's not so much FILE_NOT_FOUND as CURRENT_SOURCE_AND_SINK_NOT_FOUND.
    There are a number of reasons why the boolean abstraction may fail at the hardware level, although it's perfectly valid the vast majority of the time. (Note that unless specified otherwise, these reasons apply to software models of the hardware in, e.g., Verilog or VHDL, rather than the actual circuits.)

    Many modern low-power designs turn off power to parts of the hardware when they're not being used at the moment. If the power is shut off, the logic gates aren't functional, and the voltage levels are not guaranteed to be valid. Signals in this part of the hardware are given an unknown (X) value.

    A wire that is not being driven has a special, high-impedance, unknown (Z) value. A wire might not be driven for any of several reasons. It might be a design bug; a connection in missing. It might be part of a bidirectional data bus, which almost always have some dead time where neither agent is driving it while changing directions. It might be a connection to a removable device of some sort that is not currently connected. In any case, this high-impedance unknown may turn into a regular unknown when it goes through a logic gate, which brings us to the next reason the boolean abstraction may fail.

    The boolean abstraction fails (the most common case for logic designers and verification engineers) when the input to a logic gate is unknown for any of the reasons given here. Depending on the type of gate and the state of its other inputs, a logic gate may be unable to determine what its output should be if one of its inputs is unknown. For example, the output of an AND gate will be unknown if one (or more) input is X and all other inputs are true. (If any input is false, of course, the output is false regardless of any other inputs, known or unknown.)

    In modeling hardware, for pretty much all practical purposes, all of the variables that represent the hardware state have static lifetime; they spring into existence when the program is loaded. (Dynamic allocation often occurs in the test harness, but not in the hardware.) It is quite possible for these variables to exist, but nothing to have yet assigned a valid boolean value to them; these variables have X value. (This is often, but not always, a bug.)

    Some logic devices, like flip-flops or RAMs, store the state of their data inputs. When one input changes from false to true (or vice versa), the device captures the state of another input, stores it, and sends the stored value to its output (in the case of a RAM, later, when the relevant address is read). What happens if the data input is changing at the same time the value is being captured? Which value does it store (or in the case of a RAM, if an address input is changing, into which address does it store)? The answer is "unpredictable," so the software models this by setting the value to unknown.

    One way that real hardware can be unknown is that it takes finite time for the circuit to change from the "false" voltage to the "true" voltage. While it is doing so, it may temporarily at a voltage that is neither true nor false. (Or, very possibly, some of the other logic it is driving may treat the voltage as true and some as false.) This is usually abstracted into a simplified "propagation delay" in logic design and simulation, but the people who design the actual gates at the transistor level care about this.

    @dkf said:

    Real hardware designers think of digital circuits in terms of currents anyway; voltages are for the slow…
    I'm going to disagree with you on this, at least somewhat. The on/off state of a MOSFET1 is controlled by the voltage on the gate. The current in an ideal device is zero, except when it is in the process of turning on or off. There is current in a real device, but this is undesirable leakage, not part of the digital logic operation. The reason I disagree only somewhat is that hardware designers have to consider how long it takes gates to change between true and false, and at a sufficiently detailed level this depends on the current available to change the voltage on the gate of the downstream MOSFETs (and the gate capacitance and parasitic capacitance and resistance of the wires between them). Ultimately, however, it is the difference between the gate voltage and the VT of those transistors that matters.

    1These are the transistors that make up nearly all modern logic gates.


  • Discourse touched me in a no-no place

    @HardwareGeek said:

    Ultimately, however, it is the difference between the gate voltage and the VT of those transistors that matters.

    It's faster if you can detect the current that is charging/discharging the gate capacitor and act on that, rather than waiting for the voltage threshold to be reached. (I used to know something — perhaps too much — about memory circuitry design. I've forgotten a lot.)



  • +A interesting would read again.

    I should also finish reading What Every Programmer Should Know About Memory.



  • @HardwareGeek said:

    >dkf:
    Real hardware designers think of digital circuits in terms of currents anyway; voltages are for the slow…

    I'm going to disagree with you on this


    Thinking further about this (while in the shower this morning), and without having read the replies, yet, depending on @dkf's definition of "real hardware designers," older, TTL logic circuits were based on current rather than voltage. Logic levels were defined in terms of input and output voltages, but the BJT1s that make up the logic circuits operate on a ratio between currents.

    1Bipolar Junction Transistors, an older type of transistor than the MOSFETs used in modern CMOS logic circuits. BJTs are/were used in older digital chips such as the 7400 series common the 80's.



  • @dkf said:

    I used to know something — perhaps too much — about memory circuitry design.

    Ah, memory circuitry. That's a whole other ball of wax. I've only dealt with logic, and don't really know much of the intricacies of memory design (and I'm OK with that).


Log in to reply