Curriculum issues in Computer 'Science' courses



  • Continuing the discussion from No StackOverflow for you! Or: How to get away with <del><del>murder</del><ins>cheating</ins></del><ins>public masturbation </ins>!:

    @cheong said:

    Btw, you do realize the teacher's job is based on the "topics covered in course", don't you? The student may be a C# whiz, but if the course included topic of "linked list" or even older rarely used now "selection sort", "bubble sort", etc. and he somehow can't get it, it doesn't mean he'd not got a problem.

    This sort of brings up a side issue I have regarding how 'Computer Science' (feh) is usually taught. There are a lot of problems with it, both in vocational training and in degree programs, but the one that I specifically have in mind right now is the order in which they cover subjects.

    First off, most CS curricula have a serious problem with pacing, and part of the reason is that, unlike in a lot of other courses, they have to handle students how have varying degrees of exposure to the subject already. You can have one freshthing with years of coding under their belt sitting next to another one who has never turned a computer on. Most will fall into a sort of middle ground where they have used computers casually before, for web surfing and word processing and so forth, but have no real knowledge of programming, or some small amount with a lot of misunderstandings and probably some bad information. So assessment is a first priority, but it doesn't fit well with the one-size-fits-all first year courses most universities start out with (I am assuming a university level education; high school CS classes are hit and miss at best, while from what I understand vocation schools generally get people on the lower end of experience, regardless of talent or inclination).

    This is further complicated by the fact that there isn't any smooth, fixed progression of skill; a student may have a lot of experience in programming while still having significant lacunae in their knowledge of (for example) source control, or common data structures. This means that some material will have to be covered regardless, just because even the experienced students might never have run across some important issue.

    Worst of all, the professors themselves are often crappy teachers, assuming that it is a professor teaching the intro classes at all. Most professors see underclass courses as a distraction from their research, so the departments tend to foist the courses off on whoever didn't show up for the staff meetings that year, or even push it off onto some grad student. To say that this isn't conducive to good education is to put it mildly.

    I guess what I am saying is that making a good underclass curriculum is harder than it seems, yet, most universities seem to treat introductory courses (in most fields, not just CS) as something to be slapped together at the last moment.

    OK, that was a bit of a tangent... where was I? Oh, yeah, the order in which things get presented. OK, assuming we have no choice but to push everyone through the meat grinder in order to make sure that the basics get covered, we need to consider where to start. Most CS programs start with a course in some programming language, which sounds reasonable - if you are teaching programming, you want to have a class on a programming language, right? - but actually isn't really a good idea. Why not? Well, first off, as I already mentioned, not all of the students have a lot of computer background. That's less of an issue today, but it still comes up a lot more than you'd expect. More to the point, starting with a course focused on a programming language puts too much focus on learning the language. It is like starting off in microbiology with a lab course on using microscopes and petri dishes before explaining what bacteria and protozoa are. While there have been some attempts to solve this with courses that use pseudo-code, or flowcharts, or various mathematical notations, that reverses the problem, like having an English Lit class that studies the structure of the sonnet without ever reading any actual sonnets. Finding a way of teaching how to write a program without getting caught up in the details of the language is damn hard. In the US at least, more and more universities are using Python, in part because it is so close to an algorithmic pseudo-code, so that at least is an improvement by my view of things. I still like the idea of using Scheme, which is even simpler, but that's a personal bias and in any case has problems of its own.

    But the fact remains that jumping right into programming might not be the best starting point in the first place. By teaching to the language, not only are you going to have problems with students getting caught up in the syntax at the expense of understanding the concepts, but you also sort of sweep a lot of practical issues of programming under the rug. Too many times, critical issues like version control, documentation or even just using the editor and compiler, get dropped out completely, or at most get mentioned in passing without any real explanation. This leads to students seeing writing a program as something done in isolation, without any real ideas of how things fit together.

    I'm not sure where to take this little rant from here, I'll get back to it later I guess.



  • @ScholRLEA said:

    not only are you going to have problems with students getting caught up in the syntax at the expense of understanding the concepts, but you also sort of sweep a lot of practical issues of programming under the rug.

    That stuff under the rug is mostly software engineering, though, not CS as such.



  • The situation is worse with C++. It doesn't help that many professors learned C++ before it was standardized and haven't paid any attention to the development of the language at all. Over at the cplusplus.com forums, lots of people asking for help with their code are using ancient compilers and writing non-standard code. I actually took a C++ class at my college and the professor was actually just teaching C with classes. I asked him which standard of C++ he knew best and he didn't know what I was talking about.

    With other languages that are more "recent" in terms of being taught, at least the information being taught is more up to date.



  • I was just discussing this with my friends the other night. The first class a CS student takes should be one teaching basic algorithm design and problem solving. They should understand how to decompose and rigorously solve a problem in an algorithmic way before they even touch a programming language. Computer science is math based, so teach the math right away and then apply it to programming just a little later.

    That being said, the CS program at my college is incredible. Best in the state by far.



  • My college taught C++ as a language just to get people used to a programming language at all.

    Then, they went full abstract and taught design, discrete math, logic, finite state machines, then ended it with classes that let you pick whatever technology to solve a problem.

    On the side I picked up ASP.NET elective and did some frontend website programming.

    It was a much better degree than other bigger schools where they kinda taught about computers, and the intro CS class used Microsoft Excel?

    Also, my degree put me two classes short of a math MAJOR.... Could have double majored with one more semester.


    Big point here.

    ###College is not an education

    ###College is a tool to get an education


    I didn't expect that school to teach me the skills to get a job. I expected it to lay out opportunities for me to learn skills.


  • ♿ (Parody)

    @LB_ said:

    using ancient compilers and writing non-standard code

    👋

    $ g++ --version
    g++ (GCC) 4.1.2 20080704 (Red Hat 4.1.2-46)
    Copyright (C) 2006 Free Software Foundation, Inc.
    


  • Hah, that's nothing. Anyone who has spent time on SO or Daniweb will tell you that Turbo C++ is still alive and well and being rammed down the throats of students from Karachi to Manila. The national university systems in India and Pakistan both standardized on TC++ as the one IDE to rule them all decades ago and refuse to budge on the matter, last I heard. A lot of professors elsewhere - especially in the Philippines, Indonesia, and some African countries - have followed suit.

    I haven't heard of any in the US or Europe using it lately, but at least as of 2007, I knew of a data structures professor who refused to accept any code that wouldn't compile in Visual Studio 6, and would mark points off if you used C++ I/O instead of the stdio library. He would use 'fill in the blanks' problems for homework (that is, he would write most of the program out and leave just part of it to be filled in by the students), and detailed all the algorithms using flowcharts, too. What is this, 1977?



  • @boomzilla said:

    👋

    $ g++ --version
    g++ (GCC) 4.1.2 20080704 (Red Hat 4.1.2-46)
    Copyright (C) 2006 Free Software Foundation, Inc.
    ```</blockquote>
    
    Wow, that's old even by Debian stable standard.
    On wheezy (7) I get 4.7.2

  • ♿ (Parody)

    @ScholRLEA said:

    Hah, that's nothing.

    Yeah, I'm sure. Still, it feels pretty old.

    @TimeBandit said:

    Wow, that's old even by Debian stable standard.

    RHEL 5. We may upgrade to 6 "soon."



  • We're talking Turbo C and void main ancient here, not to mention #include <conio.h>


  • ♿ (Parody)

    whew

    I checked. We have int main.


  • Winner of the 2016 Presidential Election

    @flabdablet said:

    That stuff under the rug is mostly software engineering, though, not CS as such.

    The problem with SE is that it's impossible to teach without at least two or three real projects. Our professor really tried (he gave us a shitty piece of software to which we had to add a new feature before refactoring it completely), but I still felt like half of the students did not learn anything in the course.



  • @asdf said:

    The problem with SE is

    people who think that engineering is not at least as much about art as about process and metrics. ISO9001 has done incalculable damage to the software business.



  • @ScholRLEA said:

    First off, most CS curricula have a serious problem with pacing, and part of the reason is that, unlike in a lot of other courses, they have to handle students how have varying degrees of exposure to the subject already.

    I used to teach a lot of computer classes - but the agenda was never set by me. My biggest problem was classes that started basic and delved so deep that no person could possibly exist that both learned from the first quarter of the class and wasn't totally lost the final quarter. We're talking stuff like "What is a Web Browser" in the same one week class as "Covariance and Contravariance".



  • @flabdablet said:

    people who think that engineering is not at least as much about art as about process and metrics.

    Absolutely true.

    Iirc, it's one of the world's most renowned artists that phrased this best. I believe it was Pablo Piccaso who once stated (translated to English, ofcourse) that one should "learn the rules like a professional, so that one may break them like an artist."

    Engineering at the level of abstraction that software engineering takes place, requires a certain kind of finesse and feeling for the subject matter as well as a rigorous understanding of the rules. A combination that you typically find in artisan occupations. It's this combination that allows you to engineer solid products while also able to break out of ill-defined boxes to arrive at better solutions to situational problems.



  • @ScholRLEA said:

    First off, most CS curricula have a serious problem with pacing, and part of the reason is that, unlike in a lot of other courses, they have to handle students how have varying degrees of exposure to the subject already.

    The problem isn't the pacing and how to solve the pacing. The problem is in the one-size-fits-all approach itself. Attempting to handle all students the same just doesn't work.

    If you don't have the basic programming experience to carry you through a simple entrance exam, then you should be sent to prep-courses first, before enrolling in the main curriculum.

    If you graduated with a bachelor's degree from another institution and it turns out that this institution handed it to you on a silver platter without instilling the expected level of subject knowledge in you, then you should not expect to be given any kind of special treatment during classes for your master's degree. That would eat into the time a professor could be spending on students that actually have the requisite knowledge at hand to grasp the material. Instead you should double down on intermediary courses to fill that knowledge gap first.

    etc. etc.

    Also, we should bear in mind that not everyone is cut out for this field. And we should definitely take care not to lower standards to the lowest common denominator, lest you wish a return to the status quo of the 80s and 90s with 'self-educated' hackjobs saturating the market. (We have enough of them left over and still lingering in our field as is.)



  • I agree with you most programming courses needs redesign, as I myself have ranted about this a lot before.

    When I was studying higher diploma in university, the tutor simply assigned me to help teach the Java class because she's not exposed to the language before. (This is an interesting matter. Our department head asked us which programming language do we wish to learn for the programming class at the beginning of semester and we choosed Java, therefore the course changed to use Java, while it's unfortunate that the tutor has not programmed much on Java before, I'd say it's not her fault for not knowing it), And I've seen classmates struggled with ordinary data structures and programming structs taught in "O-level" CS classes.



  • At least you have g++, we had cc on Solaris (SunOS 5) when learning C only.


  • Winner of the 2016 Presidential Election Banned

    My university's CS program starts with java in the first intro course, then switches over to C for the second intro course, and also uses gcc for the compiler. And they have a debugger, but it works about as often as Dischorse, so it's practically worthless. Oh, and most of the computer science courses in general share a small bank of computers that we all have to SSH into to do our work. All of the compiling for all the classes in the department has to be done on one of about eight different machines. And half the tools we use (such as the C debugger) aren't even installed on some of them.



  • One thing I liked about my university is the intro class to CS wasn't about programming. It was simplifications of the CS basics.

    So we learned binary arithmetic, high-level explanation of Von Neumann architecture, what the definition of an algorithm is, programming a "Z-Machine" in z-assembly(really simple push, pop, add move stuff with registers).


  • Winner of the 2016 Presidential Election Banned

    God, that would be beautiful. /jealous



  • @cheong said:

    I agree with you most programming courses needs redesign

    Thats not CS, though. That's software engineering, which is an entirely different thing. SE is to CS what Auto Shop is to Applied Mathematics.



  • @ScholRLEA said:

    the professors themselves are often crappy teachers

    Hey! :sadface:

    @ScholRLEA said:

    push it off onto some grad student. To say that this isn't conducive to good education is to put it mildly.

    I actually disagree here. Grad students have the will and the drive to actually do a good job of the day-to-day execution of a course. They are also much likely to have written any serious code in modern times, and they have the added benefit of remembering how it was to be a student. IME, the professors are usually the worst instructors. They have been around for a while though, so it is a good idea to include them in the planning of a course.

    @ScholRLEA said:

    if you are teaching programming, you want to have a class on a programming language, right?

    Yes. You don't teach Chomsky's Theory of Language by waving your hands in the air. Yes, it adds one more obstacle for the students to get past the syntax of the language you choose, so you are better off with a high-level language. But they are still learning a new way of expressing themselves. Everything is new, no matter which abstraction level you pick. They are going to struggle with syntax. They will learn.

    As for using the editor and the compiler -- this should be an integral part of the programming course, including using state-of-the-art IDE's and CM tools. In the same way, this gets the students caught up with using the IDE more than they learn programming, but this is also a valuable lesson for them. In some ways, knowing how to efficiently use the IDE is more important than knowing how to implement your own linked list.

    The differing entry levels, and the progression you can expect from each student: No argument here. This is a real challenge. A lot of learning programming is about "spending time in the saddle", with the teacher providing individual assistance. If a student knows the topic it is a matter of half an hour to complete the tasks in a lecture, and I (when I have filled in as a programming teacher) see it as my job as a teacher to keep challenging them for the rest of the time. This is usually not a big problem. There is always some language construct, algorithm, data structure, framework, or tool lurking around the corner that they have not yet seen. If all else fails, have them help their less skilled classmates (and recruit them as TA's for next year).

    But this leads to the strange situation that not all students that complete the course are on the same level. What we can hope to certify is that the student at least knows the curriculum. This ought to take care of the:

    @ScholRLEA said:

    a lot of experience in programming while still having significant lacunae in their knowledge

    If the students who think they know it all slack off, this is very visible in the assignments and the exam, and they fail the course. The challenge is to make them realise this early on so they do put in the required effort to pass with flying colours.

    Summary: You said it yourself:

    @ScholRLEA said:

    Finding a way of teaching how to write a program without getting caught up in the details of the language is damn hard.

    ... And we all have good and bad experiences about how to do it. Being WTDWTF, we also have strong opinions on the matter.

    I once read an experience article (sorry, can't find the complete reference anymore) from some uni where the programming 101 class was mandatory for all students, regardless of which faculty they belonged to. Their executive summary read something in the lines of:

    We tried everything. We had extra tutoring. We reasoned with them. We pleaded with them. We offered them tea. All to no avail. In the end we came to the conclusion that there are three types of students: Those who already know programming, those whom we have a chance at teaching how to program, and those who will never be able to learn programming.



  • Actually, that's something I meant to bring up. Even in schools where the trade of 'Management Information Systems' is treated separately, most 'Computer Science' degrees actually are an umbrella for at least six different disciplines: Computer Programming, which is a practical artisanal craft; Algorithmics, Process Modelling, and Data Modelling, which are different branches of mathematics; Software Engineering, which is an engineering discipline (or one day will be - it isn't really there yet and probably won't be in our lifetimes); and Requirements Determination, which is an interdisciplinary study involving a combination of mathematics, engineering, cognitive science, and communication psychology. The problem is that the craft of programming is dependent on all of the others in order to function, and a working programmer needs at least a rudimentary understanding of all of them in order to be effective. Trying to stuff all that into a four year engineering program isn't practical; trying to do it alongside the sort of liberal arts degree a well-rounded person 'ought' to be seeking out first and foremost is virtually impossible.

    I've honestly been wondering if CS should be a BS curriculum at all. Medicine, Law, and several other Engineering branches have 'Pre-X' requirements instead, maybe Software Engineering should, too?

    Or should we be going in the other direction? I mean, it is possible to learn to write programs without all of this, the problem arises when you try to do the rest of it that we end up with :wtf:-ery. Maybe we need a way to factor out the coding part into a trade and specialize the rest of it. We haven't really been able to so far, but is it possible? I'm not sure. Some other engineering fields do it as a matter of course - architects, for example, don't need to learn carpentry, masonry and riveting before they learn how to design a house or a skyscraper - but even then, a knowledge of what is practical is needed. Other engineering fields, such as chemical engineering, are more interdependent with their scientific and technical partners, and it is no accident that those are the fields that don't have baccalaureate engineering courses. , Programming, CS, and SE? Those are intertwined to such a degree today that you can hardly separate them at all.

    We need to work on this as a field, and not just for our field. This is just the edge of the knife blade; these problems are beginning to arise all over the place, not just in IT. As Asimov intuited more than fifty years ago ('Profession', 1957), in an interconnected, technological society, the learning requirements for every sort of work - even ditch digging (how often do you hear of power or sewer lines getting cut because of a poorly planned dig?) - get more and more complex, eventually reaching the point where an ordinary person would have to spend half a lifetime on general studies in order to accomplish anything, and would need to continue their formal education throughout their career. We aren't at that point yet, not even close, but it is eventually coming and IT is ahead of the curve on this. The solutions we come up with for this - if we find any at all - will be the trail other fields will eventually follow.

    In the short term, I think we need to consider two possibilities, either separately or together: one, pushing SE into graduate studies; and two, setting up a system of apprenticeships for computer programmers, either as purely vocational studies (no college required), or more likely, following a CS degree. Those may not be long-term answers, but they are probably the best we can do today.

    Yes, I know it is possible now to work in the field without a degree (I'm still trying to get the last two courses of my BS out of the way myself, and I've been in the field since the mid 1990s), but that probably won't last. It has only worked so far because the CS and SE degree programs are such a mess, the demand for programmers so high, and the barriers to entry so low, that a determined auto-didact can often outperform some bored college grad who took CS because it looked like easy money and now has no idea what he's up against in the field.


  • ♿ (Parody)

    I remember getting a pile of FORTRAN77 code dumped in my lap in...uh...probably 2002? It came with a rats nest of a build system based on self modifying .bat files.

    It was a program that we dearly depended on, normally run by a different organization (in my company). We wanted to be able to do "unofficial" runs. But of course, I didn't have access to a FORTRAN77 compiler, so I had to port it to a more modern dialect (don't remember what, probably 95).

    And then make the batch files work.


  • ♿ (Parody)

    @ScholRLEA said:

    As Asimov intuited more than fifty years ago ('Profession', 1957), in an interconnected, technological society, the learning requirements for every sort of work - even ditch digging (how often do you hear of power or sewer lines getting cut because of a poorly planned dig?) - get more and more complex, eventually reaching the point where an ordinary person would have to spend half a lifetime on general studies in order to accomplish anything, and would need to continue their formal education throughout their career. We aren't at that point yet, not even close, but it is eventually coming and IT is ahead of the curve on this.



  • Hmmn, OK, there is that, but there is more than one way to look at it. On the one hand, it does show how complex the interactions really get, and how tied together even the act of making a simple pencil is; on the other, it is meant to point out as well that this complexity comes out of individuals doing different jobs, each of whom need only be concerned with their own part of the process, and that each is working only out of their own interests yet producing something that is basic to modern society. It is a great example of the equivalence of cooperative and competitive economics (e.g., that neither 'pure' capitalism nor 'pure' socialism are actually possible, since they are two aspects of the same phenomenon).

    The problem in IT today is that the complexity is there, but not the division of labor, nor, really, the self-organization aspect, either. We are an industrial field working through artisanal methods (at best).


  • ♿ (Parody)

    @ScholRLEA said:

    The problem in IT today is that the complexity is there, but not the division of labor, nor, really, the self-organization aspect, either. We are an industrial field working through artisanal methods (at best).

    I would disagree there, too. Most people, even programmers, aren't going to understand the full stack of stuff they're using. Start with 3rd party libraries. Operating system. Networking. Database. etc, etc, etc.

    I totally agree that there's a strong artisanal aspect to building software.



  • @boomzilla said:

    self modifying .bat files

    To be fair, given the limitations of command.com, that pattern can sometimes be the least obscure way available for accomplishing even some quite straightforward tasks. The cmd.exe improvements that arrived with NT go only so far toward altering the truth of that.

    CP/M's command.com and the DOS version descended from it were always CLIs first and glue languages way distantly second. Like the CP/M tools, the DOS tools were designed to produce output that looked OK on a line printer rather than stripped back for easy parsing by other tools. Powershell goes the other way: it's a capable if somewhat untidy-looking scripting language, but it's a dog of a thing for interactive command entry.

    It's hard for those of us who grew up with all this stuff to get the GUI generation to appreciate exactly what it was about the Bourne shell and the Unix CLI toolset that made them so much nicer than contemporary CLIs on other systems. Without fairly extensive experience in using them, all CLIs look much alike; but the balance that Stephen Bourne's little language and the tools that grew around it achieved between the competing pressures of usability at the command prompt and capability in scripting has, in my opinion, still never been bettered.



  • @ScholRLEA said:

    We are an industrial field working through artisanal methods (at best).

    In my view, that's almost entirely due to the rapidity with which expectations about what software can do are changing. Not even materials science advances as quickly as the capabilities of software systems. There simply isn't time to formulate, let alone codify and promulgate, standard and well-tested ways of dealing with everything that software gets applied to.

    That said, the software industry has actually done an astoundingly good job at attempting that Augean task. Software engineering as a discipline has grown tremendously over the thirty-five years that I've had a professional reason to pay attention to it; we're way better at doing loads of things these days.

    But that doesn't change the fact that software system complexity is still growing pretty much exponentially; the amount of newness for which only native cunning and gut feel is of any use continues to outstrip our attempts to order and contain it, and my sense is that the resulting aggregate technical debt is already quite perilously close to the limits of what we, as a culture, will be able to sustain.

    Or maybe everybody approaching retirement age has always felt that way. Hard to say.



  • @boomzilla said:

    Most people, even programmers, aren't going to understand the full stack of stuff they're using.

    That, in particular, is something that has definitely changed over the years I've been working in this business. When I was getting my start, full-stack understanding was something that any competent project manager would certainly have been expected to have.



  • True, but that's sort of like discussion Merlin engines in fighter aircraft - it was the pinnacle in development for its line, but appeared at a time when a whole new approach to the problem was on the horizon. The difference is, Bourne shell stuck around mainly because the new solution (GUIs) often was mishandled in such a way that it didn't readily cover all cases. Specifically, most GUIs never handled automation right, so there remained a need for scripting languages that dropped down into a text shell to handle that scenario. Given that at least three GUIs did solve it - MacOS (at least while they still had Hypercard), Viewpoint (which had Smalltalk underneath it), and LispM Dynamic Windows - this failure had less to do with GUIs and more to do with lack of foresight on the implementors.


  • ♿ (Parody)

    @flabdablet said:

    @boomzilla said:
    Most people, even programmers, aren't going to understand the full stack of stuff they're using.

    That, in particular, is something that has definitely changed over the years I've been working in this business. When I was getting my start, full-stack understanding was something that any competent project manager would certainly have been expected to have.

    But there's a difference, I think, in having a familiarity or understanding of the basic principles vs understanding it well enough to implement. Just like the process of mining ore to refining it before making into the little cylinder that holds the eraser on a pencil. Watching a show like How It's Made shows me how little I really know about so many things that go on.

    But, yeah, with the growth over the years, I agree that people used to have a more detailed knowledge of the details as a percentage of the system.



  • @ScholRLEA said:

    this failure had less to do with GUIs and more to do with lack of foresight on the implementors.

    ... and nothing to do with the qualities (if any) or failings of the Bourne shell.



  • @ScholRLEA said:

    Specifically, most GUIs never handled automation right

    I'd argue that this is inherent in the form. GUIs are by their nature event-driven and user-controlled: the machine itself has far less say over why things happen in a typical GUI session compared to a CLI session, because the control actions seen by the machine are so much less rich.

    In a CLI, the typical atom of control is something akin to a whole English sentence. There's a verb (the command), a bunch of nouns (the arguments) and a bunch of adjectives and adverbs (options) all tightly bound together. The output produced by any given command is constrained by the construction of the command itself, and can be specifically restricted in order to make it suitable as fodder for something else. Automating a CLI is a relatively simple matter of adding some control flow constructs and a bit of general-purpose inter-command glue, and the resulting scripts are easily comprehensible to people already familiar with the CLI as CLI.

    From the machine's point of view, instructions arriving via a GUI are almost completely formless. To the greatest extent possible, decisions about what to do next are pushed back to the user. User gestures that do stuff result in graphical updates designed to make sense to users, making them hard to machine-parse; even the rather gussied-up output from something like DOS's DIR command is trivially easy by comparison. There's just not much there to hook into control flow constructs with.

    This means that automation of the actual GUI is not really terribly useful. For an automation capability to be of any use, you need to automate the controller behind the view, and that has to involve exposing a fair amount of both that controller and the model it controls to the person doing the automating; unsurprisingly, the resulting aggregation of controls generally yields something resembling a scripted CLI more than anything else.



  • @Mikael_Svahnberg said:

    Yes. You don't teach Chomsky's Theory of Language by waving your hands in the air.

    Waving your hands in the air would be far more useful to the students, though.

    Protip: any theory which contains the phrase "deep deep structure" probably isn't as well thought-out as it should be.



  • @ScholRLEA said:

    Given that at least three GUIs did solve it - MacOS (at least while they still had Hypercard), Viewpoint (which had Smalltalk underneath it), and LispM Dynamic Windows - this failure had less to do with GUIs and more to do with lack of foresight on the implementors.

    AppleScript would be a much better example than HyperCard, but either works.

    You're also missing macro recorders common in Office products of a certain era (both ClarisWorks/AppleWorks and Microsoft Works had very good GUI macro recorders), and while VBScript/JScript is more "command line-y" it's worth a mention, too, since at one point they also shipped with GUI macro recorders.



  • @flabdablet said:

    I'd argue that this is inherent in the form.

    It's not. Apple proved so when they implemented AppleEvents and AppleScript way back in the early-90s.

    Pro-tip: once your AppleScript was written, executing it didn't even require drawing windows on the screen. Not like HyperCard, or a Microsoft Works macro.

    The key is to add an abstraction level between the user's action in the application and the application's response, make the events in the abstraction layer scriptable, and add a sort of "dictionary" defining the operations and what parameters they can use.

    For example, a proper MacOS 7.x application, when the user selected "Bold" from the menu bar, would send an AppleEvent to itself instructing itself to bold the selection. Since the OS routed AppleEvents, the Macro Recorder could simply record this user action to later create a script from it. Since the application got a high-level instruction like "bold selected text" instead of "simulate a mouse click at x:316, y:126", the script could be later executed without even bothering to draw the window on the screen, and without worrying that a user's rogue mouse movements during script execution would ruin everything.

    ... in short, it's a solved problem.



  • @blakeyrat said:

    The key is to add an abstraction level between the user's action in the application and the application's response, make the events in the abstraction layer scriptable, and add a sort of "dictionary" defining the operations and what parameters they can use.

    Quite so. As I said, automating the view is far less useful than automating the controller, which generally requires exposing details of both controller and model.

    And of course what you end up with from doing that is scripts in an ordinary text-based programming language that bears no resemblance at all to the GUI itself. Writing those scripts then becomes a completely separate skill from operating the interface, which means that most users never do actually write any scripts.

    The CLI certainly takes more effort to learn than a GUI, but the incremental amount of learning required to implement scripts on top of a good CLI is so small that most CLI users can and do automate stuff as a matter of course.



  • @flabdablet said:

    And of course what you end up with from doing that is scripts in an ordinary text-based programming language that bears no resemblance at all to the GUI itself.

    Not true, that's where the macro recorder comes in handy.

    @flabdablet said:

    Writing those scripts then becomes a completely separate skill from operating the interface, which means that most users never do actually write any scripts.

    Possibly true. But simple recorded scripts can be created/used by anybody. And it only takes a small amount of knowledge to edit a recorded script to, for example, add a loop to it.

    @flabdablet said:

    The CLI certainly takes more effort to learn than a GUI,

    Part of that is because the popular ones around now are really, really shitty. Not because that's a property inherent to CLIs.

    @flabdablet said:

    but the incremental amount of learning required to implement scripts on top of a good CLI is so small that most CLI users can and do automate stuff as a matter of course.

    Unless you want to script something dealing with audio, video, browsing the web, or any of a billion tasks the CLI is fucking terrible at.

    PowerShell makes an effort to be better at it (at least it throws away the moronic notion that every type of data is text or can be meaningfully converted to text), but there's still a long, long, long way to go before any CLI is going to be useful to, say, a DJ.



  • @blakeyrat said:

    at least it throws away the moronic notion that every type of data is text or can be meaningfully converted to text

    Except when it doesn't.



  • Except that blakey is right. scripting stuff under System 7/8/9 was exactly as simple as clicking on the "record" button and going about doing what you wanted to do. You could then manually edit your script if needed, but most of the time you didn't need to. See also Automator on OSX, which is the more or less "up to date" version.

    Also, almost anyone who resorts to using sh for scripting when they don't have to worry about running their script on a system where they can't install or use something more sane is a fucking retard.

    [edit] - "use". fuck this keyboard.



  • @flabdablet said:

    Except when it doesn't.

    Oh well that one WTF post which frankly I don't even understand is a good enough reason to completely ignore all PowerShell innovations and give up entirely and become a caveman.

    Seriously, is that your point in posting that? Is not, what was?


    Look, let's cut to the chase: have you even used MacOS 7, 8, 9? If so, did you use the Macro Recorder in them? How about VBScript? And VBScript's macro recorder? Do you even have the foundational knowledge needed here to talk about GUI scripting?

    Because I don't believe you do.



  • Just as a gentle reminder that moronic notions cannot be handwaved out of existence by innovation. Moronic notions are actually among the most resilient features of the software industry, and more and more of them appear every year.



  • @blakeyrat said:

    have you even used MacOS 7, 8, 9? If so, did you use the Macro Recorder in them? How about VBScript? And VBScript's macro recorder? Do you even have the foundational knowledge needed here to talk about GUI scripting?

    Yes, occasionally, yes, occasionally, yes and yes.



  • Uh ok? So you've given up on the conversation entirely, then? Or...?



  • I don't think it was too bad; the guy just is really stupidbad at understanding anything. If he did, he would have written something like the code you posted.

    @blakeyrat said:

    one WTF post which frankly I don't even understand

    Are you the guy sitting 20 feet away from me, about whom the said WTF was written?

    Seriously, blakey, second time today about the whole "operate a comptuer" thing...



  • @rc4 said:

    Are you the guy sitting 20 feet away from me, of whom said WTF was written?

    I dunno. I'm anonymous and so are you. But probably not, since I haven't done any PowerShell recently.

    @rc4 said:

    Seriously, blakey, second time today about the whole "operate a comptuer" thing...

    Look, I read the post, there's no explanation of what's wrong, so I don't understand what's wrong. Why is that so surprising? I barely even know PowerShell, other than some playing around with it. If you want to explain it, please be my guest and do so. Otherwise, posting that shit is completely unhelpful.



  • The guy was attempting to:

    1. Run a command that searches for "A"
    2. Store that output in a variable, and search it to see if it contains "A"
    3. If it matches, do a thing

    The thing about that command is that it will always output what you searched for, so it will always match A, and therefore his "if" was broken.



  • And the other thing, the one that made me laugh, was that MS has apparently not provided a cmdlet that does what cmdkey does, instead advising people who need that functionality to invoke cmdkey itself from their PowerShell scripts and fartarse about with parsing its not-at-all-machine-friendly output.


Log in to reply