A level IT



  • Hmm, just found this in my A level IT book, just shows how fake the subjcet is:

    Step 3: Retrieving the web page from the web server

    THe server where the website is located receives the request. The server can tell wherer the request originated by looking at information contained in the request header. The server sends a message back indicating: "Website Found. Waiting for Reply". ????
    Somewhere on the server the web page itself will be stored as an HTML text file. The code contained in this text file will define the content and appearance of the web page. It may contain references to URLs where graphics, video or sound files are stored. The server will put the request in a a queue. When it is in a position to repson, the server sends the HTML file to the requester's browser. It also sends a message to those locations where other files e.g. graphics are stored to send the appropriate files to the same browser. ???

    Step 4: Opening the web page

    AS the different parts of the web page arrive at the requester's PC or server, they have to be stored before being reassembled. They are therefore held in the computer's cache memory, a particular type of memory that holds required data.
    As the data is being collected the browser begins putting the page together. It uses the HTML code as a guideline for this process. The different element of the website arrive at different times - larger graphics / animation files arriving later than simple text. When the page is complete , the user generally given a message tyo this effect, "Done".


    WTF. This is supposed to be advanced level IT. How wrong and over simplified can you get.



  • "advanced" is all relative.

     



  • Well, simplifying is one thing, but most of this is just plain misleading and incorrect terminology.

    I don't think anything like that can be called "advanced".



  • @dhromed said:

    Well, simplifying is one thing, but most of this is just plain misleading and incorrect terminology.

    I don't think anything like that can be called "advanced".

    I understand your frustration but its a semantic question, for a layperson this text can be "advanced".

    Whatever seems to be a wtf anyways.



  • @mgrisoli said:

    @dhromed said:

    Well, simplifying is one thing, but most of this is just plain misleading and incorrect terminology.

    I don't think anything like that can be called "advanced".

    I understand your frustration but its a semantic question, for a layperson this text can be "advanced".

    Whatever seems to be a wtf anyways.



    Yes, I understand and agree.
    Rereading the text does make it seem less weird to me as it was at first glance.

    But certain things can be simplified to the point where they are wrong.

    For example:
    cache memory


    The term "memory" is usually reserved for the actual RAM memory.
    When we speak of browser cache, we refer specifically to space on the hard drive.

    That's the kind of thing that should not, [i]could not[/i], be simplified because it already is as fundamental as possible.


  • @dhromed said:

    @mgrisoli said:

    @dhromed said:

    Well, simplifying is one thing, but most of this is just plain misleading and incorrect terminology.

    I don't think anything like that can be called "advanced".

    I understand your frustration but its a semantic question, for a layperson this text can be "advanced".

    Whatever seems to be a wtf anyways.



    Yes, I understand and agree.
    Rereading the text does make it seem less weird to me as it was at first glance.

    But certain things can be simplified to the point where they are wrong.



    I would agree if the descriptions were accurate but just a little weird and over-simplified.  Except that at least the "Step 3: Retrieving the web page from the web server" section is grossly inaccurate.  Some of the actions ascribed to the server are actually performed by the client (i.e. browser).

    Whoever wrote them did not understand the technology enough to explain it.

          -dZ.


  • @dhromed said:

    @mgrisoli said:

    @dhromed said:

    Well, simplifying is one thing, but most of this is just plain misleading and incorrect terminology.

    I don't think anything like that can be called "advanced".

    I understand your frustration but its a semantic question, for a layperson this text can be "advanced".

    Whatever seems to be a wtf anyways.



    Yes, I understand and agree.
    Rereading the text does make it seem less weird to me as it was at first glance.

    But certain things can be simplified to the point where they are wrong.

    For example:
    cache memory


    The term "memory" is usually reserved for the actual RAM memory.
    When we speak of browser cache, we refer specifically to space on the hard drive.

    That's the kind of thing that should not, [i]could not[/i], be simplified because it already is as fundamental as possible.

    The computer-proficient (or at least computer-aware) people can differentiate between hard disk and RAM.

    90% of the people I deal with, however, think that it's all "memory". They have 512 memory, and 40 GB's of memory. (Yes they're the same people that don't know that RAM is usually measured megabytes, nor do they know what "GB" stands for).



  • @alias said:

    Hmm, just found this in my A level IT book, just shows how fake the subjcet is:


    (For the Americans reading, A levels are a two-year course in the UK, which is normally done between the end of compulsory education and the start of university, nominally age 17 to 18; your grades in them normally form the basis of your university acceptance. They are "advanced" as compared to the GCSEs, which are done at the end of compulsory education. Nothing in the US education system is quite the same - we study three to five subjects specifically, and drop everything else, instead of continuing general education all the way to university)

    The "IT" and "Computing" A levels are notoriously bad. They contain little information of consequence, they are years out of date (often based on 1980s technology), and often they are just plain wrong. Yes, that's right - in order to pass the course, you must write things on the exam paper which are actually incorrect, and which you will have to be untaught if you then take a university degree in the same subject. It looks good on your CV but you won't learn anything on the course.

    The universities are all aware of this. That's why, if you go to university to get a computer related degree, they don't look at your grades in these subjects as being worth any more than another random subject. Instead, they look at your grades in the mathematics courses.



  • @asuffield said:

    @alias said:
    Hmm, just found this in my A level IT book, just shows how fake the subjcet is:


    (For the Americans reading, A levels are a two-year course in the UK, which is normally done between the end of compulsory education and the start of university, nominally age 17 to 18; your grades in them normally form the basis of your university acceptance. They are "advanced" as compared to the GCSEs, which are done at the end of compulsory education. Nothing in the US education system is quite the same - we study three to five subjects specifically, and drop everything else, instead of continuing general education all the way to university)

    The "IT" and "Computing" A levels are notoriously bad. They contain little information of consequence, they are years out of date (often based on 1980s technology), and often they are just plain wrong. Yes, that's right - in order to pass the course, you must write things on the exam paper which are actually incorrect, and which you will have to be untaught if you then take a university degree in the same subject. It looks good on your CV but you won't learn anything on the course.

    The universities are all aware of this. That's why, if you go to university to get a computer related degree, they don't look at your grades in these subjects as being worth any more than another random subject. Instead, they look at your grades in the mathematics courses.


    I just sat  the exam for A level IT, it is a complete joke compared to the other more intense A level courses I take. I haven't actually learnt anything, except to disregard anyone who condisers it a proper A level. Thank god I can drop it.



  • @Manni said:

    The computer-proficient (or at least computer-aware) people can differentiate between hard disk and RAM.

    90% of the people I deal with, however, think that it's all "memory". They have 512 memory, and 40 GB's of memory. (Yes they're the same people that don't know that RAM is usually measured megabytes, nor do they know what "GB" stands for).



    Let them learn!

    Even more because this is course material.


  • I was told to stay away from IT at A level, and go for the maths instead, 'cause, as you say (correctly), it's not worth it. Most places aren't interested in IT, they look for Computing, and it's been like that for the last few years. (I'm just adding this in, in case anyone was still in doubt about what a bad idea IT is).

    And most courses teach VB. gaaaaah. Brain cringe.

    Of course, if you want to "improve the nation's schools", you can spend some money on actually improving them, or change the metric at which schools are rated. So, we get subjects like IT that are ridiculously easy to pass, and often incorrect, and are just click-click-click-do-this-do-that and you don't need to understand what you're doing, and are often only chosen because people want to piss about during the classes, but at which the government can point at and say "See, students are now passing exams, so we must be doing something right!" and use nice figures above 90% for pass rates, whatever they're meant to mean these days.

    I sure am annoyed these days. Maybe it's the Further Maths...



  • Some schools have a computer science A level course, is that worth it?



  • So, in short, the A levels are the UK's equivalent of bad Community/Junior colleges in the US.



  • @merreborn said:

    So, in short, the A levels are the UK's equivalent of bad Community/Junior colleges in the US.

    Not all of the A-levels are worthless. Maths is still OK, for example - but it's probably not long until that gets dumbed down into uselessness...



  • @merreborn said:

    So, in short, the A levels are the UK's equivalent of bad Community/Junior colleges in the US.


    They did at least say that it's only the computer related A-levels that are that bad, thank goodness for that.

    But even some of our bad community/junior colleges are better than that. The one I went to only had utterly false information in the A+ Certification course. (Leading me to believe that A+ is the Certification equivalent of the GED. Even Linux+ is less shite than that.)

    For instance: Did you know that heavier power supplies are better than lighter ones because they have more transistors? It's got to be true, it's in print! ;)



  • I guess I should be glad im not in the UK



  • I wouldn't call it advanced, but it seems about right for something just out of high school.  I don't know if I believe this is a word for word copying of the original text (because there are many really bad typos and sentences that aren't sentences).  I don't think a published book would have typos that bad.  the only thing I thought was really stupid was that it refered to "cache memory" which is a completely different cache than the browser's cache.  as far as the part 3 goes.  it all seems right to me except that last sentence which I don't even understand what it is trying to say.  are you sure you typed it exactly the same?



  • @asuffield said:


    The universities are all aware of this. That's why, if you go to university to get a computer related degree, they don't look at your grades in these subjects as being worth any more than another random subject. Instead, they look at your grades in the mathematics courses.


    So are employers.  For a junior level post we'd generally be looking at A-levels with reasonable grades.  Pure maths, applied maths or physics are a good sign, statistics is another good one (especially for the financial arena), computing is largely speaking worthless.  After that, you start looking at the extra-curricular and non-scientific stuff - philosophy is a very good sign, and good marks in modern (and especially "classical", although the number of students studying latin these days is lamentably low) languages are another good sign.

    For a slightly more advanced junior post we'd be looking at someone with a degree, preferably something analytical but not necessarily computing.

    The majority of computing students, at both A and degree level, are taught so many antipatterns that it's necessary to make them unlearn pretty much everything they have been taught before you can let them rip on production code without heavy mentoring.  At least A level students don't have expectations that they will be earning 100K within 3 months.

    There's room for apprenticeships in this business.

    Simon


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.