So this ancient DEC Alpha server hadn't been backed up lately...



  • ...and by lately, I mean like nearly ever, except an old DDS-3 tape that probably succumbed to bit rot long ago.

    This server's running OpenVMS, connected to an internal network, and has FTP enabled with no path restrictions, but the bizarre format it uses for paths means that automated FTP stuff won't do much with it. It's rather a pain to actually do anything on it; much nicer when the files are copied to Windows -- assuming, that is, that they're not in proprietary formats that only it understands. (Luckily it's not really used anymore, except when we occasionally need to pull some historical data that only it understands off of other equally old DDS-3 tapes that may or may not still work. Believe you me it was a huge whoopee day when I discovered that it had FTP, so that after I exported the old data to .CSV -- assuming the tape could be read, that is -- I then could FTP it onto my Windows PC. Before that, I was using copy-and-paste to transfer data a screen at a time.)

    Did I say the format was bizarre? It's not so much bizarre as it's just completely different from anything else you've ever seen. The root is named 000000. Folders show up as FOLDER.DIR files (oh, and the root contains 000000.DIR -- a link to itself). Paths are given as one or more folder names (not including .DIR), separated by . and enclosed in square braces -- CD [000000.FOLDER], or just CD [FOLDER] -- the root folder is default. Paths can also be relative to the current directory: CD [.FOLDER], or to the parent directory: CD [-.FOLDER] (or to the grandparent, CD [--.FOLDER]... etc.). Files have extensions: HELP.README, CONFIG.TXT, and the OS supports automatic versioning (CONFIG.TXT;1, CONFIG.TXT;2, CONFIG.TXT;3), but this one's only configured to keep 1 version so that's irr:elephant:.

    Enter DOS batch script, pounded out yesterday and left to run overnight...

    @echo off
    setlocal enabledelayedexpansion
    
    set "local_root=%userprofile%\My Documents\Druidia - Backup"
    set "remote_root=000000"
    set "errors=%userprofile%\Desktop\errors.txt"
    
    set "ftp_server=192.168.0.100"
    set "ftp_username=roland"
    set "ftp_password=12345"
    set "ftp_commands=%temp%\commands.ftp"
    
    if exist "%errors%" del "%errors%"
    if not exist "%local_root%" mkdir "%local_root%" 2>nul 
    cd "%local_root%" 2>nul || (
        echo Unable to create folder %local_root%>>"%errors%"
        notepad "%errors%"
        exit /b
    )
    set "local_root=%cd%"
    
    echo open %ftp_server%>"%ftp_commands%"
    echo %ftp_username%>>"%ftp_commands%"
    echo %ftp_password%>>"%ftp_commands%"
    echo bin>>"%ftp_commands%"
    echo cd [%remote_root%]>>"%ftp_commands%"
    echo mget *>>"%ftp_commands%"
    echo bye>>"%ftp_commands%"
    
    ftp -i -s:"%ftp_commands%"
    if exist "%remote_root%.dir" del "%remote_root%.dir"
    
    for /r %%f in (*.dir) do (
        if /i "%%~xf" == ".dir" (
            cd "%%~dpf"
            set "f=!cd!"
            set "f=!f:%local_root%\=!"
            set "f=!f:%local_root%=!"
            if "!f!" neq "" set "f=!f:\=.!."
            
            del "%%~nxf"
            mkdir "%%~nf" 2>nul && (
                cd "%%~nf"
                
                echo open %ftp_server%>"%ftp_commands%
                echo %ftp_username%>>"%ftp_commands%"
                echo %ftp_password%>>"%ftp_commands%"
                echo bin>>"%ftp_commands%"
                echo cd [!f!%%~nf]>>"%ftp_commands%"
                echo mget *>>"%ftp_commands%"
                echo bye>>"%ftp_commands%"
                
                ftp -i -s:"%ftp_commands%"
            ) || echo Failed to create %%f>>"%errors%"
        )
    )
    
    del "%ftp_commands%"
    if exist "%errors%" notepad "%errors%"
    

    So this morning it seems to have run successfully, and the destination folder has a bit over 11 GB of data. :)

    It does have one potential issue: if both a FOO.DIR folder and a FOO file existed in the same directory, then locally it'd try to create a FOO folder and fail because there's already a file by the same name, but I wrote it to log the error and move on. (I got no such errors logged.) I could've fixed this by having it include .DIR on the local folders it created, but then I'd have A.DIR\B.DIR\C.DIR locally where the original path had just A.B.C. Or I suppose I could've had it rename the conflicting file. Anyway, considering that it didn't actually run into that situation, I left it alone.

    Yes, of course I anonymized it...



  • So have hackers from Planet Spaceball stolen your air supply yet?



  • @mott555 said in So this ancient DEC Alpha server hadn't been backed up lately...:

    So have hackers from Planet Spaceball stolen your air supply yet?

    Not yet. Why do you think I wanted to make the backup? Sheesh.



  • @anotherusername 11 GIGABYTES on that ancient DEC server? How?

    Or is it like 1 GB repeated 11 times because of how the folders were traversed?



  • @blakeyrat It's not that ancient. The hard drive appears to be about 17 GB.

    Actually it looks like it probably skipped all of the files that were in use...



  • @anotherusername We have very different definitions of "ancient".



  • @blakeyrat well, it's really not as ancient as what it feels like... the obscure, peculiar OS makes it feel more like something from the 80s than something that was produced shortly before Y2K. Still, 15+ years is a long time in computer years.



  • Will someone please close the wormhole leading to 1998?



  • @ScholRLEA Quick warn them about the things to come first!


  • Discourse touched me in a no-no place

    @theBread said in So this ancient DEC Alpha server hadn't been backed up lately...:

    Quick warn them about the things to come first!

    Just tell them to be very wary of giving any money to AOL or Yahoo!


  • Impossible Mission - B

    @dkf said in So this ancient DEC Alpha server hadn't been backed up lately...:

    Just tell them to be very wary of giving any money to AOL or Yahoo!

    I see this and the first thing I think of is the movie Frequency...



  • @anotherusername said in So this ancient DEC Alpha server hadn't been backed up lately...:

    Did I say the format was bizarre? It's not so much bizarre as it's just completely different from anything else you've ever seen.

    Interesting. I’ve never seen VMS up close, but I decided to look it up. In short, filenames follow the pattern node::device:[root.][directory-name]filename.type;version where the square brackets don’t mean “optional” (as they do in Unix documentation) but “directory name”.

    I wouldn’t say “bizarre” but “unusual to modern eyes” because it’s somewhat different enough from the modern Unix–CP/M way that we’re probably all used to. It also kind of reminds me when I was tinkering with RISC OS a few years ago and kept getting slightly baffled by the filenames.



  • @Gurth ...uh... weird. Okay, I'm uncertain on the difference between the "root" and the "directory" portions.

    Within the root component, a period (.) separates subdirectory names.

    Within the directory component, a period (.) separates subdirectory names.

    The root component contains a series of root subdirectory names

    The directory component contains a series of subdirectory names

    So is there a difference between "root subdirectory names" and "subdirectory names"? It seems not to really answer that.

    I'm thinking, as a wild guess, that it might be a way to specify the "working directory" within the filename... but... yeah, I dunno.



  • @Gurth said in So this ancient DEC Alpha server hadn't been backed up lately...:

    I wouldn’t say “bizarre” but “unusual to modern eyes” because it’s somewhat different enough from the modern Unix–CP/M way that we’re probably all used to.

    I did kind of say that...



  • @Gurth oh, you know what's even more fun? You can CD to folders that do not exist -- there's zero checking for it actually existing, just that it parses okay. Only at the point where you actually try to -- say -- list the files in the folder, will it actually tell you that the current working directory does not in fact exist.



  • @anotherusername said in So this ancient DEC Alpha server hadn't been backed up lately...:

    So is there a difference between "root subdirectory names" and "subdirectory names"? It seems not to really answer that.

    I’m guessing it’s somewhere else in the documentation but I’m not going to dig through it, However, Wikipedia to the rescue:

    In the VMS operating system, the term "root directory" is used to refer to the directory in which all the user's files are stored, which is what Unix calls the "home directory". The equivalent of a MS-DOS per-disk "root directory" in VMS is referred to as a "Master File Directory" and is specified as [000000]

    So once again, we’re fooled by VMS using a different meaning to what we expect from current systems. If I got this right, disk1:[home.gurth][Documents.Stuff] would be equivalent to /home/gurth/Documents/Stuff on a *NIX system or C:\Users\gurth\Documents\Stuff on Windows. I get the impression that [000000] is used to represent that you’re at the level of //C:\.

    @anotherusername said in So this ancient DEC Alpha server hadn't been backed up lately...:

    I did kind of say that...

    Yes, you did.

    @anotherusername said in So this ancient DEC Alpha server hadn't been backed up lately...:

    @Gurth oh, you know what's even more fun? You can CD to folders that do not exist -- there's zero checking for it actually existing, just that it parses okay. Only at the point where you actually try to -- say -- list the files in the folder, will it actually tell you that the current working directory does not in fact exist.

    I could see that having some use if there’s a command to create the directory you’re currently in (so you can cd to a non-existing directory and then create it). If there isn’t, I’m somewhat puzzled about why it wouldn’t just check when you change directories.



  • In it's day Open VMS was one of the most advanced OS's available and blew just about everything else out of the water. It set several standards and principles that are still valid today. As was the hardware and architecture it was built on. DEC as a Company failed because it was not commercially motivated. The machines and technology they developed died because Microsoft stopped supporting the Alpha CPU after Windows NT. As a service engineer for DEC / Alpha Systems I witness million of $ of equipment being abandoned, literally, over night.



  • @Gurth said in So this ancient DEC Alpha server hadn't been backed up lately...:

    the modern Unix–CP/M way

    CP/M gave us drive letters and 8.3 filenames. It gave us neither hierarchical directories nor the /path/to/file convention for representing them. Those didn't even arrive in MS-DOS until version 2.0, by which time Unix had been a thing for over ten years.



  • @loose The rise of inexpensive PCs running MS-DOS, and somewhat more expensive workstations running Unix, pretty much blindsided everyone, including IBM (who didn't even have the IBM PC as part of their strategic planning until after it took off, and when they did start to take it seriously, assumed that they could steer the home users back towards using their PCs as glorified remote terminals once they stomped out all the other home computer manufacturers; this led directly to the PS/2 series, and cost them their market lead pretty much permanently).

    Also, their primary market was universities, with their lower range being aimed at running control systems for research equipment (such as the ones used for the 'diving monkey' project, which infamously ended when the DEC maintenance tech failed to mount a scratch monkey), while their mainframe line was mostly used for programming classes and computer science research (a few universities also used them for their record keeping and accounting, but that was unusual, and those who did often used a separate system for it).

    They didn't want to go into the home system market (this isn't speculation; Ken Olsen said as much repeatedly in the late 1970s), and assumed that the main area for workstations would be LispMs (most of which were designed to run attached to a PDP-10 or a VAX, conveniently enough for DEC); they didn't notice what Sun, Apollo, and HP were doing with Unix workstations until after the LispM companies tanked.

    When they did go into the markets for PCs and workstations, it was already pretty late in the day (around 1984), and their entry products for those markets - especially the Rainbow and their various diskless workstations- were dismal failures.

    The Alpha was an attempt to leapfrog the competition with a bold new RISC design, but they failed to realize that most of the other workstation companies had already shifted to RISC designs by the, and completely failed to realize that by that time, the RISC systems already on the market were getting their lunch eaten by the cheaper x86 systems that Intel had - much to everyone's amazement, including Intel's - managed to push the performance on enough to keep up with systems which, while using far superior architectures, were often manufactured with trailing-edge fab techniques.



  • @ScholRLEA Good story - the monkey thing. And I refuse to be triggered by my memories of just what I have seen the LSI series (03's, 05's, 23's, 73's etc) of PDP's hooked up to in "research environments" because the standard issue boards would have probably been rewired or had the FPLA's (where a lot of the "hardware" was) reprogrammed. And there would almost always be a home grown or severely bastardized IF card.

    Yes the PC has a lot to answer for. :trolleybus:


  • Discourse touched me in a no-no place

    @ScholRLEA said in So this ancient DEC Alpha server hadn't been backed up lately...:

    diskless workstations

    I used those. Swapping over the network is entirely appropriate for the Bad Ideas Thread :arrows:

    Why was it a Bad Idea? Well, imagine you've got a room full of 50 of these things, and a class of 50 undergraduate students comes in and starts using them all at once. Even with fast (for the day) 10-Base-2 networking, you're still going to be stuck with pegging the network traffic for a long time while all those systems thrash to a usable state. Having the swap partition locally makes a big difference.

    Also, as I learned a few years later, having the full OS and applications installed locally (instead of pulling them in via NFS or other networked filesystem) is a good idea as it means that if the fileserver keels over, you don't bring an entire department of 400–500 people to a screeching halt…



  • @dkf said in So this ancient DEC Alpha server hadn't been backed up lately...:

    Swapping over the network is entirely appropriate for the Bad Ideas Thread

    FTFY.

    I agree with Seymour Cray: if you need more RAM, buy more RAM.


  • Discourse touched me in a no-no place

    @flabdablet That's what I do now. But when we were in that long period where technology and economics kept most people below the 32-bit limit…



  • @loose Apparently, I hadn't realized that there were two such incidents around that time, in different projects at different universities. The one I had in mind was the U. of Western Ontario case involving the death of Mabel the Swimming Monkey, but I linked to the case at U. of Toronto by mistake. Both stories are recounted here.



  • @flabdablet said in So this ancient DEC Alpha server hadn't been backed up lately...:

    @Gurth said in So this ancient DEC Alpha server hadn't been backed up lately...:

    the modern Unix–CP/M way

    CP/M gave us drive letters and 8.3 filenames. It gave us neither hierarchical directories nor the /path/to/file convention for representing them. Those didn't even arrive in MS-DOS until version 2.0, by which time Unix had been a thing for over ten years.

    That’s why I called it the “Unix–CP/M” way: most computer users these days are accustomed to CP/M-style drive indicators followed by Unix-style paths.


  • Impossible Mission Players - A

    @dkf said in So this ancient DEC Alpha server hadn't been backed up lately...:

    @ScholRLEA said in So this ancient DEC Alpha server hadn't been backed up lately...:

    diskless workstations

    I used those. Swapping over the network is entirely appropriate for the Bad Ideas Thread :arrows:

    Why was it a Bad Idea? Well, imagine you've got a room full of 50 of these things, and a class of 50 undergraduate students comes in and starts using them all at once. Even with fast (for the day) 10-Base-2 networking, you're still going to be stuck with pegging the network traffic for a long time while all those systems thrash to a usable state. Having the swap partition locally makes a big difference.

    Also, as I learned a few years later, having the full OS and applications installed locally (instead of pulling them in via NFS or other networked filesystem) is a good idea as it means that if the fileserver keels over, you don't bring an entire department of 400–500 people to a screeching halt…

    Yeah. I managed to experiment with iSCSI enough to install Windows 7 on a target and get an arbitrary machine to boot it. It's not terrible if only one machine is on, but more than that...

    Good news is that it's somewhat resilient to network interruptions, so long as the disk queue doesn't fill up before connectivity is restored...

    Actually, there might be some code reuse as that same resiliency is present on a Windows to Go installation...



  • @Gurth said in So this ancient DEC Alpha server hadn't been backed up lately...:

    the “Unix–CP/M” way

    Ah. Now I see what you did there. Well played. Carry on.


  • Discourse touched me in a no-no place

    @anotherusername said in So this ancient DEC Alpha server hadn't been backed up lately...:

    but the bizarre format it uses for paths

    dir I$dont:[understand.what.you]mean.jpg;23


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.