Linux is SO MUCH MORE superior!



  • Okay, so I grew up on Linux - grew up in the sense that it was my OS of choice up until a few years ago. For work I have a Dell (TRWTF?) laptop that's been happily running Ubuntu 10.04 since before I started work here back in May. Last week I had an issue shutting down the laptop which forced me to power off the machine before it could be shut down properly. It should also be noted that I have a 3TB external plugged in that is used for backups of both the laptop and our servers. Ever since that hard power off, my laptop would not shutdown properly - always having to resort to a hard power off. Yesterday I got to work and powered up my machine, and to my surprise my wireless didn't work. The kernel modules were being loaded, and dmesg offered very little as to why the driver was being marked "disabled" by the network applet. I did some googling, and followed many suggestions, and eventually I realized that dpkg was hanging (wouldn't even respond to a kill -9 - for those not in the immediate "know", a -9 is the terminate with prejudice signal, and the kill command is used to, um, kill a process).

    So I did some more googling, and even more googling. I finally found that the unattended-updates init script was hanging on the shutdown - but not specifically why (at least, not immediately). I assume that the unattended-updates script utilizes dpkg, or some facility of, and there was some vague correlation between the two. It should also be noted that I have been able to fix my wireless because I have been unable to update the wireless driver/firmware - which I still do not know why it mysteriously stopped working. Finally, I stumbled across this post, which got me to thinking... mostly about my USB drive that sat quietly in the corner. I opened up the file manager, and to my disgust I couldn't get a directory listing of the device. I dropped down to the terminal (think DOS shell for you Windows guys), and found that the drive was not even mounted. I took another look at dmesg (btw, dmesg is a utility that displays OS messages, mostly during boot, but not exclusively) and found that the filesystem was corrupted. Apparently it was so bad that just unplugging the device did not release the OS's hold on it (to free up the sync call), so I had to reboot (without it being plugged in).

    My recourse? Upgrading the OS and hoping that whatever was broken gets magically fixed. As for the external? I haven't gotten to a point where I can try anything - because I'm waiting on the 1,670 packages to download for the OS upgrade - but I fear that I'm going to have to re-format (losing all prior backups) and start over again. My gripe? Linux boasts being such a superior OS, yet something as simple as a corrupted external hard drive can bork a system up so bad that the wireless driver stops working (yes, that's an assumption, but I have little else to go on).

    Sigh. I miss my Macbook.



  •  I've had the same (or a closely similar) prob with Ubuntu 10.04 - though in my case, I have no external drive to blame (I do use thumb drives, so there might be a connection). Suddenly it went from a 30-sec shutdown to 5+ minutes. The thing that had happened most recently before was that I had run some Recommended Updates. I tried Googling, and there was very little being reported. The reports that existed were quite old, babbled about ACPI (which was deffo Not The Problem, because I disabled it at boot without any change in behaviour resulting). No fixes being reported.

    After several weeks, and applying every fix that came through, with no results, I added up all the stuff about Ubuntu that I don't like (being forced on to Unity or Gnome 3), and installed Mint 11 (that s, Ubuntu done properly) instead. Since my /home is on its own partition, that was a painless 20 mins.

    All the probs went away, and I am much happier.



  • @dohpaz42 said:

    Linux boasts being such a superior OS, yet something as simple as a corrupted external hard drive can bork a system up
     

    What filesystem are you using on the external harddisk? Oh, and if you'd just update your fsck and kernel packages (and the dependencies) it might be a simple improvement - with a 1 or 2-digit number of packages ...

    Or did you just want sympathy and no help? Sorry, then, about that.

     



  • @flop said:

    @dohpaz42 said:

    Linux boasts being such a superior OS, yet something as simple as a corrupted external hard drive can bork a system up
     

    What filesystem are you using on the external harddisk? Oh, and if you'd just update your fsck and kernel packages (and the dependencies) it might be a simple improvement - with a 1 or 2-digit number of packages ...

    Or did you just want sympathy and no help? Sorry, then, about that.

     

    No, no sympathy. As for help, I've found the solution in that the problem is the corrupted USB drive. As for the number bump - eh, if I have to update, and there is an upgrade available, I might as well upgrade. My complaint is centered around how two completely unrelated processes (the USB filesystem corruption has nothing to do with a wireless driver; or, it shouldn't). And the fact that the problem was buried so deep under the covers (found using dmesg, but only when specifically looking for it) and not reported to me in a more meaningful way (i.e., a popup that says the drive could not be mounted or whatever). But yes, as you infer, I just wanted to complain. :)



  • Thanks for explaining the Linux-specific terms in your post, much appreciated.

    Nothing else to add, except: I don't like Linux much when it's working right, I can't imagine what it's like when it's barfing up its own guts. Remember Mac Classic? The entire OS was like 4 files? Those were the days... none of this 56,000 tiny text and .exe files scattered all over God-knows-where shit...



  • @blakeyrat said:

    Thanks for explaining the Linux-specific terms in your post, much appreciated.
     

    Damn, man, you beat me to it.



  • Sync operations on Linux have been a little off the last couple years. For me the problem was usually that ongoing writes in another process would cause sync() to stall indefinitely, waiting for all writing to stop instead of just the writes that were pending when sync was called. So if you had some sort of backup or other large file copy going on, even at low i/o priority, dpkg, firefox, and such would often freeze for very long periods of time. dpkg calls sync once after every file written, as if that would help much if the power went out when a package was only half installed because it took too long, and Firefox would sync after every sqlite transaction, as if the user would care if they lost the last couple seconds of browsing history in an outage.



  • FWIW: Dell wireless drivers appear to be TRWTF. Mine (Dell wireless 1490 Dual Band WLAN running on XP) pukes from time to time. WLTRAY.EXE bloats up to about 500 MB of RAM (don't worry, Dell will release an updated driver to fix it really soon, like early 2009 or so...). Had another Dell at work ("newer" Vostro, running XP, similar Wireless card), that one would just randomly refuse to contact the network. Had the wireless card swapped out = no dice. Had the motherboard swapped out (Per Dell tech support, was still under warranty) = Ok, seems to work now (and for the last two years). WTF?

    The external drive issue is odd... wondering if `fuser` would have revealed anything about what was hanging (I've had luck with that approach when kill -9 didn't work, in my case it was a hung NFS mount due to my stupidity).

    Depending on what filesystem you used on the external (and assuming the physical drive is ok, of course), fsck should save 99% of your data. Odds are the hard shutdown corrupted the FS as opposed to a corrupt FS causing the hang.

     



  • @blakeyrat said:

    Thanks for explaining the Linux-specific terms in your post, much appreciated.

    Nothing else to add, except: I don't like Linux much when it's working right, I can't imagine what it's like when it's barfing up its own guts. Remember Mac Classic? The entire OS was like 4 files? Those were the days... none of this 56,000 tiny text and .exe files scattered all over God-knows-where shit...

    Haha! Yeah, I remember working with OS 8 and 9, and yes it was so much easier to troubleshoot. Doesn't boot? It's an extension. Program doesn't launch? It's an extension. Fonts messed up? Remove the font. Simpler times. But even with OS X and all of its internal guts, it's still a much better user experience. At least Apple has taken the time to make sure that if it does barf all over itself, it does so in an intuitive manner.



  • @RichP said:

    >

    The external drive issue is odd... wondering if fuser would have revealed anything about what was hanging (I've had luck with that approach when kill -9 didn't work, in my case it was a hung NFS mount due to my stupidity).

    Ugh, I had completely forgot about fuser. That's another thing, and it's not Linux's fault, but because it's "open source", and doesn't really have the same kind of structure as a corporate product (like Windows and Mac OS X), very useful tools such as this often are either not known or easily forgotten. You're right, that probably would have uncovered the real problem.

    @RichP said:

    Depending on what filesystem you used on the external (and assuming the physical drive is ok, of course), fsck should save 99% of your data. Odds are the hard shutdown corrupted the FS as opposed to a corrupt FS causing the hang.

     

    I don't doubt that the hard shut down caused the FS to become corrupted - btw, it's ext4 - but I would have hoped that it would have been recoverable, or at least flagged dirty and ignored by the OS (sync() specifically).



  • @dohpaz42 said:

    Ugh, I had Suppressed all the traumatic memories of needing to use fuser.

     

    FTFY.

    Besides, other OSes have the same problem with the specialized tools being forgotten about (In Windows, for instance: boot logging, SYSTEM level command prompt, Process Explorer, command line utilites, etc.)

     



  • @dohpaz42 said:

    Ugh, I had completely forgot about fuser. That's another thing, and it's not Linux's fault, but because it's "open source", and doesn't really have the same kind of structure as a corporate product (like Windows and Mac OS X), very useful tools such as this often are either not known or easily forgotten. You're right, that probably would have uncovered the real problem.

    You kids and your Linux complaints. fuser's been around in multiple os-es for ... a couple years now. Crap, I remember it from the last time I was a 'real' unix admin, so... '02, '03? Then again, my co-workers accuse me of telling them to get off my lawn (I'm 7 years older than the co-workers who do what I do, sort of.) dmesg and syslog are the first places I'd have looked for more info. And after a hard power failure, a fsck of any disks that spun down would've been an automatic task. Then again, I'm an old-school unix admin, and those are things that you just learn to f'ing do. In my day ... (cane-waving)

    @dohpaz42 said:

    @RichP said:

    Depending on what filesystem you used on the external (and assuming the physical drive is ok, of course), fsck should save 99% of your data. Odds are the hard shutdown corrupted the FS as opposed to a corrupt FS causing the hang.

     

    I don't doubt that the hard shut down caused the FS to become corrupted - btw, it's ext4 - but I would have hoped that it would have been recoverable, or at least flagged dirty and ignored by the OS (sync() specifically).

    And this is one of the reasons why I still think of Linux as "not a real OS". What happened to the OS going through /etc/mnttab (or equivalent) and making a few sanity checks before mounting a partition? Not "It looks like you're mounting a partition" with animated paperclip help, but "hey, dumbass, this partition is too screwed for me to mount. You might want to look at it."



  • Well, I'm only going to offer sympathy and no solutions. If I were to write a "list of things that annoy me in linux and I'd fix right now with vim and the source, but I'm in the middle of browsing funny cat pictures", I'd lose days of cat picture browsing. Maybe more than if I worked to resolve the problems in the first place. But the thing is, there's a small and finite set of such troubles. By far the biggest is braindead handling of i/o corner cases which don't really happen on servers, but that's mostly it. You live and learn and can rest assured it won't really get worse. And I have seen windows as well as mac os crap itself, all in different ways - every OS sucks, it's just that I'm used to linux and it works for me.
    Well, and the community seems somewhat better.



  • I'd be heading off to pull data from that hard drive, myself. The cause of the initial hard-power-off could have been that drive beginning to fail. When a hard drive starts taking 30 seconds to respond to a read request, all OSes fail with a lack of grace.



  • @tweek said:

    And this is one of the reasons why I still think of Linux as "not a real OS". What happened to the OS going through /etc/mnttab (or equivalent) and making a few sanity checks before mounting a partition? Not "It looks like you're mounting a partition" with animated paperclip help, but "hey, dumbass, this partition is too screwed for me to mount. You might want to look at it."
     

    I don't have thousands of dollars to spend for casual to minor-business use of a UNIX®-type operating system.  But I prefer the overall feel of the command line, being able to actually process data, etc.

    Seriously, what would you recommend as an alternative?  One of the BSDs?  OpenSolaris (is it still available)?  I have no particular allegiance to a Linux-based distro save for the fact that I cut my teeth in the UNIX®-type world running awk, grep, sed, vim, writing Perl and bash scripts, etc. under Linux.  I'm all ears for something different.

     



  • @nonpartisan said:

    @tweek said:

    And this is one of the reasons why I still think of Linux as "not a real OS". What happened to the OS going through /etc/mnttab (or equivalent) and making a few sanity checks before mounting a partition? Not "It looks like you're mounting a partition" with animated paperclip help, but "hey, dumbass, this partition is too screwed for me to mount. You might want to look at it."
     

    I don't have thousands of dollars to spend for casual to minor-business use of a UNIX®-type operating system.  But I prefer the overall feel of the command line, being able to actually process data, etc.

    Seriously, what would you recommend as an alternative?  One of the BSDs?  OpenSolaris (is it still available)?  I have no particular allegiance to a Linux-based distro save for the fact that I cut my teeth in the UNIX®-type world running awk, grep, sed, vim, writing Perl and bash scripts, etc. under Linux.  I'm all ears for something different.

     

    Back when i cared... I ran one of the BSDs. OpenBSD, iirc. It ran on my lil' SparcStation and was our firewall. (Goddamn, I'm old.) At work, I'd rather they spring for the hardware for a more real unix with support contracts, but I have an Ubuntu box which does unixy things, and I manage windows servers running IIS, run win7 on my work laptop, and run mac os and have several ios devices at home. If I'm at home, I'm generally decompressing from dealing with windows servers all f-ing day, so I don't do much with my mac that requires a command line, but I have used it a couple of times. It's a decent BSD-ish thing underneath the shiny candy coating. Powershell is NOT a replacement for my unix tools on windows, but for simple things, it works nicely.



  •  TRWTF is anyone trying ot use Ubuntu, that's a shitty distro.  (yes i know it's been the Flavor of the Moment for a while, but that doesn't change the fact it is a shitty distro)

     

    use a better distro.   i've never see windows or OS/X get this borked by something so simple.

     

     

    though there is a disk related stupidty that both windows (even 7) and linux have.... when it cannot remap a bad sector (usually because it's out of realloc blocks) - instead of trying 5 times and giving up they both keep trying infinitely - essentially bringing the system to a standstill.  just shutting down windows gracefully takes 20 minutes when it is in this state, i don't remember if linux makes forward progress at all but it isn't any better at this siutation.



  • @nonpartisan said:

    Seriously, what would you recommend as an alternative?  One of the BSDs?  OpenSolaris (is it still available)?  I have no particular allegiance to a Linux-based distro save for the fact that I cut my teeth in the UNIX®-type world running awk, grep, sed, vim, writing Perl and bash scripts, etc. under Linux.  I'm all ears for something different.

    In all seriousness, depending on what exactly it is that you're doing, you may find that you can do it all in Excel on a Windows system a lot less painfully, once OS-related hassle is factored in. If you use Windows, you may feel a slight sting. That's pride fucking with you. Fuck pride. Pride only hurts, it never helps.



  •  Ok here's what you do... you install Win 7 on your machine because as much as it pains me to say it, the bloody thing just works. Then you install your favourite linux flavor on the local server/VMWare and only visit when you want to use all those linux command line goodies. You can just tie the two via Samba/SSH. That way you don't need to go blind reading man pages just to get a movie to play on linux and you don't have to install 5 billion separate programs on Windows just to do basic dev tasks (not even telnet is enabled by default on Win 7? Really?).



  • @DOA said:

    (not even telnet is enabled by default on Win 7? Really?)

    99% of telnet use is malware talking to botnets. 1% is legit. Those 1% who legitimately need telnet can easily find out how to enable it. (It's on the disk, just not installed by default.)

    Frankly, it was stupid of Microsoft to ship Telnet.exe by default as long as they did.



  • @blakeyrat said:

    99% of telnet use is malware talking to botnets. 1% is legit.
    ...ok, that makes sense. I forgot for a moment what the typical windows user is like.



  • @blakeyrat said:

    99% of telnet use is malware talking to botnets.

    I thought malware almost exclusively used in-built IRC clients to talk to botnets, with custom P2P protocols coming a distant second and really nothing else in the running.  Can you point to any bot that relies on an actual external telnet client exe for C&C?




  • @blakeyrat said:

    Remember Mac Classic? The entire OS was like 4 files? Those were the days... none of this 56,000 tiny text and .exe files scattered all over God-knows-where shit...

    And applications, properly written applications, were one file. Want to move it somewhere else? OK, just drag the app file to wherever you want it. Want to get rid of it? Delete the file. I remember getting annoyed at seeing a game which decided it needed a whole folder to live in and had dozens of files. Why can't you just be a single file sitting in the Applications or Games folder like everything else?



  • I stand [somewhat] corrected. (Re: Linux is SO MUCH MORE superior!)

    Okay, I just finished the upgrade to 11.10. Apparently the firmware for my wireless changed to a different package in Ubuntu (oddly, I had never updated anything just for this type of scenario where something critical would break; I digress). First off, WOW. So about 95% of my Linux gripes just got wiped out with their updated interface (an actual, true-to-$diety launcher, the menus are centralized similar to Mac OS X, they have an honest-to-goodness virtual folder for common system stuff ala Windows, etc). Mind you, this is all eye candy and perfunctory, but I am impressed (yeah, it doesn't take much... oh, shiney!). I also admit that during the install whenever one of the system configs (i.e., apache, vimrc, etc) was about to be replaced it actually asked me what to do (that is almost enough to forgive everything else in and of itself). But alas, the wireless still doesn't work. Off to Google I go. Sigh. TRWTF is having to use wireless at work because we work in a very old office that only has one hard-wired port. Damn start ups. BUT, SHINEY! :)



  • @dohpaz42 said:

    TRWTF is having to use wireless at work because we work in a very old office that only has one hard-wired port.

    Wouldn't your time be better-spent running cables?



  • @MascarponeRun said:

    @nonpartisan said:
    Seriously, what would you recommend as an alternative?  One of the BSDs?  OpenSolaris (is it still available)?  I have no particular allegiance to a Linux-based distro save for the fact that I cut my teeth in the UNIX®-type world running awk, grep, sed, vim, writing Perl and bash scripts, etc. under Linux.  I'm all ears for something different.

    In all seriousness, depending on what exactly it is that you're doing, you may find that you can do it all in Excel on a Windows system a lot less painfully, once OS-related hassle is factored in. If you use Windows, you may feel a slight sting. That's pride fucking with you. Fuck pride. Pride only hurts, it never helps.

     

    I try to use the right tool for the right job.  Pride doesn't get in the way of me getting something done.  If I'm in error, fine, let's do it right and I'll learn from it.

    In this context, data processing, a typical task might include grabbing the names, serial numbers, and models of all of our network gear.

    We have a monitoring system that automatically walks the network on a regular basis.  I go to one of its config files, grep out the comments (^#), then awk the second column to get the list of names.  I use tr to delete the apostrophes surrounding the name, then sort while ignoring case and run uniq just for good measure.  This is all a single command line.  It runs just about instantaneously.  (grep -v "^#" syscfg.cfg | awk '{print $2}' | tr -d "'" | sort -f | uniq -i)

    I have a Perl script that uses this list to SNMP touch every entry on the list, grabbing serial number and model, and perhaps a few others things too.  I output it all to a TSV file.

    I copy the file over to my Windows machine and import the data into Excel.  From there, I use its filtering capabilities as needed.  Perhaps I need to grab all of the Cisco 2821 routers.  I filter on 2821 and I get a list of devices.  I'll copy that list into an e-mail, into a different spreadsheet, etc.  I change the filter and move on to the next category.

    The UNIX® paradigm works well and much faster on the initial data grab.  Excel works better for narrowing down the list on demand and being able to move the data from one program to another (since we use the full Microsoft suite of products -- Office, Outlook, Sharepoint, etc.).

    Can I install comparable tools on my Windows machine to run this?  I'm sure I could.  But most of our monitoring systems run in the UNIX® paradigm and portability between them is much easier than it potentially would be under Windows.  They've all got these standard processing tools already installed.  If I need to move scripts between machines, I just do it, and they just work.

    The right tool for the right job.

    (grep - searches a file for data or patterns; awk - pattern scanning and text processing language; tr - translate or delete characters; sort - sorts the given input; uniq - eliminates duplicate lines; Windows - an operating system maintained by Microsoft with a GUI; Excel - a spreadsheet program that is part of the Microsoft Office suite; Cisco 2821 router - an IP-based router with various hardware options, such as VoIP processing and encryption)



  • @blakeyrat said:

    @dohpaz42 said:
    TRWTF is having to use wireless at work because we work in a very old office that only has one hard-wired port.

    Wouldn't your time be better-spent running cables?

    Probably not. It's an old building, I'm inexperienced with running cables in that capacity, and the cost (both in my time, and upgrading the router for the added ports) could not be justified when I can simply use the wireless. Hence TRWTF. :)



  • @nonpartisan said:

    @MascarponeRun said:

    If you use Windows, you may feel a slight sting. That's pride fucking with you. Fuck pride. Pride only hurts, it never helps.
     

    I try to use the right tool for the right job.  Pride doesn't get in the way of me getting something done.  If I'm in error, fine, let's do it right and I'll learn from it.

    Sorry, maybe you missed that reference - it's from Pulp Fiction, when Marsellus Wallace is telling Butch (Bruce Willis) to take a dive in his final fight. No dickweedery intended :)

    @nonpartisan said:

    I go to one of its config files, grep out the comments (^#), then awk the second column to get the list of names.  I use tr to delete the apostrophes surrounding the name, then sort while ignoring case and run uniq just for good measure.  This is all a single command line.  It runs just about instantaneously.  (grep -v "^#" syscfg.cfg | awk '{print $2}' | tr -d "'" | sort -f | uniq -i)

    That looks sane and sensible - I'm not knocking Linux as a data handling system, because that would be daft. The point was more that you can do all that stuff pretty trivially in Excel as well, so if your os-maintenance overhead is high with Linux, then the MS solution may be simpler. If you're tied-in to the unix-family for other reasons, then it's likely not a sensible thing to do.

    @nonpartisan said:

    Can I install comparable tools on my Windows machine to run this?

    I don't think you need anything other than Excel - people don't realise how good Excel is at importing/working with data. Just off the top of my head, I'd import only the unique records in the column you want from the config file directly - I'd have to check if that's one step or two - and then sort (by clicking the sort button...). Not sure if Excel would automatically trim the apostrophes, but if not that's the work of seconds - and an example of where Excel can leave you feeling a bit dirty is that you may be able to cut out that step by treating '' as a column delimiter.

    Come to think of it, you could probably just make the original config file a linked data source in Excel, so the workbook would update automatically/as desired when the source data has changed.

    @nonpartisan said:

    The right tool for the right job.

    I absolutely agree - but what we're looking at here is something of a disconnect between the MS/Unix mindset. Where Unix tends to have separate tools - grep, awk, and so-on - MS would tend to bundle those into an application of some kind which brings together related tools. Unix users on Windows tend not to realise that all these tools exist (generally speaking) but are tucked away somewhere. Almost any tool considered necessary for the unix-family is also extremely likely to be necessary - and so present - in Windows/Office in some form or another for the simple reason that ultimately people are all trying to do the same things with their PCs.



  • @nonpartisan said:

    Can I install comparable tools on my Windows machine to run this? I'm sure I could.

    You can install those exact tools on Windows. They're all ported. They even made an handy installer program for them, it's not like you have to build from source or anything Unix-y.



  • @blakeyrat said:

    @nonpartisan said:
    Can I install comparable tools on my Windows machine to run this? I'm sure I could.

    You can install those exact tools on Windows. They're all ported. They even made an handy installer program for them, it's not like you have to build from source or anything Unix-y.

    ja; then you get all the fun of linux in your windows box. You can even get X for your windows box if you like to have gnome or kde access to your linux boxen. Somehow, it gives me the twitches (I'm thinking it's a UI thing, my brain can't quite parse linux and its tools under windows unless it's in a puTTY window -- like my inability to write anything relying on ActivePerl), but most people are fine with cygwin.



  • @blakeyrat said:

    @nonpartisan said:
    Can I install comparable tools on my Windows machine to run this? I'm sure I could.

    You can install those exact tools on Windows. They're all ported. They even made an handy installer program for them, it's not like you have to build from source or anything Unix-y.

     

    But guess what?  It's an extra step.

    All those tools are already available to me on the systems I use without having to install anything extra. Not just grep, awk, sed, sort, tr, uniq, but also snmpwalk/snmpget, Perl, the Perl modules I need, etc.  It's all ready to go, whether it be on a Linux distro, Solaris, or FreeBSD (which is the OS hosting the monitoring system for all of our network ports . . . which, coincidentally, happens to be the system from which I assemble the list of devices to which I originally referred).


  • Garbage Person

    @blakeyrat said:

    99% of telnet use is malware talking to botnets. 1% is legit. Those 1% who legitimately need telnet can easily find out how to enable it. (It's on the disk, just not installed by default.)

    Frankly, it was stupid of Microsoft to ship Telnet.exe by default as long as they did.

    What kind of shitass malware uses telnet.exe? That takes more code than just opening a damned socket yourself. It's not like telnet has a fucking protocol to it.


  • @Weng said:

    What kind of shitass malware uses telnet.exe? That takes more code than just opening a damned socket yourself.
    Surely as a malware author your concern is less LOC and more whether Windows firewall is configured to let your traffic through by default?



  • @dohpaz42



    You should try booting a live disc or live usb of Ubuntu (or any other OS). If you're already using Ubuntu I'm assuming you already have one lying around.



    If your wireless works in the live boot, then you know the problem is specific to your configuration.



    In my experience wireless adapters just die sometimes. This is a quick way to rule that out.



  • @MascarponeRun said:

    @Weng said:
    What kind of shitass malware uses telnet.exe? That takes more code than just opening a damned socket yourself.
    Surely as a malware author your concern is less LOC and more whether Windows firewall is configured to let your traffic through by default?

    I thought Windows Firewall only allows/denies incoming connections?

    And in any case, even with a real firewall that does handle outgoing connections, you'd be a lot more likely to get past the blocks on $J_RANDOM_USER's machine by leverarging iexplore.exe rather than telnet.exe, since the browser is more likely to have a "permanently allow" status on most home users' machines.




  •  @Weng said:

    What kind of shitass malware uses telnet.exe? That takes more code than just opening a damned socket yourself. It's not like telnet has a fucking protocol to it.

    Hey Weng, is this you?


  • :belt_onion:

    @DaveK said:

    I thought Windows Firewall only allows/denies incoming connections?
     

    Correct. It can be set up to block outbound connections, but isn't. (I've never seen even the most secure of environments set it up that way; if they want to block outbound connections, they do it at the gateway. Windows Firewall can theoretically be centrally managed via Group Policy, but it's not the easiest of solutions.)

     



  • @heterodox said:

    Correct. It can be set up to block outbound connections, but isn't. (I've never seen even the most secure of environments set it up that way; if they want to block outbound connections, they do it at the gateway. Windows Firewall can theoretically be centrally managed via Group Policy, but it's not the easiest of solutions.)

    There's a (kind of rough) UI for the whole shebang on Server versions, but the desktop version's UI is simplified. The actual firewall is the same, though, so... it has the features on the desktop version, just no UI for them.



  • @nonpartisan said:

    But guess what?  It's an extra step.

    you aren't using this tired argument, are you?

    It's ironic that anybody would say that Windows requires "extra step(s)" over *nix, especially when almost everything else requires extra steps on *nix. writing the script to begin with, for example. I know there are still people in this world who long for the days when the only people allowed to actually work with the computers were an elite priesthood, and writing perl scripts and shell scripts and repeating the holy trinity of awk sed and grep to whomever will listen can sometimes recapture a few moments of that elitism, but those days are over. Installing *nix tools on Windows is an extra step. But it's an unnecessary extra step because on any decently configured Windows machine in an Office it will already have the capability that is desired by downloading those tools.Do you get them free? No. But then again, the people paying for it are also paying the people working with the machines and would probably prefer that they do what they are paid to do as efficiently as possible, and that typically doesn't mean fiddling with perl scripts. Also, they can hire people cheaper, because it doesn't take as much training; and the people don't need to have much of an aptitude for computers. This is good because the people that do know what they doing with regards to the type of stuff you are mentioning can actually do it for "more important" things than just printing out reports. At least, that's the ideal case.

     

     



  • @BC_Programmer said:

    on any decently configured Windows machine in an Office it will already have the capability that is desired by downloading those tools.Do you get them free? No. But then again, the people paying for it are also paying the people working with the machines and would probably prefer that they do what they are paid to do as efficiently as possible,
     

    The "perfect world" argument is just as tired.



  • @dhromed said:

    The "perfect world" argument is just as tired.

    My butt's tired.



  • @blakeyrat said:

    @dhromed said:
    The "perfect world" argument is just as tired.

    My butt's tired.

    That you Zunesis?  Give Blakey his account back right now!




  • @blakeyrat said:

    My butt's tired.

    Sorry dude, nearly done.



  • @BC_Programmer said:

    @nonpartisan said:

    But guess what?  It's an extra step.

    you aren't using this tired argument, are you?

    It's ironic that anybody would say that Windows requires "extra step(s)" over *nix, especially when almost everything else requires extra steps on *nix. writing the script to begin with, for example. I know there are still people in this world who long for the days when the only people allowed to actually work with the computers were an elite priesthood, and writing perl scripts and shell scripts and repeating the holy trinity of awk sed and grep to whomever will listen can sometimes recapture a few moments of that elitism, but those days are over. Installing *nix tools on Windows is an extra step. But it's an unnecessary extra step because on any decently configured Windows machine in an Office it will already have the capability that is desired by downloading those tools.Do you get them free? No. But then again, the people paying for it are also paying the people working with the machines and would probably prefer that they do what they are paid to do as efficiently as possible, and that typically doesn't mean fiddling with perl scripts. Also, they can hire people cheaper, because it doesn't take as much training; and the people don't need to have much of an aptitude for computers. This is good because the people that do know what they doing with regards to the type of stuff you are mentioning can actually do it for "more important" things than just printing out reports. At least, that's the ideal case.

     

     

    Dude . . . you gotta be trolling.  I'm sure I'm being trolled.  But just because I've got nothing better to do while waiting for my bus that has been interrupted by Occupy Portland . . .

    I'm a network engineer.  I have a cube.  I have my primary desktop machine.  I have a laptop.  I spend a fair amount of my time in my cube, but a significant portion outside it as well.  Whether it be in the data center, or a building on a different campus, or whatever.  I find myself using foreign machines (foreign = not mine) on a reasonably regular basis.  You want me to install those tools every time I need them?  Why?

    The OS build for our machines has PuTTY built into it.  I just PuTTY into one of our network monitoring or general purpose boxen, and boom, I have everything I need, from wherever I need it.  And since most of our monitoring occurs on UNIX®-based boxen, it's the right way to do it.  This isn't a Windows-vs-UNIX ® flame.  It's simple a practicality.  For me, installing the Windows versions of the tools is not the right tool for the job because of my environment.  I'm guessing that, if I were primarily a software developer, I'd install said tools on my machine, the one I would be using 99% of the time, and be done with it.  But that's not the environment I work in.

    Writing the script is not an extra step.  Under Windows or UNIX® it's a step, but it's not an extra step.

     



  • @MascarponeRun said:

    I don't think you need anything other than Excel - people don't realise how good Excel is at importing/working with data. Just off the top of my head, I'd import only the unique records in the column you want from the config file directly - I'd have to check if that's one step or two - and then sort (by clicking the sort button...). Not sure if Excel would automatically trim the apostrophes, but if not that's the work of seconds - and an example of where Excel can leave you feeling a bit dirty is that you may be able to cut out that step by treating '' as a column delimiter.

    There's one thing I'm missing: if I want to do this kind of thing in Excel, I would have to go through the "import data from text file" wizard, click lots of buttons to make it behave the way I want it, import the file, and repeat the procedure the next time I want to import a file. With grep+awk+sed (also head+tail to eliminate useless headers and trailers) I can write a script and use it for any file I want without wasting any more time. That really makes a difference if I have to import the files often, doesn't it? I'm saying because I don't really like going through the crazy logic of awk and stuff, and I would much prefer to use excel, but it doesn't really seem a viable option to me.



  • Even without doing anything tricky: set up the import once. After that, you can right click on any portion of the data in themselves and select refresh, repick the file name and it is refreshed. Make always leave your original data and do you manipulation to adjacent cells that way you don't keep overwriting your stuff.



  • @mahlerrd said:

    Even without doing anything tricky: set up the import once. After that, you can right click on any portion of the data in themselves and select refresh, repick the file name and it is refreshed. Make always leave your original data and do you manipulation to adjacent cells that way you don't keep overwriting your stuff.
     

    Something I realized a couple of the responses are missing . . . I use the command line just to get the list of equipment that I need to poll.  So that list doesn't need to go into Excel, it needs to be fed to the Perl script.  The output from the Perl script is what I end up importing into Excel for further processing.  And yes, it is easier at that point to manipulate the data in Excel than it would be on the command line.  But to generate the list in the first place, I'd have to import the data, do a global search and replace to remove the apostrophes from the device names, select the column with the device names, then copy that into another text file for use as input to Perl.  CLI beats those steps hands down, faster than the first mouse click to try to import the source config file into Excel.



  • @nonpartisan said:

    @mahlerrd said:

    Even without doing anything tricky: set up the import once. After that, you can right click on any portion of the data in themselves and select refresh, repick the file name and it is refreshed. Make always leave your original data and do you manipulation to adjacent cells that way you don't keep overwriting your stuff.
     

    Something I realized a couple of the responses are missing . . . I use the command line just to get the list of equipment that I need to poll.  So that list doesn't need to go into Excel, it needs to be fed to the Perl script.  The output from the Perl script is what I end up importing into Excel for further processing.  And yes, it is easier at that point to manipulate the data in Excel than it would be on the command line.  But to generate the list in the first place, I'd have to import the data, do a global search and replace to remove the apostrophes from the device names, select the column with the device names, then copy that into another text file for use as input to Perl.  CLI beats those steps hands down, faster than the first mouse click to try to import the source config file into Excel.

    I think you're confusing two different things here - repeating the same task, or setting it up for the first time. If you have to type out your command-line instruction each time you use it, working out the parameters as you go along, it'll take you much longer than if you merely copy-and-paste a stored command. Similarly, if you redo it from scratch each time in Excel it'll take you much longer than if you set it up as a one-click scripted action, or linked-data spreadsheet, or whatnot.

    I think, although I've never tried, that with some very basic Excel scripting you could even have a workbook you double-clicked to open, which would automatically update the data from the source, reformat as necessary, and then export to a text file and close Excel.

    We agreed before, though, that your use-case probably doesn't make this a particularly great option - I'm mainly expanding on my points in case anyone else finds it useful. Whilst it apparently makes little sense for you to do them in Excel, the kind of things you're talking about are every bit as quick and easy for a skilled Excel user as they are for you as a skilled CLI/unix user.



  • @dargor17 said:

    @MascarponeRun said:
    I don't think you need anything other than Excel - people don't realise how good Excel is at importing/working with data. Just off the top of my head, I'd import only the unique records in the column you want from the config file directly - I'd have to check if that's one step or two - and then sort (by clicking the sort button...). Not sure if Excel would automatically trim the apostrophes, but if not that's the work of seconds - and an example of where Excel can leave you feeling a bit dirty is that you may be able to cut out that step by treating '' as a column delimiter.

    There's one thing I'm missing: if I want to do this kind of thing in Excel, I would have to go through the "import data from text file" wizard, click lots of buttons to make it behave the way I want it, import the file, and repeat the procedure the next time I want to import a file. With grep+awk+sed (also head+tail to eliminate useless headers and trailers) I can write a script and use it for any file I want without wasting any more time. That really makes a difference if I have to import the files often, doesn't it? I'm saying because I don't really like going through the crazy logic of awk and stuff, and I would much prefer to use excel, but it doesn't really seem a viable option to me.

    In the first place, you could write a script to do the same in Excel. But the bigger point is that the more you use Excel, the more you realise that there must be a solution to that, and google it to find out what it is. It's a common enough problem that Excel has to have some way to deal with it.

    I suspect the functionality you should actually be using is data-linking - if all the files are in the same format, you'd just change the source and be done, having set up the import parameters the first time.


Log in to reply