Linux ain't free, y'all!



  •  @belgariontheking said:

    I assure you that I just want shit to work no matter what OS I'm on.  I gave up on a bunch of things that didn't "just work" on both Windows and Linux.

    I agree, but I'm wondering what these things are exactly?  I run probably the widest array of software among the computer savvy people I know; graphics software like Photoshop, Flash and Maya, music software like Reason and Ableton Live, a wide selection of development tools from Visual Studio for C# to Notepad2 and E for python, and games like Team Fortress 2 and the Steam client.  All that software installs easily, runs flawlessly and never leaves me wanting.

    The impression I get from a lot of people who get fed up is that something small bothered them, maybe the popup bubbles or something, and rather than disabling them, they threw up their hands and hit uninstall.  Not saying this is your case though.

     



  • @ComputerForumUser said:

    Finding out how any application does something is. Children are best prepared for the inevitable and often uncontrollable change in the real world with the knowledge that there is more than one way of achieving something.

    What they'll take away from it is that there is no consensus on anything, and that software is apparently supposed to be incompatible within your own ecosystem.



  • @amischiefr said:

    we are to only teach Windows because it is the most popular...
     

    What's the deal with this "teaching Windows"?  Nobody's teaching Windows, they're teaching basic computing skills like working with word processors, spreadsheets, the web, etc.  There are two things that make this more compelling on the Windows platform; Windows is what students are most likely to encounter so it'll be more familiar when they do, and Office is, by a huge margin, much better than OpenOffice.

    In neither situation are they "teaching Windows", they're simply teaching in an environment most people will consider familiar. Sure, you could argue that its a chicken and egg scenario where more Windows exposure perpetuates more Windows usage, but that's life.  It could also be argued that OpenOffice is "good enough" for your limited day to day document needs, but I'm sure some Google fanboy would argue that Google docs is good enough for that and installing any office suite is bloatware.



  • @shakin said:

    If Linux was trying to work like Windows then why would it need to exist? We already have Windows. Linux is doing its own thing.


    Couldn't we have a Linux distro that would be exactly like Windows except that it would be free? Or do I have to build it myself if i want it?

    @shakin said:

    Many people manage Linux systems all the time and the fact that you think it's difficult to use just goes to show your own ignorance about it. Personally, I think Windows is easier to use in some ways (setting up multiple monitors, installing new drivers) and Linux is easier to use in other ways (software updates, automated configuration). I'm sure there are plenty of examples both ways, but they don't really matter because either way you need to learn a skillset to become an expert. The particular skillsets you require to become an expert in Windows differs from those skills needed for Linux, but I don't think either one is more difficult than the other.

    Ubuntu was advertised as being so easy to install that everybody could do it. That means that I'm either more ignorant than the average person or that nobody expected that anybody would want to use a 64-bit processor and an ATI graphics card at the same time or that anyone would want to use the extra buttons on their mouse. At least that's what it was like a couple of years ago. Maybe it's easier now.

    @shakin said:

    While there is some overlap in government and education, Linux is not generally trying to compete in the same markets as Windows or OS X. It's a different operating system with different purposes.

    O RLY? So Linux isn't for home use and I shouldn't have been installing Linux on my home computer in the first place?


  • @julmu said:

    At least that's what it was like a couple of years ago.

    I don't have a 64-bit cpu, nor does my Linux box have an Ati card, but I've never had any trouble with the mouse buttons in Ubuntu.

    Software goes fast. You can't comment on the state of things if the only state you know is a couple years old.



  • @Soviut said:

    @belgariontheking said:

    I assure you that I just want shit to work no matter what OS I'm on.  I gave up on a bunch of things that didn't "just work" on both Windows and Linux.

    I agree, but I'm wondering what these things are exactly?

    The big one that I can think of off the top of my head is getting apache to play nice with tomcat.  It's been a couple of years since I tried it, but try though I might, and no matter how many websites I consulted, I just couldn't get apache to fire up tomcat for me, so I just ran tomcat on port 8080.  Really, all I wanted was to learn J2EE in my own environment, so the 8080 solution was sufficient.

    Also, svn threw me for a loop.  Got that settled with morbius' help.  



  • @Soviut said:

    @amischiefr said:

    we are to only teach Windows because it is the most popular...
     

    What's the deal with this "teaching Windows"?  Nobody's teaching Windows, they're teaching basic computing skills like working with word processors, spreadsheets, the web, etc.  There are two things that make this more compelling on the Windows platform; Windows is what students are most likely to encounter so it'll be more familiar when they do, and Office is, by a huge margin, much better than OpenOffice.

    In neither situation are they "teaching Windows", they're simply teaching in an environment most people will consider familiar. Sure, you could argue that its a chicken and egg scenario where more Windows exposure perpetuates more Windows usage, but that's life.  It could also be argued that OpenOffice is "good enough" for your limited day to day document needs, but I'm sure some Google fanboy would argue that Google docs is good enough for that and installing any office suite is bloatware.

     

    Once again you have failed to miss the point: the argument here that was made was that ONLY Windows should be used in schools, and skills should ONLY be tought on Windows.  I am trying to say "sure, teach on Windows machines, have 90% of the computers on Windows, but show them other tools that are out there.  Don't simply close their eyes and pretend that there isn't another way to do things."  I have no problem with 100% of the computers in schools running Windows, but to tell a teacher that he/she doesn't have the right, or that they shouldn't "show" the students another way of doing something, and possibly handing out free copies of Linux, is just plain stupid.



  • @julmu said:

    @shakin said:

    If Linux was trying to work like Windows then why would it need to exist? We already have Windows. Linux is doing its own thing.

    Couldn't we have a Linux distro that would be exactly like Windows except that it would be free? Or do I have to build it myself if i want it?

    Not anymore than you can make Windows exactly like OS X, but on normal PC hardware. You can install themes and programs that make certain things work like OS X, but Windows can never clone it exactly. It's the same thing with Linux. There are plenty of Windows workalike programs and themes for Linux, but none of them are really like Windows. There are fundamental technical and philosophical differences between the operating systems that can't be eliminated with configuration or third party programs.

    @julmu said:


    @shakin said:

    Many people manage Linux systems all the time and the fact that you think it's difficult to use just goes to show your own ignorance about it. Personally, I think Windows is easier to use in some ways (setting up multiple monitors, installing new drivers) and Linux is easier to use in other ways (software updates, automated configuration). I'm sure there are plenty of examples both ways, but they don't really matter because either way you need to learn a skillset to become an expert. The particular skillsets you require to become an expert in Windows differs from those skills needed for Linux, but I don't think either one is more difficult than the other.

    Ubuntu was advertised as being so easy to install that everybody could do it. That means that I'm either more ignorant than the average person or that nobody expected that anybody would want to use a 64-bit processor and an ATI graphics card at the same time or that anyone would want to use the extra buttons on their mouse. At least that's what it was like a couple of years ago. Maybe it's easier now.

    It is easy to install and use, some hardware driver problems notwithstanding. My mom, my wife, and my son use it. Ignorant doesn't mean stupid, it means you lack knowledge of Linux. Like I said above, Linux is fundamentally different from Windows so being an expert at Windows doesn't make you an expert at Linux, but it also doesn't mean Linux is difficult. For example, at work I used an iMac for a month while testing software on it. I found the Mac software installation process incredibly confusing: a diagram appeared when I clicked on the downloaded file and a coworker had to tell me that the diagram was showing me to drag the icon into the applications menu. After having heard how easy Macs are to use I was shocked that this appeared to be so difficult, but it wasn't really difficult, it was just different and I was ignorant about how to do a very simple task. I had a ton of problems trying to get OS X working the way I want (delete key didn't work properly, couldn't figure out how to mount remote sftp directories, etc). It must have been complete ignorance on my part because lots of people with less computing experience than I are able to use OS X..

    @julmu said:

     

    @shakin said:

    While there is some overlap in government and education, Linux is not generally trying to compete in the same markets as Windows or OS X. It's a different operating system with different purposes.

    O RLY? So Linux isn't for home use and I shouldn't have been installing Linux on my home computer in the first place?
     

    Home is a broad term, but generally you don't see Red Hat or Suse marketing to home users. Ubuntu does, but it's still relatively knowledgeable users who really want to learn Linux that use it. Your average computer user is never gong to install an operating system on their own so Linux is definitely not marketed towards those people. Of course, anybody who has a desire to learn a Unix-like operating system will find the barrier to entry is quite low.



  • @julmu said:

    Couldn't we have a Linux distro that would be exactly like Windows except that it would be free? Or do I have to build it myself if i want it?

    I'm guessing you mean "is fully compatible with Windows hardware and software" and the fact is that is a pretty difficult task.  Microsoft even had some trouble with Vista compatibility and that's with the same kernel.

     

    @julmu said:

    O RLY? So Linux isn't for home use and I shouldn't have been installing Linux on my home computer in the first place?

    Honestly?  Probably not.  Linux is not and almost certainly never will be a suitable alternative for a general-purpose desktop OS.  For simple stuff like web browsing and email it can suffice and for servers or developer workstations it's great, but it can't compete with Windows or OS X.  For most people, Linux will never be a good choice for desktop OS.  I recommend sticking with Windows.



  • @Soviut said:

     @belgariontheking said:

    I assure you that I just want shit to work no matter what OS I'm on.  I gave up on a bunch of things that didn't "just work" on both Windows and Linux.

    I agree, but I'm wondering what these things are exactly?  I run probably the widest array of software among the computer savvy people I know; graphics software like Photoshop, Flash and Maya, music software like Reason and Ableton Live, a wide selection of development tools from Visual Studio for C# to Notepad2 and E for python, and games like Team Fortress 2 and the Steam client.  All that software installs easily, runs flawlessly and never leaves me wanting.

    The impression I get from a lot of people who get fed up is that something small bothered them, maybe the popup bubbles or something, and rather than disabling them, they threw up their hands and hit uninstall.  Not saying this is your case though.

    I'm perfectly happy with my selection of Linux software too.  They have all the features I want, and I can't remember the last time I encountered a bug.  I've helped fix a few in the past though, even provided source code patches.  I do a lot of software development at home, and I find the Linux tools for that to be a lot better than Windows tools of comparable price.  Games work pretty well under Wine too, and I can accept a few glitches in return of not having to switch to another OS every time I want to play a game.

    I can't remember why I originally switched to Linux anymore - it's been more than 7 years since that event.  It was probably a mixture of better dev tools, having a perfectly legal OS free of charge (I was experiencing an awakening about piracy), being able to know exactly what's on my computer and what each component does, and getting annoyed at Windows.  Back then Windows XP was barely out the door and since I had heard that Windows 2000 was not good for games, I was sticking to Windows 98SE.  That one certainly did have its share of problems, which I have blissfully forgotten by now. 



  • @julmu said:

    @shakin said:

    Many people manage Linux systems all the time and the fact that you think it's difficult to use just goes to show your own ignorance about it. Personally, I think Windows is easier to use in some ways (setting up multiple monitors, installing new drivers) and Linux is easier to use in other ways (software updates, automated configuration). I'm sure there are plenty of examples both ways, but they don't really matter because either way you need to learn a skillset to become an expert. The particular skillsets you require to become an expert in Windows differs from those skills needed for Linux, but I don't think either one is more difficult than the other.

    Ubuntu was advertised as being so easy to install that everybody could do it. That means that I'm either more ignorant than the average person or that nobody expected that anybody would want to use a 64-bit processor and an ATI graphics card at the same time or that anyone would want to use the extra buttons on their mouse. At least that's what it was like a couple of years ago. Maybe it's easier now.

    You mean no one at ATI thought of that possibility?  It's not like the Linux community hasn't pestered both ATI and Nvidia about bugs and begged them to release the specs so that they could write better drivers themselves.  AMD opened some specs (2D acceleration?) a year ago, but 3D graphics at least is still largely non-functional with free drivers.

    As someone already mentioned, software moves fast.  Vista was released a bit under two years ago - exactly how well did 64-bit stuff work on it back then?  I recall even 32-bit Vista being plagued by problems, as well as 64-bit XP. 



  • @Soviut said:

     @belgariontheking said:

    I assure you that I just want shit to work no matter what OS I'm on.  I gave up on a bunch of things that didn't "just work" on both Windows and Linux.

    I agree, but I'm wondering what these things are exactly?

    Wireless networking. I've had no end of trouble with that, on both Windows and Linux. It's working on the Ubuntu Hardy machine I'm using now, but even having bought an adapter specifically sold as Linux compatible, it was still a PITA sorting out WPA whatnot, and getting it to come up properly at boot. Couple years ago I couldn't do it; when XP worked Linux didn't and vice versa, can't remember the details. I had to run unencrypted, with only MAC authentication as a defense.

    My solution to all wireless networking problems remains as it has always been - use a wire.



  • @tdb said:

    @Soviut said:

     @belgariontheking said:

    I assure you that I just want shit to work no matter what OS I'm on.  I gave up on a bunch of things that didn't "just work" on both Windows and Linux.

    I agree, but I'm wondering what these things are exactly?  I run probably the widest array of software among the computer savvy people I know; graphics software like Photoshop, Flash and Maya, music software like Reason and Ableton Live, a wide selection of development tools from Visual Studio for C# to Notepad2 and E for python, and games like Team Fortress 2 and the Steam client.  All that software installs easily, runs flawlessly and never leaves me wanting.

    The impression I get from a lot of people who get fed up is that something small bothered them, maybe the popup bubbles or something, and rather than disabling them, they threw up their hands and hit uninstall.  Not saying this is your case though.

    I'm perfectly happy with my selection of Linux software too.  They have all the features I want, and I can't remember the last time I encountered a bug.  I've helped fix a few in the past though, even provided source code patches.  I do a lot of software development at home, and I find the Linux tools for that to be a lot better than Windows tools of comparable price.  Games work pretty well under Wine too, and I can accept a few glitches in return of not having to switch to another OS every time I want to play a game.

    I can't remember why I originally switched to Linux anymore - it's been more than 7 years since that event.  It was probably a mixture of better dev tools, having a perfectly legal OS free of charge (I was experiencing an awakening about piracy), being able to know exactly what's on my computer and what each component does, and getting annoyed at Windows.  Back then Windows XP was barely out the door and since I had heard that Windows 2000 was not good for games, I was sticking to Windows 98SE.  That one certainly did have its share of problems, which I have blissfully forgotten by now. 

     

     >>>I can't remember the last time I encountered a bug.  I've helped fix a few in the past though, even provided source code patches

    hahahaha, you remember that you fixed bugs but do not remember the last one?

    I have meet some linux users that Soviut describe, that had a tiny issue and dumped the whole OS.  You now find them engineering massive workarounds just to get some simple job done under linux instead of windows.  I have also been told by these people that they virtualise windows if needed.  Then i am told in a very very nerdy tone "with the added benefit that you can easily load a fresh copy becuase after a few reboots XP gets slow...." followed by "oh but I recommend doubling the memory and processors in the PC" (!)

    I run a very wide range of software under windows and not had any issues, far less having to fix bugs in source code and issue patches back to the OS developers. I also use linux too, but I would never consider it for home or educational.  I mainly use linux for servers/embedded use i.e dedicated tasks - yes Linux has lost me a huge amount of time with poor documentation and bugs, but suits these dedicated jobs well. 

     

    Pick the right tool for the right job, i say.  Asking teachers to spend time to implement there own educational software is just not productive.  Just becuase anyone can write code does not mean that everyone should.

     

     




  • @Helix said:

     >>>I can't remember the last time I encountered a bug.  I've helped fix a few in the past though, even provided source code patches

    hahahaha, you remember that you fixed bugs but do not remember the last one?

    Obviously that sentence was somewhat poorly thought out.  I tend to easily dismiss minor bugs and use some trivial workaround.  Now that I think of it, I do remember the last time I encountered a major bug: in last April, a bug in the kernel NFS server caused my file server to lock up when I tried to install UT3 with Wine on my workstation.  I upgraded the kernel and it has been fine ever since, going at 244 days of uptime now.

    @Helix said:

    I have meet some linux users that Soviut describe, that had a tiny issue and dumped the whole OS.  You now find them engineering massive workarounds just to get some simple job done under linux instead of windows.  I have also been told by these people that they virtualise windows if needed.  Then i am told in a very very nerdy tone "with the added benefit that you can easily load a fresh copy becuase after a few reboots XP gets slow...." followed by "oh but I recommend doubling the memory and processors in the PC" (!)

    As I stated previously, there were more reasons for my switch to Linux than the problems with Windows.  And remember that it was 98SE back then, which was a lot less stable than XP.  It was not unusual to have to reboot after a day or two of heavy use because of leaked resource handles.  I don't know what sort of simple jobs you are talking about and what do you consider a massive workaround, but in my opinion I'm not using any.  If by "workaround" you mean "using my own script instead of sticking to existing software" then sure, I have several.  If a task is simple enough for me, I may not even bother looking for existing software and instead spend the 15 minutes to write a script.  And sometimes I write larger stuff just for the enjoyment I get from it, such as my firewall manager which reads a simple rules file and syncs iptables state to it with minimal commands.  I wouldn't run a virtualized Windows though, it's too slow for games, which is the only thing I may need Windows for these days.  Most of them are fully playable under Wine anyway.

    @Helix said:

    I run a very wide range of software under windows and not had any issues, far less having to fix bugs in source code and issue patches back to the OS developers. I also use linux too, but I would never consider it for home or educational.  I mainly use linux for servers/embedded use i.e dedicated tasks - yes Linux has lost me a huge amount of time with poor documentation and bugs, but suits these dedicated jobs well.

    And I run a wide range of software under Linux and have no issues that bother me.  Blender may get stuck if I forget it in the background for a week, but it's not like Windows software never hangs up.  Commercial and free software are two very different worlds.  The former is produced by big companies that spend large amounts of money for developing and testing it; while the latter is written largely by individual people in their free time at near zero cost, who hope to get feedback from users if something doesn't work.  I dare say that the quality per cost for free software is better, even if the bleeding edge versions have more bugs.

    @Helix said:

    Pick the right tool for the right job, i say.  Asking teachers to spend time to implement there own educational software is just not productive.

    I hope you are not claiming that my choice of Linux for my workstations is the wrong one.  I don't know about the selection of specifically educational software for Linux, but programs like abiword and gnumeric are certainly good enough for teaching the principles of word processing and spreadsheets.  They are even similar enough to other office software packages that by applying just a little bit of thought, the students will be able to use any of them. 

    @Helix said:

    Just becuase anyone can write code

    This seems to be the aim of programming language developers, yes.  All sorts of training wheels are added that make it easier to hide bugs.  Software was much better when only the truly skilled could create it.  I wouldn't count being able to copypaste helloworld.cpp into Visual Studio and press F5 as being able to write code though.

    @Helix said:

    does not mean that everyone should.
     

    Certainly not.  There are plenty of examples on this site of people who shouldn't write code. 



  • @belgariontheking said:

    I just couldn't get apache to fire up tomcat for me, so I just ran tomcat on port 8080.  Really, all I wanted was to learn J2EE in my own environment, so the 8080 solution was sufficient.

    This sounds more like an apache/module issue than an issue with Windows.  Couldn't you get the exact same issue on Linux, or any other OS running apache for that matter?

    That actually brings up another good point. Apache.  WTF.  Anyone who enjoys configuring an apache server is sadistic beyond belief.  I remember trying to get a basic WAMP stack configured.  I'm no slouch, but after 3 days I gave up and downloaded a pre-built WAMP stack installer.  After a day of supposed automatic solution not even configuring apache correctly, I gave up yet again, turned on IIS and downloaded the Windows PHP installer and was up and running in 2 minutes.



  • @tdb said:

    Software was much better when only the truly skilled could create it.

    Haha.  And I'll bet music was way better when only Mozart and Bethoven were writing it?  It doesn't matter how many people are writing code, making music, painting pictures, etc. the cream will naturally rise to the top in one way or another.  Painters were once quoted as saying that photography would be the end of any need to paint.  After all, why would anyone want a painting when they had a perfect photograph?

    What I'm saying is, new mediums don't introduce new ways to hide bugs, they introduce new ways to solve larger problems.  No longer is it an issue of CPU registers and bits, its about bringing half the world together on a social network or linking police force's database to combat inter-city crime.  If Facebook was written in assembly it would still be 200 years away from launch.  That's about the number of human-years of research and progress that have gone into software development to make it possible for a globally recognized to be developed by some college kids in their dorm room between classes.

    The next time someone says "they don't make it like they used to", consider the fact that school gyms used to be lined with espestos, and knob and tube wiring was responsible for millions of house fires.



  • @Soviut said:

    turned on IIS and downloaded the Windows PHP installer and was up and running in 2 minutes.

    Sounds much like a LAMP installation on a Debian system.  Just install the required packages and you're good to go.


  • @Soviut said:

    @tdb said:

    Software was much better when only the truly skilled could create it.

    Haha.  And I'll bet music was way better when only Mozart and Bethoven were writing it?

    As a matter of fact, I do like classical music better than a lot of more recent so called "music".  Particularly Death Metal, which I categorize as the drummer, guitarist and "singer" having a competition about who can make the biggest noise.

    @Soviut said:

    It doesn't matter how many people are writing code, making music, painting pictures, etc. the cream will naturally rise to the top in one way or another.  Painters were once quoted as saying that photography would be the end of any need to paint.  After all, why would anyone want a painting when they had a perfect photograph?
    Perhaps you have noticed a decline in the demand for portraits for example?  Particularly when compared to population of the Earth.  Paintings can still express things photographs cannot - there's no way you could take a photograph of a fantasy scene which does not exist.  Or an abstract mess of lines (who decided to call that art anyway?).

    The problem with programming is that with the easymode programming tools, incompetent programmers can produce marginally functional code, which their equally incompetent managers will accept.  So perhaps this is partially a management WTF too.  The good coders are still out there, writing high-quality code for their well-deserved salaries, but when there's a hundred times more bad ones, they get drowned in the mass.

    @Soviut said:

    What I'm saying is, new mediums don't introduce new ways to hide bugs, they introduce new ways to solve larger problems.
    Not only do they enable hiding bugs, they create completely new possibilities for creating more bugs.  Consider garbage collection for example.  You no longer have to worry about memory management, which certainly makes programming easier, doesn't it?  But forget a reference somewhere, and your application starts leaking memory.  Because the GC systems generally make no guarantees about when they collect the garbage, verifying that everything gets properly freed is between hard and impossible.

    @Soviut said:

    No longer is it an issue of CPU registers and bits, its about bringing half the world together on a social network or linking police force's database to combat inter-city crime.  If Facebook was written in assembly it would still be 200 years away from launch.  That's about the number of human-years of research and progress that have gone into software development to make it possible for a globally recognized to be developed by some college kids in their dorm room between classes.

    The next time someone says "they don't make it like they used to", consider the fact that school gyms used to be lined with espestos, and knob and tube wiring was responsible for millions of house fires.

    I'm not saying that we should still use assembly, or that all advances in software development have been bad.  I do like C++ more than C, and there are some great tools like Valgrind that didn't exist ten years ago.  But things like memory management and flow control are essential to efficient programming.  Making things too easy and especially marketing them as easier than they are attract the sort of people that write bad code.  Imagine what would happen if someone started selling kits to "build your own house in two weeks".  How make electrical, structural and whatever hazards would there be within the first month?


  • @tdb said:

    I'm not saying that we should still use assembly, or that all advances in software development have been bad.  I do like C++ more than C, and there are some great tools like Valgrind that didn't exist ten years ago.  But things like memory management and flow control are essential to efficient programming.  Making things too easy and especially marketing them as easier than they are attract the sort of people that write bad code.  Imagine what would happen if someone started selling kits to "build your own house in two weeks".  How make electrical, structural and whatever hazards would there be within the first month?
     

    But that hasn't stopped millions of childeren of building treehouses, clubhouses and pillow huts. The difference is that nobody expects them to be fully-functional and equivalent houses. The same should hold true for software development, the difference is that managers/online advertisers etc. expect or claim that is is the same.

    See it like this: a wheelbarrow greatly increases house building speed while reducing chances on failures (people dropping things, people injuring their back). It does bring some new problems (wheel breaking of, falling over to a side) but generally it's considered very usefull. It just isn't "the utimate solution that reduces house-building to a child tasks", something software languages and/or tools exclaim a little too much.



  • @tdb said:

    @Soviut said:

    turned on IIS and downloaded the Windows PHP installer and was up and running in 2 minutes.

    Sounds much like a LAMP installation on a Debian system.  Just install the required packages and you're good to go.

    Maybe you didn't read my previous sentense about the WAMP stack that wouldn't work?  We're talking about a version controlled, pre-configured package, and it still didn't manage to get Apache running.  With IIS I clicked the "go" button, double clicked the PHP installer which automatically sets up an IIS hook, and was on my way.



  • @tdb said:

    But things like memory management and flow control are essential to efficient programming

    If garbage collectors have taught us anything, its that these issues are typically best left out of the hands of programmers unless you love leaks.  Hell, a good modern GC can automatically adjust itself during the same lifecycle of an app to optimize for several different memory scenarios.  Yes, I know, GCs are written in C/C++/ASM, but that's exactly my point; the heavy lifting has been done for us, leverage it.  Stop re-inventing the wheel in the name of "efficient programming", its time to bring a new term to the frontline, "productive programming".

    I want to think like a problem solver when I'm developing, I don't want to think like a computer.



  • @Soviut said:

    Maybe you didn't read my previous sentense about the WAMP stack that wouldn't work?  We're talking about a version controlled, pre-configured package, and it still didn't manage to get Apache running.  With IIS I clicked the "go" button, double clicked the PHP installer which automatically sets up an IIS hook, and was on my way.
     

    Strange. In my early web-development stage I never had any  problem with xampp on both windows and 2 distro's of linux.



  • @Soviut said:

    @belgariontheking said:

    I just couldn't get apache to fire up tomcat for me, so I just ran tomcat on port 8080.  Really, all I wanted was to learn J2EE in my own environment, so the 8080 solution was sufficient.

    This sounds more like an apache/module issue than an issue with Windows.  Couldn't you get the exact same issue on Linux, or any other OS running apache for that matter?

    That actually brings up another good point. Apache.  WTF.  Anyone who enjoys configuring an apache server is sadistic beyond belief.  I remember trying to get a basic WAMP stack configured.  I'm no slouch, but after 3 days I gave up and downloaded a pre-built WAMP stack installer.  After a day of supposed automatic solution not even configuring apache correctly, I gave up yet again, turned on IIS and downloaded the Windows PHP installer and was up and running in 2 minutes.

     

     

    OMFG! I hope you dont call yourself a "computer expert" as it sounds like you really have no idea at all.



  • @jay019 said:

    OMFG! I hope you dont call yourself a "computer expert" as it sounds like you really have no idea at all.

    I never claimed to be a "computer expert". I'm detailing "apache was hard to set up" and "iis was not".  Just because I can write code doesn't mean I'm some a guru with web servers, or have the time and patience to read 300 pages of forum posts detailing what exact version conflict or minute configuration issue I'm running into that's preventing things from just working so I can get some work done.

    I sort of resent the fact that just because my job is a technical one, that I'm somehow no longer an ordinary user anymore.  If a mechanic had to drive to the store to get milk, do you think he'd take the high performance kit car that requires him to manually install the engine before he can leave, or the minivan he's got sitting in his driveway ready to go?



  • @dtech said:

    Strange. In my early web-development stage I never had any  problem with xampp on both windows and 2 distro's of linux.

    I don't doubt that the xampp/wamp/lamp pre-configured bundles work for most people, but its tough for any of them to compete with the fact that the iis setup was one click, and the php install was about 4 clicks.  I pretty much loathe PHP at this point, but I tip my hat to whomever was responsible for building that installer, it did its job perfectly.



  • @Soviut said:

    @tdb said:

    @Soviut said:

    turned on IIS and downloaded the Windows PHP installer and was up and running in 2 minutes.

    Sounds much like a LAMP installation on a Debian system.  Just install the required packages and you're good to go.

    Maybe you didn't read my previous sentense about the WAMP stack that wouldn't work?  We're talking about a version controlled, pre-configured package, and it still didn't manage to get Apache running.  With IIS I clicked the "go" button, double clicked the PHP installer which automatically sets up an IIS hook, and was on my way.

    I did, and the point of my comment was that there are platforms on which Apache works as easily as IIS does on Windows.  Just because the automatic setup fails on Windows does not mean Apache is bas as a whole.  A lot of web servers do use Linux after all.

    Was that an offial package from apache.org or some third-party one btw?



  • @Soviut said:

    @tdb said:

    But things like memory management and flow control are essential to efficient programming

    If garbage collectors have taught us anything, its that these issues are typically best left out of the hands of programmers unless you love leaks.  Hell, a good modern GC can automatically adjust itself during the same lifecycle of an app to optimize for several different memory scenarios.  Yes, I know, GCs are written in C/C++/ASM, but that's exactly my point; the heavy lifting has been done for us, leverage it.  Stop re-inventing the wheel in the name of "efficient programming", its time to bring a new term to the frontline, "productive programming".

    I want to think like a problem solver when I'm developing, I don't want to think like a computer.

    With proper use of automatic storage (= stack variables) and encapsulation in C++, there is not much need to worry about memory management.  It is not hard to use the proper tools to find and fix the small amount of leaks that may remain.  Some of them can even point out other potential errors in the program.  Having control of object destruction can be especially important when file or device handles are involved.  I seem to recall that in Java you need to manually close files or risk leaking handles.

    Being able to ignore the details of how computers work may be nice for small projects, but fact is, if you want to write efficient code, you have to know how the hardware works.  Sometimes down to analyzing assembly output and optimizing memory accesses for caching.  No wonder games are still written in C++. 



  • @tdb said:

    @Soviut said:

    @tdb said:

    @Soviut said:

    turned on IIS and downloaded the Windows PHP installer and was up and running in 2 minutes.

    Sounds much like a LAMP installation on a Debian system.  Just install the required packages and you're good to go.

    Maybe you didn't read my previous sentense about the WAMP stack that wouldn't work?  We're talking about a version controlled, pre-configured package, and it still didn't manage to get Apache running.  With IIS I clicked the "go" button, double clicked the PHP installer which automatically sets up an IIS hook, and was on my way.

    I did, and the point of my comment was that there are platforms on which Apache works as easily as IIS does on Windows.  Just because the automatic setup fails on Windows does not mean Apache is bas as a whole.  A lot of web servers do use Linux after all.

    Was that an offial package from apache.org or some third-party one btw?

     

    I wanted to play around with PHP the other day (never used it before.. sorry lol) so I read up about this WAMP, downloaded  and installed it.  I created an index.php with a form (from a tutorial, very simple), placed it into a folder under /www, clicked on the little speed dial thingy on my toolbar, clicked 'start all processes' and bam! I was viewing my web page.  Mind you this was all done on Win XP.  I am not sure what kind of special configurations to WAMP people are doing, but I found it to be the easiest thing in the world to do. 

    Basically I downloaded WAMP (19 mb) in 2 minutes. Install software, 1 minute.  Wrote the .php files (two, an action and a form) 3 minutes. Started the server, 2 seconds.  Opened my browser, 4 seconds.  So in 6 minutes and 6 I was able to download a web server, install it, write the web page and have everything up and running, and viewable.  Seemed pretty easy to me.



  • @tdb said:

    With proper use of automatic storage (= stack variables) and encapsulation in C++, there is not much need to worry about memory management.  It is not hard to use the proper tools to find and fix the small amount of leaks that may remain.  Some of them can even point out other potential errors in the program.  Having control of object destruction can be especially important when file or device handles are involved.  I seem to recall that in Java you need to manually close files or risk leaking handles.

    Being able to ignore the details of how computers work may be nice for small projects, but fact is, if you want to write efficient code, you have to know how the hardware works.  Sometimes down to analyzing assembly output and optimizing memory accesses for caching.  No wonder games are still written in C++.

    C++ is godawful collision of low-level and high-level paradigms.  It should not exist and just because you can come up with a ton of workarounds and band-aids to compensate for its shittiness does not mean it is a sensible tool.  Why in the name of God would you want to manage memory in this day and age?  Why?? 



  • @amischiefr said:

    @tdb said:

    @Soviut said:

    @tdb said:

    @Soviut said:

    turned on IIS and downloaded the Windows PHP installer and was up and running in 2 minutes.

    Sounds much like a LAMP installation on a Debian system.  Just install the required packages and you're good to go.

    Maybe you didn't read my previous sentense about the WAMP stack that wouldn't work?  We're talking about a version controlled, pre-configured package, and it still didn't manage to get Apache running.  With IIS I clicked the "go" button, double clicked the PHP installer which automatically sets up an IIS hook, and was on my way.

    I did, and the point of my comment was that there are platforms on which Apache works as easily as IIS does on Windows.  Just because the automatic setup fails on Windows does not mean Apache is bas as a whole.  A lot of web servers do use Linux after all.

    Was that an offial package from apache.org or some third-party one btw?

     

    I wanted to play around with PHP the other day (never used it before.. sorry lol) so I read up about this WAMP, downloaded  and installed it.  I created an index.php with a form (from a tutorial, very simple), placed it into a folder under /www, clicked on the little speed dial thingy on my toolbar, clicked 'start all processes' and bam! I was viewing my web page.  Mind you this was all done on Win XP.  I am not sure what kind of special configurations to WAMP people are doing, but I found it to be the easiest thing in the world to do. 

    Basically I downloaded WAMP (19 mb) in 2 minutes. Install software, 1 minute.  Wrote the .php files (two, an action and a form) 3 minutes. Started the server, 2 seconds.  Opened my browser, 4 seconds.  So in 6 minutes and 6 I was able to download a web server, install it, write the web page and have everything up and running, and viewable.  Seemed pretty easy to me.

     

     

    And to think that was on windows. lol. On linux I had my web server up and running after my first boot. Seriously Soviut, what the hell makes apache so hard for you? Even configuring virtual hosting is a piece of piss. If a retard like me (someone who's job is collecting trolleys for $5 an hour) can handle it, anyone who works in the computer field and can not get it up and running in less than 30mins should hang their head in shame.



  • @morbiuswilters said:

    @tdb said:

    With proper use of automatic storage (= stack variables) and encapsulation in C++, there is not much need to worry about memory management.  It is not hard to use the proper tools to find and fix the small amount of leaks that may remain.  Some of them can even point out other potential errors in the program.  Having control of object destruction can be especially important when file or device handles are involved.  I seem to recall that in Java you need to manually close files or risk leaking handles.

    Being able to ignore the details of how computers work may be nice for small projects, but fact is, if you want to write efficient code, you have to know how the hardware works.  Sometimes down to analyzing assembly output and optimizing memory accesses for caching.  No wonder games are still written in C++.

    C++ is godawful collision of low-level and high-level paradigms.  It should not exist and just because you can come up with a ton of workarounds and band-aids to compensate for its shittiness does not mean it is a sensible tool.  Why in the name of God would you want to manage memory in this day and age?  Why?? 

    I'm a control freak and a perfectionist.  C++'s destructors are absolutely fabulous, and I can't outright think of any other language that has them, although I'm not quite sure about C#.  If the core language or standard library doesn't have some feature, that doesn't mean that implementing it myself is a workaround.  I like (re)implementing all sorts of stuff myself, it's fun.

    At work I write code for embedded Linux devices.  In that world it's necessary to maintain tight control of the resulting machine code, sometimes down to optimizing memory fetches in tight loops.  We use mostly C (with glib - I hate debugging leaked references), but some C++ too.



  • @tdb said:

    C++'s destructors are absolutely fabulous, and I can't outright think of any other language that has them, although I'm not quite sure about C#.

     you can hook in to the destructor in .NET using the IDisposable interface.   You can also have direct control over when this is called by utilizing the "using" keyword.  For instance:

     

    using (SqlStatement statement = createSqlStatement)
    {
        //do stuff with the statement
    } // the destructor for SqlStatement gets called here

     

    The great thing about this is that it handles exceptions and returning from within the using braces nicely.



  • @jay019 said:

    @amischiefr said:

    @tdb said:

    @Soviut said:

    @tdb said:

    @Soviut said:

    turned on IIS and downloaded the Windows PHP installer and was up and running in 2 minutes.

    With IIS I clicked the "go" button, double clicked the PHP installer which automatically sets up an IIS hook, and was on my way.

    Seemed pretty easy to me.

    Seriously Soviut, what the hell makes apache so hard for you?

     

    Maybe soviut didn't turn IIS off properly before trying to start Apache? I once tried to get Apache running on a Windows SBS 2003 but IIS still takes port 80 on all IPs even if you tell it otherwise ("well behaved" software?) and various SBS features broke when I tried workarounds. I ended up installing PHP into IIS (with extra cost as the project used mod_rewrite extensively) which was easy enough, still not as easy as apt-get though ;-)

     



  • @tdb said:

    Was that an offial package from apache.org or some third-party one btw?

    This was a while ago, I think I was using PHP4 at the time.  However, my first attempt was to grab the Windows version of Apache, which, as I recall, even had an installer.  It installed fine, but suddenly I'm faced with the manual task of setting up the PHP modules.  I know if you're a web admin and you work with Apache all day, this is probably a trivial task, but roll in the whole "by the way, make sure you get this version of PHP as the 0.0.0.1 release before it was incompatible with the 1.0.2.1 release of mod_php which can't run on apache 3.1.2 on Windows XP SP1 when the moon is full" and it gets tricky for someone who just wants to write some code.  Like I said, I'm not a web server admin, just a developer who happens to be using the web as a platform.

    As I recall, I spent 3 days researching how to get the PHP module to actually run under apache, how to get requests piped to it, and actually return its responses.  I poured over the configs, version conflict docs, etc. and I was sure I'd got every single setting just right, yet Apache wouldn't do the phpinfo() test page when I tried to browse to my localhost.  Port 80 was clear, I checked that.  Pinged the server, it appeared to be running.  Browsed to the localhost and got the Apache test page.  But PHP just wouldn't hook.  So I finally gave up, and almost cried when IIS "just freakin' worked".  But now I was 3 days behind on my project.



  • @Zemm said:

    Maybe soviut didn't turn IIS off properly before trying to start Apache?

    I checked that.  It was off.  Hell, the service wasn't even installed by default.  So I guess I technically lied when I said it was 2 clicks to get IIS running, it was about 4 since I had to install the service from the Windows CD.



  • @jay019 said:

    On linux I had my web server up and running after my first boot. Seriously Soviut, what the hell makes apache so hard for you?

    I don't run a Linux server.  You do.  That about sums it up right there.  You're familiar with it, I'm not.  Just like I'm sure you'd have a tough time writing tools for Autodesk Maya and pipelining a 3D animation studio, but to me it seems "retardedly simple".

    To clarify, it wasn't difficult to get Apache running.  It was difficult to get the PHP module to be recognized, configured properly, and take requests.



  • @Soviut said:

    This was a while ago, I think I was using PHP4 at the time.  However, my first attempt was to grab the Windows version of Apache, which, as I recall, even had an installer.  It installed fine, but suddenly I'm faced with the manual task of setting up the PHP modules.  I know if you're a web admin and you work with Apache all day, this is probably a trivial task, but roll in the whole "by the way, make sure you get this version of PHP as the 0.0.0.1 release before it was incompatible with the 1.0.2.1 release of mod_php which can't run on apache 3.1.2 on Windows XP SP1 when the moon is full" and it gets tricky for someone who just wants to write some code.  Like I said, I'm not a web server admin, just a developer who happens to be using the web as a platform.

    As I recall, I spent 3 days researching how to get the PHP module to actually run under apache, how to get requests piped to it, and actually return its responses.  I poured over the configs, version conflict docs, etc. and I was sure I'd got every single setting just right, yet Apache wouldn't do the phpinfo() test page when I tried to browse to my localhost.  Port 80 was clear, I checked that.  Pinged the server, it appeared to be running.  Browsed to the localhost and got the Apache test page.  But PHP just wouldn't hook.  So I finally gave up, and almost cried when IIS "just freakin' worked".  But now I was 3 days behind on my project.

     

    That doesn't really sound to me as a pre-configured WAMP that didn't work...

    Also, I don't know if this was available when you tried it, but PHP has a standard installation manual for unix Apache (1.3 and 2.0) and windows (and a whole lot of other things). If you follow that it shouldn't be too hard.

    Alternatively you have, as I said, a lot of preconfigured WAMP installers available (of which I think XAMPP and wampserver are the most used). You even have a few preconfigured LAMP installers, but most people just use the preconfigured binaries from their distro's system (e.g. apt)  because they also most of the time "just work" and have the advantage of auto-updating.



  • @dtech said:

    That doesn't really sound to me as a pre-configured WAMP that didn't work...

    Like I said, I spent 3 days trying to manually configure Apache to run PHP, gave up and spent another day trying to get a WAMP installation to work, then gave up again and went with IIS.



  • @Soviut said:

    @jay019 said:

    On linux I had my web server up and running after my first boot. Seriously Soviut, what the hell makes apache so hard for you?

    I don't run a Linux server.  You do.  That about sums it up right there.  You're familiar with it, I'm not.  Just like I'm sure you'd have a tough time writing tools for Autodesk Maya and pipelining a 3D animation studio, but to me it seems "retardedly simple".

    To clarify, it wasn't difficult to get Apache running.  It was difficult to get the PHP module to be recognized, configured properly, and take requests.

     

    Firstly, I dont run a Linux server. I run a linux desktop which has apache/php/mysql installed. It's for testing purposes and I am in no way an administrator and never have I claimed to be.

    OK, so is it Apache thats a WTF or the windows PHP installer or what? This all started because you claimed that Apache is a WTF because you failed at making it work. That would be as stupid as me claiming Autodesk Maya as a WTF if I failed at writing tools for it. I would just admit I was the WTF not the tool I had difficulty understanding.

     

    @Soviut said:

    As I recall, I spent 3 days researching how to get the PHP module to actually run under apache, how to get requests piped to it, and actually return its responses.  I poured over the configs, version conflict docs, etc. and I was sure I'd got every single setting just right, yet Apache wouldn't do the phpinfo() test page when I tried to browse to my localhost.  Port 80 was clear, I checked that.  Pinged the server, it appeared to be running.  Browsed to the localhost and got the Apache test page.  But PHP just wouldn't hook.  So I finally gave up, and almost cried when IIS "just freakin' worked".  But now I was 3 days behind on my project.


    So it seems that you are trwtf. You spent 3 days trying to get something working for a project of yours, yet you didnt even have the prerequisite knowledge of how to setup the tools you needed for the project. A bit overambitious maybe? 3 days behind my eye! Sounds like a project you shouldnt have been a part of in the begining. Sorry if it souds harsh, but reality usually is.



  • @jay019 said:

    Firstly, I dont run a Linux server. I run a linux desktop which has apache/php/mysql installed.

    Still more experience than I had at the time.  Who knows, maybe these days its easier, and I'm certainly a lot more experienced and might have less trouble.  I just remember being in config hell, and I'm not alone when I state this.

    @jay019 said:

    Sounds like a project you shouldnt have been a part of in the begining.

    It was a personal project, I wasn't on a deadline, but regardless of whether or not I wasn't competent enough to get Apache/PHP running, I still wasted 3 days I could have been doing what I know how to do (write code, design, css, html), when IIS got it running more automatically than even the automatic solutions.  I went into this figuring that I was technically savvy enough to figure it out. I don't love wading through config files, but typically I can follow instructions and get things working.  Not this time, Apache burried me.



  • @jay019 said:

    That would be as stupid as me claiming Autodesk Maya as a WTF if I failed at writing tools for it.

    Believe me, Maya IS a WTF.  Its huge, opaque, and generally requires both an artistic background and a technical background to even understand what's going on.  If someone complained that Maya was giving them issues, the last person I'd blame was the user.

    Its got poor defaults, reams of settings, non-standard UIs, two interpreted languages, DAG hierarchy systems, node network systems, fragile referencing, etc. etc. etc.  Don't even think for a second you wouldn't blame Maya when you had to figure out all this shit; that's exactly how I felt when dealing with Apache and module configuration.


Log in to reply