Home Web Server



  • I'm interested in creating a web server that serves a small community (< 5000 people, but peak load is likely going to be < 100 concurrent users) - This is mainly going to be prototyping for a new service I'm working on. The final solution will be a hosted solution on dedicated hardware.

    I used to (7-10 years ago) make a lot of websites using things like "PHP", "MySQL", "Apache"

    But

    I haven't done anything like that in years and years. These days, I typically create web services that get loaded into asp.net MVC applications, and permissions are managed by somebody else. Frequently it's a request/reply setup

    Get To The Point
    I'm not an IIS sysadmin - I am vulnerable to default configuration errors

    This service will:

    • Host a web site that will likely include the usual bells and whistles of an MVC3 application
    • Host a queryable web api (base 64'd JSON data)
    • Will connect to a local database on my LAN using properly permissioned reader/writer views and stored procedures (but I could use recommendations for securing connection strings, etc for MVC application through IIS)
    • Will use Asp.Net MVC 3 / IIS 7.5
    • Likely will talk to a PHP Wiki - provider TBD (Apache/NGINX) - May or may not be hosted on the same server.

    I'm looking for the following:

    • Security best practices for hardening a web server (IIS 7.5) so that I'm unlikely to leave default configurations that are dangerous in place. (IE: webconfig debug=true) (Links are good - I can read.)
    • Routing recommendations: The current plan is to set a domain 'A' name at a no-ip solution, and have that point at my local server. Yes, I realize this is less than perfectly reliable, no that's not a big deal as long as some futzing can bring it back online.
    • Caveats I can address for connection pools, server threads, scheduled tasks, etc. (I don't need spoon feeding if you know of some search terms to get me started)
    • Any other helpful feedback you wish to provide

    Basically, I normally write the 'guts' of applications to get things done, and hand them off to others to integrate into the actual IIS server / set up permissions for the application layer. Probably sounds like a weird setup, but it's what I do.

    I have a basic knowledge of how to get an application up and running, but I have VERY limited knowledge in securing that application at the administration level. (I normally handle user permissions inside of an application)

    Again, this is for prototyping. I will be able to control how many people have access to this server, but I'd like to make it public to the small community in total. If it turns out to not be feasible, I'll upgrade to managed hosting. This is both for learning experience, and to make information available earlier than it would be able to in other venues.



  • Why not contract the work out to someone? You can work with them so that you can learn, but if it's going to be a production system, there may be too much for it to be set up properly with involving people in the direct management of the server.



  • Consider a topology where you set up nginx as a caching proxy for your app servers. That's what nginx is best at (though it's no slouch at just serving pages or even spawning apps).

    Keep the proxy apart from the app servers, though. So that would mean having 2 nginx servers -- the wiki app server and the proxy.

    It's 2014. A web server that needs "hardening" isn't worth using. Seriously. If IIS needs hardening, run away.

    Use configuration management and deployment automation software. I don't know what the options are on windows. But setting up servers is boring. You only want to have to do it once.



  • What's your upstream bandwidth like? Residential ISPs tend to cut corners there, possibly because it's a requirement for a web server to be able to send things quickly.



  • I consistently have 35 down and 15 up. Ive also proven that Web services can be supported for my target community, though persistent connections are still a bit tbd.

    For what I'm planning, it should be able to handle the load, but a caching server is a good idea, and I have the hardware for it. I have two reasonably beefy machines that will be part of the network, and two lower end machines that would make for a decent gaming pc if I gave it a graphics card.

    As far as contracting, I'd rather learn how it fits together, and the pieces that need to change from out of the box install.

    I would guesstimate I would start noticing network degredation (for home services) after around 20 down, 10 up thresholds are passed. Though the odds of my (primarily text based) service exceeding those thresholds seems exceedingly slim. (This isn't a game, or a game server)



  • I'm by no means a 'security' or 'deployment' guy, but I've seen my share of IIS production environments and have set up a few myself. My take on it is - there's nothing to it. No special security setup, no secret voodoo sauce. It's usually just, take the dev web.config, plop in production DB connection string and you're (more or less) good to go. Whatever security theater the guys in your company are pulling that got you so scared, they are probably full of it.

    That said, like I said, I'm far from an expert. And my experiences are mostly tied to small intranet apps, without huge security concerns (no money or private data). I would also be curious to learn if there's some "best practices" handbook I'm missing.



  • IIS is secure by default. That's why its such a PITA to use for dev and everybody uses IIS Express. Your dev web.config is probably not. Use a fresh .config with only the changes you absolutely need.



  • I see we're back to green hearts.

    RE running scared, not really. I just don't want to pretend I'm an iis expert, I know some basics but I figured this is the place to hear 'you're doing it wrong, idiot. '


  • :belt_onion:

    @blakeyrat said:

    IIS is secure by default. That's why its such a PITA to use for dev and everybody uses IIS Express. Your dev web.config is probably not. Use a fresh .config with only the changes you absolutely need.

    Agreeing with blakeyrat - IIS7.5 does fine security-wise for us (and we host a fairly major website). The most insecure part is what you set up in your custom web.config, or if you go bungling around in the 7.5 control panel. That said, I've seen plenty of 3rd party contractors that do exactly what blakeyrat implies - use the dev web.config in production with full error messages, debugging, everything turned on because they're too lazy to configure a proper production one.

    Also, there is one bit of security that should be done if you need to set up some database connection strings or other passwords in your application/web .config files, you should encrypt those sections of the config. See http://msdn.microsoft.com/en-us/library/vstudio/zhhddkxy(v=vs.100).aspx or find a link appropriate to whatever your .NET version is.



  • Glad the two things I touched on (web config and encrypting connection strings) appear to be the only real concerns brought up in this topic so far.


  • Discourse touched me in a no-no place

    @Matches said:

    I see we're back to green hearts.

    Huh? I've not touched them...


  • Discourse touched me in a no-no place

    @PJH said:

    Huh? I've not touched them...

    I don't see green hearts either. Is @Matches colour-blind? (It's quite a common thing, so no shame in admitting it…)


  • Discourse touched me in a no-no place

    I've seen green hearts too, but they're red today.

    Let me just check something...

    ...yup, that's it. They're red on desktop and green on mobile.

    Chrome on Windows 7

    Chrome on Android 4.4.4 (Nexus 5)


    Filed under: ...probably for no particular reason. Consistency is a bitch!

  • Discourse touched me in a no-no place

    @DoctorJones said:

    They're red on desktop and green on mobile.

    What next, blue hearts on print media? What colour are they when using a screen reader?


  • Discourse touched me in a no-no place

    @DoctorJones said:

    They're red on desktop and green on mobile.

    Oh bollocks. Forgot about that...

    Edit: Fixed.


  • Discourse touched me in a no-no place

    @PJH said:

    Oh bollocks. Forgot about that...

    Edit: Fixed.

    Do I take it that this is some custom stuff that we've done?


    Filed under: I wish more people responded to support requests with "Oh bollocks"

  • Discourse touched me in a no-no place

    @dkf said:

    What colour are they when using a screen reader?

    Mumble-gurgle-babble.


  • Discourse touched me in a no-no place

    @DoctorJones said:

    Do I take it that this is some custom stuff that we've done?

    Yup.

    http://what.thedailywtf.com/t/the-like-icon-wut/1694/69


  • Discourse touched me in a no-no place

    @PJH said:

    Edit: Fixed.

    I can verify; we now have red hearts on mobile.


    Filed under: PJH: rapid response technician

  • Discourse touched me in a no-no place

    @DoctorJones said:

    Filed under: PJH: rapid response technician

    Only to stuff I break. And if I'm online when people notice it...



  • Blue hearts were only around for 20m or so, they didn't look good. /Not kidding


  • Discourse touched me in a no-no place

    If you get wounded or killed during your service of the TDWTF forums you be awarded with Purple Hearts.



  • Being killed is a barrier to reading.



  • Anybody have recommendations for OS for the NGINX server?

    The current plan is:

    • Install the main OS (likely XenServer)
      - Install VM1 (NGINX Linux OS (flavor suggestions welcome) - This should be the main server for PHP pages)
    • Install VM1 (NGINX Mono Cent OS 7 - this should allow MV3 hooks and PHP processing (such as wiki)
    • Install VM2 (IIS7.5 Win7 Pro** - This will be the main server for MVC3 pages)

    Users would only be able to access the main configuration from the primary OS that would issue the proper delegates to VM1/2 (though knowing me, I probably am thinking about this backwards.)

    The concept is this prototype stages out how the actual servers would behave, where the main OS would act as a caching/load balancing server that manages worker delegation, VM1/2 would all be physical hardware for the end product.

    ** A real buildout of this would be win server 2012

    [Edit] Decided to simplify life.


  • BINNED

    @Matches said:

    Install the main OS (likely linux, OS recommendations welcome, probably going to use a UI for active administration [on demand] and ssh for the quick log in type items.)

    For virutalization? Proxmox seems fine. Has a Web UI to administer shit. Debian-based.

    @Matches said:

    Install VM3 (Mono Linux OS - probably CentOS 5

    CentOS 5? CentOS 6 is behind the curve on the packages, CentOS 5 is ancient. You sure that wasn't a typo?

    And no, not requoting your ninja.



  • I'd just go with a Debian server image. Or an Ubuntu Cloud Image. If you go Ubuntu, make sure to use the long term release (14.04 at this point)



  • Oh shit, it is.

    I was going by mono documentation. Well then, updating that to CentOS7.


  • BINNED

    @Captain said:

    I'd just go with a Debian server image. Or an Ubuntu Cloud Image. If you go Ubuntu, make sure to use the long term release (14.04 at this point)

    Also, as a midpoint, Linux Mint Debian Edition has an interesting thing going on. Based on Debian stable with chosen updated packages from testing in an upgrade pack every 6 months.

    Not intended for server use though, so it comes with desktop and gubbins, but I like the idea. Running it on my work machine, though we did go with Debian for servers.



  • Made some edits after thinking about it.



  • As another question,

    If I have a domain point at a no-ip configuration which points at my local machine, if I'm testing locally (from a different machine on the same LAN) be able to properly go out to the net, and come back locally as if I were a public viewer? Or will it be smart enough to resolve the DNS entry to an IP that matches my own, and automagically convert it to a local LAN ip?

    (Yes, I realize this is an odd question. But I'd rather not have to test everything from mobile when I want to test the public page.)



  • @Onyx said:

    For virutalization? Proxmox seems fine. Has a Web UI to administer shit. Debian-based.

    We are using Proxmox on one server. It's pretty neat.

    As for CentOS, I'm using it on a bunch of machines. Not a big fan. Prefer Debian myself.



  • I'm honestly not a big fan of any linux distro. I have some experience in cent 4, some older ubuntu installs (12), some debian (much more recent, but im definitely not going back to that)

    Honestly, the knoppix live disc I had back in 2006 felt a lot more user friendly than most of the linux distros i've seen. I really hate most linux package managers and how it throws shit wherever it pleases.

    Granted, this is largely ignorance of the linux environment, but god dammit why do I have to work so hard to get basic things to work.



  • Coming from "natural" desktop environment, like Windows or Mac, there's like a barrier you need break through. Barrier to linux, ha ha. Ahm..

    At first, you're working inside whatever desktop environment they gave you and desperately hope nothing goes wrong. But something always does. Then you go online and look for magic incantations you need to type in to make the damn thing work again. Underneath it all is the bubbling rage, that stems from helplessness, that stems from ignorance. "Why the fuck do I have to deal with this? Why can't the fucking thing just work?"

    You don't have time for this shit! You have real work to do, not play around with OS like some fucking neckbeard!

    So you fix the thing and move on.

    Then you fix it again. And again.

    You try different distros. "Maybe this one will just work, like Windows or Mac, like they always promise it would." But it won't. None of them do.

    Underneath the shiny surface, there's always the ugly mishmash of packages and daemons that seems to just barely hold together, as long as you don't shake it too hard. Sooner or later something happens and you're thrown back into the terminal. Back to angrily googling for magical console commands and cursing stupid OSS.

    But maybe this time, you understand a bit more. You can adapt the solution to your own needs.

    "Hmm, what if I pipe these two commands together... Neat!" That fixed it. This whole *nix thing is starting to make sense. Ok, whatever, back to work.

    Soon, you find yourself keeping one terminal open. Maybe just to paste in a quick command or run kill -9 if some desktop widget hangs. You are still googling for fixes when the shiny facade breaks, but the rage is lessened. You feel more in control, like you can deal with this shit. No matter how bad it gets, you can recover. You will never again need to format a HD and start over.

    Fast forward. One terminal isn't enough anymore. Neither are two. How about an entire workspace? And do you even need to start this desktop utility? Isn't it faster to type in a quick command, when you already have a terminal waiting? Cleaner? Maybe even... cooler?

    This is about where I'm now. I was at your level maybe two years ago.

    So, stay strong. If you keep at it, it will get better. Neck without a beard is a sad thing.

    (Pictured: each beard level is 5 years of Linux experience)


  • @cartman82 said:

    Coming from "natural" desktop environment, like Windows or Mac, there's like a barrier you need break through. Barrier to linux, ha ha. Ahm..

    At first, you're working inside whatever desktop environment they gave you and desperately hope nothing goes wrong. But something always does. Then you go online and look for magic incantations you need to type in to make the damn thing work again. Underneath it all is the bubbling rage, that stems from helplessness, that stems from ignorance. "Why the fuck do I have to deal with this? Why can't the fucking thing just work?"

    You don't have time for this shit! You have real work to do, not play around with OS like some fucking neckbeard!

    So you fix the thing and move on.

    Then you fix it again. And again.

    You try different distros. "Maybe this one will just work, like Windows or Mac, like they always promise it would." But it won't. None of them do.

    Underneath the shiny surface, there's always the ugly mishmash of packages and daemons that seems to just barely hold together, as long as you don't shake it too hard. Sooner or later something happens and you're thrown back into the terminal. Back to angrily googling for magical console commands and cursing stupid OSS.

    But maybe this time, you understand a bit more. You can adapt the solution to your own needs.

    "Hmm, what if I pipe these two commands together... Neat!" That fixed it. This whole *nix thing is starting to make sense. Ok, whatever, back to work.

    Soon, you find yourself keeping one terminal open. Maybe just to paste in a quick command or run kill -9 if some desktop widget hangs. You are still googling for fixes when the shiny facade breaks, but the rage is lessened. You feel more in control, like you can deal with this shit. No matter how bad it gets, you can recover. You will never again need to format a HD and start over.

    Fast forward. One terminal isn't enough anymore. Neither are two. How about an entire workspace? And do you even need to start this desktop utility? Isn't it faster to type in a quick command, when you already have a terminal waiting? Cleaner? Maybe even... cooler?

    This is about where I'm now. I was at your level maybe two years ago.

    So, stay strong. If you keep at it, it will get better. Neck without a beard is a sad thing.

    <img src='/uploads/default/4535/2e8a8fca183967fa.jpg'>
    <small><i>(Pictured: each beard level is 5 years of Linux experience)</i></small>

    +1 because I have no likes left.

    I have a fair amount of Linux server admin experience. The stuff I know how to do is now fairly easy, except for when it isn't. It's the learning part that sucks due to so much inconsistent, outdated, and often plain incorrect documentation. The learning process is about as much fun as repeatedly sodomizing yourself with an 8-foot cactus.



  • @cartman82 said:

    At first, you're working inside whatever desktop environment they gave you and desperately hope nothing goes wrong. But something always does. Then you go online and look for magic incantations you need to type in to make the damn thing work again. Underneath it all is the bubbling rage, that stems from helplessness, that stems from ignorance. "Why the fuck do I have to deal with this? Why can't the fucking thing just work?"

    Desktop environment, CLI, tomato.

    But this paragraph is the most beautifully crafted poetry I've ever read that describes my feelings perfectly.

    @blakeyrat would be proud.


  • BINNED

    I don't know if I had almost no problems with the basic stuff recently or I just know what I'm doing now. But apart from "stuff is falling apart" (again, maybe I just don't notice any more) I agree completely.

    Have +1 because all my likes are gone.

    Also, I suffer from TMT (Too Many Terminals) as well, but I think it's mostly because I have to ssh into stuff all the time and I keep forgetting I already have a session open on some workspace and on some monitor buried under some other window...


  • Considered Harmful

    @Matches said:

    The final solution

    You monster.


  • I survived the hour long Uno hand

    I finally hit the level where I opened a VM with a new distro, stared at the menu for a minute, then popped open a command line and proceeded to do my work there where I understood what I was looking at.

    (Seriously, WTF is up with Fedora's quickstart menu or whatever it's called?)



  • Maybe a year ago, I turned my old PC into a half-assed linux server, just because. At the time, I was very careful to select a proper lightweight desktop environment, so I could do my work on it. I selected some LXDE variant (maybe Lubuntu) and then spent the whole day setting up SAMBA and some kind of remote desktop analogue, so I can treat it as a crappy version of Windows Server.

    These days, I would just do the barebones Debian install and do my stuff through ssh.


  • I survived the hour long Uno hand

    I had the exact same process, but I picked XFCE over Ubuntu headless install, and now after that box blew I have one with Debian. It's not even in the same room as a monitor anymore, I stuck it in the living room so it and the consoles could avoid wireless.


  • Discourse touched me in a no-no place

    @cartman82 said:

    Why the fuck do I have to deal with this? Why can't the fucking thing just work?

    That's what I feel like with selected parts of the Windows API. Particularly the horrible mess that is running subprocesses, where different runtimes have different weird rules. Yay. :angry: Not claiming that Linux or OSX are great, but they don't fuck this part up so thoroughly. And yes, I know why. I know why. It was wrong in CP/M so it's wrong now.

    And both Win and Linux suck when it comes to GUI drawing primitives. But I try to avoid writing GUI apps these days. BTDTGTTS.



  • You want to know the truly awful WTF? Some bastard bridged PHP and GTK. That's right: the language you know and hate fused with the desktop toolkit you know and hate to build apps that you'll avoid and hate.



  • @Arantor said:

    You want to know the truly awful WTF? Some bastard bridged PHP and GTK. That's right: the language you know and hate fused with the desktop toolkit you know and hate to build apps that you'll avoid and hate.

    I wanted to dislike this but then I remembered I'm out of likes. A moment after that, I remembered Discourse doesn't even have a dislike feature. So here's a -1 for you.

    Yes I'm shooting the messenger again.





  • We should get @codinghorror here to give you guys more likes. Here's one for you, also.


  • :belt_onion:

    @Arantor said:

    That's right: the language you know and hate fused with desktop toolkit you know and hate to build apps that you'll avoid and hate

    At least it's not PHPQT.

    Its goal is being a base and supplement for further bindings such as Akonadi, Plasma and other KDE related software as well as enabling PHP developers to write desktop applications
    what has been seen can not be unseen. Seriously though, if "PHP developers" want to write desktop applications, why not take a few days and learn a language that is actually useful for writing desktop applications? Instead of wasting a few days trying to wrangle something that was never meant to have ever existed, with next-to-zero reusability whatsoever.


  • Possibly because fusing PHP to GTK (not so sure about to Qt) is really not the headache you might think it is. PHP for all its faults is really just a kludge of fake C on top of real C, protecting people like me from really messing it up. Since you can bridge the gap to real C without too much effort, it's really not that intensive to provide a PHP-GTK binding and just address everything as though it were C functions, including the wait loop that you would do in GTK.

    The thing is, we all take the piss out of most PHP developers - and I speak from the other side of the fence as I say this - as it is rightly deserved. Most PHP programmers are API gluers not programmers. Most PHP 'programmers' would struggle to implement simple sorting if PHP didn't make it easy for them in the first place. I don't implement my own, because PHP provides me a native one that works faster than my own - but it doesn't mean I can't do it. It just removes me having to.

    I've seen the shit most PHP programmers turn out. The thought of them growing up to do real programming is absolutely frightening. You think software quality is poor now? Them suddenly having to work triple hard while floundering far deeper than they can cope with... it'd weed out the real programmers. But I stick with PHP because there is money to be made out of fixing that kind of shit.


  • Discourse touched me in a no-no place

    @darkmatter said:

    Seriously though, if "PHP developers" want to write desktop applications, why not take a few days and learn a language that is actually useful for writing desktop applications? Instead of wasting a few days trying to wrangle something that was never meant to have ever existed, with next-to-zero reusability whatsoever.

    Are 'Thin Clients' still a thing these days?


  • BINNED

    @darkmatter said:

    At least it's not PHPQT.

    Fucking WHY? Seriously, WHY? Ok, yes, it's still C++, but freaking hell, if you do it the right way (instead of going out of your way to make your own life complicated) it's so easy it might as well be PHP. GTK is at least just an API on top of C / C++ / C#, Qt is a fucking framework where you don't have to touch a single bit of native C++, ever.

    And how the fuck do they intend to implement signals and slots? No, don't tell me, I don't want to know. I wouldn't be surprised if they replaced the moc with .htaccess or something.


  • BINNED

    @PJH said:

    Are 'Thin Clients' still a thing these days?

    Yes. I still see a lot of Citrix + all desktop and application virtualisation tricks are also a way of using thin or thinner clients (e.g. a normal client pc/laptop but low/underpowered running almost no local software).
    The headaches it produces are just increasing.
    "I have problems running your VMware virtualized software in our citrix environment. I connect to citrix from my laptop running an out of date XP version through the VPN over a 3G connection. Your application fails to connect properly to my aging serial barcode reader/scanner that is hooked up through a serial-usb convertor. Your application is so buggy and slow."


Log in to reply