Last week I spent a surprisingly small amount of time setting up a new Ubuntu Server as a Virtual Machine which will be used to run ircd and a few other minor things internally to replace an existing FreeBSD “server” (it’s running on some old crappy spare hardware). I decided to use Ubuntu this time just because Java support on Linux seems to be much better than FreeBSD, and I wanted to see what Ubuntu was like to administer once you were only armed with a CLI (so far, awesome!)
During the process, I realized a major difference between retiring Unix servers and Windows Servers. Notably, I always find that retiring a Unix box to replace it with another one feels like the end of an era rather than an upgrade. For instance, the previous server was originally setup by employees which do not work with me any longer, on a mish-mash slapped together hardware primarily made of an old eMachines “computer”, yet it’s been surprisingly reliable:
[brandonh@hilo ~]$ uname -a FreeBSD hilo 5.2.1-RELEASE FreeBSD 5.2.1-RELEASE #0: Mon Feb 23 20:45:55 GMT 2004 [brandonh@hilo ~]$ uptime 6:07PM up 231 days, 8:17, 1 user, load averages: 0.00, 0.00, 0.00
The last time it was taken down, it was to move to another location in the building. The worst problem I had with it other than trying to get Java 1.5 on it is explaining to the Security Department what cvsup is and why I needed firewall ports opened up so I could update the system. I’ve had the same experience many times as I’ve been using FreeBSD to host sites at home since 1999 or so, in fact I still have an infamous old 466 Celeron running FreeBSD sitting around on a shelf, unplugged.
On the contrary, old Windows servers always seem to get worse with age. The longer they are around, the more ready you are to get rid of them. I recently worked on a major enterprise application migration which saw many, many new Windows based servers deployed and existing applications moved to them at a new datacenter. There were several of the old servers I was glad to never have to work with again. As a Windows server ages, it seems to get slightly more erratic, to the point of needing to be pulled completely out of production, wiped clean, and reimaged. A friend of mine works for IBM as a Sr. Administrator doing work for a major corporation that needs reliable real-time transactions, and time and time again the Windows servers are the ones which are unreliable and causing expensive downtime under the limitless resources at IBM. Once a Unix box is configured properly, it needs minimal maintenance over a long period of time, while Windows needs lots of reboots and other maintenance which apparently lead to OS rot.
I could go on and on with other examples about reasons I prefer Unix in most all situations, but when it comes down to it, I’ve never been reluctant to see a Windows server replaced, while I’ve always felt at least a little resentment about retiring Unix servers even though they were 2-4 years old or more. Have you had similar experiences?