Ziggy Admin and the OS from Mars (Part Two)

Part Two: Making the move from Windows to Unix involves a few changes. Here's the second part of a diary from someone making that move.

By Deann Corum | Posted Dec 28, 2005
Print ArticleEmail Article
  • Share on Facebook
  • Share on Twitter
  • Share on LinkedIn

Editor's Note: This is the second installment in a periodic series ENP regular Deann Corum writes on her experiences moving from a Windows-centric environment to a new job with an emphasis on Linux.

Some Weird Sin
First of all, I want to clarify a point I made in the first part of this journal. I have been taken to task by one reader for a statement made wherein I wrote that "Unix-based operating systems don't evolve as quickly (save Fedora Core) as Windows-based OS's do".

I was not so much referring to timelines here as the frequency with which Windows system administrators have to almost completely relearn and re-train on an operating system when it has been upgraded. That is to say that Microsoft operating systems have more of a tendency to be reborn than to evolve.

Even with very frequent kernel updates to open source operating systems, it's rare that Unix system administrators have to completely re-learn how to support the OS due to updates, even major ones.

In contrast, an update of a Microsoft operating system typically means a completely redesigned interface to learn, different file locations, different locations of configuration information in the registry, new snap-ins, wizards and different services (or obfuscations of 'standard' services and protocols) to learn to configure, troubleshoot and support. When Microsoft "evolves", it's a whole new operating system, confounding system administrators and costing them time, money and the frustration of relearning it all. I will not miss this. And that was what my statement, however badly written, meant.

Always Crashing in the Same Car
In the past few weeks, I've run into some interesting differences between Unix and Microsoft operating systems and the way each is administered and supported.

One of the people I spoke with recently stated that a reason for their organization's preference for software rather than hardware RAID was that with software RAID, they "knew what was going on," referring to everything being a file in Unix. They've had issues with RAID controllers and prefer the control they over their software RAID partitions using LVM. Of course they don't have really disk-intensive applications running either. What they support is mostly file sharing and serving up static Web content, so they can afford this choice.

This was still a little shocking to me because in a Microsoft world, we always used hardware-based RAID because software RAID is often considered too slow, too difficult (read: impossible) to reconfigure and too often caused issues with applications and environments we supported. Windows 2003 has vastly improved software RAID reconfiguration capabilities, but I've never heard anyone say they'd prefer Windows software RAID due to its openness and configurability. And when a Windows software-based RAID system crashed, recovery was too often a crap shoot (no pun intended). I haven't had the "opportunity" to deal with this on an LVM configured system yet, nor have I had to deal with it on a Windows 2003 system, in all fairness to Microsoft. But the differences between the two operating systems in regards to which type of RAID system administrators are most comfortable with and why, are quite stark.

Last week, I attempted to upgrade Firefox to the latest version on a CentOS system. Now, on a Windows workstation, this is simple: Download, run the install, and all profiles and preferences are left intact. But on a Unix system this wasn't nearly as simple. I was told to just wait until "someone" rolled an RPM for the upgrade rather than doing it myself because all the file locations, preferences and settings are not automatically upgraded or preserved on a Unix system without some finessing on the administrator's (note: not the end-user's) part. This is a minor task, upgrading your browser in order to have the latest security and features. But if you're a Unix user, you either muddle through doing it manually, roll your own RPM, or wait for "someone else" to roll one.

And if you have a laptop and a docking station, USB devices, or wireless network interface on your system, Microsoft's operating systems will deal with all these pretty seamlessly much of the time. But not quite so with Unix. At least twice in as many weeks, I had to download and install the latest wireless drivers, and manually configure network interfaces to use them. And X Window still comes to a screeching halt when people undock their Thinkpads to run off to a meeting. Not that Microsoft Windows ever crashes on a laptop. Of course it doesn't. I have a country farmhouse and 20 acres to sell you in the middle of Manhattan too.

But all these are relatively minor glitches. Nothing I mind dealing with myself, but they do add up to the common sentiment that "Unix is not quite ready for the desktop" because software upgrades, uninstalls, wireless network interfaces, laptop docking and undocking, dealing with USB interfaces and devices, and other "minor" functionality isn't quite as seamless as it is in Windows. Even in some environments where the servers, network applications and services are primarily Unix-based, the clients remain on Windows because it's still easier for both the administrators and the end-users to work with and support. I'm sure I'm not alone in the sentiment that I'd love to see Unix overtake the desktop.

Try Some, Buy Some (or don't)
Speaking of Windows, it's a relief that the X Window system in Unix is separate from the operating system. Ctrl-alt-backspace restarts X Window. Not the machine, just the graphical interface. And you don't have to install a windowing system on your workstation to use it. Most people do these days, but it's a separate component rather than a noose. On a server, it's often not necessary. It's nice to have it installed but on a daily basis, it's not a requirement. Try that with a Microsoft operating system while still maintaining a usable server or workstation with all necessary services running and fully administrable. I cannot say I miss having to use Windows Terminal Services (now called "Remote Desktop") to remotely administer servers and install software or services. Plain old 'ssh me@someremotemachine' is just fine.

In regards to GUIs, administrator expertise and best practice, the Assistant Dean of IT in a school at a large university made the following comment to me recently:

"One of my principles of good systems administration is the ability to back out of whatever change you've made. I think it's harder to back out of anything when you're interfacing with a GUI because it's harder to know just exactly what you did when you checked that box. This is a problem now for both Linux and Windows administration, but it used to be more of a Windows-only problem. Now, Linux has more GUIs, but system administrators typically don't use them, or don't use them after a few tries."

Even for experienced Windows system administrators, there are too often check boxes and radio buttons in the operating system or various Microsoft applications that they don't know the purpose of, or exactly what it means if they were checked or unchecked. As soon as Microsoft releases its new operating system, there'll be more of them scattered all over the interface for Windows system administrators to (re)learn or leave alone, until there are issues or they are forced to deal with them to enable or disable something.

Comment and Contribute
(Maximum characters: 1200). You have
characters left.
Get the Latest Scoop with Enterprise Networking Planet Newsletter