Moving Beyond Dual Boot


Posted:

Category:

,

Starting with dual boot

For as long as I can remember I’ve had a dual boot setup up on my computer. That effectively means there are two different operating systems on the computer. I would run Windows for all of the mainstream applications like Microsoft Office, Adobe Photoshop, Macromedia Dreamweaver (yes I know they got bought by Adobe but it’s still Macromedia to me). I would then run Linux for MySQL, better PHP tools, and to keep my knowledge of that space current. We had one server on the Dreamweaver Team, UltraQA8, which was that server. Ironically but not so much so it was the most stable server of the server farm.

When I started using Linux I started out with Red Hat. As the releases went by I became more and more comfortable with Red Hat’s solution. About six years later a coworker of mine at Macromedia introduced me to Ubuntu. I brushed it off as yet another thing to learn even though it was much easier to get installed. Finally I broke down and gave it a spin and found it to be very novel.

What made Ubuntu different is that it was easy. I could boot into a functional Linux system with just a CD. After the Linux environment is up the user can install the system to the hard drive. Ubuntu was then up and running on my computer. At that time even getting Windows installed that painlessly was no small order. I’d often use the live CD functionality of Ubuntu to rescue a computer that was having issues.

Fast-forward another couple of years and now I’m a hard-core Ubuntu user. The people at Canonical, the commercial arm of Ubuntu, along with the open source counterparts separated their release strategy into two branches. There was a long-term release that was updated every 18 months and support cycle of five years. There is also a short-term release that was released every six months and supported for a duration of two years. I’ve always followed the short-term releases as features tended to be more important than stability. The short-term releases have always been good so I never really thought twice about it.

Quality erodes with Ubuntu

I will say with some regret though that 10.04 was their last great release. Fortunately, it’s a long-term release meeting they will support it until 2015. Now that Ubuntu has reached a level of popularity and saturation in the market I think it’s becoming harder for them to evolve. They’re trying to do too many things at the same time: traditional desktop, server, notebook, and mobile. The introduction of Unity which is their new user interface designed for the netbook and notebook experience is absolutely infuriating on the desktop. I’ve learned to live with it for the small amount of that I have to use it at work but it’s a clear step backwards from their original design in 10.04.

I decided to move off of 10.04 and update to the current short-term release as there was press about Unity becoming better. 10.04 became 10.10 which then became 11.04 which then became 11.10 and finally 12.04. The last update from 12.04 to 12.10 the boot loader failed to install. I restarted the system and then was greeted with a boot loader failure. Now I have my computer inaccessible to me and the state of the hard drive is unknown. MAJOR failure! Bad Ubuntu!

Dan went on vacation for a week

Dual boot is replaced with a virtual machine

When I came back I decided that Linux needed a bit more abstraction from my PC. The bootloader installation and partitioning of the hard drive always was scarier than I was comfortable with. I’ve traditionally thought of Linux as an as an alternative to Windows. Having run Ubuntu on top of Mac OS for a number of months at work my feeling on the matter has changed. On the desktop Linux for me is a complement to Windows.

Like all software vendors updates require changing what people know and require them to learn something new for no apparent reason. Usually one can just run FDISK /MBR to remove the Linux boot loader and reinstall the Windows one. Not so with Windows 7. Microsoft now has a new command to do that. I forget exactly what it was, but I had to go into recovery mode explicitly, and then out to command line mode. There is a command in the system32 folder that began with boot to refresh the bootloader and the MBR.

After getting all of the hard drive issues resolved I’m now left with Windows fully occupying the hard drive and no remnants of the Linux boot loader left. The next step was to install a virtual copy of Ubuntu so I could sandbox it into its own environment. I downloaded Oracle’s VirtualBox. I didn’t think I would be able to install into VMware’s Player so I went the VirtualBox route. Virtual box was missing two major features out-of-the-box: the ability to resize the screen dynamically and copy and paste from the host operating system.

I then downloaded VMware’s Player and the experience could not have been easier. I dropped in the CD walked through a simple wizard in about 15 minutes later Ubuntu is running in a window ready for me to use. The advantage to virtualization is clear: more efficient use of hard drive space, a clear abstraction between host and guest operating systems, and the ability to iterate and experiments without major consequences to unrelated technology. The downside though is that it does require more memory, some performance hit, and the guest operating system always requires it’s host to be healthy. The argument for dual booting is that the operating systems are decoupled. If a virus brought windows down you could quickly reboot into using Ubuntu. Now that live CDs are much more common that argument loses water.

That being said I’m happy with my setup. I don’t push the virtual machine all that hard so this type of setup works for me.

Navigation

Related Posts

Subscribe to the Dashed Yellow Line!

Comments

Leave a Reply

Discover more from Dashed Yellow Line

Subscribe now to keep reading and get access to the full archive.

Continue reading