Getting Started with VMware vSphere 5.5: Part I

Much of my research to date has involved the use of virtualization technology especially Oracle VM VirtualBox. Lately however, I've found that my aging 5 year old iMac has been struggling to keep up with my testing and research even despite attempts at increasing performance by adding physical memory (16GB) and SSD disks. I could also try using other desktop virtualization software such as VMware Fusion or Parallels for OS X (I have both) since other users have had better experience with them. However, I find the alternatives lacking in some feature that I currently use in VirtualBox (such as shared disks for clustering). Also Oracle publishes several Virtual appliances for VirtualBox which makes it easy to learn a plethora of Oracle products and features. For more on Oracle with VirtualBox you can read my paper and presentations on Slideshare.

With that said I've been wanting to build a dedicated lab machine for doing database research for a while now. I've been using VMware server virtualization products such as ESX/ESXi and vSphere for the past eight years, primarily for Oracle databases in my daily job. However, I decided it is time to finally make the plunge and build a dedicated VMware server.

Oracle VM VirtualBox vs VMware vSphere
Oracle VM VirtualBox is a desktop virtualization product, commonly referred to as a type 2 hypervisor. In a nutshell, it allows you to create guest virtual machines with similar or different operating systems from the host on which the product is run. For example you can install Oracle VM VirtualBox on a Mac OS X computer (host) and then create Windows or Linux virtual machines (guests).
In contrast, VMware ESXi is a server virtualization product or type 1 hypervisor. It doesn't require an additional host operating system to install the hypervisor (the layer of code that manages virtual machines). This is an over simplification on my part but it gets the basic idea across.

The Hardware
One of the first things I had to figure out was what hardware to go with - "whitebox" or OEM. A whitebox is a DIY system in which you purchase all the hardware parts separately and build it to your specification. The advantages are that it is built to your specifications and it's typically cheaper than OEM systems. The disadvantage is that you have to buy all the parts yourself and this can take some time if you need to make sure things are compatible. An OEM system is one which is purchased from vendors like Dell, HP and Lenvo for example. The advantages are that you get a complete system with warranty (1-3 years). The disadvantages are that you have limited customization and the costs are usually significantly higher than that of an OEM system.
Everyone will have their own criteria for determining which path to choose but mine came down to two things - time and money. Although I consider myself hands-on techie, I haven't had to do much with hardware besides the occasional memory or HDD or SSD upgrade recently. I've built my fair share of whitebox systems back in the day but I never found it enjoyable or entertaining. In fact, I recall it being quite time-consuming and frustrating at times. Alas, times have changed but if I can find an OEM system within the same price range with similar specifications, that would be my preference.
My specifications were as follows:

  • Intel Core i7 or new processor
  • 32GB physical memory expandable to 128GB
  • RAID controller
  • 2x2TB 7200 RPM HDD
  • 256GB SSD

Target price: $1,000

I know this seems quite doable with a whitebox system but most OEM systems with these specs are currently out of this price range. To get to the chase I decided on purchasing a Lenovo ThinkStation S30. for $834 + tax. Since this came with 16GB physical memory installed I purchased an additional 16GB (2x8GB) ECC RDIMM for $140 on Amazon. I also added a single 2TB Seagate Constellation ES 2TB 7200RPM SATA 6Gbps 128 MB for $90 on Amazon.

In part II I will cover the VMware ESXi 5.5 installation.

Comments

Popular posts from this blog

Viewing ASM trace files and alert logs in Unix/Linux

ORA-00020: maximum number of processes (%s) exceeded

Troubleshooting RAC Public Network Failure