Virtualization, Cloud, Infrastructure and all that stuff in-between

My ramblings on the stuff that holds it all together

vTARDIS Cloud

Following on from my recent VMworld Europe user award I have mentioned that I’ve been working on a scaled out version of the vTARDIS, this post will act as the index for this project, there is a lot of ground to cover in terms of it’s configuration.

Disclosure/Disclaimer – I am a VMware employee, this project is not an official VMware effort, project, fling or even a thing – it’s my private-time work, documented for the community

Very little of this is an officially supported configuration, particularly the use of nested ESX – to re-iterate, this is not a VMware supported, recommended or blessed configuration – but it works well enough for my own needs – your mileage may vary and no warranty is granted, expressly or otherwise.

This is not a solution for production use, it’s suitable for lab/study work and actual performance is limited by the laws of physics

If you run into difficulties with any of this please feel free to drop me a line via the comments section of this post, however I do have a full-time day job at VMware, I’ll help where I am able.

What is vTARDIS? – see this post for details of the original vTARDIS project

What is vTARDIS.cloud?

A small, low cost physical infrastructure which is capable of supporting several multi-node ESX clusters. It provides an infrastructure representative of enterprise-grade vSphere/vCD deployments through heavy over-subscription of physical hardware as well as providing “production” home services like media streaming, data storage, DNS, DHCP etc.

Why?

My original home lab has been scaled out to support my new position at VMware and my VCDX/VCAP studies, a core part of my work is VMware vCloud Director (herein refered to as vCD) so my lab reflects that. Additionally my wife is trying to continue her IT studies, so it’s helpful to have a self-service portal for building out virtual machines for learning.

You very rarely have a large number of ESX hosts and shared storage to experiment with, testing scripts, rebuilding hosts, changing configurations. This provides a representation of a large vSphere/vCD deployment so you can carry out such work to support studies, or pre-production work.

What does it look like? – photo

The vTARDIS.cloud lives in my geek-cabin which is my home office (more info on that here) and now takes up most of a full rack.

image

What does it look like? – High-level architecture

the following diagram illustrates the layout of the vTARDIS.cloud

image

The key configurations and components of the design which I will post further details on are as follows (+more to follow);

  • Stateless ESXi deployment – Using autodeploy VM to PXE boot and configure large numbers of {virtual} ESXi hosts
  • Script to deploy large numbers of VMs and create DHCP reservations
  • Using the Distributed Virtual Switch with nested ESX – share a single dvSwitch between physical and nested ESX hosts (complicated virtual wiring!)
  • Remote Access to your home lab with a virtual appliance
  • vMotion between nested VMs
  • vMotion between nested ESX and physical hosts
  • Configuring the Cisco 3500 XL switch with VLAN, trunk ports for ESX
  • HA Layer 3 routing for the lab using Vyatta virtual appliance and FT
  • Using Distributed Power Management (DPM) with your home lab
  • Enabling Self-service with vCD
  • Backup on a budget

How much did you spend?

I cannot say, as my wife will probably kill me Smile I’ve acquired most of the hardware over the years or from eBay/factory outlet stores so it’s been a gradual expansion rather than an upfront cost. But still, it’s all been out of my own pocket – there are no sponsors or generous donations of kit (if you are reading this and would like to donate some equipment, read the disclaimer at the start and if you’d still like to talk drop me a line)

Item Approx Cost (£GBP) Status
Cisco 3500 XL 100mb switch (48 ports) £100 (eBay) in-use, VLAN-trunks from ESX hosts and office workstations connectivity
Netgear GS487T 48 port gigabit switch £100 (eBay) Spares (decent switch but too noisy for use in office)
Linksys SLM2008 8 port gigabit switch £90 (Amazon) vMotion/vStorage networks
Iomega IX4-200d 8Tb NAS in RAID5 configuration £1,000 (online, ouch!) in-use, critical, like it a lot but v.expensive
Multiple USB2 drives 500Gb-1Tb varies in-use plugged into IX4 for backup
2 x HP ML110G4 Intel Xeon, 8Gb £200 each in-use (management cluster)
special online deals, now defunct Sad smile
3 x HP ML115 G5 AMD Quad Core, 8Gb RAM, dual port GbE Intel NIC £2-300 for each serverwith RAM (varying deals)
80-100 for 8Gb RAM
40 for dual port Intel GbE NIC (job-lot on eBay)
in-use (resource cluster)
now EoL – hopefully they won’t die!
42U Rack (no-brand) free holding up servers Smile
1 x HP D530 SFF Desktop PC, 4Gb RAM, 500Gb SATA £90 (eBay) in-reserve, was ESX 3.5 host (non x64 CPU)
HP TFT 15” rack mount monitor free from skip at customer in-use
HP 4 port PS/2 KVM free from skip at customer in-use
128Gb Kingston SSD £200 (Amazon) UberVSA virtual SAN storage (was in original vTARDIS project; since cannibalised)
64Gb Transcend SSD £100 (Amazon – a while ago) UberVSA virtual SAN storage
Compaq ML570 G1, quad Xeon CPU, 12Gb RAM, external disk array multiple 18Gb SCSI disks, SmartArray £400 eBay (4 or 5 years ago) retired, non-x64 and too power hungry (was power-sucken-cluster)
[open to offers!]
spider refuge
Compaq DL360 G1, single Xeon CPU, 4Gb RAM, 2 x 18Gb HDD £500 eBay (a long time ago) retired, non-x64 and too power hungry (was power-sucken-cluster)
[open to offers!]
spider refuge
Compaq DL320 G1 – unknown spec free, from customer refresh a long time ago retired and faulty, spider refuge
Sun Netra free from a customer refresh a long time ago retired, was old firewall, spider refuge
Compaq 2 drive DLT tape-loader free from a customer refresh a long time ago retired, and probably faulty by now, spider refuge

How much does it cost to run?

This uses approx 600w of power 24/7 – it’s not that cheap here in the UK, I estimate about £6-700 per year, DPM certainly helps to reduce the power consumption of the resource cluster when it’s less-busy, although as a side-benefit the vTARDIS acts as passive heating for my garden office during the winter, that’s “green”, right?

11 responses to “vTARDIS Cloud

  1. Tom November 4, 2010 at 3:18 pm

    Regarding support, suggest using a WIKI and putting everything there, then people can add more information and you don’t have to do it all. There’s also sites in vsphere-land that you could link to though I can’t think what ones…HTH

  2. Mike Tang December 5, 2010 at 11:56 am

    I failed to install Windows 2008 R2 (64bit) in one nested VM hosted by a vESXi 4.1

    So, 64bit gest OS is not supported in nested VM, right?

  3. vinf.net December 7, 2010 at 11:49 pm

    @Mike – yes, 32-bit nested guests only at the moment, no way round that I’m afraid.

  4. Virtually Just That Simple January 5, 2011 at 2:04 am

    Great Blog

    I am in the process of building a lab at home, and found your article. It works perfectly. I am running the Celerra VSA, VC, AD on My Dell XPS and have ESXi installed on the Shuttle, running vMA, vShields, and two nested ESXi Running some windows 7 and Centos VM’s VMotion, Storage VMotion, vDS, DRS, HA, All working well. Booting from the SD card is Great, The Celerra VSA is using Raid 1 Disk on my Dell XPS. I can teardown and redeploy the Lab very fast with your setup. I am in the proccess of building a vCloud Director lab, to become comfortable with the Install and Configuration.

    Thanks Again
    Bill Roberts
    https://virtuallyjustthatsimple.wordpress.com/

  5. Pingback: Building a Fast and Cheap NAS for your vSphere Home Lab with Nexentastor « Virtualization, Cloud, Infrastructure and all that stuff in-between

  6. jitesh January 29, 2011 at 12:50 pm

    how did you setup vyatta as im having issues

  7. Pingback: vSoup the USB deviation #5

  8. Pingback: ALVARO ANTON » Sample VMware Lab

  9. Pingback: Latest addition to my homelab : Dell PowerEdge T410 | Timo Sugliani

  10. Pingback: Welcome to vSphere-land! » Home Lab Links

Leave a comment