Tag Archives: vCO

Running free VeeamZip directly from the vSphere Web Client

veam_thumb.pngThere are a lot of times I find myself needing to take a one-off backup job of a VM – prior to software upgrades or patching I always like to take a backup of the affected VM(s) in the event that, well, you know, I mangle things.  VeeamZip is great for this – it allows me to process a quick backup of my VM that is separate from its’ normal backup and replication routines.  Since I deal in an environment that running paid Veeam licenses I have access to the Veeam Plug-in for the vSphere Web Client – and this plug-in does exactly what the title of this blog post is – it allows us to perform VeeamZips of our VMs without having to leave the vSphere Web Client and/or log into our Veeam Backup and Replication console.

What if I’m using Veeam Backup and Replication FREE?

So this is all great for me, but I got to thinking – What if I wasn’t running a paid version of Veeam Backup?  What if I was simply running the free version – this doesn’t come with Enterprise Manager, therefore it doesn’t come with a means of getting the Veeam Backup and Replication Web Client plug-in installed – therefore no VeeamZip from the Web Client right? – Wrong!  Ever since Veeam Backup and Replication v8 U2 came out they have been including PowerShell cmdlets around the VeeamZip functionality.  I wrote about how to use it last year in Scheduling Veeam Free Edition Backups.  Well, since we have PowerShell that means we can use vRealize Orchestrator to build a workflow around it – and we have the ability to execute workflows directly from within the vSphere Web Client – so without ado, Running the free VeeamZip functionality directly from the vSphere Web Client.

First up the script

I didn’t get too elaborate with the script as you can see below.  This is simply a handful lines that take in a few parameters; the VM to backup, the destination to store the backup, and the retention, or autodeletion of the backup.

#Load Veeam Toolkit
& "C:\Program Files\Veeam\Backup and Replication\Backup\Initialize-VeeamToolkit.ps1"
#Get the VM Veeam Entity.
$vmentity = Find-VBRViEntity -Name $VM
#VeeamZip it!
Start-VBRZip -Entity $vmentity -Folder $destination -AutoDelete $Autodelete -DisableQuiesce

That’s it for the script – simple right – feel free to take this and add whatever you seem fit to suit your needs 🙂

The Orchestrator Configuration

Before we get to creating our workflow there are a few things we need to do within orchestrator, mainly adding our server that hosts our Veeam Free instance as a PowerShell host within vRO.  But even before we run the ‘Add a PowerShell Host’ workflow we need to run a few winrm commands on the Veeam Free instance.  I have a complete post about setting up a PowerShell host here, but will include the commands you need to run below for quick reference.

First up, on the Veeam server run the following in a command shell…

  • winrm quickconfig
  • winrm set winrm/config/service/auth @{Kerberos=”true”}
  • winrm set winrm/config/service @{AllowUnencrypted=”true”}
  • winrm set winrm/config/winrs @{MaxMemoryPerShellMB=”2048″}

Next, from within vRO (as shown below) we can run the the “Add a PowerShell host” workflow…


As you can see my Veeam Server is actually the same as my vCenter server – don’t recommend doing this but hey it’s a small lab!  Just be sure to use the FQDN of your Veeam Server for Host/IP column.


Ensure that the remote host type is WinRM, and that the Authentication method is set to Kerberos.


And be sure that we are passing our username in the ‘username@domain’ format, along with ‘Shared Session’ for the session mode.  Once you are done go ahead and click ‘Submit’.  If everything goes as planned your Veeam Backup and Replication server should be added as a PowerShell host within vRealize Orchestrator.

And now, the workflow!

Finally we can get to actually building our workflow.  If you remember our script we take in three parameters; VM, Desitination and AutoDelete – so we will mimic the same with our workflow, only calling them Input Parameters within vRO (shown below)


Now since we will be using the built-in Powershell workflow ‘Invoke an External Script’ we will also need to have some workflow attributes setup in order to pass to that workflow.  Below you can see how I’ve setup mine…


Your configuration may vary a little from this one, but as you can see we simply add a PowerShell host attribute and map it to our newly added host, as well assign the ScriptPath attribute to the representation of where we saved our little VeeamZip script earlier.  The arguments attribute can remain empty as we will only use this to build the arguments string to pass to the script.


The first element we want to add to our workflow schema is a Scriptable task – go ahead and drag that over into your workflow.  This is where we will create our arguments string.


As far as what goes into the scripting you can see I’ve simply brought in the arguments attribute, along with our three input parameters and simply chained them together into one string (arguments = ‘”‘+VM.name+'” “‘+Destination+'” “‘+AutoDelete+'”‘;), then ensured that my arguments attribute was included in the output as well.


Next drag the ‘Invoke an external script’ workflow into your schema (you can see I’ve renamed mine ‘Run VeeamZip’.  Ignore all of the prompts regarding the setup of parameters that pop up – the easiest way I like to do this is by editing the workflow (the pencil above it) and using the ‘Visual Binding’ tab as shown below.


Simply drag and drop your in attributes to their corresponding in attributes on the external script workflow, along with mapping your output to output.  Easy Peasy!

At this point you can go ahead and save and close your workflow – we are done with Orchestrator.  If you want to run the workflow a few times to test from within vRO go ahead – but the point of this post was to run it from within the Web Client so let’s move on to that step.

vRO and vSphere Web Client

I love vRealize Orchestrator and I love the fact that I can contextually execute custom workflows from within the vSphere Web Client.  To do this you need to first register your vRO instance with vCenter – this should be automatically done for you depending on you set everything up – I’m not going to get into that configuration today.  To get to our context mappings we need to click on Home->vRealize Orchestrator.  With the vRO Home in context select the ‘Manage’ tab and then ‘Context Actions’.  We then are going to want to hit the little green + sign to add a new workflow context map.


As far as the next steps they are pretty self explanatory – navigate through your vRO inventory to your workflow, click ‘Add’, and select ‘Virtual Machine’ from the types box.  This is what will allow us to right click on a VM and run our VeeamZip, passing the contextually selected VM to the workflow’s VM input parameter.  Click ‘OK’ and it’s time to VeeamZip!

Now when you want to run the workflow you can simply right click a VM and navigate to (in my case) All vRealize Orchestrator Actions->VeeamZipVM


As you can see our workflow will start, using our VM selected as the VM input, and simply prompt us for the destination and AutoDelete settings.


And there you have it!  We can now use the Free version of Veeam Backup and Replication to VeeamZip our VMs directly from within the vSphere Web Client.  In fact, our workflow will even show up within our vSphere tasks so we can monitor the status of the job.  Now, there is no error checking or anything like that…yet!  Let me know if you have any questions, concerns, etc… always happy to read the comments!

Silently installing Veeam v8 Update 2 – at scale with vRO

servletIt’s no surprise that most Veeam customers live with one or two Veeam Backup and Replication consoles – hey, it’s easier to manage, easier to report on, and easier to upgrade come update time if you keep your footprint to a minimum.  That said, I’m not most customers Smile  I have multiple Veeam B&R consoles – not necessarily because I have to, but it’s a preference – It allows me to split out functionality and services, I can have a self contained DR site, and I can also instantiate local backup functionality at multiple sites, leaving them to be non reliant on the main datacenter.  It’s working for me but there’s one caveat – come update time it’s a real big pain having to go out and click through a wizard 20 or so times.

Needless to say I was quite thrilled when I saw the following within the v8 update 2 release notes….


w00t!  No more wizard driven day of click craziness for me.  After getting a little bit of help from Anton Gostev in the forums (if you have been in the Veeam forums you know this guy) I was off to the races in my automated install.

The command

Installing Veeam Update 2 silently is basically done in two steps; first we need to unpack the files from the installer and second we simply execute the HotFixRunner.exe file with the proper arguments.  Anton suggested placing both steps with at batch file so in the end I was left with something like such (setup.bat)…

mkdir c:\VeeamData\v8\unpacked
”c:\VeeamData\v8\VeeamBackup&Replication_8.0.0.2021_Update2.exe” /Q /C /T:c:\VeeamData\v8\unpacked
c:\VeeamData\v8\unpacked\HotFixRunner.exe silent noreboot log C:\VeeamData\v8\patch.log VBR_AUTO_UPGRADE=1

Basically save all the above within setup.bat, obviously changing your directory names to the where you copied the VeeamBackup&Replication_8.0.0.2021_Update2.exe file and where you want the files unpacked.  From there you simply execute setup.bat and sit back and relax while Veeam updates…

But that’s not enough for me

What about copying the update file out to the B&R servers?  What about copying setup.bat out?  We still need to do this and I for one don’t want to manually do anything Smile.  This could very easily be achieved with PowerShell or your tool of choice – but in the case I decided to lean on my good ol’ friend vRealize Orchestrator to do the trick.  Mainly because I’m sitting in the vSphere Web Client for a good chunk of the day and having the functionality right at my fingertips just seemed proper.  That, and every single one of my Veeam B&R servers are virtualized.  Another reason is because I’d like to expand on the workflow someday, giving it the ability to cancel any currently running Veeam jobs, report back on success, etc..  vRO through the use of PowerShell and REST plug-ins gives me all of this building-block functionality.  If your Veeam B&R console isn’t virtualized or you don’t want to automate with vRO go ahead and copy the files out using whatever tool you like and execute them – it’ll work just the same.

But if you want to dabble around in vRO or just want something a little more challenging go ahead and create a new workflow – The workflow I built performs three main functions; copies the update file, copies the setup file, and then executes the setup file – pretty simple.

As far as inputs and attributes this is the way I went about it.   For inputs I used only three, the VM name (this is the VBR instance I’ll be upgrading) and a username and password with permission to copy and execute on the file system.


The rest of the information that the workflow needs will be stored in workflow attributes as it will always be static throughout all the installs and upgrades I’ll perform. In my case I used four attributes (shown below) defining the file paths on both the vRO server and the Veeam server for the setup.bat file and the Veeam update executable.


Once these are defined it’s time to setup our workflow schema – Drag two scriptable tasks onto the schema editor and label them “Copy Files to VBR Server” and “Execute Setup.bat” or something more to your liking.


The Copy Files scriptable task handles the copying of both the Veeam update and setup.bat file.  Thankfully most the scripting for copying a file has already been completed and is stored inside a default vRO workflow titled “Copy File from vCO to Guest”.  I simply copied the script out of this workflow, pasted into my scriptable task and modified slightly to suit my needs.  You can see my final script along with a screen cap of the bindings so you can get a better understanding of which attributes/parameters need to be mapped into the scriptable task shown below.  If you run into some trouble, mainly permission issues have a look at this post by Nick Coyler which greatly helps with that issue.


var host = vm.sdkConnection;
var guestOperationsManager = host.guestOperationsManager;
var guestAuth = new VcNamePasswordAuthentication();
guestAuth.username = vmUsername;
guestAuth.password = vmPassword;
var fileManager = guestOperationsManager.fileManager;
result = false;
var attr = new VcGuestFileAttributes();
var srcFile = new File(vroPathToSetup);
var uri = fileManager.initiateFileTransferToGuest(vm , guestAuth ,guestPathToSetup, attr, srcFile.length, true);
result = fileManager.putFile(vroPathToSetup, uri);
var attr2 = new VcGuestFileAttributes();
var srcFile2 = new File(vroPathToVeeam);
var uri2 = fileManager.initiateFileTransferToGuest(vm , guestAuth ,guestPathToVeeam, attr2, srcFile2.length, true);
result = fileManager.putFile(vroPathToVeeam, uri2);

From here we move onto the “Execute setup.bat” scriptable task.  Again this script was borrowed and modified slightly from the “Run program in guest” workflow that is shipped with vRO –  the script and screencap of attribute/parameters are shown below


var host = vm.sdkConnection;
var guestOperationsManager = host.guestOperationsManager;
var guestAuth = new VcNamePasswordAuthentication();
guestAuth.username = vmUsername;
guestAuth.password = vmPassword;
guestAuth.interactiveSession = false;
var guestProgramSpec = new VcGuestProgramSpec();
guestProgramSpec.programPath = guestPathToSetup;
guestProgramSpec.arguments = "";
guestProgramSpec.workingDirectory = "";
var processManager = guestOperationsManager.processManager;
result = processManager.startProgramInGuest(vm , guestAuth , guestProgramSpec);

And we are done…

updatecontextPretty simple right – once you have saved your workflow you can go ahead and execute it right away.  Or if you prefer, map the workflow within the vSphere Web Client to your VM inventory object – allowing you to simply right-click a Veeam B&R server and execute the script all without having to leave the web client.  Either way you are left with a quick, easy, and consistent way to upgrade all of your B&R servers without ever having to log into them – achievement unlocked

Keep in mind

  • You can only use this for virtualized Veeam servers – any physical servers could be automated, but you may need to chose another tool to do the copying and executing of the files
  • You need to ensure that no jobs are running when you perform the upgrade – This is something I’d love to build into the workflow but just need time (story of my life) – for now, manually cancel any currently running Veeam jobs before executing the workflow
  • The workflow reports success right away, before the upgrade is complete – again, I need time for this one.  For now, you can monitor the patch.log file that setup.bat creates – it should say something along the lines of exit code 0 returned when the upgrade as completed…

Happy update day!

Last weekend to shape your VMworld experience!

voteforpedroEvery year VMware takes the thousands of session abstracts submitted by community members and puts them out there to the masses so YOU can have your say, whether it be a yay or nay!  This year is no different so if you haven’t already cast your votes for the session you want to see you’d better hurry as voting ends this weekend, Sunday, May 18th @ 11:59 PDT.

Now there are a ton of great sessions available and by no means am I going to tell you which ones to vote on (wait, yes I am and I’m going to do it right now!)  If you are having trouble wading through them all or just want to get that clicking finger warmed up why  not just search for Session 1683 and give that one a vote.


For real though I have some high hopes for this session – I’ve teamed up with a couple of the brightest and smartest guys I know when it comes to vCO (James Bowling [blog/twitter] and Joerg Lew[blog/twitter]) to bring an informative panel session revolving around best practices, pitfalls, tips and tricks when it comes to vCenter Orchestrator.  We are hoping to turn this session into a real discussion with a lot of interaction from the audience.  So, if that sounds appealing to you then go ahead and give us the ‘Thumbs Up’.  If not, then leave unchecked – actually, don’t, just Thumbs up it anyway 🙂

Either way, VMworld 2014 is quickly approaching and I hope to see you there!

To The Point – Removing stranded vCenter Orchestrator servers from vCenter

Ever have an issue that no matter what you do you can’t seem to get rid of an old vCenter Orchestrator server that’s registered with vCenter?  No?  Well, you might want to just stop reading then!  Either way, I have and no matter how many times I un-configured the vCenter plug-in, removed the vCenter plug-in, reloaded the vCenter plug-in the entry would just not seem to go away.  After a slew of curse words and a half dozen restarts of vCO and SSO I finally prevailed – and this is how I did so.

Take the “mob” approach and “dispose” of the vCO server.

Don’t worry, there’s no illegal activity here – I’m not talking about taking Soprano action but what we will do is get a little Al Capone on the vSphere MOB (Managed Object Browser).  The MOB is basically a graphical representation or hierarchy of our vSphere objects which allows us to view properties and  invoke and perform different methods on different objects to achieve the respective results.  That’s about the best way I can explain it – anyways, let’s cut to the point and get rid of that registered vCO server.

Access the mob by opening up your favorite browser and navigating to https://IP_OF_VCENTER/mob/

Once authenticated let’s click our way to our vCenter extensions by hitting ‘content’, then ‘Extension Manager’

mob-content mob-extensionmanager

Here we can see all of the extensions that are currently registered with our vCenter Server, including our vCenter Orchestrator extension.


Before we can go ahead and unregister this extension we need to obtain the extension key or unique identifier.  To do so, you guessed it, click on it

mob-vcoextension key

If you have more than one vCO extension and can’t figure out which one is which try drilling further down into the object by clicking ‘server’ in order to get the URL associated with the object.  That should help you figure out which object key you need.

Copy that object key and head back to the ‘Extension List’ screen.  Here, we need to invoke a method located in the bottom section called ‘Unregister Extension’.    Simply paste in your extension key you obtained above and click ‘Invoke Method’.


mob-unregistermethodBooyah!  All should be well in the world again!  Hope someone finds this short, but to the point post helpful!  As always, comments, concerns and threats are all welcome through the outlets provided in the comments section.  Happy Friday and #GOHABSGO.


Quick To The Point – No port options on vCO ssh plug-in.

Do you run ssh on its standard and default port?  I for one, do not. That's why I was surprised when I found that the built in workflows for the ssh plugin in vCO have no input parameter for port and simply ride along the standard ssh port of 22

Thankfully for those of us that modify our sshd_config the underlying methods and constructs of the ssh plugin allow for us to specify a port, it's just the workflows built around them don't reflect that. So lets go through an example of modifying the "Run SSH Command" workflow to accept a port.

First up, duplicate the workflow. You don't want to modify the original as you have no idea what might be using it.

Next, we need to add an input parameter (type string) called, you guessed it. – port.


On the Presentation tab let's drag our port parameter up to the Host selection display group so things look nice and pretty


Now let’s edit the first scriptable task in the schema (Execute SSH Command).  We will need to ensure that we have access to our port parameter by mapping it in our ‘In’ tab.


And on the Scripting tab of that same element, we can see our session gets established on Line 2.  Here we can simply change the options we send to that construct from “var session = new SSHSession(hostName,username);” to “var session = new SSHSession(hostName,username,port);”

Save and close your workflow. Go ahead and test it now. You should now be executing your ssh commands on whichever port you have specified.  Now you can go ahead and add a port parameter to any workflow that references new SSHSession();

Migrating workflows from one vCO server to another!

Although vCenter Orchestrator does offer an upgrade path for their appliances there are times where I have found myself needing to deploy a new one and migrate all of my configuration and workflows to it.  vCO has some great built in workflows that can configure itself, but nothing that really deals with workflows.  Sure, you can export and import workflows one at time using the client, which may be ok if you have 10 workflows, but what if you have 20, 40, 100 that you need to migrate.  That could get pretty monotonous.

The shear energy of a mouse click.

That’s right – who wants to exert all that energy of clicking on the mouse to get these workflows migrated when we can just orchestrate or automate it – after all, this is vCenter Orchestrator we are talking about.  vCO has a REST plugin that allows us to create workflows around any application that offers one, but did you know that vCO also has it’s own REST API available for us to use?  So that’s where I started with my task of migrating workflows and by the time I got to writing this post it truly ended up being a community effort.

“Steal from the best”

This was a quote that I seen on one of Alan Renouf’s slides during a VMworld presentation on PowerCLI.  “Steal from the best”, “Don’t re-invent the wheel” – sayings that have resonated with me for my entire career – Why re-do something if it has already been done.  So when I set out on this small project I ended up using two main sources; This post by William Lam on VirtuallyGhetto on how to use curl to export a single vCO workflow and this forum reply by igaydajiev who “showed me the light” on how to import a workflow back in!  Without these two guys I wouldn’t have been able to do any of this.

Enough already let’s get to it!

So I chose to go the PowerShell route to accomplish this as I’m not too familiar with the REST plugin for vCO.  As well, I am targeting only a certain category – so basically what the script does is take in the following parameters

  • OldvCOServer – the old vCO appliance
  • OldvCOUser/OldvCOPass – credentials for the old appliance
  • OldvCOCategory – Category Name to export workflows from
  • TempFolder – Location to store the exported workflows
  • NewvCOServer – The new vCO appliance
  • NewvCOUser/NewvCOPass – credentials for the new appliance
  • NewvCOCategory – Category name on new server where you would like the worfkflows imported.

As far as an explanation I’ll just let you follow the code and figure it out.  It’s basically broke into two different sections; the export and the import.  During the import routine there is a little bit of crazy wonky code that gets the ID of the targeted category.  This is the only way I could figure out how to get it and I’m sure there is a way more efficient way of doing so, but for now, this will have to do.  Anyways, the script is shown below and is downloadable here.  

# vCO Port
# Type to handle self-signed certificates
add-type @"
    using System.Net;
    using System.Security.Cryptography.X509Certificates;
    public class TrustAllCertsPolicy : ICertificatePolicy {
        public bool CheckValidationResult(
            ServicePoint srvPoint, X509Certificate certificate,
            WebRequest request, int certificateProblem) {
            return true;
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
[byte[]]$CRLF = 13, 10
function Get-AsciiBytes([String] $str) {
    return [System.Text.Encoding]::ASCII.GetBytes($str)            
function ConvertTo-Base64($string) {
   $bytes  = [System.Text.Encoding]::UTF8.GetBytes($string);
   $encoded = [System.Convert]::ToBase64String($bytes); 
   return $encoded;
##############  EXPORT OLD WORKFLOWS ##############################
Write-Host "Beginning Export Routine" -ForegroundColor Black -BackgroundColor Yellow
Write-Host ""
# build uri
$uri = "https://"+$oldvCOServer+":"+$vCOPort+"/vco/api/workflows/?conditions=categoryName="+$oldvCOCategory
# Authentication token or old server
$token = ConvertTo-Base64("$($oldvCOUser):$($oldvCOPass)");
$auth = "Basic $($token)";
$header = @{"Authorization"= $auth; };   
#execute API call
$workflows = Invoke-RestMethod -URi $uri -Method Get -ContentType "application/xml" -Headers $header
Write-Host "Exporting $($workflows.total) workflows from $oldvCOCategory"
Write-Host "-------------------------------------------------------------------------------------------"
#loop through each workflow and export to TempFolder
foreach ($href in $workflows.link.href)
    #retrieve information about the specific workflow
    $header = @{"Authorization"= $auth; };   
    $workflow = Invoke-RestMethod -URi $href -Method Get -ContentType "application/xml" -Headers $header
    #replace all spaces in workflow name
    $workflowname = [System.Text.RegularExpressions.Regex]::Replace($($workflow.name),"[^1-9a-zA-Z_]","")
    $filename = $TempFolder + $workflowname + ".workflow"
    # setup new header
    $header = @{"Authorization"= $auth;
                "Accept"="application/zip"; }; 
    Write-Host "Exporting $($workflow.name) to $filename - " -NoNewline
    Invoke-RestMethod -URi $href -Method Get -ContentType "application/xml" -Headers $header -OutFile $filename
    Write-Host "Done" -ForegroundColor Green  
Write-Host ""
Write-Host "Export Routine Complete" -ForegroundColor Black -BackgroundColor Yellow
##############  IMPORT WORKFLOWS ##############################
Write-Host ""
Write-Host ""
Write-Host "Import Routines to new server" -ForegroundColor Black -BackgroundColor Yellow
Write-Host ""
#Generate auth for new vCO Server
$token = ConvertTo-Base64("$($newvCOUser):$($newVCOPass)");
$auth = "Basic $($token)";
#Get Category ID
$header = @{"Authorization"= $auth; };   
$uri = "https://"+$newvCOServer+":"+$vCOPort+"/vco/api/categories/"
$categories = Invoke-RestMethod -URi $uri -Method Get -Headers $header -ContentType "application/xml" 
foreach ($att in $categories.link)
    if ($att.attributes.value -eq "HPEDSB")
        foreach ($newatt in $att.attributes )
            if ($newatt.name -eq "id")
                $categoryID = $newatt.value
$impUrl = "https://$($newvCOServer):$($vcoPort)/vco/api/workflows?categoryId=$($categoryId)&overwrite=true";
$header = @{"Authorization"= $auth;
            "Accept"= "application/zip";
            "Accept-Encoding"= "gzip,deflate,sdch";};    
$workflows = Get-ChildItem $TempFolder -Filter *.workflow
Write-Host "Importing $($workflows.count) workflows to $newvCOCategory"
Write-Host "-------------------------------------------------------------------------------------------"
foreach ($workflow in $workflows)
    Write-Host "Importing $($workflow.name) - " -NoNewline
    $body = New-Object System.IO.MemoryStream
    $boundary = [Guid]::NewGuid().ToString().Replace('-','')
    $ContentType = 'multipart/form-data; boundary=' + $boundary
    $b2 = Get-AsciiBytes ('--' + $boundary)
    $body.Write($b2, 0, $b2.Length)
    $body.Write($CRLF, 0, $CRLF.Length)           
    $b = (Get-AsciiBytes ('Content-Disposition: form-data; name="categoryId"'))
    $body.Write($b, 0, $b.Length)
    $body.Write($CRLF, 0, $CRLF.Length)
    $body.Write($CRLF, 0, $CRLF.Length)
    $b = (Get-AsciiBytes $categoryId)
    $body.Write($b, 0, $b.Length)
    $body.Write($CRLF, 0, $CRLF.Length)
    $body.Write($b2, 0, $b2.Length)
    $body.Write($CRLF, 0, $CRLF.Length)     
    $b = (Get-AsciiBytes ('Content-Disposition: form-data; name="file"; filename="$($workflow.Name)";'))
    $body.Write($b, 0, $b.Length)
    $body.Write($CRLF, 0, $CRLF.Length)            
    $b = (Get-AsciiBytes 'Content-Type:application/octet-stream')
    $body.Write($b, 0, $b.Length)
    $body.Write($CRLF, 0, $CRLF.Length)
    $body.Write($CRLF, 0, $CRLF.Length)
    $b = [System.IO.File]::ReadAllBytes($workflow.FullName)
    $body.Write($b, 0, $b.Length)
    $body.Write($CRLF, 0, $CRLF.Length)
    $body.Write($b2, 0, $b2.Length)
    $b = (Get-AsciiBytes '--');
    $body.Write($b, 0, $b.Length);
    $body.Write($CRLF, 0, $CRLF.Length);
    $header = @{"Authorization"= $auth;
            "Accept"= "application/zip";
            "Accept-Encoding"= "gzip,deflate,sdch";};    
    Invoke-RestMethod -Method POST -Uri $impUrl -ContentType $ContentType -Body $body.ToArray() -Headers $header
    Write-Host "Done" -foregroundcolor Green
Write-Host ""
Write-Host "Import Routine complete" -ForegroundColor Black -BackgroundColor Yellow

Kerberos authentication for the PowerShell plugin in vCO 5.5

1 The ability to have vCO kick off PowerShell scripts is pretty awesome!  And the fact that you can kick these off contextually inside of the vSphere Web Client is even more awesome!  Even more awesome than that, yes, that’s a lot of awesome is the new features offered with vCenter Orchestrator 5.5 – So, I’ve taken the plunge on one of my environments and upgraded.  Since then I’ve been slowly migrating workflows over – one of which utilized the PowerShell plug-in.  Now, since the appliance mode of vCO requires you to do a rip and replace rather than an upgrade (because I’m using the embedded database) I had to reinstall the PS plugin, therefore forcing me to reconfigure the Kerberos settings on vCO.   During this I realized that things are a little bit different than when I first blogged about vCO and PowerShell here.  Below is how I got it to work…

First up is the WinRM setup on your PowerShell host.  This process  hasn’t changed from 5.1, however I’ll still include the steps and commands that need to be run below.  Remember these are to be executed on the Windows box that you wish to run the PowerShell script from.

  • To create a winrm listener and open any required firewall ports
  • winrm quickconfig
  • To enable kerberos authentication
  • winrm set winrm/config/service/auth @{Kerberos=”true”}
  • Allow transfer of unencrypted data
  • winrm set winrm/config/service @{AllowUnencrypted=”true”}
  • Up the max memory per shell – I needed to do this to get things working
  • winrm set winrm/config/winrs @{MaxMemoryPerShellMB=”2048″}

No on to the krb5.conf file – this is where things get a bit different.  In vCO 5.1 we were required to edit the krb5.conf file located in /opt/vmo/jre/lib/security/ – well, if you go looking for that directory on 5.5 you won’t find it.  Instead, we need to create our krb5.conf file in /usr/java/jre-vmware/lib/security/  As far as what goes in the file it is the same and is listed below…(obviosoly substituting your own domain for lab.local and your own dc for the kdc definition).

default_realm = LAB.LOCAL
udp_preferences_limit = 1   [realms]
kdc = dc.LAB.LOCAL
default_domain = LAB.LOCAL
}   [domain_realms]

After you have saved the file in the proper directory we need to modify the permissions.  The following line should get you the proper permissions to get everything working.

chmod 644 /usr/java/jre-vmware/lib/security/krb5.conf

Just a few other notes!  You might want to modify your /etc/hosts file and be sure that you are able to resolve the fqdn’s of both your dc and the PowerShell host you plan to use.  Also, when adding the PowerShell host be sure to select Kerberos as your authentication type and enter in your credentials using the ‘user@domain.com’ format.

For now, that should get you automating like a champ!

Why Orchestrate?

Orchestrate-FacilitateAs you all can probably tell by reading my blog lately I have went head first down a path that lead me directly to vCenter Orchestrator.  Honestly, since doing that I haven't looked back.  I've known about the application for quite some time but never could find a use case for it in my environment.  Sure, getting the same results every time is great, that's the most obvious reason to script anything, but I had already been achieving this with Perl, PowerCLI and PowerShell, so why orchestrate?

This is an attitude I had for quite some time.  I'll just keep plugging away at all of the tasks I need to do throughout my day, finding efficiencies here and there for things, scripting some, manually pounding away on the keyboard for others; no biggie!  Then something happened – and by something I mean our environment grew…substantially.  We for the most part doubled in size over the course of a few months and things started getting really crazy really fast.  Those daily tasks, or one-off things that I had been doing started to become a hold up to the rest of the team and the business.  Let's take a simple example of deploying a new application or VM into your environment…

Wait, I thought VMware was supposed to improve provisioning time?

Well it certainly has, I can deploy servers in a matter of minutes now, have them on the right network, with our base load of software – and even with some of the Software Defined Datacenter pieces implemented I can have security and compliance built right into this deployment process as well.  But, the fact of the matter is I still have a lot of other things I need to do in order to call my server or VM completely deployed.  That's where vCenter Orchestrator comes in.

So I'm secure, provisioned and have a base software load installed and configured, what else is there?

Backup/Replication/DR – Some products will point to a datastore and/or cluster as their target which means this may be already setup for you when a new VM is deployed.  However I don't have my backup solutions configured that way.  I like to add my VMs to jobs which are based on RPO, therefore this is something I need to do manually after it has been provisioned

Monitoring/Reporting – Again, some products will automatically pick up new VMs and begin monitoring them.  I do have vCOPs setup, however there are many other tools I use to monitor specific application level statistics and performance, in which I need to go and setup manually after I deploy the VM.

Active Directory placement and group policy – For the Windows VMs I like these to be sitting in the proper OU after I deploy them, without this they will never receive the proper group policy – again, needs to be setup after the fact.

So how does vCO help with this?

vCenter Orchestrator by itself doesn't – but coupled with all the plug-ins available it becomes a pretty powerful tool.  If any of the services that provide you with those additional tasks have some sort of way to programmatically perform tasks such as an API, PowerShell cmdlets, SQL backends, etc – you should be able to find a vCO plug-in or simply use straight up JavaScript to interact with these programs and automate all that needs to be done.  In fact you could use the vCenter plug-in and design out the whole deployment process from start to finish using nothing but vCenter Orchestrator.  And for some icing on the cake, you can still initiate these workflows from directly inside the vSphere Web Client.

So this is just one small example of how I've been using vCenter Orchestrator.  Honestly, I'm finding more and more use cases for vCO everyday and reaping the benefits of orchestration and automation – which usually involve myself and a coffee watching scripts run 🙂  So, my question to you is…

Do you orchestrate?

My First vCenter Orchestrator Workflow – Part 6 – Resources

booksOK!  Finally the end of this series!  Honestly without the existence of the following resources there is no chance in hell that I would have been able to even develop a simple workflow, let along start scripting and what not.  I was a couple weeks into vCO before I even learned that you can switch to ‘design’ mode 😉

So, if you are looking for some awesomeness on vCO, have a look at the following…

Official vCenter Orchestrator Documentation – Should really be your first go to reference for all that is vCO

Automating vSphere with VMware vCenter Orchestrator – Although official docs SHOULD be your first go to item – I find that this book written by Cody Bunch actually IS my first go to reference.

www.vcoteam.info – Great blog with a ton of script examples and whatnot!  Bookmark this…

www.vcoportal.de – Ditto to the above line, might better bookmark this on while you in your bookmarks…This is a fabulous blog!

VMware vCenter Orchestrator blog – the official blog from VMware centered around vCO.

VMware API and SDK documentation – this always helps when trying to determine what type of objects or properties any given function requires.

Good Ol’ Twitter – Follow @cody_bunch, @josh_atwell, @vCOteam, @technicalvalues – there are a ton more but these are the ones that I can think of off the top of my head – just search for #vCO and find someone to ask your question to – community seems to be always willing to help.

Thanks for reading and hopefully you can find some usefulness out of vCO as I did!  I encourage everyone to explore what it has to offer, which the same results, everytime!  Check out the full series below…

My first vCenter Orchestrator Workflow

My First vCenter Orchestrator Workflow – Part 5 – A little bit of SQL

orchestratepart5Thus far in this series of posts we have installed and configured vCenter Orchestrator as well as setup and utilized a couple of plugins; the vCenter plug-in and the PowerShell plug-in.  Before the series ends I wanted to go through one more plug-in.  The SQL plug-in.  The SQL plug-in is used for, well, you guessed it; running SQL queries against a database.  If you remember in Part 4 we setup a workflow that took two input parameters; a host and a location code.  The script then went on to create a new port group on a vSwitch named locationcode-VM_Network.  The logic in itself works fine and the workflow validates, the only problem I see with it is that the user needs to enter the ‘location code’ manually as an input.  Now I know you are all very careful and take your time when doing things, but I’m not, and more than likely after doing this 40 or 50 times you can count on me keying in that location code wrong 🙂 – Enter the SQL plugin.

So the setup is as follows; I have a database containing a table with the following columns; locationcode and deployed (boolean).  What I would like to do is have an input parameter that would allow me to select the location code from a list of non-deployed locations in a drop-down box (rather than manually enter this information in), and in turn pass the location code to my PowerShell script.  Once the script has ran I’d like to update the deployed column for that specific location, ensuring it isn’t displayed in the list again in turn ensuring I can’t deploy the same location twice.  Make sense?  I know it’s a lot to follow but the concepts of setting it up are pretty much the same no matter what you are looking to do.

Alright – enough background – Let’s get into it.  I’m not going to go through the installation of the SQL Plug-in – it’s exactly the same installation process as the Powershell plug-in which we covered in Part 4.  Similar to how we setup our Powershell host we need to setup our SQL server database connection inside of vCO.  To do this fire up your vCO client and navigate through the workflow tree to Library->SQL->Configuration and select the ‘Add a database’ workflow, right click and Start Workflow.  There are a few parameters as you can see that we need to pass to the workflow in order for it to successfully run.  First, give this connection a name and select your desired Database Type – in my case MS SQL.  Also here we need to pass a Connection URL.  Now if you don’t know anything about jdbc connection urls no fear, it’s not that difficult.  Simply enter it in the the following format…


So, for a SQL Server with named DC.lab.local running on the default port 1433 and  database named ServerDeploy you would use the following…



After clicking Next we are once again presented with our Shared/Per User session mode – again, I chose shared to use one simple set of credentials rather than a per user authentication.  When you are ready click ‘Submit’ to add your new database to vCO’s inventory.  One thing to note here is that this step is not necessary   If we wanted we could perform all of this at run time inside code, however for tutorial purposes and learning purposes it’s sometimes easier to do it this way.

Alright, nuff config!  It’s time now to get started on our task at hand; Querying the database for non-deployed servers and pushing the result as a drop-down box as an input parameter to our workflow that we created in Part 4.  First off there is a simple change we need to make to our existing workflow.  Here’s a tip – don’t feel like buffalo-ing your original workflow, simply right click on it and  select ‘Duplicate Workflow’ to copy it.  OK, first off we need a new attribute.  We originally have locationcode setup an input parameter of type string – and we still need this, however the result we get back from our database will be an array of strings.  So, on the General tab of your workflow create a new attribute called  databaseParameter of type SQL:Database and assign it the value of the Database we created earlier (this should allow you to browse the inventory to do so).  Once you are done that simply Save & Close and continue on with any validation errors.


So here comes the real magic!   We need to take that database attribute and pass it to a newly created action which will in turn spit out an array of string attributes (our locations in our database).   Again, you could do the following all in script embedded within your workflow, but you never know when you are going to need to reuse something so I’ve decided to create a new action to do so.    To create an new action be sure you are on ‘Design’ view from within your client and click on the actions tab in the left hand side menu.  Where you place your action doesn’t really matter, I chose to right click com.vmware.library.sql and create my action inside that module.  Just remember where you put it and what you named it:).


OK, you should now be in the Action editor.  This is where we are going to place the code that does all the querying of the database.  As I said earlier we need to pass this Action a database parameter and it will return an array of string.   The setup of input parameters and return types, along with all the other work we are going to do is all done on the scripting tab.  First off define your return type of an Array/string.  Secondly add an input parameter of type SQL:Database.  You can see all of this outlined in the capture below…


Hey!  That was easy!  Now it’s time for a little scripting.  vCO script is nothing but Javascript which calls the rich set of API’s that vSphere provides, along with the functions and properties of all the plug-ins provided in vCO.  The script I used is displayed below…

var resultingArray = new Array();
var query = "SELECT locationcode FROM Locations WHERE deployed = 0";
var resultingActiveRecords = databaseParameter.readCustomQuery(query);
for each (var resultRecord in resultingActiveRecords) {
var locationcode = resultRecord.getProperty("locationcode");
return resultingArray;

Simply paste this into the script pane of your action.  As you can see it’s a pretty simple script.  Creates and array, queries our database, pushes the locationcode column value of each record returned into that array and finally returns the array.  So – time to Save and Close and head back to our workflow.

So at this point we are left with 2 tasks.  The first being the creation of the drop-down box as an input.  To do this we will need to change the way our original input parameter, locationcode, is displayed.  This is done on the Presentation tab of our script editor.  Be sure you have selected locationcode in the Presentation tree that the top of the screen and select the icon to add a new property to it.  There are many several different properties listed but the one we are after is called Predefined list of elements.  Under the value field of our new property select the icon to assign an action.  In the popup find/filter for our action we created earlier, assign the actions parameter to our database parameter and click Apply.


There done right…well, kinda, you could run the workflow now and it would populate our input and it would go ahead and run the PowerCLI script and the VM Network would be created, however if you remember it was my intent to go back at the end of the workflow and update our database to ensure that the same locationcode could not be selected twice.  To do this we will need to drag a new Scriptable task element to run after we invoke our PowerCLI script.  Inside this scriptable task we will need to import a couple of local attributes in order to accomplish the sql we need to do, the locationcode, and the databaseParameter.


As for the script it’s going to look very similar to the syntax that we placed inside of our action with the exception of and executeCustomQuery function in place of the readCustomQuery and the actual query itself is different.  The following is what I used…

var query = "UPDATE locations SET deployed= 1 WHERE locationcode= '" + locationcode+ "'";

And now at last we are done!!  Go ahead and run your workflow, either from within the vCO client or the Web Client and you should now see a drop-down selection for the locationcode.  Once it’s selected once the script will run, create the VM Network, then update our database to ensure that the selected locationcode is not shown again in the drop-down.


So that my friends is the end of the line for me on creating my first vCenter Orchestrator Workflow but it is most definitely not the end of the line with vCO.  With access to a huge set of vSphere API’s along with all of the functionality and properties provided by its’ plugins, throw in some sweet integration with the vSphere Web Client I’m beginning to see my usage of vCO ramp up within my current role.  This series has hardly even grazed the surface in terms of vCO’s functionality so I urge you all to go out there and learn as much as you can about this product.  I’ll do one more post in the series and outline some of the resources that I’ve found along the creation of this workflow so expect to see that soon.

My first vCenter Orchestrator Workflow

My First vCenter Orchestrator Workflow – Part 4 – A look at the Powershell Plug-in

Orchestrate-FacilitateAlright!  So far throughout this series of posts we have installed, configured and setup vCenter Orchestrator; as well as created and tested the integration between both vCO and the vSphere Web Client.  So what’s next in my venture?  Well if you can remember back to Part 1 of the series I set out on a quest to develop a workflow that would take an un-configured host and go through a series of tasks to create datastores, rename networks, setup hostname, etc…  Basically configure it the way I wanted.  Now there are built-in workflows in vCO to do this, however I had already done this work inside of a PowerShell script – so my main goal was to get vCO to execute that script.  And this leads us to part 4!

So first off as I said above there are a ton of great built in workflows inside of vCO that can do pretty much anything.  Some things however, such as the ability to run a Powershell script, are missing!  Thankfully there is a thriving developers community built around vCO and it’s SDKs allowing third party vendors to release their own sets of workflows in the form of a plug-in.  So as you probably guessed by now we will need to install a plugin to handle our Powershell script execution.

This plugin is called the vCenter Orchestrator Plug-in for Microsoft Windows Powershell and can be found over on VMware’s site.  You will need to download this file.  Once you have your hands on the vmoapp file we need to go into the vCenter Orchestrator Configuration page (https://IP_OF_vCO:8283/) to install it.  Select Plug-ins from the left hand menu, then browse down to the Install new plug-in section.  Upload your file and click ‘Upload and install’, accept the EULA and away you go… If you browse back to the main Plug-ins page you will notice that it states the plugin will be installed at the next server startup; so if you wish you can do this task manually by going to Startup Options -> Restart Service.


At this point we can fire up our vCO client.  Having a look in the Library of workflows we can now see a PowerShell folder.  Expanding this out further let’s us see some of the workflows that come packaged with the plug-in.  So we now have a few configuration steps we need to go through before executing scripts with this plug-in.

First off we need setup a PowerShell host.  A PowerShell host is simply a Windows machine with Powershell installed located somewhere in your environment that will handle the storing and executing of the scripts.  There is a great article on all of the steps here but I will go through explaining them all as well.   Keep in mind this might not be the most secure of ways to go about everything – but this is simply a lab so I’m not too worried – for those concerned w/ security implications check out the full lengthy thrilling winrm read on the msdn 🙂   

So, once you have chosen which machine you would like to be your Powershell host go ahead and drop to a command prompt on that machine and run the following list of commands.

  • To create a winrm listener and open any required firewall ports
  • winrm quickconfig
  • To enable kerberos authentication
  • winrm set winrm/config/service/auth @{Kerberos=”true”}
  • Allow transfer of unencrypted data – told you wasn’t the most secure 🙂
  • winrm set winrm/config/service @{AllowUnencrypted=”true”}
  • Up the max memory per shell – I needed to do this to get things working
  • winrm set winrm/config/winrs @{MaxMemoryPerShellMB=”2048″}

And there we are!  Things should be setup fine now on our host end to allow the connections from our vCO server.  Now we need to configure kerberos on the vCO end of things.  To do this you will need to console into your vCO server and get on the command line there – hope your comfortable with vi 🙂

Basically we need to create a file under /opt/vmo/jre/lib/security called krb5.conf – to do so simply

cd /opt/vmo/jre/lib/security
vi krb5.conf

This file will need to look as follows – obviously change to fit your environment – if you happen to be using an default configuration of the autolab then you are good just to copy the following…

default_realm = LAB.LOCAL
udp_preferences_limit = 1
kdc = dc.lab.local
default_domain = LAB.LOCAL

Once you are done simply hit :wq to save and exit.  In order to grant vCO permission to access our newly created krb5 file you will also need to change ownership of the file using the following command

chown vco.vco krb5.conf

And, in order for changes to take affect, restart the vCO service – you can do this from the configuration page->Startup Options or simply

service vcod restart

Whew!  Ok!  Getting somewhere….Now we need to add the host into our configuration of the Powershell plug-in.   I find the easiest way to do this is to simply run the ‘Add a Powershell Host’ workflow however you can develop the script/code on the fly to add the host during your own workflow if you want.   The way I have documented however is as follows…

Browse to the PowerShell plug-in configuration by expanding Library->PowerShell->Configuration in the workflows view of your vCO Client.  Right click ‘Add a PowerShell host’ and select Start Workflow.


Here we need to add a few key pieces of information.  Enter a name for your PowerShell host – this can be whatever you wish, then enter the HOSTNAME of the host you setup (kerberos will not work with IP)   and click Next.


The only item that needs to be changed on the Host Type screen is Authentication; be sure Kerberos is selected and click ‘Next’.


In the User Credentials screen you have a couple of options in regards to Session mode.  Again since this is a lab environment I selected Shared Session, however if you were looking at a more production type deployment you might want to explore the Per User settings which will execute commands under the credentials of the current user connected to vCO.  Enter in some credentials with access to vCO and click ‘Submit’.  The workflow should kick off and you can watch it flow through the different elements.  When done, hopefully you have a nice green check mark beside it!

Now that we have successfully installed the Powershell plug-in and added a Powershell host lets go ahead and create a script on our Powershell host to run in order to make sure everything works.  Although my end goal is to completely configure a host –  for simplicity sake I’ll use a small snippet of the complete script.  So I have placed the following script on my PowerShell host to simply add a new VM Network to vSwith0.  As you can see the script is pretty simple, takes two parameters, a host and a location code.  The location code is used to represent uniquness in my environment and is actually used multiple times in the complete script for the hostname, datastores, etc….

param ($hosttoconfigure, $locationcode)
add-pssnapin VMware.VimAutomation.Core
$username = "Host Username"
$password = "Host Password"
$success = Connect-VIServer $hosttoconfigure -username $username -Password $password
Get-VirtualSwitch -Name vSwitch0 | New-VirtualPortGroup -Name "$locationcode-VM_Network" -confirm:$false

So let’s get started on our workflow to call this script.  Again, create a new workflow and call it what you will.   Once you are in the workflow editor move to the Inputs tab.  Here we will get the information from the user that we require in the script.  So go head and add a couple of inputs to the workflow, one being hosttoconfigure (Type VC.HostSystem) and locationcode (Type String).


By selecting our host parameter and clicking the eye icon we can set the ‘Show in Inventory’ presentation property on the host input.  This will basically allow the user to browse through their vCenter to select the host when running the workflows, so might as well go ahead and do that as well.


The Powershell workflow that we will be using is called ‘Invoke and external script’ and requires 3 parameters; One, a Powershell Host which we have setup previously, the second is the script to execute, and the third is the arguments to pass to the script.


Once on the Schema tab go head and drag the ‘Invoke an external script’ workflow into yours.   When you see the parameter setup black bar pop up click ‘Setup’ and we can have a look at the parameters.


Through this screen we can basically promote or link the parameters from the Script workflow to our main workflow.  You can see that I’ve done a few things.  One, I’ve set the host parameter to a value (same as creating an attribute) and browsed through my inventory to select my Powershell host that we created previously.  Secondly I manually inputted the path to my Powershell script in the externalScript parameter.  Thirdly I selected to skip the arguments parameter at this time.  I’ve done this because I need to pass the hostname as well as the locationcode parameter from my main workflow to this parameter which we will do with a separate element called a Scriptable task.  Go ahead and click ‘Promote’.


So now we have our main input parameters setup (locationcode and hosttoconfigure) as well as our script setup and ready to go.  We just need to tackle that arguments parameter.  As I said we will do this with a separate Scriptable Task workflow so go ahead and drag that into your schema, just before your Invoke External Script.   A scriptable task is just that, kind of a blank canvas to do any sort of scripting (javascript) that you desire.


So let’s go ahead and click the pencil icon to edit our scriptable task.  There are few IN parameters that we are going to need inside of our script; The first is hosttoconfigure and the second is locationcode.


Now move on to the OUT parameters.  We will need to create a new out parameter to store our arguments, so click the link for a new parameter and create a new workflow attribute called scriptArgs of type string.

Now we need to move to the Scripting tab.  This is where we can concatenate a property of the hosttoconfigure parameter and the locationcode to populate our newly created scriptArgs variable.  Honestly it’s a one-liner.  I’m sure there is a way to create this workflow without even requiring this whole Scriptable task but this is the way I got to to work (again, my FIRST vCO workflow).  Copy/Paste the following into your scripting tab.

scriptArgs = hosttoconfigure.Name + " " + locationcode;

Ok, you can close out of your Scriptable task workflow now.  We are almost there.  All that we have left to do is map the External Script workflow paramater arguments to our Scriptable task elements parameter scriptArgs.  Thankfully this is an easy thing to do.


Once again go into your External Script element and click on the Visual Binding tab.  Here, as shown above, we simply need to drag the scriptArgs in attribute over to the arguments in parameter.


Guess what!  Your done!  If everything has been setup properly and I’ve done a good job at explaining what to do then you should be able to Save and Close your workflow and actually run it on a host (a test host of course).  The result – you should get a VM Network setup on vSwitch0 – oh and a pretty green checkmark beside your workflow 🙂

At this point I’m going to cut off this post since it’s getting pretty long.  That being said I’m not going to end the series!  I’m always looking at ways to better automate things and you know, having users type in that location code just isn’t good enough for me, a lot of room for error there; they could type it in wrong, do the same location twice, on and on and on…  And hey, I have a database already setup with my location codes in it and whether or not they have already been deployed.  So, stay tuned for Part 5 of my series where we will go through configuring the SQL plug-in and setting up our location input parameter to be populated by an SQL table – as well as update that same table on completion of our script.

My first vCenter Orchestrator Workflow