Monthly Archives: October 2014

Get the cobwebs out of your Data Center with CloudPhysics

Just the other day I was thinking to myself you know what this world needs – more Halloween themed infographics relating to IT.  Thankfully, CloudPhysics, with their analytic powers pulled that thought out of my head and did just that!  With a vulnerability dubbed Heartbleed, how can you resist it?

On a serious note some of the data that CloudPhysics has should really scare you.  22% of our vSphere 5.5 servers remain vulnerable to Heatbleed! 41% of clusters do not have admission control enabled!   These are definitely some spooky stats and shouldn’t be ignored!


But what’s Halloween with out some tricks and treats right?

CloudPhysics has you covered there as well!   Be sure to grab their Halloween Cookbook – A collection of tips and tricks on how you can remediate issues within your data center and stay out of the spooky stats that they are collecting.  For an even bigger treat, be sure to sign up to allow CloudPhysics to help find the data center goblins for you – oh, for free!  Better yet, sign yourself up for the community edition of CloudPhysics – it’s completely free and can give you some great insight into what’s going on inside your virtual data center!  Be sure to head to their official blog post to get all in the information you need to fill up your bag!

Google Admin SDK APIs and .net integration – Changing a users password

I know weird right?  Coding, Google API’s, WTF!?!?  This is a virtualization blog.  Truth is I was a developer before ever moving into the infrastructure space, not much of one, but I was one Smile  Honestly, this is probably the reason why products like vRealize Orchestrator ( More weirdness calling vCO that) that mix both development and infrastructure together appeal to me so much!  More truth – as much as I try I can’t quite get away from development – it’s a part of what I do ( I guess ).

Anyways, cut to the chase – I’ve been recently working on a project revolving around integrating Google Apps with a homegrown .net application I’ve written.  Now, there is such a thing called the provisioning API, which is super easy to use and worked great – but is depreciated and Google could simply stop providing it whenever they want.  Google suggests people move to the Admin SDK – which, of course, is much harder!  Either way, I needed to provide a way for users to reset other users Google Apps passwords from within a .net application, without those users having to provide access or permission to their accounts. My Google-Fu was strong on this one, and by the time I finally got it to work I had about 27 tabs open, therefore I thought it might be nice for the next person to maybe stumble upon one page containing all the info they need to make this happen – therefore – this!

To the developers console

The first step to getting everything to mesh with the Admin SDK involves setting up a project in the Google Developers Console.  Simply log in and select ‘New Project’ and give your project a name and an id.  Easy enough thus far.  Once the project has been created we need to enable the Google Admin SDK.  Select APIs from the APIs & auth section on the navigational menu on the left.  Here, we can simply filter our list of available APIs by inputting ‘Admin SDK’ in the browse box and then enable it by switching its status to On.


From here we need to create a client ID containing the p12 key and credentials that we need in order to delegate access to.  As mentioned earlier, I’ve decided to go about this via the ‘Service Account’ route as I would like to have an account that I can delegate domain wide admin access to in order to change passwords, create users, etc, and doing this without authorization or permission from the users themselves.  To do click the ‘Create new Client ID’ button inside of the Credentials section of API’s & auth.  When presented with the ‘Application Type’ dialog, select Service account and click ‘Create Client ID’.


Once the process has completed pay attention to the .p12 key that is automatically downloaded.  We will need this file later when connecting our .net application so go and store it somewhere safe.  Also note the private keys password in the dialog as we will also need this information.


At this point you should see your newly created service account, it’s Client ID, Email Address, and Public Key Fingerprints.  The only piece of information that we need off of this screen is the Client ID – usually a big long string ending in  Go ahead and copy that to your clipboard as we will need it for the next step.

To the Admin Console

From here we need to go to our Google Apps admin console ( and grant this newly created security account access to specific APIs.  Once logged into the admin console, launch the security app (sometimes located in the ‘More Controls’ link near the bottom.


Inside of Security we need to go into the Advanced Settings (located by clicking Show more) and then the “Manage API client access” section.


Now we need to grant our service account access to specific APIs within Google by specifying the individual URLs of the API.  First, paste your Client ID that we created and copied into the Client Name box.  Secondly, copy in all of the API urls (comma separated) that you want to grant access to.  In my case I’m only looking for User and Group Management so I entered, into the API Scope input box.  If you need help figuring out the url for the specific API you are looking for you can find them listed in the Authorize Request section in the developer guide for each Google API.  Once you are ready to go as shown below, click Authorize.


And now the code

Thus far we’ve done all the heavy lifting as it pertains to opening up the access for .net to utilize our Admin SDK APIs – time to get into a little code!  Again, I don’t consider myself a hardcore developer, as you can probably tell from my code.  There may be better ways to do this but this is the way I found that worked, and again, not a lot of information out there on how to do this.

First up there are some project references that you need to use.  Honestly, you can get the .net client libraries from Google but the easiest way to bring packages in is by using NuGet as it will pull dependencies down for you.  Go ahead and import the Google APIs Client Library, Google APIs Auth Client Library, Google APIs Core Client Libary, and Google.Apis.Admin.Directory.directory_vi Client Library.  That should be all you need to do the password resets.

So the complete script is listed at the bottom, but for “learning” sake, I’ll break down what I’ve done in the sections to follow.

String serviceAccountEmail = "";
var certificate = new X509Certificate2(@"c:\p12key\NewProject-3851a658ac16.p12", "notasecret", X509KeyStorageFlags.Exportable);

Here I’m simply declaring some variables; first, the serviceAccountEmail – this is the email (not the ID) of the Service Account we have setup – secondly, our certificate, which is generated by pointing the constructor to the p12 key we generated (remember) and the key password that was displayed (remember).

ServiceAccountCredential credential = new ServiceAccountCredential(
new ServiceAccountCredential.Initializer(serviceAccountEmail)
Scopes = new[] { DirectoryService.Scope.AdminDirectoryUser }

This line essentially builds the credential object we need in order to instantiate the service we want to perform.    Take special note here I have to pass a user parameter – this is the user that we want our service account to impersonate (they will need to be have correct roles/permissions in Google to perform any of the tasks we attempt).  Also, the Scopes array – this is specifying which exact API scopes we want to authenticate to – these will normally match the end of the API URL, just without decimals.  That said, we have auto-complete in Visual Studio right – use it Smile

var dirservice = new DirectoryService(new BaseClientService.Initializer()
HttpClientInitializer = credential,
ApplicationName = "MyNewProject",

Each and every API you call from Google inside of .net will need to be stored in a service variable.  This is where we instantiate a new DirectoryService (to access users).  If you were to use the Task Service this would be new TaskService.  No matter what we always use new BaseClientSerivce.Initializer in our constructor.  Note we also pass in our created credential object, as well as the name of our project we created in Google Developer Console.

User user = dirservice.Users.Get("").Execute();
Console.WriteLine(" email: " + user.PrimaryEmail);
Console.WriteLine(" last login: " + user.LastLoginTime);
user.Password = "MyNewPassword";
dirservice.Users.Update(user, "").Execute();

And now the magic happens.  Honestly this will be different depending on what API you are using but again we have built-in help and auto-complete in Visual Studio so you should be able to figure out how to do anything you need to.  Above I simply get a user, display a few of their parameters, change their password and then update the user.  Easy enough!

So here’s the code in its’ entirety.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Security.Cryptography.X509Certificates;
using Google.Apis.Auth.OAuth2;
using Google.Apis.Services;
using Google.Apis.Admin.Directory.directory_v1;
using Google.Apis.Admin.Directory.directory_v1.Data;
using Google.Apis.Admin.Directory;
namespace ConsoleApplication3
class Program
static void Main(string[] args)
Console.WriteLine("Connect to Google API");
String serviceAccountEmail = "";
var certificate = new X509Certificate2(@"c:\p12key\NewProject-3851a658ac16.p12", "notasecret", X509KeyStorageFlags.Exportable);
ServiceAccountCredential credential = new ServiceAccountCredential(
new ServiceAccountCredential.Initializer(serviceAccountEmail)
Scopes = new[] { DirectoryService.Scope.AdminDirectoryUser }
var dirservice = new DirectoryService(new BaseClientService.Initializer()
HttpClientInitializer = credential,
ApplicationName = "MyNewProject",
User user = dirservice.Users.Get("").Execute();
Console.WriteLine(" email: " + user.PrimaryEmail);
Console.WriteLine(" last login: " + user.LastLoginTime);
user.Password = "MyNewPassword";
dirservice.Users.Update(user, "").Execute();
Console.WriteLine("Press any key to continue...");

I know this post may be a little off topic with the rest of this blog but the fact of the matter is I couldn’t find a lot of information out there on how to accomplish changing Google Apps user passwords from .net.  And when I can’t find information I post it!  Hopefully someone will stumble across this and find it useful!  Thanks for reading!

Veeam Cloud Connect – a customer perspective!

During the partner keynote at VeeamON it was announced that some of its’ partners will be ready to go in terms of supporting the new Veeam Cloud Connect hosting functionality that is set to be released inside of Veeam Backup and Replication v8.  In it’s basics, Veeam Cloud Connect and the Veeam Service Provider certification allows partners to become a service provider, allowing their customers to get their backups offsite and provide a full backup hosting environment.


So, in terms of a customer what does this mean?  Many small to medium size businesses use Veeam to take care of their day to day backups and as an added bonus replicate and test their backups to ensure consistency.  That said, the shear cost of acquiring space, compute, and storage at an offsite location tends to be a hurdle that is hard to overcome.  But it’s not just that – even if budget is not a problem, the complexity of providing and setting up connectivity, networking, and routing can be a challenge to those without a large IT staff.  Backing up to a VSP certified Veeam Cloud Connect partner eliminates those challenges, allowing customers to simply rent space from their preferred partner.

From within Veeam Backup and Replication backing up to a VSP through Cloud Connect is essentially no different than if you were backing up locally.  When we setup our backup repositories we currently select either a Windows or Linux machine with some available storage and then it is available to be used within our backup jobs.  Veeam Cloud Connect is much the same – A Veeam-powered service provider will simply show up as a backup repository within your Veeam Backup and Replication console and can be used within your jobs just as any other repository.  Easy!

But it’s more than that!

HandshakeSure, Veeam Cloud Connect will allow us to easily get our backups offsite without incurring most of the major capital costs that come along with that, but it offers much more – more that centers around trust.  Mostly everyone is aware of giant cloud providers such as Amazon, Microsoft, and VMware – and the services they provide are bar none amazing but do we really trust them with our data.  The thing about granting partners with the ability to become a Veeam hosting provider brings the trust issue closer to home for the customer!  Partners are in the trenches everyday, building relationships with customers and ensuring that these are life-long commitments.  In my opinion, people tend to trust someone they have worked with often, someone they met many times over the years, and someone who is local – essentially their preferred IT partners and VARs.  By allowing their partners to become hosting providers Veeam has essentially leveraged this trust that partners have worked to obtain with their customers, and in turn, allowed the partners to provide a well rounded, complete solution to these customers.  It’s really a win-win all around.

I think about markets like education, healthcare, government – these verticals house very important and sensitive data, data that needs to be protected both onsite and off.  With that said, strict compliance guidelines usually dictate exactly how and where this data may sit.  Honestly, if that place is outside of the US the major players just aren’t there.  I can see how Veeam Cloud Connect can solve this issue.  By utilizing a preferred partner, educational institutions and districts could essentially leverage a partner who is “close to home” to provide them with that offsite repository for their data.  Someone they already know and trust.

Furthermore, why settle at just one?  There are many school districts in Ontario who may or may not leverage Veeam as a backup vendor of choice – but for those that do I can see tremendous opportunity for partners to design and implement data-centers centered directly around the compliance and regulations that the districts face.  Again, a win win – a partner gets many like customers and the customers get an easy and secure solution – without the need to purchase additional hardware or licensing.

In essence, Veeam Cloud Connect is more than just a backup repository for a customer.  It’s an opportunity.  An opportunity for partners to leverage the trust they have built, an opportunity for customers, especially those in similar to verticals to unite and look for a “groupon” type deal for offsite backups.  And an opportunity for everyone to ensure that in the event of a disaster, they are not left without the critical data that their business needs to survive!

Veeam Endpoint Backup – Bringing Veeaminess to your physical endpoints

veeam_logoToday (Wednesday) at VeeamON brought with it some of the biggest news to come out of the conference.  Veeam has officially announced the arrival of Veeam Endpoint Backup, a solution enabling IT shops to backup their physical Windows desktops and laptops, but do it in a similar style as to how we currently backup our virtual infrastructure with Veeam Backup and Replication.  Oh, one more important key to this announcement is that  Veeam Endpoint Backup is absolutely free – I thought that ensued a worthy mention in the first paragraph!

But you can’t hot-add my laptop?

Ok, let’s take a look at what we know thus far about how this works.  Basically we are looking at a package that is installed directly on our endpoint.  From there, we can chose to backup our entire system, individual volumes, or specify individual files to be part of our backup job.  Easy enough thus far right?  As for targets, or where we are going to backup to we have a few options – we can backup to a drive attached to the source machine, such as a USB or SATA disk, we can point our job to a NAS device or to a file server via a CIFS share, or, perhaps the most appealing to me, we could backup a Veeam Backup and Replication repository.  From there its just a matter of setting up your retention settings and you are set to go with physical protection provided by Veeam

The restore process is much the same as the backup, meaning we can perform file or volume level restores, as well as complete bare-metal restores.  Veeam has also provided the option to create a recovery type USB key that can be used to boot the endpoint in the event that you aren’t able to get to Windows in order to do the restore.

So what’s and endpoint?

Veeam has stated that this is not an Enterprise product and its main purpose is to provide protection for your Windows desktops and laptops.  That said the list of OS support includes Windows 7 and up, as well as Windows Server 2008 and up – so essentially we could have a Veeam solution for our physical infrastructure.  One can see how this would be very appealing to IT shops that have only a handful of physical servers left and might be currently using another solution alongside Veeam to back these up.

Oh and it all can be yours for only FREE

Veeam has a long history of releasing free products and tools to the virtualization community.  We’ve seen it happen with products like Backup and Replication and Veeam One.  We’ve also seen separate standalone tools come out for free such as the Veeam Explorers and the highly popular FastSCP.  Following that same model comes Veeam Endpoint Backup.  Offered free to everyone and supported with a “best-effort” type model.  Veeam states they simply want to get this product into the hands of IT Professionals in order to get feedback and look at future expansion.

What the future holds

crystal-ballVeeam touting that they want feedback for future expansion isthe key here – I would love to see some functionality like this built in to Veeam Backup and Replication – allowing us to remotely install some sort of Veeam agent and setup backup jobs directly from within the centralized console.  I’m not saying this is going to happen, but it does seem like a logical step for Veeam to take with the product – and maybe that’s the plan seeing as this is a separate product targeted at client machines, leaving the doors open to provide physical server protection from elsewhere, say Veeam Backup and Replication.  This would allow us to use our enterprise type features such as application aware processing as well as things like SureBackup and virtual labs on our physical infrastructure.  Or even open up doors for having a physical server replicated to a virtual machine.  Of course this is all just me sitting on the Cosmopolitan balcony speculating while recovering from the VeeamON party last night – and could turn out to simply be the advil talking 🙂   Even if they do hold steady with just endpoint protection I’m excited to see where Veeam will take it.  Veeam is a company that is constantly releasing very innovative features into their products so you never know what you might find inside a 3.x or 4.x version of Veeam Endpoint Backup.

But, back to reality – Veeam Endpoint Backup is here now, it’s cool, it’s free and it’s going into a public beta come November of this year.  How do I sign up – follow the white (green) rabbit to this page and simply provide your email and you can be the first to know when Veeam Endpoint Backup Free hits the internets!  For now we wait, keep calm, and VeeamON!

Veeam Backup and Replication v8 – What we know so far…

veeam-v8With Veeam hosting their first conference and the release of Veeam Backup and Replication v8 being set to be sometime within Q4 of this year one can only put two and two together to come to the conclusion that they have held onto some major announcements for VeeamON.  I could be wrong but the quietness and the little smiles coming from the Veeam employees around the Cosmopolitan this week lead me to believe there is something BIG in store for us shortly.   With that said Veeam has been slowly releasing some of the new features that will definiltly be included in v8 of their flagship Backup and Replication software and they should certainly not be ignored.  Let’s have a quick look at what we already know about…

Backup/Restore from NetApp Storage snapshots.

This one has been released for a while.  The same type of technology that was introduced for various flavors of HP arrays can now be applied to NetApp appliances running OnTap 8.1 or later.  This basically allows us to perform both our backups and restores from inside of Veeam but utilizing the SnapMirror and SnapVault technologies that NetApp provides.

Support for EMC Data Domain Boost

Integration between EMC’s Data Domain and Veeam v8 has been underway for a while, but will come to fruition in v8.  By allowing Veeam to leverage the Boost API’s, accessing the source side deduplication that Boost provides, we  can now ensure that we are not copying duplicate blocks across the complete Data Domain appliance, eliminating the need of laying those blocks down in the first place.  In the end we are left with some pretty good statistics surrounding speed and efficiency.  50% faster full backup performance and 2x faster backup transformation.  This is pretty awesome!   If you’ve ever watched a full backup transformation you will know why I think so!

Integration with Exagrid

This feature almost applies more to the Exagrid appliance then it does Veeam, as the technology lives on the Exagrid appliance.  Essentially, the core Veeam Data Mover functionality is placed directly on the Exagrid appliance, allowing us to free up those infrastructure resources that Veeam would have normally used.  From the Veeam end of things, we simply select to use a deduplication appliance as our backup repository and point it to our newly configured Exagrid appliance.  Exagrid, with the integrated Veeam Data Mover goes ahead and does all the processing for us.  I like this feature and I would love to see other arrays, especially those that focus on deduplication backup oriented appliances adopt!  I’m sure we haven’t seen the last of this.

Veeam Explorer for SQL Server

Ah databases, the thorn in every administrators side!  None the less these databases are important and we need to be able to provide some pretty aggressive RPO and RTO around them.  The new Veeam Explorer for SQL Server will most definitely help in that realm.  Essentially it takes the same item-level restore technology that was brought to us in the Exchange Explorer, and applies to our databases, allowing us to restore individual databases back to our original, or a new server!  But wait!  There’s more!  I said RPO right!  Well, along with the nice restore functionality Veeam has added some pretty slick backup enhancements around SQL as well.  Think transaction log backups!  Basically, we now have a way to perform just transaction log backups without the need of taking a full backup every single time.  So essentially, we could take a daily image level backup of our SQL Server, and then set it to take agent-less transaction log backups every 15 minutes throughout the day, thus creating a nice little 15 minute RPO on the specified database!

Veeam Explorer for Active Directory

This one has been in beta for a while now so most of us have already had the chance to check it out.  Again, we get item level recovery of all our active directory objects, as well as the individual attributes and properties associated with those.  Aside from the item level restore the Veeam Explorer for Active Directory contains a couple nifty notable features.  One, this will restore passwords, both user and computer!  So if you’ve accidentally deleted a user and decide to restore that user, you no longer have to set an temporary password and have them go through the hardship (I know, end users right 😉 of changing it again.  Simply restore the user with the original password and your done.  The second notable is around what Veeam calls 1-click compare.  This allows us to quickly take an object within our Active Directory backup and quickly compare it’s properties and attributes to that of which is running production.  Being able to see the differences and changes is crucial in troubleshooting what may of happened or what the next steps will be.

Replication gets a face lift

Perhaps some of the biggest news thus far is how Veeam have accelerated and changed the way replication functions.  In v8, we can expect to see a new option to use our backup files as our replication source.  What this means is that we can simply point our replication job to our backup files, and replicate directly from the backup, eliminating the need to touch our production storage twice if we wanted to replicate and backup.  This is a functionality that competitors have had for a while, and it’s nice to see Veeam, the company that usually is one step ahead of everyone adopt it as well.

But in true Veeam fashion it doesn’t stop there.  We now can use Veeams built-in WAN Acceleration for our replication jobs as well, providing us with twice the performance than it did in v7.  We have the ability to ‘warm up’ the cache using backup files which might be located at the target, which in turn saves us bandwidth and greatly decreases the replication window size.

Another great feature included inside of v8 is around resuming that of a failed replica job.  In the past if a job had failed, we would need to re-run the job where it would basically start from scratch backing the complete VM up again.  In v8 we have the ability to resume from the last failed state – meaning if we had copied over 50GB of a 60GB VM before it failed, upon resume we would only need to copy that remaining 10GB.  Again, lowering our replication window.

Failover enhancements

When the rubber hits the road and it’s time to failover to a backup site you don’t want to have any uncertainty as to whether things are going to work.  Veeam recognized this and has implemented failover plans into the application.  Essentially we can set up one (or many) failover plans inside of Veeam to ensure that our environments are properly failed over when disaster hits.  These plans can be split up and delegated via roles and permissions allowing administrators to split the load and assign different areas of the failover plan to different people.  This is a must have in terms of DR.

Another feature added into v8 is the planned failover.  The planned failover essentially leverages replication to move VMs from one site to another.  Think of this in terms of disaster avoidance, or even datacenter migrations.


We now can have piece of mind knowing that our Veeam backups are securely encrypted using AES 256-bit.  The new Veeam encryption can be applied to our data at the source, in flight, or at rest on our target storage.  Also note that if you are using the built-in Veeam WAN Acceleration that our compression and accelerating functionality as it normally would using a third party accelerator.   This doesn’t seem like a huge feature but is definitely a must have for those with strict security and compliance guidelines.

VeeamZip enhancements and the new QuickBackup

VeeamZip is a cool functionality which let’s us take a quick full backup of a VM.  I’ve used this numerous times when I need to archive or test certain functionality and want to have an “out” incase things go bad.  Apparently  many other Veeam customers have followed suit as Veeam has seen customers using VeeamZip in lieue of taking VMware snapshots.  Inside v8 we know have the ability to apply a retention policy to our VeeamZip backups allowing us to take that quick VeeamZip and have it autoremoved after x number of days.

As mentioned above VeeamZip is a FULL backup everytime – if we have very large VMs this could take quite a long time to complete.  To counter this Veeam has introduced QuickBackup.  When we perform a QuickBackup on a VM, Veeam basically polls our backup jobs to determine if we have any other backups of this VM.  If we do, Veeam then uses the latest restore point of that backup as the base, and then only backs up the delta as the QuickBackup.  This seems like some great functionality when we just want to get one quick backup of a VM without affecting our retention and restore points inside of production.

Let your end-users restore

As an administrator looking after my Veeam infrastructure I’m normally the “go to” guy when it comes to performing any type of restores.  I’ve always understood this, especially if this was going to be a full VM restore, but for things like restoring someones file from last week it just seems trivial to have me do it.  Well, in v8 I can take that responsibility off of my plate and provide my end-users with the Veeam Self Service File restore portal.   Essentially if you are a local administrator of the VM that you are trying to perform a file level restore on, you have the access to do so by simply launching the web portal.  Furthermore if you launch the portal from within the VM you are looking to restore to, it is intelligent enough to automatically import the restore points and present them to you.  Basically, you can perform file level restores to the VM within the VM without having any sort of interaction with the “go to” guy.  This is a win for everyone.

Backup I/O Control

This is a very cool feature that we can utilize to throttle our backup job resource usage in order to not affect production.  Basically, Backup I/O control will monitor our datastores before and during our backup jobs.  If a certain amount of latency is reached, Backup I/O control will halt or throttle Veeam operations in order to ensure that our production storage is not affected.  When the latency clears, backups begin flowing again.  This fits very well into the avialability message that Veeam is pushing.

Snapshot Hunter

We have all most likely experienced the issues of VMs requiring consolidation which sometimes get triggered by failed Veeam backup jobs.  The problem resides in VMware reporting back that snapshots have been removed when sometimes they aren’t, and what we are left with is something of a “phantom” snapshot.  A snapshot that exists on the datastore, is part of the VM, but is not reported inside of the VMs snapshot list.  Another issue spawns when we attempt to consolidate the snapshot and locks exist on the datastore files.  There is a lengthy list of things we need to do to resolve this and it’s not much fun at all!  Well, now we have something called Snapshot Hunter that will help us with this.  Snapshot Hunter can be triggered at the end of a backup/replication job and basically it will do the VMware snapshot consolidation for us, searching for those phantom snapshots and performing consolidations as per the VMware best practices.  If the files appear to be locked or Snapshot Hunter is unsuccessful in consolidating the files, no worries, a background process will be spawned and triggered to try again every few hours, ensuring we get that “always-on” experience.

So with nearly 2000 words I think I have captured everything that we know thus far about Veeam Backup and Replication v8.  Not that small paragraphs give these features justice, I’m sure I could and probably will do some deepdive posts around each and every one of these features.  And, as mentioned at the beginning of this post I think we are going to have another 200o words worth of features to talk about.  With the Tuesday keynote of VeeamON just an hour away I guess we will find out!

Veeam reiterates just how important partners are during keynote

Today is Partner Day at VeeamON and it kicked off with the partner keynote this morning at the Cosmopolitan in Las Vegas.  To call this a keynote would be somewhat misleading as it was more of talk show type atmosphere complete with a host and couches on stage.  The only thing missing was a Top 10 list and a band.  Honestly, it was a refreshing experience to sit and watch this unfold as keynotes can become a little monotonous and the laid back conversation style was definitely entertaining to watch.


veeamon-speaker-263x160-TimashevThe first guest of the evening was none other than Veeam founder and CEO Ratmir Timashev.  Ratmir sat down in the hot seat if you will and took us through a journey of not only how Veeam was founded, but through Ratmir’s professional career as well.  It’s always interesting to hear first hand how businesses are founded, funded and started and Ratmir has a great story around building his previous company, Aelita Software and how it was acquired by Quest, which lead him to eventually founding and starting up a small company called Veeam Software in 2006.

From there Ratmir went on to praise the channel partners, vars, and folks out there in the trenches selling and evangelizing Veeam, stating there are really three things which are needed to stay competitive in the market today; The best product, the best sales and marketing teams, and most importantly, the best partners.  His praise towards the Veeam partner ecosystem was further backed up by some data, stating that 40% of customers surveyed stated that a major factor in their buying decisions around Veeam came from a partner recommendation.  In my opinion this is a powerful statement and a very important one that centers around trust.  It’s the trust that evolves in a customer-partner relationship which really solidifies how effective a partner is.  Thankfully, Veeam has over 25,000 partners across the world that are ready and willing to take that trusted advisor role on.

veeamon-speaker-263x160-RussellAfter Ratmir we went into a more traditional type keynote with Dave Russell from Gartner taking the stage to talk a little about the trends inside of the storage and backup world.  Dave had a tough act to follow and truly did give a great talk, outlining where he and Gartner see the future of IT and backup, more specifically the customer/enterprise buying cycle going within the next 4 years, and showing some validation to their claims by re-iterating predictions and statements that they have made in the past.  Definitely some data that people could take back with them in order to further validate or prove a solution.

MooreThe Tonight Show style took over again with Chris Moore, Veeams’s North American Channel Chief now taking a seat on the couch!  Chris summarized how Veeam is currently, and plans to support their partners outlining the many services and opportunities that Veeam provides to the channel.  He did this by brining up a couple of partners as well as a couple executives from Veeam to talk about what they are doing in terms of partner enablement.  Through the use of stories and shear numbers I think his point was well taken.

The complete keynote re-iterated one thing!  Veeam is 100% dedicated to channel alignment and go to market strategies through their partners – and this isn’t going to change any time soon.   Veeam is not shy about giving their partners “props” and are well aware that their successes, past, present and future and completely reliant on their partners, the people on the front line, promoting, selling, and evangelizing Veeam.   Sure, some of the “partner speak” went right over my head.  I’m not a partner but whatever was said must have been good as the place erupted in applause more than a few times.

All to often we see keynotes with speakers standing up dictating a PowerPoint – this, was not that!  My props go back to Veeam for the laid back, conversational, humorous style of keynote delivered this morning.  Throwing some personal touches in there really made the difference in my opinion and made this keynote a huge success!  #VeeamON

What happens in Vegas – is livestreamed – #VeeamON live streaming

veeam_logoThe official kickoff of VeeamON is literally just minutes away and in the case of VeeamON, the old saying of “what happens in Vegas stays in Vegas” most certainly doesn’t apply!

If for some reason you ccouldn’tattend Veeam’s inaugural event don’t worry – they have you covered.  Throughout the next couple of days Veeam will be live streaming from the event.  Be sure to tune in here and watch as there is some exciting news coming up!  I’ll do my best to summarize the event on this blog as well!  #VeeamON


VMware launches new delta version of VCP5-DCV exam

VMware LogoToday I received a message from VMware Education Services introducing a new way for current VCP holders to refresh or re-certify before their VCP expires.  Currently as it stands, anyone holding a VCP certification prior to March 10, 2013 has only until March 10, 2015 to re-certify using one of the following methods.

  • Take the most current VCP exam in any of the available tracks (Datacenter Virtualization, Cloud and Desktop – not sure if Network Virtualization qualifies for this or not).  No matter which track you held your VCP in, all will be refreshed with another two years.
  • Take an advanced level exam, meaning the VCAP DCA or VCAP DCD.  Not only will you advance to the next level, you will refresh your VCP expiration as well.

Prior to today, these were your options.  Now however all you VCP holders have a third option, so long as you are currently hold the VCP5-DCV status.

What is a delta exam?

This is something new to VMware certifications.  Basically, this exam is based only on the differences between vSphere 5.0/5.1 and the vSphere 5.5 exams.  Also, instead of your normal 135 questions the delta exam will only have 65.  The biggest difference is how the exam is delivered – you won’t need to drive to a testing center for this one, it is being offered online through Pearson Vue – and I’m assuming this will be a similar fashion to that of the VCA delivery.  Another noticeable difference is price – this one, coming in at $120 USD instead of the normal $220 USD.

Is it worth it?

ScreamingMan-300x225This is something I can’t answer for you – you will have to go through the scenarios in your head.  Currently I have an expiry date of January 2016 for my VCP5 and honestly I’d rather sit a new version of the VCAP then do the VCP again.  That said, can I expect a VCAP6-DCA to be available by Jan 2016?  I have no idea!  Do I want to risk the chance of losing my VCP due to no new VCAP exam coming out or possibly failing the VCAP when it does come out?  It’s all a giant kerfuffle in my head right now!  One note, the email I received said it was only available to those who need to renew their VCP before March 10, 2015.  As noted above, mine was extended to Jan 2016 due the completion of my VCAP in January of this year.  That said, I went through the process of being authorized for this delta exam and had no issues getting into the portion of the Pearson Vue site which allows me to schedule it.  So, try for yourself I guess!

Time’s a wastin!

Oh yah, better hurry and make your mind up.  This delta exam will only be available until November 30th, 2014!  So you have just less than a couple of months to figure out what you are going to do!  Honestly, this whole re-certification process just confuses and puts me in a bad mood Smile  Nonetheless, though I’d share the news!  Oh, I tried to use the VMUG Advantage VCP discount code – didn’t work!