#VFD4 – A Canadian in Texas?

VFD-Logo-400x398I know, I didn’t leave much to the imagination with the blog title and as you may of guessed I’m going to be attending Virtualization Field Day 4 in Austin, Texas this January!

I was ecstatic when I received the invitation and it didn’t take much convincing to get me to go!  I’ve been a huge fan and supporter of the Tech Field Day format over the years, and not too many of them go by where I don’t catch a few session on the livestream.  The fact that Austin is on average 30 degrees Celsius warmer than here in January sure does help too!

The event

Aside from the heat I’m definitely looking forward to being a part of VFD4.  This will be the fourth installment of Virtualization Field Day and it takes place January 14th through the 16th in Austin, Texas.  The Tech Field Day events bring vendors and bloggers/thought leaders together in a presentation/discussion style room to talk everything and anything about given products or solutions.  I’ll point you to techfieldday.com to get a way better explanation about the layout of the events.

The Delegates

This will be my first time as a delegate and I’m feeling very humbled for having been selected.  Honestly I get to sit alongside some of the brightest minds that I know.  Thus far Amit Panchel (@AmitPanchal76), Amy Manley (@WyrdGirl), James Green (@JDGreen), Julian Wood (@Julian_Wood), Justin Warren (@JPWarren), and Marco Broeken (@MBroeken) have all been confirmed as delegates with more to be announced as time ticks on.  Some of these people I’ve met before, some I know strictly from Twitter and others I haven’t met at all so I’m excited to catch up with some people as well as meet some new people.

The Sponsors

So far there have been 6 sponsors sign up for #VFD4 – Platform9, Scale Computing, Simplivity, Solarwinds, StorMagic and VMTurbo.  Just as with the delegates some of these companies I know a lot about, some I know a little, and others I certainly need to read up on.  Having seen many, and I mean many vendor presentations in my lifetime I have tremendous respect for those that sponsor and present at Tech Field Day.  The sessions tend to be very technical, very interactive, and very informative – three traits that I believe make a presentation.  I’m really looking forward to seeing my fellow Toronto VMUG Co-Leader and friend Eric Wright (@discoposse) sitting on the other side of the table :)

Be sure to follow along via Twitter by watching the #VFD4 hashtag leading up to and during the event.  Also a livestream will be setup so you too can watch as it all goes down.

I’m so grateful to everyone for getting this opportunity – so thank you to my peers, the readers of this blog, Stephen Foskett and all the organizers of all the great Tech Field Days and the virtualization community in general – See you in Texas!

Veeam on Windows 2012 R2 – Don’t forget about large size file records

veeamlogoUnfortunately I had to learn this one the hard way – but in hopes that you don’t have to, here’s the gist of my situation.  I’ve had Veeam running and backing up utilizing a Windows Server 2012 R2 box as a target for a while now!  I’m getting some awesome dedup ratios utilizing both Veeam and Windows built in deduplication.  That said, just last week I began to see this error occurring on one of my backups jobs.

“The requested operation could not be completed due to a file system limitation.  Failed to write data to the file [path_to_backup_file]”

After a bit of Google Fu one can conclude that from here and here my problems were mainly due to the way my volumes were formatted – more specifically the support of for large file records.  I, like most people went ahead and simply used the GUI to format and label all of my volumes.  The problem being, utilizing the GUI also utilizes the default settings for the format operation, which in turn support only small size file records.

Therefore, after time, after some data is laid down to the disk, after dedup has been doing it’s thing for a while you might start to see the same error I was.  The solution – well, unfortunately it is to reformat that volume with large size file records.  The command to do so, pretty simple, listed below

Format <DRIVE> /NTFS /L

The key here being /L, which specifies the support for large size file records.  Also, this process can take quite some time.  From the time I ran the command to the time I had to get to this point in this blog I’m still sitting at 2%.

In my case, I removed all the backups from Veeam that lived on this disk.  I’m comfortable that I also have replication running so I wasn’t worried about losing any data.  If you are though, you could always simple disable dedup and copy your backups files to another location, run the format command and then copy them back.  Honestly, I knew in my case it would be easier and quicker to simply just reseed those VMs.

Also, it’s not to say that Veeam won’t work without large size file records.  I have 8 other volumes on this Veeam instance which have all been formatted with the same default settings and haven’t seen any issues whatsoever with them – just this one volume is throwing the error!  For now, I plan on just leaving these other volumes the way they are.  Am I just delaying the inevitable?  Only time will tell!

Thanks for reading!

Get the cobwebs out of your Data Center with CloudPhysics

Just the other day I was thinking to myself you know what this world needs – more Halloween themed infographics relating to IT.  Thankfully, CloudPhysics, with their analytic powers pulled that thought out of my head and did just that!  With a vulnerability dubbed Heartbleed, how can you resist it?

On a serious note some of the data that CloudPhysics has should really scare you.  22% of our vSphere 5.5 servers remain vulnerable to Heatbleed! 41% of clusters do not have admission control enabled!   These are definitely some spooky stats and shouldn’t be ignored!

CloudPhysics-Halloween-2014

But what’s Halloween with out some tricks and treats right?

CloudPhysics has you covered there as well!   Be sure to grab their Halloween Cookbook – A collection of tips and tricks on how you can remediate issues within your data center and stay out of the spooky stats that they are collecting.  For an even bigger treat, be sure to sign up to allow CloudPhysics to help find the data center goblins for you – oh, for free!  Better yet, sign yourself up for the community edition of CloudPhysics – it’s completely free and can give you some great insight into what’s going on inside your virtual data center!  Be sure to head to their official blog post to get all in the information you need to fill up your bag!

Google Admin SDK APIs and .net integration – Changing a users password

I know weird right?  Coding, Google API’s, WTF!?!?  This is a virtualization blog.  Truth is I was a developer before ever moving into the infrastructure space, not much of one, but I was one Smile  Honestly, this is probably the reason why products like vRealize Orchestrator ( More weirdness calling vCO that) that mix both development and infrastructure together appeal to me so much!  More truth – as much as I try I can’t quite get away from development – it’s a part of what I do ( I guess ).

Anyways, cut to the chase – I’ve been recently working on a project revolving around integrating Google Apps with a homegrown .net application I’ve written.  Now, there is such a thing called the provisioning API, which is super easy to use and worked great – but is depreciated and Google could simply stop providing it whenever they want.  Google suggests people move to the Admin SDK – which, of course, is much harder!  Either way, I needed to provide a way for users to reset other users Google Apps passwords from within a .net application, without those users having to provide access or permission to their accounts. My Google-Fu was strong on this one, and by the time I finally got it to work I had about 27 tabs open, therefore I thought it might be nice for the next person to maybe stumble upon one page containing all the info they need to make this happen – therefore – this!

To the developers console

The first step to getting everything to mesh with the Admin SDK involves setting up a project in the Google Developers Console.  Simply log in and select ‘New Project’ and give your project a name and an id.  Easy enough thus far.  Once the project has been created we need to enable the Google Admin SDK.  Select APIs from the APIs & auth section on the navigational menu on the left.  Here, we can simply filter our list of available APIs by inputting ‘Admin SDK’ in the browse box and then enable it by switching its status to On.

EnableAdminSDK

From here we need to create a client ID containing the p12 key and credentials that we need in order to delegate access to.  As mentioned earlier, I’ve decided to go about this via the ‘Service Account’ route as I would like to have an account that I can delegate domain wide admin access to in order to change passwords, create users, etc, and doing this without authorization or permission from the users themselves.  To do click the ‘Create new Client ID’ button inside of the Credentials section of API’s & auth.  When presented with the ‘Application Type’ dialog, select Service account and click ‘Create Client ID’.

CreateServiceAccount

Once the process has completed pay attention to the .p12 key that is automatically downloaded.  We will need this file later when connecting our .net application so go and store it somewhere safe.  Also note the private keys password in the dialog as we will also need this information.

privatekeydownloadpassword

At this point you should see your newly created service account, it’s Client ID, Email Address, and Public Key Fingerprints.  The only piece of information that we need off of this screen is the Client ID – usually a big long string ending in googleusercontent.com.  Go ahead and copy that to your clipboard as we will need it for the next step.

To the Admin Console

From here we need to go to our Google Apps admin console (admin.google.com/your_domain_name and grant this newly created security account access to specific APIs.  Once logged into the admin console, launch the security app (sometimes located in the ‘More Controls’ link near the bottom.

adminconsole-security

Inside of Security we need to go into the Advanced Settings (located by clicking Show more) and then the “Manage API client access” section.

advancedsettings

Now we need to grant our service account access to specific APIs within Google by specifying the individual URLs of the API.  First, paste your Client ID that we created and copied into the Client Name box.  Secondly, copy in all of the API urls (comma separated) that you want to grant access to.  In my case I’m only looking for User and Group Management so I entered https://www.googleapis.com/auth/admin.directory.group, https://www.googleapis.com/auth/admin.directory.user into the API Scope input box.  If you need help figuring out the url for the specific API you are looking for you can find them listed in the Authorize Request section in the developer guide for each Google API.  Once you are ready to go as shown below, click Authorize.

authorizeAPIAccess

And now the code

Thus far we’ve done all the heavy lifting as it pertains to opening up the access for .net to utilize our Admin SDK APIs – time to get into a little code!  Again, I don’t consider myself a hardcore developer, as you can probably tell from my code.  There may be better ways to do this but this is the way I found that worked, and again, not a lot of information out there on how to do this.

First up there are some project references that you need to use.  Honestly, you can get the .net client libraries from Google but the easiest way to bring packages in is by using NuGet as it will pull dependencies down for you.  Go ahead and import the Google APIs Client Library, Google APIs Auth Client Library, Google APIs Core Client Libary, and Google.Apis.Admin.Directory.directory_vi Client Library.  That should be all you need to do the password resets.

So the complete script is listed at the bottom, but for “learning” sake, I’ll break down what I’ve done in the sections to follow.

1
2
String serviceAccountEmail = "350639441533-ss6ssn8r13dg4ahkt20asdfasdf1k0084@developer.gserviceaccount.com";
var certificate = new X509Certificate2(@"c:\p12key\NewProject-3851a658ac16.p12", "notasecret", X509KeyStorageFlags.Exportable);

Here I’m simply declaring some variables; first, the serviceAccountEmail – this is the email (not the ID) of the Service Account we have setup – secondly, our certificate, which is generated by pointing the constructor to the p12 key we generated (remember) and the key password that was displayed (remember).

1
2
3
4
5
6
7
ServiceAccountCredential credential = new ServiceAccountCredential(
new ServiceAccountCredential.Initializer(serviceAccountEmail)
{
User="myadminaccount@mwpreston.net",
 
Scopes = new[] { DirectoryService.Scope.AdminDirectoryUser }
}.FromCertificate(certificate));

This line essentially builds the credential object we need in order to instantiate the service we want to perform.    Take special note here I have to pass a user parameter – this is the user that we want our service account to impersonate (they will need to be have correct roles/permissions in Google to perform any of the tasks we attempt).  Also, the Scopes array – this is specifying which exact API scopes we want to authenticate to – these will normally match the end of the API URL, just without decimals.  That said, we have auto-complete in Visual Studio right – use it Smile

1
2
3
4
5
var dirservice = new DirectoryService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "MyNewProject",
});

Each and every API you call from Google inside of .net will need to be stored in a service variable.  This is where we instantiate a new DirectoryService (to access users).  If you were to use the Task Service this would be new TaskService.  No matter what we always use new BaseClientSerivce.Initializer in our constructor.  Note we also pass in our created credential object, as well as the name of our project we created in Google Developer Console.

1
2
3
4
5
User user = dirservice.Users.Get("testuser@mwpreston.net").Execute();
Console.WriteLine(" email: " + user.PrimaryEmail);
Console.WriteLine(" last login: " + user.LastLoginTime);
user.Password = "MyNewPassword";
dirservice.Users.Update(user, "testuser@mwpreston.net").Execute();

And now the magic happens.  Honestly this will be different depending on what API you are using but again we have built-in help and auto-complete in Visual Studio so you should be able to figure out how to do anything you need to.  Above I simply get a user, display a few of their parameters, change their password and then update the user.  Easy enough!

So here’s the code in its’ entirety.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Security.Cryptography.X509Certificates;
 
using Google.Apis.Auth.OAuth2;
using Google.Apis.Services;
using Google.Apis.Admin.Directory.directory_v1;
using Google.Apis.Admin.Directory.directory_v1.Data;
using Google.Apis.Admin.Directory;
 
namespace ConsoleApplication3
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Connect to Google API");
Console.WriteLine("=====================");
 
String serviceAccountEmail = "350639441533-ss6ssn8r13dg4ahkt20ubasdf2424@developer.gserviceaccount.com";
var certificate = new X509Certificate2(@"c:\p12key\NewProject-3851a658ac16.p12", "notasecret", X509KeyStorageFlags.Exportable);
 
ServiceAccountCredential credential = new ServiceAccountCredential(
new ServiceAccountCredential.Initializer(serviceAccountEmail)
{
User="MyAdminAccount@mwpreston.net",
 
Scopes = new[] { DirectoryService.Scope.AdminDirectoryUser }
}.FromCertificate(certificate));
 
 
var dirservice = new DirectoryService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "MyNewProject",
});
 
User user = dirservice.Users.Get("testuser@mwpreston.net").Execute();
Console.WriteLine(" email: " + user.PrimaryEmail);
Console.WriteLine(" last login: " + user.LastLoginTime);
user.Password = "MyNewPassword";
dirservice.Users.Update(user, "testuser@mwpreston.net").Execute();
 
Console.WriteLine("Press any key to continue...");
Console.ReadKey();
}
}
}

I know this post may be a little off topic with the rest of this blog but the fact of the matter is I couldn’t find a lot of information out there on how to accomplish changing Google Apps user passwords from .net.  And when I can’t find information I post it!  Hopefully someone will stumble across this and find it useful!  Thanks for reading!

Veeam Cloud Connect – a customer perspective!

During the partner keynote at VeeamON it was announced that some of its’ partners will be ready to go in terms of supporting the new Veeam Cloud Connect hosting functionality that is set to be released inside of Veeam Backup and Replication v8.  In it’s basics, Veeam Cloud Connect and the Veeam Service Provider certification allows partners to become a service provider, allowing their customers to get their backups offsite and provide a full backup hosting environment.

600x315_cloud-connect_1

So, in terms of a customer what does this mean?  Many small to medium size businesses use Veeam to take care of their day to day backups and as an added bonus replicate and test their backups to ensure consistency.  That said, the shear cost of acquiring space, compute, and storage at an offsite location tends to be a hurdle that is hard to overcome.  But it’s not just that – even if budget is not a problem, the complexity of providing and setting up connectivity, networking, and routing can be a challenge to those without a large IT staff.  Backing up to a VSP certified Veeam Cloud Connect partner eliminates those challenges, allowing customers to simply rent space from their preferred partner.

From within Veeam Backup and Replication backing up to a VSP through Cloud Connect is essentially no different than if you were backing up locally.  When we setup our backup repositories we currently select either a Windows or Linux machine with some available storage and then it is available to be used within our backup jobs.  Veeam Cloud Connect is much the same – A Veeam-powered service provider will simply show up as a backup repository within your Veeam Backup and Replication console and can be used within your jobs just as any other repository.  Easy!

But it’s more than that!

HandshakeSure, Veeam Cloud Connect will allow us to easily get our backups offsite without incurring most of the major capital costs that come along with that, but it offers much more – more that centers around trust.  Mostly everyone is aware of giant cloud providers such as Amazon, Microsoft, and VMware – and the services they provide are bar none amazing but do we really trust them with our data.  The thing about granting partners with the ability to become a Veeam hosting provider brings the trust issue closer to home for the customer!  Partners are in the trenches everyday, building relationships with customers and ensuring that these are life-long commitments.  In my opinion, people tend to trust someone they have worked with often, someone they met many times over the years, and someone who is local – essentially their preferred IT partners and VARs.  By allowing their partners to become hosting providers Veeam has essentially leveraged this trust that partners have worked to obtain with their customers, and in turn, allowed the partners to provide a well rounded, complete solution to these customers.  It’s really a win-win all around.

I think about markets like education, healthcare, government – these verticals house very important and sensitive data, data that needs to be protected both onsite and off.  With that said, strict compliance guidelines usually dictate exactly how and where this data may sit.  Honestly, if that place is outside of the US the major players just aren’t there.  I can see how Veeam Cloud Connect can solve this issue.  By utilizing a preferred partner, educational institutions and districts could essentially leverage a partner who is “close to home” to provide them with that offsite repository for their data.  Someone they already know and trust.

Furthermore, why settle at just one?  There are many school districts in Ontario who may or may not leverage Veeam as a backup vendor of choice – but for those that do I can see tremendous opportunity for partners to design and implement data-centers centered directly around the compliance and regulations that the districts face.  Again, a win win – a partner gets many like customers and the customers get an easy and secure solution – without the need to purchase additional hardware or licensing.

In essence, Veeam Cloud Connect is more than just a backup repository for a customer.  It’s an opportunity.  An opportunity for partners to leverage the trust they have built, an opportunity for customers, especially those in similar to verticals to unite and look for a “groupon” type deal for offsite backups.  And an opportunity for everyone to ensure that in the event of a disaster, they are not left without the critical data that their business needs to survive!

Veeam Endpoint Backup – Bringing Veeaminess to your physical endpoints

veeam_logoToday (Wednesday) at VeeamON brought with it some of the biggest news to come out of the conference.  Veeam has officially announced the arrival of Veeam Endpoint Backup, a solution enabling IT shops to backup their physical Windows desktops and laptops, but do it in a similar style as to how we currently backup our virtual infrastructure with Veeam Backup and Replication.  Oh, one more important key to this announcement is that  Veeam Endpoint Backup is absolutely free – I thought that ensued a worthy mention in the first paragraph!

But you can’t hot-add my laptop?

Ok, let’s take a look at what we know thus far about how this works.  Basically we are looking at a package that is installed directly on our endpoint.  From there, we can chose to backup our entire system, individual volumes, or specify individual files to be part of our backup job.  Easy enough thus far right?  As for targets, or where we are going to backup to we have a few options – we can backup to a drive attached to the source machine, such as a USB or SATA disk, we can point our job to a NAS device or to a file server via a CIFS share, or, perhaps the most appealing to me, we could backup a Veeam Backup and Replication repository.  From there its just a matter of setting up your retention settings and you are set to go with physical protection provided by Veeam

The restore process is much the same as the backup, meaning we can perform file or volume level restores, as well as complete bare-metal restores.  Veeam has also provided the option to create a recovery type USB key that can be used to boot the endpoint in the event that you aren’t able to get to Windows in order to do the restore.

So what’s and endpoint?

Veeam has stated that this is not an Enterprise product and its main purpose is to provide protection for your Windows desktops and laptops.  That said the list of OS support includes Windows 7 and up, as well as Windows Server 2008 and up – so essentially we could have a Veeam solution for our physical infrastructure.  One can see how this would be very appealing to IT shops that have only a handful of physical servers left and might be currently using another solution alongside Veeam to back these up.

Oh and it all can be yours for only FREE

Veeam has a long history of releasing free products and tools to the virtualization community.  We’ve seen it happen with products like Backup and Replication and Veeam One.  We’ve also seen separate standalone tools come out for free such as the Veeam Explorers and the highly popular FastSCP.  Following that same model comes Veeam Endpoint Backup.  Offered free to everyone and supported with a “best-effort” type model.  Veeam states they simply want to get this product into the hands of IT Professionals in order to get feedback and look at future expansion.

What the future holds

crystal-ballVeeam touting that they want feedback for future expansion isthe key here – I would love to see some functionality like this built in to Veeam Backup and Replication – allowing us to remotely install some sort of Veeam agent and setup backup jobs directly from within the centralized console.  I’m not saying this is going to happen, but it does seem like a logical step for Veeam to take with the product – and maybe that’s the plan seeing as this is a separate product targeted at client machines, leaving the doors open to provide physical server protection from elsewhere, say Veeam Backup and Replication.  This would allow us to use our enterprise type features such as application aware processing as well as things like SureBackup and virtual labs on our physical infrastructure.  Or even open up doors for having a physical server replicated to a virtual machine.  Of course this is all just me sitting on the Cosmopolitan balcony speculating while recovering from the VeeamON party last night – and could turn out to simply be the advil talking :)   Even if they do hold steady with just endpoint protection I’m excited to see where Veeam will take it.  Veeam is a company that is constantly releasing very innovative features into their products so you never know what you might find inside a 3.x or 4.x version of Veeam Endpoint Backup.

But, back to reality – Veeam Endpoint Backup is here now, it’s cool, it’s free and it’s going into a public beta come November of this year.  How do I sign up – follow the white (green) rabbit to this page and simply provide your email and you can be the first to know when Veeam Endpoint Backup Free hits the internets!  For now we wait, keep calm, and VeeamON!