Month: November 2014

#VFD4 – A Canadian in Texas?

I know, I didn’t leave much to the imagination with the blog title and as you may of guessed I’m going to be attending Virtualization Field Day 4 in Austin, Texas this January! I was ecstatic when I received the invitation and it didn’t take much convincing to get me to go!  I’ve been a huge fan and supporter of the Tech Field Day format over the years, and not too many of them go by where I don’t catch a few session on the livestream.  The fact that Austin is on average 30 degrees Celsius warmer than here in January sure does help too! The event Aside from the heat I’m definitely looking forward to being a part of VFD4.  This will be the fourth installment of Virtualization Field Day and it takes place January 14th through the 16th in Austin, Texas.  The Tech Field Day events bring vendors and bloggers/thought leaders together in a presentation/discussion style room to talk everything and anything about given products or solutions.  I’ll point you to techfieldday.com to get a way better explanation about the layout of the events. The Delegates This will be my first time as a delegate and I’m feeling very humbled for having been selected.  Honestly I get to sit alongside some of the brightest minds that I know.  Thus far Amit Panchel (@AmitPanchal76), Amy Manley (@WyrdGirl), James Green...

Read More

Veeam on Windows 2012 R2 – Don’t forget about large size file records

Unfortunately I had to learn this one the hard way – but in hopes that you don’t have to, here’s the gist of my situation.  I’ve had Veeam running and backing up utilizing a Windows Server 2012 R2 box as a target for a while now!  I’m getting some awesome dedup ratios utilizing both Veeam and Windows built in deduplication.  That said, just last week I began to see this error occurring on one of my backups jobs. “The requested operation could not be completed due to a file system limitation.  Failed to write data to the file [path_to_backup_file]” After a bit of Google Fu one can conclude that from here and here my problems were mainly due to the way my volumes were formatted – more specifically the support of for large file records.  I, like most people went ahead and simply used the GUI to format and label all of my volumes.  The problem being, utilizing the GUI also utilizes the default settings for the format operation, which in turn support only small size file records. Therefore, after time, after some data is laid down to the disk, after dedup has been doing it’s thing for a while you might start to see the same error I was.  The solution – well, unfortunately it is to reformat that volume with large size file records.  The command to...

Read More