Syncing NSX Managers across vCenters

TL;DR

This script will sync the distributed firewall configuration from one NSX Manager to another. This is useful when you have a DR site that you need to keep in sync with production. Currently the script only syncs ipsets, security groups, services, service groups, and the Layer3 portion of the firewall. It would be easy enough to add support for tags, MACpools, and the Layer2 portion of the firewall, but it wasn’t a requirement for this project.

Download the script here.

Note: This script was written for NSX version 6.1.3 and while I haven’t seen anything in the release notes that would indicate any compatibility issues I haven’t tested it either. I will update this post after I have updated NSX with any new issues I find.

Background

NSX is a network virtualization platform produced by VMWare.  Among many features is a distributed firewall that can be used to segment traffic based on a variety of factors. My company is using NSX for a new project where we need to micro-segment traffic in a VDI deployment to separate 3rd-party users from internal users. One major issue we ran into was the lack of a native way to synchronize the configuration on the NSX Manager down to our DR site. Currently VMWare offers no way to export the configuration of your NSX deployment and import it into another NSX Manager/vCenter. What VMWare does give you is a set of REST APIs, and since I am very comfortable using Powershell that was the route I decided to take.

The Process

Going through the NSX API guide I had hoped to simply pull the global firewall configuration out of the source NSX Manager and upload it onto the destination. After banging my head against the wall with some not too descriptive error responses and a couple of phone calls with VMWare I found out that each object in the NSX configuration has its own unique ObjectId (ipset-27, servicegroup-32, etc.). The manager uses these Ids to reference objects internally. In order to sync the firewall I also needed to sync each of the objects referenced in the firewall config and ensure that I am referencing the correct ObjectId.

One thing I quickly found was the NSX API guide has a fair amount of typos, inconsistencies, and errors. My two best pieces of advice (besides paying for SDK support with VMWare) are to search for similar scripts online that you can use as a reference and to try your URIs with and without a trailing slash. For some reason the API is very inconsistent with its use of trailing slashes so if a URI gives an error try removing or adding a trailing slash. I would also recommend the Advanced Rest Client for Chrome which was indispensable while troubleshooting.

I began the process of writing a script that will query the source NSX manager for each of the object types, then sync those objects to the destination. After all of the objects are synced I can then copy the firewall config. I originally couldn’t find a way to get the raw XML out of the native Powershell XML object in order to send it as the body in my POST and PUT calls. I was taking the XML that was returned by the GET call as a string and doing some regex to pull out the pieces I wanted. However, this led to a mess of comparisons and hash tables to track and match everything. I decided to go back to the drawing board and found that I had missed the .outerxml Powershell method which allowed me to get the raw XML back from the PSObject. This solved almost all of my issues and quickly got me on the right track.

The Solution

In the final form the script moves through each of the object types one at a time. The script queries each NSX Manager and then uses the output of the source as the base of the new destination configuration. The script then loops through each object in the source and looks for a match in the destination based on the ‘Name’ attribute. If it doesn’t find a match it creates a new object. If it does find a match it replaces the ObjectId from the source with the ObjectId of the destination. All extraneous objects on the destination are deleted. This method works quite well and turned out to be modular which is nice because it means there are only a few functions that I can use over and over again for each of the objects. I also think it will scale nicely as we begin to use more of the features of NSX over time.

Note: This script is read-only on the source. Changes are only ever made to the destination NSX Manager, it’s basically a master-slave setup.

FUNCTIONS

I’ll talk briefly about each of the functions in the script.

First off we need to tell Powershell to ignore cert errors since NSX Managers use self-signed certs.

This is a simple wrapper for GET calls to query the NSX managers. I return these to a variable cast as XML.

I use the Compare-Object cmdlet to diff the two XML objects based on the ‘Name’ attribute. Then I loop through those differences and create or delete objects on the destination as needed. When new objects are created they are created as empty objects with only the name configured. This will generate a new ObjectId that we can use later to get everything synced up.

We have to query both NSX Managers again to get the new configurations now that all of the adds and deletes have been done. Now we can replace all of the ObjectIds on the source objects.

Now that we have all of the objects we want to sync with ObjectIds known to the destination NSX Manager we can push everything. If you are watching the console as this runs it will spit out some useful diagnostic info in the event the server returns an error code. It will also write all the changes to the log file along with the URI and return code.

Service group membership requires its own special function because it has to be done in a second pass with some logic that doesn’t fit into the other functions. It is essentially a microcosm of the rest of the script.

Finally we get back to the firewall rules. Originally I was going to sync each individual firewall section but unfortunately that had a tendency to write the rules out of order. NSX has no method to reorder the rules of the distributed firewall. Instead this method gets everything synced up in terms of ObjectIds and then pushes the entire global firewall configuration onto the destination.  A few things to note here. The API guide states that a new rule can have an Id of 0. I haven’t found this to be the case and I suggest removing the Id attribute all together when creating new rules. Second, I am only worried about the Layer3 rules so the Layer2 section is just enough to sync the ObjectIds so that the source will accept the configuration.

And that’s it. The script will take quite a while to run, somewhere around 10 minutes in my experience. Most of that is due to the large number of services, almost 490 come standard. When it’s done the log file should give you a nice snapshot of all of the changes made. If you want to see what is happening on the NSX Managers themselves then connect using SSH and run

show manager log follow

Unfortunately, it is both quite verbose and ambiguous in its error messages, not a great combination.

I will say this has been a great experience for me and has definitely improved my Powershell and REST skills quite a bit, especially in the XML manipulation department. I’d like to give props to my co-worker Martin Pugh who was an indispensable source of Powershell expertise as well as the folks over at VMWare who were quite helpful in decoding some of the stranger errors I encountered.

Cheers.

Creating folders in Office 365 mailboxes using EWS

I ran into an issue this week where a client needed to programmatically create folders in shared mailboxes.  The client is a contracting firm and needs a way to collect all of the messages related to a project into a single place so that they can be easily referenced and later archived with all the other data related to the project.  Up until this point we had been creating a shared mailbox for each project with an associated transport rule that would BCC all mail that contained the project number (in a variety of formats) in the subject line.

This worked great up until the firm started growing and we began to bump into the limit of 100 transport rules in Office 365. After some discussion we decided to move to a smaller number of shared mailboxes that are assigned to each group within the company and then have the individual projects as folders within those mailboxes. This way each transport rule can contain all of the strings for each project in a mailbox and then we can use inbox rules to sort the mail once it is delivered to the mailbox. This solution was great until I started thinking about having to manage all of those folder and rules, no way was I going to do that. Powershell to the rescue!

Well I quickly found out that in Office 365 even though there is a cmdlet to allow you to create a mailbox folder new-mailboxfolder it does not and will not work on a mailbox besides the one your are logged in as, even if you have FullAccess permissions. After some digging I found a method to do just that with EWS. After a lot of google, a lot of furious typing, and 3 Jaco albums I finally was able to get everything working. I did get lucky in that for whatever reason Office 365 does allow you to use the new-inboxrule cmdlet on other mailboxes besides your own.

The following script was meant to be used by an authorized user at my client so it is pretty basic. The script will ask you a series of questions and then generate a new shared mailbox, assign FullAccess permissions to the correct users, create transport rules, create folders in the new mailbox and create inbox rules. When it’s all done it will send you an email telling you if it succeeded or not. There is also some strict assumptions about the format of the project name. The client uses a standard format so I haven’t had to make this more portable but you should be able to customize this pretty easily to your needs using regular expressions and string manipulation.

There are a few prerequisites to running these scripts. Powershellv3 or better and the EWS managed API 2.2. The script assumes the EWS DLLs are installed to The script assumes that the API is installed to C:Program FilesMicrosoftExchangeWeb Services2.2 which is the default.

I also wrote a second script that can be used to create new folders in existing mailboxes.

OK, let’s try this again

Every two years or so I tell my self I am going to start a blog. Well this time I mean it (I hope). I’ll be writing mostly about things I want to keep track of with a focus on virtualization, powershell, and Exchange. I’m also using this as a way to improve my linux skills so I will probably be dropping some rudimentary linux posts here and there.   I’ll try to be good and post regularly (weekly).

Building a WordPress site in AWS

Here are the steps I used to get this site up and running.

Besides having a blog I wanted to use this site as a chance to get more familiar with a lot of technologies that I haven’t had a chance to be exposed to in my current position. To that end I am going to host the site using an instance of Ubuntu on the Amazon Web Services free tier. For the majority of the setup I used some great guides over at Digital Ocean. I will just reference those and then talk about specific issues I ran into, I don’t see any need to redo all the great work they have already done for me.

Setting up the host and getting connected

  • I signed up for the AWS free-tier and launched a new instance of Ubuntu 14.04.
  • During the process you will have to download a new key pair (.pem). I usually save this to some cloud storage like DropBox so I can always have access if I need it.
  • Open up Puttygen (C:Program Files (x86)PuTTYputtygen.exe) and click File > Load Private Key and then File >Save PrivateKey.
  • Save the .ppk file to the same folder.
  • Check back on your EC2 dashboard and note the Public IP and Security Group name.
  • While you are in your dashboard click Security Groups on the left hand menu.
  • Click on the Security Group you noted previously.
  • Click the inbound tab and then click Edit.
  • Add rules for HTTP and HTTPS and leave the source as Anywhere.
  • Open up Putty (C:Program Files (x86)PuTTYputty.exe) and enter the Public IP you noted previously in the host name field.
  • In the left-hand menu click on Data and enter ubuntu for the Auto-login username.
  • In the left -hand menu expand SSH and click on Auth.
  • Load the .ppk you just created under Authentication Parameters.
  • In the left-hand menu click on Session and enter a name under Saved Sessions and click save.
  • Click Open and you should be connected to your Ubuntu instance.

Updating the host and installing

As I said above I mostly just stuck to the Digital Ocean guides since they were written for the same distro I am using (ubuntu 14.04) and they also covered all the basics which I needed at this point. The ones I used were How To Install Linux, Apache, MySQL, PHP (LAMP) stack on Ubuntu, How To Install WordPress on Ubuntu 14.04, and How To Configure Secure Updates and Installations in WordPress on Ubuntu in that order. Just following along all the steps worked out pretty much perfectly for me. The first two guides are actually enough to get you up and running but I found out pretty quickly that you need to setup secure updates in order to be able to install new themes and plugins. I probably could have gotten away with FTP but I am trying to keep this site as secure as possible since this is all a learning exercise for me.

I did run into a few issues that I will go into further. First of all the guides ask you to run apt-get a lot. I went through and complied all of the requests to install a package so they can be run as a one liner.

sudo apt-get install apache2 mysql-server libapache2-mod-auth-mysql php5-mysql php5 libapache2-mod-php5 php5-mcrypt php5-gd libssh2-php php5-dev libssh2-1-dev

Also if you are tempted to change the usernames along the way be my guest but just make sure to keep track off all of the changes you make because these guides are cumulative and I had it come back to bite me a few times, especially when I was setting up the new users in the secure updating guide.

Once I had the website up and running and had chosen my theme I installed the Askimet plugin to control spam and the SyntaxHighlighter Evolved plugin to create code blocks since I plan on posting a fair bit of powershell scripts to this page. I also recommend going through the site settings and configuring things to your liking.

The next problem I ran into had to do with permalinks. Whenever I tried to use a permalink I was getting a 404 error. after some troubleshooting it turns out I had to enable mod_rewrite in apache. This site was where I ended up and after enabling the rewrite module and restarting apache I was all set.

sudo a2enmod rewrite
sudo service apache2 restart

Finally I ran into an issue where I couldn’t upload pictures to the site. It turns out during the secure updates configuration the Digital Ocean guide has you give ownership of everything to the wp-user user. Well this breaks content uploading so you have to go back and give www-data ownership on the /var/www/html/wp-content/uploads/ directory.

sudo chown -R www-data:www-data /var/www/html/wp-content/uploads/

Still on my ToDo list for this site is configuring SSL and forcing all admin logons to use it and configuring a script to backup the site to my DropBox.

I hope this was a help to you, as I get more comfortable using Linux I know that all in one guide such as these can be a big help. As always if you run into any issues Google is your friend, it was certainly mine.