This past week I’ve been working on a python script to gather the used, free, and total disk space from a bunch of Windows servers. I’ve had to do this manually many times over the years for various planning tasks. This most recent time a client of ours has eaten up their SAN storage in less than a year, so I wanted to see what servers are wasting a lot of SAN storage. To figure this out, I was going to look at servers with large volumes that do not have a lot of data.

I started writing a script that uses WMI to connect to the servers and collect the information. Then I thought it would be cool to have it saved to a Google spreadsheet. After figuring out how to do that, I then wanted a way to run this on a regular basis, which requires storing a Windows login and a Google login. Obviously, you don’t want that stored in plain text or in any manner that allows someone to get your passwords. I started Googling and found a post on Pycrypto. I’m sure there are better ways to do this, but here is what I came up with.

When running the script with the configuration option, it asks you for your login information for both the servers and Google. It then puts this into XML using ElementTree. I then use PyCrypto to encrypt the XML using ElementTree’s tostring function. Lastly, I use pickle to dump the encrypted data and the IV (initialization vector) used to encrypt the string to a file.

Here’s what the code looks like.


Once the data is saved, you then have to be able to get it back out of the file. Here is the code to do that.


Let me know what you think or if there is a better way. I’m sure there is.

A client of mine is looking to give access to their ERP application to their office and plant in Shanghai. They are going to do this via Citrix, so I wanted to see what latency was like. To do this, I just setup a batch file that runs pings to both locations and outputs the results to a text file. I scheduled this to run every 4 hours. Here’s the batch file.

After running for about a week, I wrote a small python script to grab all the ping times out of the text files and give me the maximum, minimum, and average response times. You are prompted for the location of the text files and the beginning pattern, so you can get the results for each site.

Here’s the output for one location:

I know this is nothing special, but I figured I’d throw it out there in case any other newbies or sys admins need to get this information quickly without software.

28. May 2013 · 3 comments · Categories: Code · Tags: , , , ,

I wrote this script a little while ago, but I wanted to re-write it so I could share it. Originally, I had our API key, email addresses, and smtp server address hard coded into the script. I obviously didn’t want to share that, and I didn’t want anyone to have to open the script and find and edit text. This led me to figure out how to save to XML, something I haven’t done yet as a beginner.

To give a little background of why I wrote this script, let me start by saying how lazy and forgetful I can be. We are a partner of Datto who we use for our backup solutions. Without getting into too much detail of the backups, it is a managed service we provide, which means we have to monitor the backups for our clients and resolve any backup issues. To check the backups of all our clients, we simply login to Datto’s partner portal and drill down into each appliance to check the statuses. There are two problems that I already highlighted. One, I can be pretty lazy, so logging in and drilling down into each appliance is a pain in the butt for me. Two, I can be forgetful, so depending on myself to remember to login and check all these servers when I happen to walk into fire fighting first thing in the morning is not the most reliable way to make sure backups get checked. This is where the script comes in thanks to Datto’s XML API.

Using this script, I can pull the backup statuses for all the servers and have them formatted in a nice email that I know I will read. I have this script running first thing in the morning and at lunch time, as our backups are hourly and I want to make sure there haven’t been any servers without backups for more than a few hours.

The way the script works is you run the script with a -config option to generate the XML file it will use to store is configuration. It will ask you for your API key, email subject, from address, to address, and SMTP server address. After the file is generated, you simply run the script itself without any options. It will grab the info from that file, grab the info from Datto, generate the email, and send it to you in a tabular format similar to the following:

Server Name Status Last Snapshot
CLIENTA-SRV1 Success 2013-05-28 11:03:19
Server Name Status Last Snapshot
CLIENTB-SRV1 Success 2013-05-28 11:10:05
CLIENTB-SRV2 Success 2013-05-28 11:08:37
CLIENTB-SRV3 Success 2013-05-28 11:08:39

Schedule this with cron on Linux or Task Scheduler on Windows, and you can save yourself the time of logging into Datto’s website and drilling down into each appliance.

Eventually, I plan on adding to this script to update a custom field in our RMM platform. The field will be something like last backup or maybe two fields, last successful backup and last backup status. Then if the time since the last successful backup gets too far out, I can have our RMM generate a support ticket. Then I won’t even have to look at the email. Lazy or efficient?

There are a couple other things I may change or just try with future scripts. After writing this, I read about the python module OptParser, which seems like a much better way to handle command line options for your script than the way I’ve been doing them. Also, I’m thinking about changing the configuration settings from XLM to using the Pickle module. It seems much simpler. I would have played with those changes before posting this, but I’m getting ready to head off to training on Business Continuity in Philadelphia and won’t have time.

Oh yeah, I wasted a lot of time trying to figure out how to make this script run no matter if you are on Python 2.7 or Python 3. After putzing around, I found the only thing I needed to do was add the following lines. It worked like a champ.

As always, if you have any recommendations, let me know. Here’s the code.

Download script


18. May 2013 · 4 comments · Categories: Code · Tags: ,

Below is a script that I wrote to address a problem we were having at our clients. Most of our clients have their email hosted on Appriver. They are hosted with Exchange 2010. All the sudden out of the blue, users start reporting that Outlook is not connecting to the servers or if it does connect, it’s not very long before it disconnects. We contacted Appriver to see if they were having issues. It turns our Microsoft put a patch out that was causing XP machines to have issues connecting to the Exchange 2010 farm. The workaround was to put an entry into the host file for the front end server.

So now that I had the fix, did I want to connect to every XP machine and edit the host file. Hell no. To quickly fix this, I wrote a down and dirty version of this script with the host entry statically in the script. It was 4 lines, simply opening the host file, writing the line and closing the file. I then used our RMM, Labtech, to create a script that would run this on XP machines. We ran it and within seconds of it running on the computers, email was working again.

Obviously, it would be bad if this script ran repeatedly on the same computer as it would put duplicate host entries. Now that the fire was out, I wanted to write a version that I could use repeatedly in the future if needed by just passing an IP address and hostname to the script. Unlike my quick fix, the reusable script would have to have some checks built in. It would have to check to make sure the host wasn’t already in the host file, and it would have to make sure the IP address and the hostname is valid.

Here’s what I have so far. I’d like to add the ability to delete an entry and to update an entry as well. In order to run this on Linux, you must run sudo, su, etc. On Windows, you’ll want to run as administrator. Luckily, we’re able to do that via our RMM platform.

Let me know your thoughts. I’m sure there are many ways to improve this, and I’m sure there are other ways to do it.


For those of you not following this mess of me learning to program in Python, this is the third option so far for getting Dell warranty expirations via web scraping. The first option I posted was the one I did without any direction from anyone who knows what they are doing. I used a string function. You can read that post here. After I posted that version on Google+ and Reddit, I got recommendations to do this with regex, Scrapy and BeautifulSoup. My last post was getting the expiration date via regex. This post is getting it with BeautifulSoup, which I must say once I figured out how to do what I wanted was much better.

Here’s a quick run down of how I’m doing this. Again, I’m sure some of this could be done much better.

The modules I use are sys for getting the command line arguments, requests to pull the data from, and lastly BeautifulSoup to parse the html. The function is only a few lines. First, I pull the html from Dell followed by parsing it with BeautifulSoup. Next, I find all the TopTwoWarrantyListItems and assign to the variable lis. Lastly, I compare those list items to pull out the max value which is assigned and returned as the warranty expiration date.

Let me know what you think, good and bad. Every time I post one of these, I get some new advice that helps me learn.



I got a decent amount of feed back and advice on my post the other day about getting a Dell warranty expiration with web scraping. It was recommended to change my scraping to use regex, Beautiful Soup or Scrapy. I figured I’d do all three and make a post on each one.

As you know if you read any of this blog so far, I’m just learning python, so I have a ton to learn. What better way than try the different options presented to me.

The first option I decided to try since I already did a little bit of it for other scripts I haven’t blogged about yet is scraping it via regex. This was quite challenging for a noob like me. I couldn’t seem to quite get the expression down to grab all the dates needed. The original expression I was using would grab the last date, but would skip right over the first one. I have no clue why.

One thing I learned from doing this scrap with regex is my original script was wrong. It was grabbing the date, but it wasn’t necessarily grabbing the correct date. Dell’s website can have multiple expiration dates. If you renew, it’s going to show the original warranty, and the old warranty. If you have a default warranty and upgraded to a better warranty, it’s going to show both. By using regex, I was able to grab the dates and compare them to find the correct expiration.

Another thing I learned about this task in particular is the slowness is not so much my code but Dell’s crappy website. As a network/systems guy, I have to go on Dell’s site a lot, and it is horribly painful to use because of the speed.

OK, so here’s a quick run down of the code and then the actual code.

First it grabs the url as a string. Then it performs a regular expression search looking for the dates and creates a list of tuples with the date being the second item in each tuple. After having a list of tuples, I have a while loop that runs through the tuples and grabs the dates out as integers and puts them into a list of tuples so they can be compared. After I have the dates in a list of tuples, I just use the max function to find out which is the correct date. I’m not sure this is the greatest way to do this, but it seems to work on the service tags I’ve tested out. Lastly, I convert the warranty back into a string to return the warranty as a string.

As I said with my original post, I’m sure this could be improved a million ways. I’m just learning, so any pointers would be appreciated.



As I mentioned in my blog on my first learning resources, right after finishing the Google training, the first script I wrote was a script to get a Dell hardware warranty expiration given the service tag. It was my first attempt at web scraping, something you learn with the Google python videos. I am not sure if this is the best or most efficient way to do this, but it was a way for me to test out what I just learned. If you have some advice, don’t hesitate to share it in the comments.

I’m actually going to do more with this as I learn more. The goal is to access our database of hardware, grab the service tags, get the warranty expiration from Dell, and lastly update a warranty field in the database. From that field, we’ll generate alerts to notify customers of their pending expiration date and hopefully get some warranty sales. This script will be a part of that bigger picture later.

For now, all you do is run SERVICETAG to get the date back in a string format.

This script grabs the html and then searches for TopTwoWarrantyListItem. From that starting point, it looks for four greater than brackets to find the beginning position of the date. Then it looks for the very next less than bracket as the ending point for the date.

When I started typing up this blog, I ran the script and it was giving me errors. It worked when I wrote it, but I’ve since rebuilt my laptop with Linux Mint. My laptop was previously running CentOS, which I believe runs Python 2.6 as default. I’m assuming something changed from there. Considering the current training I’m doing is in Python 3, I made the script check for the python version and error out if not version 3.

Without further ado, here’s the Dell warranty script code.



05. April 2013 · Write a comment · Categories: Code · Tags: , ,

The first thing I wrote in python was prompted by my son’s obsessive use of the Xbox and filling in the rest of his time watching Youtube videos of the games he plays.

I have no problem with his gaming as long as it doesn’t interfere with other interests. For some background, my son has cerebral palsy. He’s very bright and doesn’t have any major problems other than his motor skills. He walks, but not very well. He has always been far ahead intelligence wise and probably read more books by 10 than I read in my entire life. His reading skills are probably at college level and he’s 12.

Now every year it seems to be the same cycle. We give him some slack on the gaming. He ends up doing nothing but gaming, caring about nothing else, and acting quite nasty to his mother. Typically we just take it away. This time I just wanted to limit the time. I found the family time settings on the Xbox (pretty obvious I am not a gamer) and gave him 14 hours a week. That took care of the Xbox. The next thing was Youtube, because if he wasn’t playing Xbox, he was watching other’s play Xbox on Youtube. Here was my solution.

I have a Fortigate firewall at home. I created a rule that blocked Youtube from his IP address. Now, all I had to do was write a little python script to ssh into the firewall and enable and disable the rule as needed. I didn’t want it being something that only I could do. I wanted something my wife could run on her computer to enable his access as well. When ran, the rule would be set to disabled, allowing Youtube for 60 mins. After 60 mins, it would ssh back into the firewall and enable the rule. The firewall has scheduling functionality for rules, but I didn’t want this being tied to a strict schedule. When he asked for time, I wanted to be able to give him an hour and have it shut off automatically.

Here’s the python script to do this. Keep in mind, this is the first thing I wrote. It’s probably not the best way to do it, and the style is probably horrible. That’s kind of what I’m trying to focus on now.

This worked like a charm. For those of you who think I’m a jerk for doing this, I never actually implemented it. After setting the family time settings on the Xbox, my son started getting interested in other things again, in particular history. He’s even using Youtube for learning other things, which I’ve always told him to do. It’s how I learned most of what I know about python so far.