Category: PowerShell

Exporting E-Mail to Outlook data files programmatically


This post discusses a simple script to export one or more Exchange Online mailboxes to an Outlook data file programmatically.

Why am I doing this?

I wrote the script to export the mailbox data from an Office 365 tenant for an Office 365 tenant-to-tenant migration. Now I know there are tools out there that do this but money constraints meant I had to come up with something else.

How does it work?

First of all your Outlook profile must contain all the mailboxes you want to export i.e. you have been granted ‘Full Access’ over the mailboxes and they must be in online mode not cached, the reason being if they’re cached the copy process might not contain all the mailbox data. Mailboxes which are cached are reported in the logs as per the image below.


So, the script, well the script uses the Outlook Object Model to connect to Outlook and enumerate the MAPI stores. Each store is assigned an associated Outlook data file, which is attached and named PST: ‘mailbox store name’, next the script enumerates the mailbox store folder structure and copies the data from each folder to the Outlook data file store, once complete it disconnects the Outlook data file store and moves onto the next mailbox store in the profile.

The Outlook data file exports are stored in a directory named exports within the directory where the script is run from and the log file is stored in the script root directory.



The script has scope for lots of improvements such as better error checking, retries if the copy process fails on a particular folder and maybe even consuming user credentials to have mailboxes from both tenants, then run the copy process.

The script can be downloaded from GitHub here:

References – used to name the Outlook data file store in order to disconnect it reliably. – used for the disabling of the Outlook cache mode.


Spinning up and spinning down Azure VMs


This post discusses a relatively simple piece of code used to start or stop one or more Azure virtual machines within a defined time window.

Why am I doing this?

So, I have come up with this code to solve an issue with one of our customers wanting to de-allocate their VMs during non-business hours using PowerShell; the alternative would be to login to the Azure portal and selecting start…


…or shutdown everyday; not going to happen.


How does it work?

So this post references an earlier article I wrote around authenticating against Azure when scripting – see here.

The script reads in data from a csv file called AzureIaaSVMs.csv; the script looks for this file within the directory the script is being run from.



Once the data has been imported it works out what day it is and then checks whether that particular Azure VM should be running or shutdown and does the necessary.


I have this script scheduled to run every hour using Task Scheduler. The command for example would be c:\…\powershell.exe -file c:\…\deallocateAzureVMs.ps1


So the this script is relatively simple at the minute but does a job. The script could be extended to…

…report failures using email notification

…include multiple subscriptions

…be more granular regarding start or stop time and improved time logic

The current version of the script can be downloaded from GitHub here –

Copying VHDs between Windows Azure Storage Accounts

Script overview

Thought I’d write an interactive script which allows you to specify the source and destination Storage Account name, container and blob.

Requires you’re already connected to Windows Azure – see this article for more info – should really integrate connecting to Azure as part of this script.


The functions are there basically to check whether the storage accounts, containers and blobs exist.


Script body

The main body of the script uses a valid variable to determine whether it can move to the next step.


Example of script output:


The script is available here.

Connecting to Windows Azure with PowerShell

Installing the PowerShell module

First of all you need the Windows Azure PowerShell module which can be downloaded from here.


The module is installed via the Microsoft Web PI, simply follow the installer. If PowerShell is open when you install the Windows Azure module simply restart PowerShell i.e. close it and reopen.

Next check the module is available using Get-Module -ListAvailable


This will be listed at the bottom if Azure is available.


Import the module using Import-Module Azure


Get-Command -Module Azure will give you a list of the available commands.

Connecting to Azure

[Updated 16/03/2016]:

If you’re looking to use an interactive PowerShell session with Azure then the Add-AzureAccount cmdlet is suitable; this will give you a 12 hour session token; after this time you need to re-authenticate.


Enter your email address associated with the Azure subscription and follow the prompts. Once you have authenticated return to PowerShell and type Get-AzureSubscription to see your subscriptions.

If you have multiple subscription then you’ll need to determine which one if default and which is current.


Get-AzureSubscription -Default returns the default subscription.


Get-AzureSubscription -Current returns the currently selected subscription.


If you have multiple subscriptions with the same name then the -ExtendedDetails parameter is useful to determine what is what.


Now comes the question…how do I authenticate against Azure when scripting? Well you need to use the PublishSettingsFile but If you’ve already added the subscription using the Add-AzureAccount cmdlet you’ll need to remove it first.


Use the Remove-AzureAccount and then use the Get-AzurePublishSettingsFile to get the certificate for the subscription to enable non-interactive authentication.


[Updated 16/03/2016]

Using PublishSettingsFile

First of all run Get-AzurePublishSettingsFile


This will open an internet browser and you’ll be prompted to enter your credentials associated with your Azure subscription, once authenticated a publishsettings file will be downloaded.


Next import the publish settings file using Import-AzurePublishSettingsFile -PublishSettingsFile FileName…


When you run Get-AzureSubscription you’ll notice your subscription will contain a certificate, you should now delete the downloaded .publishSettings file.


Run some commands against your subscription…Get-AzureVM…Get-AzureStorageAccount…


Creating websites in IIS 7 / 7.5 using PowerShell

Script available in GitHub here –

The main reason I created this script was to speed up the time it took to create a website; from creating the folder structure, anonymous user account, assigning NTFS permissions and finally creating the IIS configuration.

The script end to end will:

  • Check whether IIS is installed
  • Check whether the web administration module is available
  • Prompt for a website / domain name, IP address, anonymous user account name and password and web root i.e. the drive letter where website folder structure should be created
  • Set the anonymous authentication mechanism of IIS to use the application pool identity
  • Create a anonymous user account for the site and application pool (there is the option to specify a pre-existing user account)
  • Create a folder structure
    • [drive letter]:\[domains]\
    • [drive letter]:\[domains]\[website name]
    • [drive letter]:\[domains]\[website name]\[wwwroot]
    • [drive letter]:\[domains]\[website name]\[logs]
  • Assign NTFS permissions to the folder structure created above
    • Set List contents on [drive letter]:\[domains]\[website name] for the anonymous user account
    • Set Read and Execute on [drive letter]:\[domains]\[website name]\wwwroot for the anonymous user account
  • Create an application pool within IIS
    • Configure the application pool process model identity
  • Create a website within IIS
    • Configure the website to use the application pool created above
    • Configure the website bindings (IP, Port and host header(s)
    • Confgure the website logging location

All the above steps are validated in some form by using

  • Regex
  • Web Administration snapin / module functionality
  • PowerShell cmdlets
  • Custom PowerShell functions

The following improvements are required: (In my opinion)

  • Resetting the root drive permissions (one time run) to remove all NTFS permissions except for Administrators and SYSTEM. Standalone PowerShell script here
  • Configure the W3C logging fields; i generally select date, time, client IP, Server IP, URI stem, URI query, protocol status, bytes sent, bytes received, user agent, cookie and referrer.

An alternative way to set the logging would be to execute this command from a command prompt:

appcmd.exe set config  -section:system.applicationHost/log /centralW3CLogFile.logExtFileFlags:”Date, Time, ClientIP, ServerIP, UriStem, UriQuery, HttpStatus, BytesSent, BytesRecv, UserAgent, Cookie, Referer”  /commit:apphost