Category Archives: IIS

Publishing and Synchronizing Web Farms using Windows Azure Virtual Machines

Deploying new web applications is pretty painless with Windows Azure Web Sites and “fairly” painless using Windows Azure PaaS style cloud services. However, for existing web apps that are being migrated to the cloud both solutions can require significant rewriting/re-architecture. That is where Windows Azure Infrastructure as a Service comes in. Running Virtual Machines allows you to have the economies of scale of using a cloud based solution and have full access to cloud services such as storage, service bus etc.. while not requiring you to re-architect your application to take advantage of these services.

Usually when you think of cloud computing with Infrastructure as a Service you think of a lot of manual work and management pain. While it is certainly a bit more work than a pure PaaS operation it is possible to lower that management burden using automation tools and techniques.

In this post I will walk through how to use Windows Azure Virtual Machines to create a web farm that you can directly publish to using Visual Studio Web Deploy. In addition to simple publishing I will also show how you can automatically synchronize web content across multiple virtual machines in your service to make web farm content synchronization simple and painless.

Step #1 – Image Preparation

Create a new virtual machine using either Windows Server 2008 R2 or Server 2012. On this machine install the Application Server and Web Server roles and enable ASP.NET).

TIP: Don’t forget to install the .NET Framework 4.0 if you are using Server 2008 R2.

For this solution you will also need the Windows Azure PowerShell Cmdlets on the web server. See this article for configuring your publish settings with the PS cmdlets.
I will use the cmdlets to discover the VM names in my web farm without having to manually keep track of them. This helps if you need the ability to grow and shrink your web farm at will without updating your synchronization scripts.

The tool I will use for content sync is Web Deploy 3.0. Download but do not install Web Deploy 3.0.

Web Deploy works by a starting a remote agent that listens for commands from either Visual Studio or the MSDeploy.exe client. By default it will listen on port 80. This default port configuration will not work in a load balanced environment.

To install on an alternate external port such as 8080:
C:WebDeployInstall>msiexec /I webdeploy_amd64_en-us.msi /passive ADDLOCAL=ALL LISTENURL=http://+:8080/

Once installed you will need to configure a firewall rule to allow traffic in on port 8080 for publishing and synchronization.

Now that the image is configured you will sysprep the vm to remove any unique characteristics like machine names etc. Ensure you have Enter System-Out-Of-Box Experience, Generalize and Shutdown all selected.

Once the VM status is shown as shut down in the Windows Azure Management portal highlight the VM and click capture. This will be the customized image you can use to quickly provision new VMs for your web farm using the management portal or powershell.

Ensure you check I have sysprepped this VM and name the image WebAppImg and click the check mark button to capture the image.

Step #2 – Virtual Machine Deployment

Once the image has been created you can use the portal or the Windows Azure PowerShell cmdlets to provision the web farm.

Here is a PowerShell example of using the new image as the basis for a three VM web farm.

A few things to note: I have created a load balanced endpoint for port 80 but for 8080 I’m only selecting a single server.
This server will be the target server for publishing from Visual Studio that will then be used as the source server for publishing to the other nodes in the web farm.

$imgname = 'WebAppImg'
$cloudsvc = 'MyWebFarm123'
$pass = 'your password'
 
$iisvm1 = New-AzureVMConfig -Name 'iis1' -InstanceSize Small -ImageName $imgname |
	Add-AzureEndpoint -Name web -LocalPort 80 -PublicPort 80 -Protocol tcp -LBSetName web -ProbePath '/' -ProbeProtocol http -ProbePort 80 |
	Add-AzureEndpoint -Name webdeploy -LocalPort 8080 -PublicPort 8080 -Protocol tcp | 
	Add-AzureProvisioningConfig -Windows -Password $pass
 
$iisvm2 = New-AzureVMConfig -Name 'iis2' -InstanceSize Small -ImageName $imgname |
	Add-AzureEndpoint -Name web -LocalPort 80 -PublicPort 80 -Protocol tcp -LBSetName web -ProbePath '/' -ProbeProtocol http -ProbePort 80 |
	Add-AzureProvisioningConfig -Windows -Password $pass
 
$iisvm3 = New-AzureVMConfig -Name 'iis3' -InstanceSize Small -ImageName $imgname |
	Add-AzureEndpoint -Name web -LocalPort 80 -PublicPort 80 -Protocol tcp -LBSetName web -ProbePath '/' -ProbeProtocol http -ProbePort 80 |
	Add-AzureProvisioningConfig -Windows -Password $pass	
 
New-AzureVM -ServiceName $cloudsvc -VMs $iisvm1,$iisvm2,$iisvm3 -Location 'West US'

Once the VMs are provisioned RDP into iis1 by clicking connect in the management portal. This is where you will configure a PowerShell script that will run MSDeploy to synchronize content across the other servers.

Inside of the iis1 virtual machine create a new text file named sync.ps1 in a directory off of your root such as C:SynchScript and paste the following in (ensuring that you update $serviceName with your cloud service name).

Import-Module 'C:Program Files (x86)Microsoft SDKsWindows AzurePowerShellAzureAzure.psd1'
 
$publishingServer = (gc env:computername).toLower()
 
$serviceName = 'REPLACE WITH YOUR CLOUD SERVICE' 
 
Get-AzureVM -ServiceName $serviceName | foreach { 
    if ($_.Name.toLower() -ne $publishingServer) {
       $target = $_.Name + ":8080"
       $source = $publishingServer + ":8080"
 
       $exe = "C:Program FilesIISMicrosoft Web Deploy V3msdeploy.exe"
       [Array]$params = "-verb:sync", "-source:contentPath=C:Inetpubwwwroot,computerName=$source", "-dest:contentPath=C:Inetpubwwwroot,computerName=$target";
 
        & $exe $params;
    }   
}

This script enumerates all of the virtual machines in your cloud service and attempts to run a web deploy sync job on them. If you have other servers in your cloud service like database etc.. you could exclude them by filtering on the VM name. Note: Web Deploy supports MANY more operations other than just synchronizing directories. Click here to find more information.

To enable content synchronization you will need to create a new scheduled task by going into Control Panel -> Administrative Tools -> Scheduled Tasks -> Create a new Task.

Accept the defaults for everything except when it gets to the action screen.

Program/Script: powershell.exe
Parameters: -File C:WebDeployInstallsync.ps1

Open the properties of the new task and you’ll need to modify the schedule to synchronize content fairly often so content isn’t out of sync during a publish.

Ensure you select Run Whether User Is Logged on or Not. You will need to provide an account for the task to run as. I’m choosing the administrator account because I am lazy. However, you could create new duplicate accounts on each of the VMs to use for synchronization.

Step #3 – Publishing with Visual Studio

Finally, to test the configuration create a new MVC app and tweak the code slightly to show the computer name.

Now right click on the project and select publish. In the drop down select new profile.

In the settings page add your cloud app url and append :8080 to it for the service URL.
Set the site/app name to Default Web Site
Set the Destination URL to your cloud app url (without :8080)

Finish the wizard and let Visual Studio publish.

When the web app first launches you may or may not see the new content. It may show the default IIS8 content. As soon as the scheduled task runs the content should sync across all of the servers.

Once it has synchronized press CTRL F5 a few times and you should see the content with the individual machine names to verify the load balancing is working.


In this post you have seen how you can configure a custom OS image that can be used to provision virtual machines for a web farm. You have then seen how you can use Web Deploy along with PowerShell to synchronize content published from Visual Studio across all of the servers in your farm.

Automation is great :)

Deconstructing the Hybrid Twitter Demo at BUILD

Many of you may have watched the Windows Server 8 Session at BUILD and were awed by how cool the Twitter demo delivered by Stefan Schackow was. (Well maybe not awed but at least impressed!). I will admit that it was a fun demo to build because we were using such varied technologies such as IIS8 Web Sockets, AppFabric Topics and Subscriptions and of course Windows Azure Worker Roles.

The demo was built by myself and Paul Stubbs

Here is a screenshot of the application as it ran live:

image

 

Here is a slide depicting the architecture:

image

 

Essentially, the demo shows how you could take data from the cloud and pass it to an application behind your firewall for additional processing (Hybrid Cloud Scenario). The method of transferring data we chose was to use Windows Azure AppFabric Topics (a really powerful queue) as the data transport.

One of the beauties of using Windows Azure AppFabric Topic’s is the ability for multiple clients to receive messages across the topic with a different view of the data using filtered subscriptions.

In our demo we could have client A view our Twitter feed with certain tags enabled while client B had a completely different set enabled.

So on to the source code!

Within the Windows Azure Worker Role I am using a modified version of Guy Burstein’s example to search Twitter based on a set of hashtags.

Sending Twitter Search Results to a Topic

[sourcecode language="CSharp"]

TwitterSubscriptionClient tsc = new TwitterSubscriptionClient();
tsc.CreateTopic();

TwitterWrapper wrapper = new TwitterWrapper();
while (true)
{
if (IsFeedEnabled())
{
try
{
// Call out to Twitter with the set of hash tags we are interested in
// getting search results from.
SearchResults results = wrapper.GetSearchResults(GetHashTags(), SearchResultType.recent, TweetCount, strLiveLatestID);
if (results != null && results.Results.Count > 0)
{
// Save the last ID so we can use it as a continuation token
// for the next query to Twitter
strLiveLatestID = results.Results[0].Id;

// Send the search results over the Service Bus Topic (250 ms delay)
TwitterSubscriptionClient.SendTweets(results, 250);
}
}
catch (Exception exc)
{
System.Diagnostics.Trace.TraceError(DateTime.Now.ToString() + " " + exc.ToString());
}
}
else
{
System.Diagnostics.Trace.TraceInformation("Waiting – Twitter Search is Not Enabled.");
}
Thread.Sleep(GetInterval());
Trace.WriteLine("Working", "Information");
}

That covers the meat of the service but let's go a bit deeper on how the Topic is created.

The constructor of the TwitterSubscriptionClient is where the intialization of all of the Service Bus classes take place.

Initialize and Authenticate the Service Bus Classes

[sourcecode language="CSharp"]

static string serviceNamespace = String.Empty;
static string topicName = String.Empty;
static string issuerName = String.Empty;
static string issuerKey = String.Empty;
static TokenProvider tokenProvider = null;
static Uri serviceUri = null;
NamespaceManager nsMgr = null;

public TwitterSubscriptionClient()
{
serviceNamespace = RoleEnvironment.GetConfigurationSettingValue("serviceNamespace");
topicName = RoleEnvironment.GetConfigurationSettingValue("topicName");
issuerName = RoleEnvironment.GetConfigurationSettingValue("issuerName");
issuerKey = RoleEnvironment.GetConfigurationSettingValue("issuerKey");
serviceUri = ServiceBusEnvironment.CreateServiceUri("sb", serviceNamespace, string.Empty);
tokenProvider = TokenProvider.CreateSharedSecretTokenProvider(issuerName, issuerKey);
NamespaceManagerSettings nms = new NamespaceManagerSettings();
nms.TokenProvider = tokenProvider;
nsMgr = new NamespaceManager(serviceUri, nms);
}

The code below simply tests to see if the topic has already been created and if it has not creates it.

As with all things in the cloud the below code uses some basic retry logic on each operation that operates on a Windows Azure service.

Creating the Topic

[sourcecode language="CSharp"]
public void CreateTopic()
{
int retrySeconds = 3;
int maxRetries = 5;
int retryCounter = 0;

RetryN(() =>
{
if (nsMgr.TopicExists(topicName) == false)
{
TopicDescription twitterTopic = nsMgr.CreateTopic(topicName);
System.Diagnostics.Trace.TraceInformation("Created and Configured Topic");
}
}, null, maxRetries, TimeSpan.FromSeconds(retrySeconds), ref retrySeconds);

if (retryCounter > 0)
System.Diagnostics.Trace.TraceWarning(String.Format("Retried {0} Times Creating Topic.", retryCounter));

}

The method below takes the search results from Twitter and sends them individually into the topic.
Note I'm adding an additional property onto the BrokeredMessage "LowerText" so on the receiving end I can easily filter for various tags on my subscription.
As the text is all lower case I don't care whether the tag is #Microsoft or #microsoft.

Sending Tweets through the Topic

[sourcecode language="CSharp"]
public static void SendTweets(SearchResults srs, int MSDelay)
{
TopicClient client = null;
MessagingFactory messagingFactory = null;
int retrySeconds = 3;
int maxRetries = 5;

try
{
MessagingFactorySettings settings = new MessagingFactorySettings();
settings.TokenProvider = tokenProvider;
messagingFactory = MessagingFactory.Create(serviceUri, settings);
client = messagingFactory.CreateTopicClient(topicName);
foreach (SearchResult sr in srs.Results)
{
int retryCounter = 0;
if (MSDelay > 0)
System.Threading.Thread.Sleep(MSDelay);
try
{
if (TwitterSubscriptionClient.PassesFilter(sr.Text))
{
RetryN(() =>
{
BrokeredMessage message = new BrokeredMessage(sr);
message.TimeToLive = TimeSpan.FromMinutes(15);
message.Properties.Add("Text", sr.Text);
message.Properties.Add("LowerText", sr.Text.ToLower());
client.Send(message);
}, null, maxRetries, TimeSpan.FromSeconds(retrySeconds), ref retryCounter);

if (retryCounter > 0)
System.Diagnostics.Trace.TraceInformation(String.Format("Retried {0} Times Sending Tweet.", retryCounter));
}
}
catch (Exception exc)
{
System.Diagnostics.Trace.TraceError(DateTime.Now.ToString() + " " + exc.Message);
System.Threading.Thread.Sleep(5000);
}
}
}
finally
{
if(client != null)
client.Close();
if(messagingFactory != null)
messagingFactory.Close();
}
}

So now we have built a feed that searches tweets looking for specific hash tags (see the project for the Twitter integration code) and we send the results out on the ServiceBus through a topic. How does our on-premises application consume the data?

The code below is client side javascript code that opens a WebSocket to our IIS8 Server. The socket.onmessage handler essentially waits on data to be sent back from the server which it then parses into a JSON object. If the JSON object has a .Key property I update the Stream Insight UI if not I update the Twitter feed.
There is also a send() method. The only time I send data to the server is when the user has clicked on one of the hash tags on the UI. This updates a structure that holds the current hash tags filter that I send back to the server via the socket.

Initialize the Web Socket

[sourcecode language="JavaScript"]
var socket;
var socketReady = false;
var hashTagsFilter = new Object();

function initializeWebSocket() {
var host = "ws://<%: Request.Url.Host %>:<%: Request.Url.Port %><%: Response.ApplyAppPathModifier("~/TopicStartHandler.ashx") %>";
try {
socket = new WebSocket(host);
socket.onopen = function(msg){
var s = ' Socket Open';
$("#serverStatus").html(s);
socketReady = true;
InitFilter();
send();
};
socket.onmessage = function(msg){
var s = msg.data;
var response = window.JSON.parse(s);

// Stream insight response
if(response.Key != null)
{
AddStreamInsightResponse(response);
}
// Tweet from the live feed
else
{
AddTweet(response);
}
};
socket.onclose = function(msg){
try
{
var s = ' Socket Closed';
$("#serverStatus").html(s);
socketReady = false;
}catch(e) {}
};
}
catch(ex){
console.log(ex.name + "n" + ex.xmessage);
}
}

function send() {
if(socketReady == false)
{
alert("Socket isn't ready.");
return;
}
var subscriptions = window.JSON.stringify(hashTagsFilter);
socket.send(subscriptions);
}

function InitFilter()
{
hashTagsFilter.microsoft = true;
hashTagsFilter.windows = true;
hashTagsFilter.windows8 = true;
hashTagsFilter.azure = true;
hashTagsFilter.wp7 = true;
hashTagsFilter.build = true;
hashTagsFilter.technology = true;
hashTagsFilter.ie = true;
hashTagsFilter.xbox = true;
hashTagsFilter.webdesign = true;
hashTagsFilter.mobile = true;
}

initializeWebSocket();

When a client logs in to our application. We first verify that indeed it is a web socket request and assuming it is create a unique ClientHandler object to facilitate state/communication with the client.

Client Setup

[sourcecode language="CSharp"]
public void ProcessRequest (HttpContext context) {
if (context.IsWebSocketRequest)
{
ClientHandler h = new ClientHandler();
context.AcceptWebSocketRequest(t => h.ProcessTopicMessages(t));
}
}

The implementation of ClientHandler is fairly straight forward. ConfigureServiceBus() sets up the subscriptions to the topic (1 for the live feed and 1 for Stream Insight). The ProcessTopicMessages method is an async method that just waits for input from the user. The only input from the client is the update of the hash tag filter.

If we receive data DeserializeFilter() is called which uses the JSON serializer to deserialize the passed in data to a C# class (TwitterFilter).

Receiving Data from a Web Sockets Client

[sourcecode language="CSharp"]
public ClientHandler()
{
ConfigureServiceBus();
}

public async Task ProcessTopicMessages(AspNetWebSocketContext context)
{
ClientList.ActiveClients[this.GetHashCode()] = this;
socket = context.WebSocket as AspNetWebSocket;
try
{
while (true)
{
WebSocketReceiveResult input = await socket.ReceiveAsync(buffer, CancellationToken.None);
if (socket.State != WebSocketState.Open)
break;

userFilter = DeserializeFilter(buffer, input);
ProcessMessagesFromTopic();
}
}
finally
{
Cleanup();
}
}

private TwitterFilter DeserializeFilter(ArraySegment<byte> buffer, WebSocketReceiveResult input)
{
String jsonString = Encoding.UTF8.GetString(buffer.Array, 0, input.Count);
TwitterFilter newFilter = jsonSerializer.Deserialize<TwitterFilter>(jsonString);
FilterChanged = true;
return newFilter;
}

// Details of ConfigureServiceBus
private void ConfigureServiceBus()
{
FilteredSubscripionLiveName = this.GetHashCode().ToString() + "live";
FilteredSubscripionStreamInsightName = this.GetHashCode().ToString() + "si";
serviceUri = ServiceBusEnvironment.CreateServiceUri("sb", serviceNamespace, string.Empty);
tokenProvider = TokenProvider.CreateSharedSecretTokenProvider(issuerName, issuerKey);
NamespaceManagerSettings nms = new NamespaceManagerSettings();
nms.TokenProvider = tokenProvider;
ns = new NamespaceManager(serviceUri, nms);
MessagingFactorySettings mfs = new MessagingFactorySettings();
mfs.TokenProvider = tokenProvider;
int retryCount = 3;
do
{
try
{
messagingFactory = MessagingFactory.Create(serviceUri, mfs);

// Subscription for live feed
ns.CreateSubscription(topicName, FilteredSubscripionLiveName, new FalseFilter());
subClientLF = messagingFactory.CreateSubscriptionClient(topicName, FilteredSubscripionLiveName, ReceiveMode.ReceiveAndDelete);
ClearSubscriptionRules(subClientLF);

// Subscription for Stream Insight
ns.CreateSubscription(topicName, FilteredSubscripionStreamInsightName, new FalseFilter());
subClientSI = messagingFactory.CreateSubscriptionClient(topicName, FilteredSubscripionStreamInsightName, ReceiveMode.ReceiveAndDelete);
ClearSubscriptionRules(subClientSI);
break;
}
catch (Exception) { }
} while (retryCount-- > 0);

// Start the stream insight engine
StartStreamInsight(FilteredSubscripionStreamInsightName);
}

GetSubscriptionMessages() is called by ProcessMessagesfromTopic. The method sits in a loop testing to see if the users filter has changed and if it has it will dynamically update the SQLFilterExpression for both subscriptions (removing the existing rules in the process). If not it queries the Topic for the next Tweet available and passes it back to the user over the web socket using SendMessage(). The GetFilterString is below as well. It dynamically creates the query for the SQLFilterExpression for the Stream Insight client and the Live Feed client.

Receive Tweets from the Service Bus Topic and Send them over the Web Socket.

[sourcecode language="CSharp"]

// Pulls messages from a filtered subscription
public void GetSubscriptionMessages()
{
while (true)
{
if(FilterChanged == true)
{
FilterChanged = false;
String SQLFilterExpression = SBHelpers.GetFilterString(userFilter);
ClearSubscriptionRules(subClientLF);
ClearSubscriptionRules(subClientSI);
if (SQLFilterExpression != String.Empty)
{
RuleDescription subRule = new RuleDescription("subscriptionRule", new SqlFilter(SQLFilterExpression));
subClientLF.AddRule(subRule);
subClientSI.AddRule(subRule);
}
}
String lfResponse = GetIncomingTweet();
if(lfResponse != String.Empty)
SendMessage(lfResponse);
}
}

// Clears existing rules from subscription
private void ClearSubscriptionRules(SubscriptionClient subClient)
{
foreach (var r in ns.GetRules(topicName, subClient.Name))
{
subClient.RemoveRule(r.Name);
}
}
public static String GetFilterString(TwitterFilter filter)
{
String subRules = String.Empty;
if (filter.microsoft)
subRules = " LowerText like '%#microsoft%' OR ";
if (filter.windows)
subRules += " LowerText Like '%#windows%' OR ";
if (filter.windows8)
subRules += " LowerText Like '%#win8%' OR ";
if (filter.azure)
subRules += " LowerText Like '%#azure%' OR ";
if (filter.wp7)
subRules += " LowerText Like '%#wp7%' OR ";
if (filter.build)
subRules += " LowerText Like '%#bldwin%' OR ";
if (filter.technology)
subRules += " LowerText Like '%#technology%' OR ";
if (filter.ie)
subRules += " LowerText Like '%#ie%' OR ";
if (filter.xbox)
subRules += " LowerText Like '%#xbox%' OR ";
if (filter.mobile)
subRules += " LowerText Like '%#mobile%' OR ";
if (filter.webdesign)
subRules += " LowerText Like '%#webdesign%' OR ";
if (filter.build)
subRules += " LowerText Like '%#build%' OR ";
if (subRules.EndsWith(" OR "))
subRules = subRules.Substring(0, subRules.Length - 4);
return subRules;
}

GetIncomingTweet waits for a short period of time looking for Tweets. Once it receives one it serializes it into a JSON object to be processed by the browser client.

Pull Tweet of the Topic and pass the JSON serialized version back.

[sourcecode language="CSharp"]
// Read from our subscription to the TwitterTopic to look for the next tweet
public string GetIncomingTweet()
{
String jsonTweet = String.Empty;
try
{
BrokeredMessage message = subClientLF.Receive(TimeSpan.FromSeconds(3));
if (message != null)
{
var obj = message.GetBody<SearchResult>();
jsonTweet = jsonSerializer.Serialize(obj);
}
}
catch (Exception)
{
// Client exited while we were still processing messages
return String.Empty;
}
return jsonTweet;
}

Stream Insight is configured in a method called StartStreamInsight. We have an adapter for Stream Insight in the project to read events form the Service Bus Topic that allows SI to calculate data on the events we pass in. The query it is executing is the Linq query below that basically tells it to calculate how many times each hash tag has been tweeted in the last 60 seconds and to recalulate that value every 1 second. Very Powerful!

Stream Insight Linq Query

[sourcecode language="CSharp"]
//Create the Query template from the Event Stream
var result = from e in inputStream.Split(e => e.Text, "#")
group e by e.ToLower() into g
from win in g.HoppingWindow(TimeSpan.FromSeconds(60), TimeSpan.FromSeconds(1))
select new SearchResultsData
{
Count = win.Count(),
Key = g.Key
};

Finally, the method to send the data back to the client over the web socket.

Using socket.SendAsync to send data over the web socket.

[sourcecode language="CSharp"]
public async Task SendMessage(string message)
{
if (socket.State == WebSocketState.Open)
{
ArraySegment<byte> outputBuffer = new ArraySegment<byte>(Encoding.UTF8.GetBytes(message));
await socket.SendAsync(outputBuffer, WebSocketMessageType.Text, true, CancellationToken.None);
}
}

If you would like to try this demo out for yourself I've uploaded the TwitterFeed project (Dev 10) and the Web Sockets Demo (Dev 11). Since this project uses Web Sockets it requires IIS 8 - tested with the developer build Windows Server 8 Developer Preview along with Visual Studio Developer Preview. Both of which can be downloaded from MSDN.

You will also need to install Stream Insight from http://www.microsoft.com/download/en/details.aspx?id=26720 download and install the 1033x64StreamInsight.msi instance. Use all defaults and ”StreamInsightInstance” as the instance name.

Managing Log Files with Windows Azure PowerShell Cmdlets 2.0

This is part 5 of a series:

In this example I’m going to show you how you can manage log data from your Windows Azure Application using new functionality in the Windows Azure PowerShell Cmdlets 2.0.

Just as in the other articles you will need to add the PowerShell Snapin (or module):

  Add-PsSnapin WAPPSCmdlets

Ditto, I have this handy initialization block and helper function GetDiagRoles:

Initialization and Helper Function

 
 $storageAccountName = &quot;YourStorageAccountName&quot; 
 $storageAccountKey = &quot;YourStorageAccountKey&quot; 
 $deploymentSlot = &quot;Production&quot; 
 $serviceName = &quot;YourHostedService&quot; 
 $subscriptionId = &quot;YourSubscriptionId&quot; 
 # Thumbprint for your cert goes below
 $mgmtCert = Get-Item cert:CurrentUserMyD7BECD4D63EBAF86023BB4F1A5FBF5C2C924902A 
 
 
 function GetDiagRoles { 
   Get-HostedService -ServiceName $serviceName -SubscriptionId $subscriptionId -Certificate $cert | ` 
   Get-Deployment -Slot $deploymentslot | ` 
   Get-DiagnosticAwareRoles -StorageAccountName $storageAccount -StorageAccountKey $storageKey 
 }

What is distinct about managing log files remotely in Windows Azure is determining where on the file system the log files actually are.

I’ve written some samples that show some techniques you can use from PowerShell for determining file locations.

When determining the local path for a named local resource I’m using guidance from MSDN: http://msdn.microsoft.com/en-us/library/ee758708.aspx.

For the rest of the log paths I am essentially just using observations of how the paths are constructed. At this point there is no guarantee this format will change but for the time being it doesn’t look too likely.

Returns the Local Path for IIS Logs

function GetIISLogsPath()
{
 $input | foreach { 
  $path = &quot;C:ResourcesDirectory&quot; + $_.DeploymentId + &quot;.&quot; + $_.RoleName + &quot;.DiagnosticStoreLogFiles&quot;
  return $path
 }
}

Returns the Local Path for IIS Failed Request Logs

function GetIISFailedRequestPath()
{
 $input | foreach {
  $path = &quot;C:ResourcesDirectory&quot; + $_.DeploymentId + &quot;.&quot; + $_.RoleName + &quot;.DiagnosticStoreFailedReqLogFiles&quot;
  return $path
 }
}

Returns the Local Path for Crash Dumps

function GetCrashDumpsPath($rolename, $deploymentid)
{
 $input | foreach {
  $path = &quot;C:ResourcesDirectory&quot; + $_ + &quot;.&quot; + $_ + &quot;.DiagnosticStoreCrashdumps&quot;
  return $path
 }
}

Returns the Local Path for a Named Local Resource

function GetLocalResourcePath($localResource)
{
 $input | foreach {
  $path = &quot;C:Resourcesdirectory&quot; + $_.DeploymentId + &quot;.&quot; + $_.RoleName + &quot;.&quot; + $localResource
  return $path
 }
}

Now that we have some helper functions to figure out where are logs are in our Windows Azure Deployment we need a method of modifying the logging data sources in Windows Azure Diagnostics.
The long function below does just that. It takes the local path of the files you want to transfer, what container in Windows Azure Storage to transfer the files to, a quota (can be set to 0) and how often to transfer that directory to storage in minutes.

Configures a local directory in a Windows Azure role to be transferred into Windows Azure Storage.

function SetDirectory($path, $container, $quota, $transferAllInMinutes)
{
  $input | foreach { 
    $role = $_ 
    $role | Get-DiagnosticAwareRoleInstances  |
      foreach {
          $roleinstance = $_
          $diagConfig = $role | Get-DiagnosticConfiguration -InstanceId $roleinstance -BufferName Directories
          $existingDirectory = $false
          $newDirectoryConfig = $null
          for($i=0;$i -lt $diagConfig.DataSources.Count; $i++)
          {
             $tmpDiag = $diagConfig.DataSources[$i]
             if($tmpDiag.Path -eq $path)
             {
                $existingDirectory = $true
                $newDirectoryConfig = $diagConfig.DataSources[$i]
                break
             }
          }
 
          # doesn't already exist in the collection so create a new one
          if($existingDirectory -eq $false) 
          {
            $newDirectoryConfig = New-Object -TypeName Microsoft.WindowsAzure.Diagnostics.DirectoryConfiguration
          }
          $newDirectoryConfig.Container = $container
          $newDirectoryConfig.Path = $path
          $newDirectoryConfig.DirectoryQuotaInMB = $quota
          if($existingDirectory -eq $false)
          {
            $diagConfig.DataSources.Add($newDirectoryConfig)
          }
 
          $role | Set-FileBasedLog -DirectoriesConfiguration $diagConfig.DataSources -BufferQuotaInMB $quota -TransferPeriod $transferAllInMinutes
        }
    }
}

Now the step of actually configuring your role using all of this code.

The snippet below configures IIS logs, IIS failed request logs and a local resource named “customLogging” to be transferred to storage every 15 minutes.

Configuring Logs to be Transferred

GetDiagRoles | foreach {
  $CurrentRole = $_
 
  if($CurrentRole.RoleName -eq &quot;MyWebRole&quot;)
  {
   # Retrieve the local paths in the role
   $iislogs = $CurrentRole | GetIISLogsPath
   $iisFailedLogs = $CurrentRole  | GetIISFailedRequestPath
   $customLogs = $CurrentRole | GetLocalResourcePath &quot;customLogging&quot;
 
   # Configure diagnostics to transfer log data to storage
   $CurrentRole | SetDirectory $iisFailedLogs &quot;wad-iis-failedreqlogfiles&quot;  0 15
   $CurrentRole | SetDirectory $iislogs &quot;wad-iis-logfiles&quot;  0 15
   $CurrentRole | SetDirectory $customLogs  &quot;wad-customlogging&quot;  0 15 
  }
}

The snippet below downloads the files from each container to the local file system.

Downloading the Data from Storage

Save-Container -ContainerName &quot;wad-iis-logfiles&quot; -LocalPath &quot;c:diagdata&quot; `
			-StorageAccountKey $storagekey -StorageAccountName $storageAccount
 
Save-Container -ContainerName &quot;wad-iis-failedreqlogfiles&quot; -LocalPath &quot;c:diagdata&quot; `
			-StorageAccountKey $storagekey -StorageAccountName $storageAccount
 
Save-Container -ContainerName &quot;wad-customlogging&quot; -LocalPath &quot;c:diagdata&quot; `
			-StorageAccountKey $storagekey -StorageAccountName $storageAccount

Finally, to clean up storage from each container we provide you with the Clear-Container cmdlet.

Note: Currently, we do not provide a -From or -To parameter to Clear-Container. To ensure that you do not delete data that may currently be being logged
you will need to ensure your scheduled transfers do not overlap during the same time period of a deletion. Yes, we are looking into improving this scenario.

Cleaning up a Container

Clear-Container -ContainerName &quot;wad-iis-logfiles&quot;  `
			-StorageAccountKey $storagekey -StorageAccountName $storageAccount
 
Clear-Container -ContainerName &quot;wad-iis-failedreqlogfiles&quot; `
			-StorageAccountKey $storagekey -StorageAccountName $storageAccount
 
Clear-Container -ContainerName &quot;wad-customlogging&quot; `
			-StorageAccountKey $storagekey -StorageAccountName $storageAccount

Annoying error using System.DirectoryServices and IIS App Pools

If you get the following error when manipulating IIS with System.DirectoryServices

System.Runtime.InteropServices.COMException was unhandled
Message=Unknown error (0×80005000)
Source=System.DirectoryServices
ErrorCode=-2147463168
StackTrace:
at System.DirectoryServices.DirectoryEntry.Bind(Boolean throwIfFail)
at System.DirectoryServices.DirectoryEntry.Bind()
at System.DirectoryServices.DirectoryEntry.get_IsContainer()
at System.DirectoryServices.DirectoryEntries.CheckIsContainer()
at System.DirectoryServices.DirectoryEntries.Find(String name, String schemaClassName)

Don’t forget to install the following role features -> IIS 6 Management Compatability -> IIS 6 Metabase Compatability.

I completely forgot about it and wasted a good hour on it.