Category Archives: Visual Studio 2010

Deployment to Windows Azure Fails with Profiling Enabled

With the new 1.4 Tools if you try to deploy a project that has the profiler enabled and a VM Role as one of your Azure roles you will receive an error on deployment.

11:04:55 AM – HTTP Status Code: 400/nError Message: One or more configuration settings are specified for this deployment configuration but are not defined in the service definition file: MortgageRatesPDFService:Profiling.ProfilingConnectionString, MortgageRatesPDFService:CloudToolsDiagnosticAgentVersion./nOperation Id: f7423416-9f32-4a9f-9c3f-9766d859d2bf

The workaround is to add the settings to your VM Role’s .cscfg and .csdef files (You don’t need an actual connection string):

.CSCFG
[sourcecode language="xml"]
<Setting name="Profiling.ProfilingConnectionString" value="SomeBogusValue"/>
<Setting name="CloudToolsDiagnosticAgentVersion" value="1.4"/>

.CSDEF
[sourcecode language="xml"]
<ConfigurationSettings>
<Setting name="CloudToolsDiagnosticAgentVersion"/>
<Setting name="Profiling.ProfilingConnectionString"/>
</ConfigurationSettings>

This should allow the deployment to proceed.. Happy profiling!

Profiling Windows Azure Applications using the August 2011 Release of the Visual Studio Tools (1.4)

For a recap of what is new in this release see my earlier post: What’s new in Windows Azure VS.NET Tools 1.4

Note: The tools can be downloaded here using the Web Platform Installer.

The profiler in Visual Studio is very powerful and extremely useful for tracking down inefficiencies and memory leaks in your application. With the 1.4.4 release of the Visual Studio Tools for Windows Azure profiling support has been added that allows you to easily profile your applications to track down these types of hard to find problems while they are running live in the cloud.

For this article I’ll use a simple but common performance problem: string concatenation.

I have three functions.

GenerateSomeData() just creates a new Guid and converts it to a string to return.

[sourcecode language="csharp"]
private string GenerateSomeData()
{
Guid g = Guid.NewGuid();
return g.ToString();
}

DoVeryInefficientStringCopying() calls GenerateSomeData() 1000 times and concatenates the result into a string. String concatenation is very intensive because strings in C# are immutable (you can't change a string once it is created) so each concatenation is actually creating a new string with the previous/new combined.

[sourcecode language="csharp"]
private string DoVeryInefficientStringCopying()
{
String tmpString = String.Empty;
for (int i = 0; i < 1000; i++)
tmpString += GenerateSomeData();
return tmpString;<br />
}

DoMoreEfficientStringCopying() - accomplishes the same goal except it uses StringBuilder to append the string instead of using the string class's += operator. The difference is StringBuilder is efficient and uses a buffer to grow the string instead of constantly creating new strings and copying memory.

[sourcecode language="csharp"]
private string DoMoreEfficientStringCopying()
{
System.Text.StringBuilder tmpSB = new System.Text.StringBuilder();
for (int i = 0; i < 1000; i++)
tmpSB.Append(GenerateSomeData());
return tmpSB.ToString();
}

I'm adding this code to a worker role I will deploy out to Azure with profiling enabled.

Here is the worker role code:

[sourcecode language="csharp"]
public override void Run()
{
// This is a sample worker implementation. Replace with your logic.
Trace.WriteLine("WorkerRole1 entry point called", "Information");
while (true)
{
// Use the profiler to see which method is more efficient..
DoVeryInefficientStringCopying();
DoMoreEfficientStringCopying();
Thread.Sleep(10000);
Trace.WriteLine("Working", "Information");
}
}

For profiling this simple problem I'm going to choose CPU Sampling:

Image

Once the application is deployed and the scenario you are profiling has been reproduced you can analyze the profiling data.

The first step is to configure your symbol paths. Within Visual Studio click tools -> options -> debugging. I have a debug build and I'm not calling any additional libraries so I don't need to set anything here but if you have components with .pdb's outside of this project you could reference them here as well as the Microsoft symbol servers.

Image

To analyze the profiling report expand the worker role from within server manager and select "View Profiling Report":

Image

The initial screen shows my overall CPU usage which isn't much considering my code is sleeping every 10 seconds. It also highlights a "hot path" which is the most expensive code path captured during the profiling session. Unsurprisingly, it is in the DoVeryInefficientStringCopying() method.

Image

Clicking on the WorkerRole.Run() method drills in where I can actually compare and contrast the two methods:

Image

 

It's easy to see that 94.6% of our time was spent in the VeryIneffecient method compared to the Effecient version.

If I then click on the DoVeryInefficientStringCopying() method in the top window I can drill in further which gets us to the += operation causing all of the trouble.

Image

You can also change the view to a table layout of all the different functions to compare and contrast how much time was spent in each function and various other views.

Image

For more information on analyzing the report data using Visual Studio 2010 Profiler see the following link: http://msdn.microsoft.com/en-us/library/ms182389.aspx.

Visual Studio Tools for Windows Azure 1.4 (August 2011)

Greetings! There is a new release of the Visual Studio Tools for Windows Azure (1.4). The tools can be downloaded here using the Web Platform Installer.

One of the first things you will notice after creating a new cloud project with the new tools is that there are now multiple configuration files.

A configuration file for running in the cloud and another for running in the local developer fabric.

Image

 

The usefulnes of this becomes more apparent when you view the greatly enhanced packaging and publishing dialogs. There is now a dropdown to all you to select which configuration to use.

Image

Image

You are also not limited to using just the two configuration files. Right clicking on your cloud project gives a new option on your context menu “Manage Configurations”.

Image

From there you can create a copy and rename to add multiple alternate configuration files.

Image

Another significant improvement in this release is the inclusion of the ASP.NET MVC 3 web role. The new template includes the new universal ASP.Net providers that support SQL Azure and it will also make sure that MVC assemblies are deployed with your application when you publish to Windows Azure.

Image

 

Finally, another major addition to the publishing dialog is the “Enable Profiling” checkbox. Clicking the settings link to the right reveals what profiler options are available.

Image

I’ll cover using the profiler in the next post!

To read more about the 1.4 release including new support for package validation take a look at the MSDN What’s New Article.

COM+ and Serviced Components within Windows Azure

This post is the first in a series that discuss running COM+/Serviced Components within Windows Azure.

The primary reason for even bringing up this topic is COM+/MTS has been around for a VERY long time and we are well aware that there is a lot of existing code investments that companies do not want to just “throw away”. Having said that it is important to point out that Windows Azure is very different from the typical environment that most COM+ applications were originally designed for.

Data Access

The number one thing to consider when considering moving COM+ to the cloud is your data. Where will it live? Moving your data to the cloud with SQL Azure will likely be your first choice. Consider though that SQL Azure does not currently support distributed transactions (For Reference). Many people used COM+ specifically for handling distributed transactions so this will require some careful thought as to whether you should change your transaction model from [AutoComplete] or SetComplete/SetAbort to using SqlTransactions and some manual transaction handling for other resources. Other options are building hybrid models where the data access is handled on premises and accessed through Windows Azure Connect or Windows Azure AppFabric Service Bus.

Authentication

In Windows Azure by default there is no Active Directory. So if you are currently performing role based authorization in COM+ then you will want to look into using Windows Azure Connect to domain join the roles that are hosting your COM+ package.

Deployment

Deploying COM+ objects to the cloud can be tricky. At a minimum you will be deploying an msi file if you are deploying a native COM+ object. If you are deploying managed code you will likely be scripting out regsvcs.exe on your assembly. If you are deploying remotely to a worker role and calling from a web tier you will be deploying the application proxy AND the COM+ object. Not to mention there will be some firewall configuration needed. I’ve written a PowerShell script that makes this easier that I will post shortly. I’m 100% positive it will not cover all scenarios but it might be a good start.

Deploying a ServicedComponent in Windows Azure

Topology

In an enterprise environment it is pretty easy to code against a remote object because you know ahead of time what the server name or the IP address will be. In the cloud this is not the case. Instances come and go so changing your client code to dynamically discover where your objects live will be important.

Queued Components

MSMQ is not currently supported in Windows Azure due to the lack of durability of disks. So queued components is another scenario that currently is not suitable to run up in the cloud. Alternatives for queued components are Windows Azure Queues and AppFabric Service Bus queues. Depending on the level of functionality needed one should suit your purpose.

Irritating error trying to use CreateChildControls in a user control.

I was trying to programmatically add an ObjectDataSource to my CreateChildControls method:

ObjectDataSource ods = new ObjectDataSource();
ods.SelectMethod = “GetDataTable”;
ods.TypeName = “SPWebParts.LeadViewer.LeadViewerUserControl”;
ods.ID = “leadsDS”;
Controls.Add(ods);

This SHOULD work! However, everytime I was getting the error:

“The control collection cannot be modified during DataBind, Init, Load, PreRender or Unload phases.”

 No idea why.. However, the quick solution is just to instantiate the ODS declaratively instead of CreateChildControls.

 <asp:ObjectDataSource ID=”leadsDS” runat=”server” SelectMethod=”GetDataTable” TypeName=”SPWebParts.LeadViewer.LeadViewerUserControl”>
</asp:ObjectDataSource>

Problem solved!

Visual Studio 2010 – ContentPlaceHolder error with MasterPages

A few days into using the RTM release of Visual Studio 2010 I ran into a quirky bug with my aspx pages when flipping to design view.

Not all pages but on some I would get “The page has one or more <asp:Content> controls that do not correspond with <asp:ContentPlaceHolder> controls in the Master Page.”.

I spent a few minutes verifying that indeed my ContentPlaceHolder controls do match up.  Not wanting to spend more time on it I ignored it because I don’t usually use design view for much anyways.

However,  later I opened an aspx page and design view worked fine.. Hmm. So I then spent a few minutes to figure out the Visual Studio voodoo that made it work this time and fail before.

The difference is I had the actual MasterPage open in design view. Don’t ask me!

So as a “workaround” if you are experiencing this problem and it is not because you have your ContentPlaceHolder controls messed up open the MasterPage in design view first THEN open your aspx page and flip to design view.

Obvious plug for business

P.S. If you know anyone is looking for Homes in Frisco Texas or Homes in McKinney Texas let us know.
We also have a Houston site if you are moving to Cypress Texas or anywhere around Houston and need a Houston REALTOR.