Thank you Outlook for this informative message…

image

‘nuf said.

Defrag in 7

[Continuing on my rant of new little features I’m discovering in Windows 7…]

I’m quite sure I wasn’t the only one who was disappointed the first time I ran defrag in Vista to see they had taken my precious UI away from me…

Sure, I agree that the user doesn’t really need to know what is happening under the covers, just that the file system is being taken care of. I also agree that users should be able to easily schedule the defrag as part of a maintenance routine. However, in simplifying the UI they took away the user’s ability to determine the status of the process. This in my eyes was the major issue and was why I started using Auslogics Disk Defrag, which does a (speedy and) great job.

From the new look of defrag in 7, it looks like they heard the various cries of outrage… They’ve given us a new UI that allows us to configure a schedule AND to watch the status of a currently running defrag. They’ve also provided an Analyse button that determines the fragmentation of the drive.

image 

A step back in the right direction here…

Some SDS basics

I just thought I’d share some basic notes on SQL Data Services that I took while reading through a few pages on MSDN and doing a few of the labs. Head to SQL Data Services (SDS) on MSDN for all the documentation, or hit the portal SDS portal for the SDK and heaps more.

 

  • Authority – the unique name of the service. This is how the service is referenced, e.g. an authority name of ducas-authority would be represented by the DNS ducas-authority.data.database.windows.net. Authorities can only contain lowercase letters (a-z), numbers (0-9) and hyphens.

 

  • Container – a store for data/entities. These are “buckets” of objects that can be used without worrying about any schema. In this release cross-container queries are not supported.

 

  • Entity – an object with user-defined properties and values that is stored inside a container. Entities can be blog or non-blob both of which have metadata properties. A non-blob entity will have Id, Version and Kind properties and a blob entity will also have the Content property. Non-blob entites have flexible properties that are of the scalar types string, binary, boolean, decimal and date time.

 

  • URI Space – when using REST for SDS, you deal with Service, Authority, Container and Entity URIs. Querying the URIs will return POX that defines the results.
    • The Service URI that allows you to create and query your Authorities is https://data.database.windows.net/v1.
    • The Authority URI depends on your authority name and can be used to create or query for containers – https://<authority-name&gt;.data.database.windows.net/v1.
    • The Container URI allows you to create and query entities in the container – https://<authority-name&gt;.data.database.windows.net/v1/<container-name>.
    • The Entity URI allows you to retrieve, update and delete an entity – https://<authority-name&gt;.data.database.windows.net/v1/<container-name>/<entity-id>.

 

  • Queries – iterate over a scope and return a POX representation of the results. The general syntax for querying entities is “from e in entities [where condition] [orderby property1] select e”. Flexible properties are stored in a property collection and most be quiered using a string indexer (e.g. e[“Age”] == 32) whereas metadata properties are on the object (e.g. e.Id == “someId”).
    • Using SOAP, the authority scope must be defined. Use the proxy’s Get method to get an entity or Query method to query the scope.
    • Using REST, the URI must be well formed for the scope of the query. The actual query is appended to the URI, e.g. https://myAuth.data.database.windows.net/v1/c1?q=’from e in entities select e’.
    • The default page size of a result set is 500 entities.
    • Take can be used to specify the number of results to return (as long as it’s less than 500) – (from e in entities orderby e[“Author”], e.Id select e).Take(10)
    • OfKind can be used as a shortcut to specifying the Kind of an entity – from o in entities where o.Kind == “Order” is the same as from o in entities.OfKind(“Order”)
    • Joins can be performed on entities in the same container by specifying multiple from clauses – from c in entities.OfKind(“Customer”) where c[“Name”] == “Customer 1” from o in entities.OfKind(“Order”) where o[“CustomerId”] == c.Id select o

Disk Cleanup in 7

One of the most useful tools for cleaning some fairly chunky temporary files in Vista is the built-in Disk Cleanup tool. This has made it to 7 with a couple of changes.

First, you no longer get the choice to clean up your own or all files when the tool starts. Instead, after you choose the driver, there is a button for cleaning up system files, much like viewing all processes in Task Manager.

image

When you hit the button, it restarts the tool as an admin.

Now, if only I could figure out what this set of files is then I could make an educated decision to delete them…

image

Oh well… delete! 🙂

Hello Windows 7…

The best thing about Personal Development (PD) time is the ability to play with new things and call it "work". One of the things I thought I’d do while "working" today is install Windows 7 on my Dell m1330.

So far it’s pretty much smooth sailing… The install was very straight forward. Most of my drivers were installed successfully with Windows Setup or Windows Update. I’ve got Aero working and enabled the Superbar. My WEI is a bit low at 3.0, but this is most probably because the drivers for my SSD aren’t quite up to scratch.

image

Before I formatted my disk, I used the Windows Vista Backup and Recovery tool to backup my entire computer to an external hard drive. The good thing about this is it uses Virtual PC’s Virtual Hard Drive (vhd) format that you can then mount as a physical drive in Windows 7. I’m still waiting on the ability to boot from it though…

image

One interesting feature I stumbled upon when installing some extra drivers is the "Troubleshoot Compatibility" option in the right-click menu for applications in explorer. Here’s a scenario… Say you have an installer that works fine on Windows Vista or XP and you want to use it on 7. You double-click it and receive the new and improved UAC prompt.

image

You hit Yes, but somewhere down the line something goes wrong…

image

Now you can right-click the install and hit Troubleshoot Compatibility.

image

You will be asked what’s actually wrong.

image

If you claim it used to work, then you’ll be asked on which operating system it worked on.

image

You’ll confirm what you just did.

image

Windows will try to resolve the problem and run the installer again.

image

If the installer succeeds or fails, you can tell Windows whether you want to save the settings you just used to run the installer, try again using different settings, or report the problem.

image

This obviously doesn’t just apply to installers. It can be used on all applications.

I thought this was a pretty darn good improvement on the previously hidden screen in the application properties that achieved something to the same effect.

Discovering Search Terms

More trawling through old code I had written brought this one to the surface. One of the requirements of the system I’m working on was to intercept a 404 (Page Not Found) response and determine if the referrer was a search engine (e.g. google) to redirect to a search page with the search term. Intercepting the 404 was quite easily done with a Http Module…

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.RegularExpressions;
using System.Web;

namespace DemoApplication
{
    public class SearchEngineRedirectModule : IHttpModule
    {
        HttpApplication _context;

        public void Dispose()
        {
            if (_context != null)
                _context.EndRequest -= new EventHandler(_context_EndRequest);
        }

        public void Init(HttpApplication context)
        {
            _context = context;
            _context.EndRequest += new EventHandler(_context_EndRequest);
        }

        void _context_EndRequest(object sender, EventArgs e)
        {
            string searchTerm = null;
            if (HttpContext.Current.Response.StatusCode == 404
                && (searchTerm = DiscoverSearchTerm(HttpContext.Current.Request.UrlReferrer)) == null)
            {
                HttpContext.Current.Response.Redirect("~/Search.aspx?q=" + searchTerm);
            }
        }

        public string DiscoverSearchTerm(Uri url)
        {
            …
        }
    }
}

Implementing DiscoverSearchTerm isn’t that difficult either. We just have to analyse search engine statistics to see which ones are most popular and analyse the URL produced when performing a search. Luckily for us, most are quite similar in that they use a very simple format that has the search term as a parameter in the query string. The search engines I analysed included live, msn, yahoo, aol, google and ask. The search term parameter of these engines was either named “p”, “q” or “query”.

Now, all we have to do is filter for all the requests that came from a search engine, find the search term parameter and return its value…

public string DiscoverSearchTerm(Uri url)
{
    string searchTerm = null;
    var engine = new Regex(@"(search.(live|msn|yahoo|aol).com)|(google.(com|ca|de|(co.(nz|uk))))|(ask.com)");
    if (url != null && engine.IsMatch(url.Host))
    {
        var queryString = url.Query;
        // Remove the question mark from the front and add an ampersand to the end for pattern matching.
        if (queryString.StartsWith("?")) queryString = queryString.Substring(1);
        if (!queryString.EndsWith("&")) queryString += "&";
        var queryValues = new Dictionary<string, string>();
        var r = new Regex(
        @"(?<name>[^=&]+)=(?<value>[^&]+)&",
        RegexOptions.IgnoreCase | RegexOptions.Compiled
        );
        string[] queryParams = { "q", "p", "query" };
        foreach (var match in r.Matches(queryString))
        {
            var param = ((Match)match).Result("${name}");
            if (queryParams.Contains(param))
                queryValues.Add(
                ((Match)match).Result("${name}"),
                ((Match)match).Result("${value}")
                );
        }
        if (queryValues.Count > 0)
            searchTerm = queryValues.Values.First();
    }
    return searchTerm;
}

The above code uses two regular expressions, one to filter for a search engine and the other to separate the query string. Once it’s decided that the URL is a search engine’s, it creates a collection of query string parameters that could be search parameters and returns the first one.

Unfortunately, there wasn’t enough time in the iteration for me to properly match the search engine with the correct query parameter, but as is most commonly the parameter comes into the query string quite early so it’s fairly safe to assume that the first match is correct.

Randomly Sorting a List using Extension Methods

I was trawling through some old code I had written while doing some “refactoring” and came across this little nugget. I wanted to sort a list of objects that I was retrieving from a database using LINQ to SQL into a random order. Seeing as extension methods are all the rage, I decided to use them…

public static class ListExtensions { 
  public static IEnumerable<T> Randomise<T>(this IEnumerable<T> list) { 
    Random rand = new Random();
    var result = list.OrderBy(l => rand.Next());
    return result; 
  } 
}

How does it work…? It adds the Randomise() extension method to the end of any IEnumerable<T> (e.g. List<T>) and uses the OrderBy function to change the sort order based on a randomly generated number.

var randomCategories = context.Categories.Randomise();

The above code will execute the Randomise function to reorder the list of Category objects retrieved from the context randomly and assign the result to randomCategories.

Setting the Test Run Config in Team Build

At my current client, we’re in a situation where a couple of us have Visual Studio 2008 Team System and the rest have Professional Edition. This means that we’ve been having a hard time with getting Code Coverage in our team build because everyone has been changing the active test configuration to suite their environment.

After trawling through a few blog posts and support forums, I finally discovered a gold nugget. In a couple of steps, I defined the test configuration and get it to run as part of the team build.

Firtsly, I added the following test arguments to my project build file (TFSBuild.proj):

<MetaDataFile Include="$(SolutionRoot)/HelloWorld.vsmdi">
  <TestList>HelloWorldUnitTests</TestList>
  <RunConfigFile>$(SolutionRoot)/TeamBuildTestRun.testrunconfig</RunConfigFile>
</MetaDataFile>

This defines the test metadata file (HelloWorld.vsmdi), the list of tests to execute (HelloWorldUnitTests) and the configuration file to use (TeamBuildTestRun.testrunconfig). However, only the metadata file and test list will be included to the MSTest command. To get it all working, we have to edit the targets file on the server C:\Program Files\MSBuild\Microsoft\VisualStudio\Team\Microsoft.TeamFoundation.Build.targets. There is a target called CoreTestConfiguration that calls the TestToolsTask MSBuild task with the parameters. The first three calls are for non-desktop (i.e. server) builds, e.g.

<TestToolsTask
      Condition=" '$(IsDesktopBuild)'!='true'
                  and '$(V8TestToolsTask)'!='true'
                  and '%(LocalMetaDataFile.Identity)' != '' "
      BuildFlavor="$(Configuration)"
      Platform="$(Platform)"
      PublishServer="$(TeamFoundationServerUrl)"
      PublishBuild="$(BuildNumber)"
      SearchPathRoot="$(OutDir)"
      PathToResultsFilesRoot="$(TestResultsRoot)"
      MetaDataFile="%(LocalMetaDataFile.Identity)"
      RunConfigFile="%(RunConfigFile)"
      TestLists="%(LocalMetaDataFile.TestList)"
      TeamProject="$(TeamProject)"
      TestNames="$(TestNames)"
      ContinueOnError="true" />

When the build is run on the server, the values of the MetaDataFile property are copied to the LocalMetaDataFile variable. This means the RunConfigFile property needs to be changed to %(LocalMetaDataFile.RunConfigFile), e.g.

<TestToolsTask
      Condition=" '$(IsDesktopBuild)'!='true'
                  and '$(V8TestToolsTask)'!='true'
                  and '%(LocalMetaDataFile.Identity)' != '' "
      BuildFlavor="$(Configuration)"
      Platform="$(Platform)"
      PublishServer="$(TeamFoundationServerUrl)"
      PublishBuild="$(BuildNumber)"
      SearchPathRoot="$(OutDir)"
      PathToResultsFilesRoot="$(TestResultsRoot)"
      MetaDataFile="%(LocalMetaDataFile.Identity)"
      RunConfigFile="%(LocalMetaDataFile.RunConfigFile)"
      TestLists="%(LocalMetaDataFile.TestList)"
      TeamProject="$(TeamProject)"
      TestNames="$(TestNames)"
      ContinueOnError="true" />

There are three more calls to TestToolsTask that should be modified. These calls are for desktop builds, so the LocalMetaDataFile has not been created. This means we use %(MetaDataFile.RunConfigFile) instead, e.g.

  <TestToolsTask
        Condition=" '$(IsDesktopBuild)'=='true'
                    and '$(V8TestToolsTask)'!='true'
                    and '%(MetaDataFile.Identity)' != '' "
        SearchPathRoot="$(OutDir)"
        PathToResultsFilesRoot="$(TestResultsRoot)"
        MetaDataFile="%(MetaDataFile.Identity)"
        RunConfigFile="%(MetaDataFile.RunConfigFile)"
        TestLists="%(MetaDataFile.TestList)"
        TestNames="$(TestNames)"
        ContinueOnError="true" />

And that’s it!

Moving Blogs

For those on twitter, you may have noticed that my latest posts look a bit different to my old ones. I’ve finally made the decision to move my blog across to WordPress (event though I’ve had an account for 6 months).

I’ve updated my feedburner feed to point to the right place, but now I have to get the rest of it going. Hopefully once it’s all up and running I’ll have my posts and comments all up on my new blog. In the meantime, please bare with me while I make this transition.

Thanks!

#teau08

NOTE: This is possibly the longest thing I’ve ever written. Please stay tuned, because the people I mention really deserve recognition.

Last week I attended Tech.Ed Australia 2008. It was the third Tech.Ed I’ve attended, but for some reason it has stood out far above the rest. I had a great week from beginning to end.

The experience started Tuesday morning for me, when I turned up to the Embedded Developers Pre-Day. This involved sitting in an instructor-led lab for a day and delving into the embedded experience. Before Tuesday my embedded experience basically involved a bit of Windows Mobile and XP Embedded development. The day introduced some great concepts to me including componentising an application, working with the File-Based Write Filter and using debugging with CeDebugX ("!diagnose all" will save my life…).

Wednesday saw the kick-off of Tech.Ed and some worthwhile sessions. The most memorable was Corneliu Tusnea’s Debugging the World session. I’ve seen this before, but it was still the highlight of the day to me. The way he works with WinDebug truly amazes me. Some people think it’s a bit technical, but I thought it’s what a 400 session should be! It was also great watching Scott Hanselman (who I was honoured to meet *briefly*) go up against the Readify greats Corneliu and Mitch in the Ask the Experts session that night.

Thursday was a bit more lively and turned into a very long day. It started off well with Jonas’ Silverlight presentation. It got a bit worse at morning tea when my nerves got the best of me while I gave my first (and hopefully not the last) presentation at Tech.Ed on Introducing Windows Mobile Development. A bit of fun and the demo gods smiled on me for once…

Scott Hansleman’s MVC presentation was an absolute hoot! The demo gods were definitely not so kind to him, but he worked through it and did an excellent job. I thought this session was a bit too technical for a 200 level, but I wouldn’t expect any less than 10 minutes in the call stack from Scott.

Paul Stovell delivered a really good presentation on reactive programming that was great at introducing advanced concepts of databinding in all forms. Backed up by Corneliu’s next session on Secure Development Patterns, which will hopefully be coming to an RDN near you…

Once again, Scott had me riveted in his afternoon session on REST and ADO.Net Entity Data Services. I now want to be a RESTifarian. Then, I finished off with Harry Pierson’s dynamic languages presentation, which was really interesting. Of course, that was followed by the closing party…

Friday was a great end to the conference. I really Joel Pobar’s F# presentation, and I can’t wait to introduce it as an alien artifact. The Mobility Smackdown was once again a really pumped session with the aim of promoting Windows Mobile. I managed to score an Xbox 360, so I really really enjoyed that too… Finally, the Locknote. Two words- passionate and Wow! Craig Bailey really summed it up well here. It’s the best locknote (or keynote) session I’ve ever seen and I just hope both ends of the conference are more like that next year.

Final Thoughts…

I had a great week. Learned loads of cool stuff. Really excited about cloud computing and the future of the web, but also about what I can do to be a better developer. Microsoft really put on a great show and I can’t wait until next year. Also, thank you to all the people that were involved in Tech.Ed that I may have missed.

The only plus in it being over is that my liver finally gets a rest… =P