Extract InfoPath Forms Data To Excel Using The .NET Client Object Model

Well, this sounded like a good way to go. The client needed to have multiple InfoPath forms being written to a single Excel file on a variable schedule. The Excel file has a very specific format that is required, so CSV wouldn’t cut it. Timer jobs could not be used as the environment had to work like Office 365 (sandboxed code only).

The alternatives considered were:

1. Promote fields to list properties and export to Excel from the list. Discarded due to constraints with promoting repeating table fields.

2. Implement InfoPath rules to write the form information out to a SharePoint list on submit, then automate the list extract to Excel. Discarded as the form had a repeating table, would require code in the InfoPath form and would still require additional code to automate the extraction.

3. Use SharePoint workflow to extract the forms to Excel. Not really possible due without coding and potential Excel file concurrency issues.

4. Use InfoPath rules or workflow to add the records to Excel file on submit. Discarded due to concurrency risks on the Excel file.

 

So what we ended up considering was to use the new Client Object Model, run from a remote workstation in the client’s network, extracting the forms into an Excel file using OpenXML and updating the status of the form after it was extracted.

Sadly, while this ended up being a good choice, there is very little documentation/blogs on accessing InfoPath forms using the client object model. Fortunately, that didn’t stop me and while I had to figure out some stuff on my own, it turns out to be a relatively good way to meet the client requirement.

So, how does one go about this?

I decided to implement the extractor as a console application so it could be easily run in the windows scheduled tasks sub-system. This allows the extract to be scheduled as often as the client likes and because it simply extracts InfoPath forms in a particular status, then updates the status to indicate they have been extracted, it can be run as many times as one would like with no negative impacts.

Pre-Requisites

In order to work with OpenXML, you should install the OpenXML SDK 2.0 on your development machine:

OpenXML SDK 2.0

You will also need to Client Object Model DLLs. These can be found in the SharePoint 14 hive or use the Client Object Model redistributable:

Client Object Model Redistributable

 

Implementation

I implemented the extractor in it’s own class (in case we want to re-use from an alternate client type), and invoke the class from the main in the console exe as follows:

            OrderExtractor extractor = new OrderExtractor();
            extractor.ExtractOrders();

In the class, we need to reference stuff from client object model, SharePoint and OpenXML:

using Microsoft.SharePoint.Client;
using Microsoft.SharePoint.Client.Utilities;
using SP = Microsoft.SharePoint.Client;
using DocumentFormat.OpenXml;
using DocumentFormat.OpenXml.Packaging;
using DocumentFormat.OpenXml.Spreadsheet;
using OX = DocumentFormat.OpenXml;

I used a couple of class level variables so I wouldn’t have to pass them around all the time:

        protected ClientContext _ctx;
        protected UInt32 _CurrentRow = 2;

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

The extractor needs to extract InfoPath forms that represent orders which contain the main order details and a repeating table for the products ordered.

In the main method, I open the client context to the site, open the form library to extract from, select the orders to extract based on a status (using CAML query to minimise the size of the result set). If there are no orders to extract we are done, but if there are, we copy the Excel template to a new file to extract to and loop through each order and extract it to Excel, then upload the Excel file to a document library on the site for easy access.

I used the InfoPath form’s schema (XSD) to generate a class that I can deserialise the InfoPath form into for easy access. There are a number of references on the internet about how to do this so I have not included it here.

Please note a couple of interesting things that have to happen after that:

1. Need to reset the worksheet dimension to cover the new rows added.

2. Save he updated workbook.

        // Extract Orders To Excel
        public void ExtractOrders()
        {
            try
            {

                // Get client context and setup credentials and authentication method
                _ctx = new ClientContext(Properties.Settings.Default.ExtractWebURL);
                SetupContext();

                // Open the form library to extract from
                SP.List formsLib = _ctx.Web.Lists.GetByTitle(Properties.Settings.Default.ExtractFormLibraryName);

                // Select all "Submitted" orders
                CamlQuery query = new CamlQuery();
                query.ViewXml = @"<View><Query><Where><Contains><FieldRef Name='OrderStatus' /><Value Type='Text'>Submitted</Value></Contains></Where></Query><RowLimit>2147483647</RowLimit></View>";
                SP.ListItemCollection items = formsLib.GetItems(query);
                _ctx.Load(items);
                _ctx.ExecuteQuery();

                // Only create the file if there are orders to extract
                if (items.Count > 0)
                {

                    // Create Excel file Orders-ddmmyy-hhmmss.xlsx from template
                    string outFilePath = CreateExtractFile();

                    // Open the extract file using OpenXML
                    using (SpreadsheetDocument extract = SpreadsheetDocument.Open(outFilePath, true))
                    {
                        // Track the row in the spreadsheet to use next, start at 2 as template has header row in row 1.
                        SheetData currentSheetData = extract.WorkbookPart.WorksheetParts.First().Worksheet.GetFirstChild<SheetData>();
                        _CurrentRow = 2;

                        // Loop through each order - extract fields to Excel
                        foreach (SP.ListItem item in items)
                        {
                            ExtractOrderDetails(item, currentSheetData);
                        }

                        // Update the worksheet dimension
                        string lastDimension = "Q" + Convert.ToString((_CurrentRow - 1));
                        extract.WorkbookPart.WorksheetParts.First().Worksheet.SheetDimension.Reference = "A1:" + lastDimension;

                        // Save Excel extract file with extract records
                        extract.WorkbookPart.Workbook.Save();
                        extract.Dispose();
                    }

                    // Upload excel file to destination library
                    string relURL = Path.Combine(Properties.Settings.Default.ExtractsLibraryRelativeURL, Path.GetFileName(outFilePath));
                    using (FileStream upStream = new FileStream(outFilePath, FileMode.Open))
                    {
                        SP.File.SaveBinaryDirect(_ctx, relURL, upStream, true);
                    }
                }
                // Dispose of client context
                _ctx.Dispose(); 
            }
            catch (Exception ex)
            {
                TextFileLogger.LogError("Exception extracting Orders", "Client.Intranet.OrderExtractor.ExtractOrders", ex);
            }
        }

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

There is a bit of interesting stuff going on in SetupContext. This method sets the credentials to use for the connection and also provides a configurable mechanism to handle SharePoint 2010 sites using claims authentication with a call back method:

        private void SetupContext()
        {
            try
            {
                if (Properties.Settings.Default.UseClaimsAuthentication == "True")
                {
                    //Configure the handler that will add the header.
                    _ctx.ExecutingWebRequest += new EventHandler<WebRequestEventArgs>(SetHeader_MixedAuthRequest);
                }
                
                //Set the Windows credentials.
                _ctx.AuthenticationMode = ClientAuthenticationMode.Default;
                System.Net.NetworkCredential credentials = new NetworkCredential(Properties.Settings.Default.CredentialsUser, Properties.Settings.Default.CredentialsPassword, Properties.Settings.Default.CredentialsDomain);
                _ctx.Credentials = credentials;
            }
            catch (Exception ex)
            {
                TextFileLogger.LogError("Exception setting up header and credentials", ex);
            }
        }


        void SetHeader_MixedAuthRequest(object sender, WebRequestEventArgs e)
        {
            try
            {
                //Add the header that tells SharePoint to use Windows authentication.
                e.WebRequestExecutor.RequestHeaders.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
            }
            catch (Exception ex)
            {
                TextFileLogger.LogError("Exception setting authentication header", ex);
            }
        }

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

The next step is to add rows to the Excel file for the order header and additional records for each product on the order:

private void ExtractOrderDetails(SP.ListItem listItem, SheetData sData) { try { // Get the InfoPath XML file string fPath = (string)listItem["FileRef"]; FileInformation fInfo = SP.File.OpenBinaryDirect(_ctx, fPath); // Load file into XML reader and deserialise into a class to work with XmlTextReader reader = new XmlTextReader(fInfo.Stream); XmlSerializer serializer = new XmlSerializer(typeof(Diageo.Intranet.HouseOrderExtractor.Order)); Order order = (Order)serializer.Deserialize(reader); // Determine calculated fields // Requested Delivery Date - depends on home state DateTime reqDelivery; // … // Write out the header record to Excel at current row Row r = new Row(); string rowID = Convert.ToString(_CurrentRow); r.RowIndex = _CurrentRow++; r.Spans = new ListValue<StringValue>() { InnerText = "1:17" }; r.DyDescent = 0.25D; // Add fields to header row r.Append(CreateTextCell(("A" + rowID), "H")); // Order row type - H for order header // r.Append for each cell to add to the row

// Example with deserialised class access to InfoPath form data

r.Append(CreateTextCell(("F" + rowID), order.OrderID)); // Purchase order number

// Add row to Excel
sData.Append(r);

// Write out a row for each order item
foreach (OrderItems orderItem in order.OrderItems)
{
CreateItemRow(orderItem, sData);
}

// Update order status to Processed

// Note: This only works when the list field is a read-write promoted field
listItem["OrderStatus"] = "Processed";
listItem.Update();
// Execute to update item statuses
_ctx.ExecuteQuery();

}
catch (Exception ex)
{
TextFileLogger.LogError("Exception extracting order details to Excel file", ex);
}
}

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

I used SQLMetal to generate the Order class from the XSD in the InfoPath template. Note that the class name must match exactly with the Main container name in the InfoPath fields view or the deserialise will not work.

Also note how we updated the order status using a promoted property on the form list. Please be aware that the promoted property must be Read-Write for this to work (found that out the hard way Smile). You can make a promoted property read/write only when you publish the InfoPath form (not through advanced form options dialog) by checking the text box “Allow users to edit data in this field by using a datasheet or properties page”, which then make the property writeable on the list column.

For each product in the order, we need to add a row to the Excel file with that products ordering details. The code below demonstrates this as well as showing how I create text cells (I only used text cells but you can use other cell types, just search the internet for some examples, there’s lots around).

private void CreateItemRow(OrderItems orderItem, SheetData sData) { // Create next row Row r = new Row(); string rowID = Convert.ToString(_CurrentRow); r.RowIndex = _CurrentRow++; r.Spans = new ListValue<StringValue>() { InnerText = "1:17" }; r.DyDescent = 0.25D; // Add order item to row r.Append(CreateTextCell(("A" + rowID), "D")); // Order row type - D for order items // r.Append for each item in the row – example using deserialised InfoPtah order object

r.Append(CreateTextCell(("M" + rowID), orderItem.ItemCode)); // Material Number - product code - not used for H records

// Add the row to the file
sData.Append(r);
}

private Cell CreateTextCell(string cellRef, string cellValue)
{
Cell newCell = new Cell();
newCell.CellReference = cellRef;
newCell.DataType = CellValues.InlineString;
newCell.InlineString = new InlineString { Text = new Text { Text = cellValue } };
return newCell;
}

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

 

We then simple setup the console .exe application to run from windows scheduled task and the client has a nice Excel extract they can push into the order management system, which can be retrieved from the file system on the machine where this runs, or easily retrieve manually or programmatically from the SharePoint document library.

It’s an interesting approach and something that will also work on an Office 365 environment when run remotely from a networked client machine in the client’s internal network.

Happy Client Object Model and OpenXML coding!

 

…Derek

InfoPath 2010–Looking Up On People Columns

This one kept me frustrated for a while…

If you are attempting to add a rule in InfoPath to lookup a field in a list based on a “user” column (such as CreatedBy), the comparison you need to make is “Contains” NOT “Is Equal To”.

So you select the field in the list, then add the filter where “CreatedBy” contains username():

image

Works like a treat once you figure this out, very frustrating otherwise Smile

Using TFS 2010 With BIDS 2008

Well, my good mates at Microsoft have managed to leave us developers in a bit of a lurch, AGAIN.

The situation this time is that Visual Studio 2010 does not have support for SQL Server 2008 R2 in there yet, so we are forced to use BI Development Studio (BIDS) 2008 with SQL 2008 R2.

What makes this painful is that VS2010 should support everything, but now we are forced to work in 2 different environments. Additionally, BIDS 2008 will not work with Team Foundation Server 2010 out of the box.

In an even funnier situation, we have to use VS2010 to version control the SQL 2008 R2 databases, as this cannot be done in BIDS.

I hate to say this, but Microsoft seems to have dropped the developer ball since Bill left. Sad smile

Anyways, there is a solution, but not a simple one sadly. The post here describes how to get this working.

I already had BIDS2008 installed with VS2010 side-by-side. The steps I followed to get this working were:

  1. Install VS 2008 Team explorer (download here)
  2. Install VS 2008 SP1 (download here)
  3. Install VS2008 SP1 Forward compatibility Update For TFS2010 (download here)
  4. Update registry entry as per this blog (Medo Blog) HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\TeamFoundation\Servers with the URL to your Project Collection in TFS so that BIDS can connect to TFS properly. URL looks like http://TFSServer:8080/tfs/collection (assuming default install paths, virtual directories, etc.). Just add a new string type (REG_SZ) with name = tfsservername and value = URL (as above).

Once that is done, open BIDS and connect to TFS, the server should already be there when you go to connect in Team explorer 2008.

VS2010 connects natively (possibly dependent on VS version) to TFS so we can version control the databases, but it sure would be a lot nicer to just have all this stuff work OOTB like it should, re-align SQL and VS releases if you need to Microsoft, I’d be happier to wait for an aligned release than deal with this mess.

Happy BI Development Smile

Hello world!

Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!

Upgrading Virtual Box On Windows

 

I have been using Virtual Box as a virtualisation environment ever since I moved to Windows 7 and found that Microsoft seems to have forgotten about 64 bit support for its Windows Virtual PC environment (hopefully it is coming but I can’t find that commitment anywhere).

It is a great product and a new release (3.2.xx) has just come out so I thought I should get upgraded. This seems to be the first release since Oracle took over Sun and I was pretty keen to see if the Sun approach of offering a good free product was being maintained by the new owners.

It appears that I didn’t need to worry as it seems that the approach is still part of the Oracle strategy, although time will tell if we will end up with a hamstrung free product and have to pay for the good stuff.

So I downloaded version 3.2 from the Oracle site and as I was about to use the standard Microsoft approach to installing upgrades, where the installer actually takes care of removing the old product. Given I wasn’t 100% sure if this would be the right approach I thought I’d have a quick look at the installation instructions. Looked everywhere and couldn’t find a single reference to whether or not the product would auto-upgrade on install.

I did find a number of references to installing on Unix/Linux variants and they invariably outlined an uninstall then install approach. While that is a pretty common scenario in those operating systems, it is not the typical scenario in Windows.

Looking into it a bit further (I started the install to see if it would indicate what it would do) I discovered that it now installs to a new path (Oracle instead of Sun in the path). I reasonably (I think) concluded from this that I would end up with a duplicate install (somebody please ping me to let me know if this is incorrect).

I was concerned that an uninstall would remove virtual hard disks and VM configurations. Fortunately, the uninstall doesn’t touch that stuff and after uninstalling and installing the new version, everything was still there.

So if you need to upgrade Virtual Box, I recommend an uninstall then install approach.

Happy virtualisation!

Installing TFS2010

Installing TFS2010 on Windows Server 2008, with SQL Server 2008 and WSS 3.0. This post provides a short but detailed instruction on how to install and configure TFS2010. Full installation instructions can be downloaded here :

Download TFS201 Installation Guide

The installation I am performing is a farm installation on a single server. I am installing with full production style security etc, but on a single server. This installation style should allow us to migrate to a multi-server TFS farm more easily in the future. NOTE: There are some additional steps that need to be taken if installing on separate servers (Firewall rules, some permissions are different, etc). see the full installation guide for any differences.

 

Pre-Requisites

 

I did a full x86 install, but only because x86 was the default VM template we had🙂 It should be pretty much identical for x64 installations, keeping in mind that whatever bitness is used should be used for everything.

1. Windows Server 2008 SP2 x86 (Download Windows Server 2008 SP2 x86)

2. SQL Server 2008 SP1 x86

3. TFS 2010

4. TFS 2010 Power Tools (TFS 2010 Power Tools Download page)

 

Service Accounts

 

In order to run TFS as a farm (even on a single computer) a number of service accounts should be setup:

1. <TFS10Setup> – This is a local administrator on the TFS and SQL servers. Used to install TFS. Assign Log On Locally permission in Local security policy. Add to local Administrators group on all servers in the farm.

2. <TFS10Reports> – This is the account used to read TFS reports from the report server. Assign Log On Locally permission.

3. <TFS10Service> – This account is used to run the TFS services. Assign Log On As service permission.

4. <TFS10Build> – Account used to run the TFS build service. Assign Log On as Service permission.

5. <TFS10WssService> – Account used to run SharePoint services. Assign Log On As Service permission.

6. <TFS10SqlService> – Account used to run SQL services. Assign Log On as service permission. Also used for the SharePoint Central Administration application pool.

 

Installing Pre-Requisites

 

1. Windows Server 2008. I won’t go into the details here. Install, join the domain.

1. Add TFS10Setup account to local administrators group if not already there.

2. Install IIS 7.0, in addition to the default role services, also ensure the following are installed:

       – ASP.NET

       – Windows Authentication

       – IIS 6 Management Compatibility (and all sub items)

3. Install Windows Server 2008 SP2 (Download Windows Server 2008 SP2)

 

2. Install SQL Server 2008:

1. Login as a local administrator, using <TFS10Setup> account (this account will then be added as a SQL Administrator too).

2. Double click setup.exe

3. Select New SQL Server Stand-alone installation or add features to an existing installation.

4. Select OK, then enter product key, Next

5. Accept license terms and Next

6. Select Install to install setup support files

7. Select Next on setup Support Rules (ignore any firewall warnings).

8. Select the following components to install then Next:

    – Database Engine

    – Full Text Search

    – Analysis Services

    – Reporting Services

    – Client Tools Connectivity

    – Management Tools – Basic & Complete

9. Select Default instance, Next

10. Next on disk space requirements  (point this to where you want it installed if the default is not acceptable)

11. On server configuration use the TFS10SqlService account for all accounts and verify automatic startup for all editable services, then Next.

12. On Database Engine Configuration select Windows Authentication, click add current user, (optionally, go to data directories tab to change data file locations), then Next. You can also ensure Latin1_General_AS_CI is selected as the collation. You may need to use an alternate collation if you need to support wide characters or Hiragana/Katakana.

13. On Analysis Services Configuration, click Add Current User, (optionally, go to data directories tab to change data file locations),then Next

14. On Reporting Services Configuration, select Install the native mode default configuration (or Install but do not configure if it is not available). Do not use Sharepoint integrated mode, it is not supported by TFS.

15. On Error and Usage Reporting screen, click Next

16. On Installation Rules, click Next

17. On Ready To Install, review and if OK, click Install.

18. After installation has completed, click Next and Close to end installation.

19. Configure Analysis services to start automatically (via Administrative Tools, Services, recovery tab, ensure Restart The service is set for all failures)

 

3. Install SharePoint products – only need to do this if you want to use MOSS 2007 instead of WSS or if you are installing on multiple servers. See the installation guide for further details. I used the WSS 3.0 install with TFS.

4. Install TFS2010

1. Login as a local administrator

2. Select the appropriate bitness of TFS for you operating system via the appropriate folder and run the associated setup.exe

3. On Welcome click Next, accept the license terms and click Next

4. On Team Foundation Server page click Install

5. If you are prompted to restart, restart

6. select Launch TFS Configuration Tool checkbox, then Finish

5. Configure TFS:

1. In TFS Configuration Tool click Advanced, then Start Wizard

2. On Welcome screen click Next

3. In SQL Server Instance, type the name of the SQL server or named instance that will host the configuration, I suggest providing a label for the databases in Server Database Label to allow multiple instances on the same DB. I used TFS10 as the label. Click Test to test connectivity.

4.  Under Service account use the <TFS10Service> credentials, click Test

5. Under Authentication Method click NTLM and then Next

6. Under web Site click Create a new site named TFS2010-8080 and port 8080. Leave tfs as Virtual Directory, Next.

7. Select Configure reporting for TFS (if you want reporting):

    1. Enter the SQL Server name and click Populate URLs, Next

    2. Enter SSAS server name and Test, then Next

    3. Enter <TFS10Reports> credentials and select Use a different account…Test, then Next

    4. Select Configure SharePoint…, use the <TFS10WssService> credentials, Next

    5. Select Create a new site collection, enter name (TFS10default) and Next

    6. Next to do the install

8. (Optional) Install TFS2010 Team Explorer (TFS2010 Team Explorer Download Page)

9. (Optional) Install TFS2010 Power Tools (TFS2010 Power Tools Download Page)

Time On My Hands…

Well sadly, the market downturn in IT has caught me up. I was made redundant from Artis Group a few weeks ago and am now looking for work. If anybody is looking for a .NET or SharePoint architect/lead, give me a buzz (DerekJMillerAThotmail.com).

So now that I have some time on my hands, looks like its time to get back to some interesting technical stuff (in between applying and interviewing for jobs).

 

I have a few options:

1. From some previous posts, you will have noticed I am a bit of a DSL fan, especially with the DSL capabilities available in VS2008. A long while ago on this blog I started to walk through a DSL project to use a DSL designer to generate ORM data access classes (and associated database artefacts). I have seen quite a few ORM layers over the years and dislike the typical approach of generating the ORM layer from the database table. I prefer to design the domain objects needed then have entity classes, data access and databases stuff all generated. It’s a bit more object centric than database centric, which is what ORM should be about and allows one to define a proper object hierarchy with explicit shallow versus deep loading (again, I am not a fan of lazy loading as it is not well enough understood and results in serious performance issues due to the misunderstandings).

2. I also have a little background project on the go to implement an SMS broadcasting, social networking site. I am looking at doing an ASP.NET and a Silverlight (3 or 4 depending on timing) interface for it. This a bit commercially sensitive so may not get much public coverage, except for interesting technical tidbits I find doing the work.

3. And finally, I am super keen to do some more stuff with SharePoint 2010. This is the version us developers have been waiting for; full 64 bit development support, develop on windows 7 (instead of server) and a lot more VS support for SharePoint development and some more bits we can get at….nice!

So, where to start?

I’ll think about that and let you know tomorrow…