Author: Richard Seroter

  • Enabling Data-Driven Permissions in SharePoint Using Windows Workflow

    A group I’m working with was looking to use SharePoint to capture data entered by a number of international employees.  They asked if SharePoint could restrict access to a given list item based on the value in a particular column.  So, if the user created a line item designated for “Germany”, then automatically, the list item would only allow German users to read the line.  My answer was “that seems possible, but that’s not out of the box behavior.”  So, I went and built the necessary Windows Workflow, and thought I’d share it here.

    In my development environment, I needed Windows Groups to represent the individual countries.  So, I created users and groups for a mix of countries, with an example of one country (“Canada”) allowing multiple groups to have access to its items.

    Next, I created a new SharePoint list where I map the country to the list of Windows groups that I want to provide “Contributor” rights to.

    Next, I have the actual list of items, with a SharePoint “lookup” column pointing back to the “country mapping” list.

    If I look at any item’s permissions upon initial data entry, I can see that it inherits its permissions from its parent.

    So, what I want to do is break that inheritance, look up the correct group(s) associated with that line item, and apply those permissions.  Sounds like a job for Windows Workflow.

    After creating the new SharePoint Sequential Workflow, I strong named the assembly, and then built it (with nothing in it yet) and GAC-ed it so that I could extract the strong name key value.

    Next, I had to fill out the feature.xml, workflow.xml and modify the PostBuildActions.bat file.

    My feature.xml file looks like this (with values you’d have to change in bold) …

    <Feature Id=”18EC8BDA-46B2-4379-9ED1-B0CF6DE46C61″ Title=”Data Driven Permission Change Feature” Description=”This feature adds permissions” Version=”12.0.0.0″ Scope=”Site” ReceiverAssembly=”Microsoft.Office.Workflow.Feature, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c” ReceiverClass= “Microsoft.Office.Workflow.Feature. WorkflowFeatureReceiver” xmlns=”http://schemas.microsoft.com/sharepoint/”&gt; <ElementManifests> <ElementManifest Location=”workflow.xml” /> </ElementManifests> <Properties> <Property Key=”GloballyAvailable” Value=”true” /> <Property Key=”RegisterForms” Value=”*.xsn” /> </Properties> </Feature>

    So far so good.  Then my workflow.xml file looks like this …

    <Elements xmlns=”http://schemas.microsoft.com/sharepoint/”&gt; <Workflow Name=”Data Driven Permission Change Workflow” Description=”This workflow sets permissions” Id=”80837EFD-485E-4247-BDED-294C70F6C686″ CodeBesideClass= “DataDrivenPermissionWF.PermissionWorkflow” CodeBesideAssembly= “DataDrivenPermissionWF, Version=1.0.0.0, Culture=neutral, PublicKeyToken=111111111111″ StatusUrl=”_layouts/WrkStat.aspx”> <Categories/> <MetaData> <AssociateOnActivation>false</AssociateOnActivation> </MetaData> </Workflow> </Elements>

    After this, I had to change the PostBuildActions.bat file to actually point to my SharePoint site.  By default, it publishes to “http://localhost&#8221;.  Now I can actually build the workflow.  I’ve kept things pretty simple here.  After adding the two shapes, I set the token value and changed the names of the shapes.

    The “Activated” shape is responsible for setting member variables.

    private void SharePointWorkflowActivated_Invoked (object sender, ExternalDataEventArgs e) { //set member variable values from //the inbound list context webId = workflowProperties.WebId; siteId = workflowProperties.SiteId; listId = workflowProperties.ListId; itemId = workflowProperties.ItemId; }

    Make sure that you’re not an idiot like me and spend 30 minutes trying to figure out why all these “workflow properties” were empty before realizing that you haven’t told the workflow to populate it.

    The meat of this workflow now all rests in the next “code” shape.  I probably could have (and would) refactor this into more modular bits, but for now, it’s all in a single shape.

    I start off by grabbing fresh references to the SharePoint web, site, list and item by using the IDs captured earlier.  Yes, I know that the workflow properties collection has these as well, but I went this route.

    //all the id’s for the site, current list and item SPSite site = new SPSite(siteId); SPWeb web = site.OpenWeb(webId); SPList list = web.Lists[listId]; SPListItem listItem = list.GetItemById(itemId);

    Next, I can explicitly break the item’s permission inheritance.

    //break from parent permissions listItem.BreakRoleInheritance(false);

    Next, to properly account for updates, I went and removed all existing permissions. I needed this in the case that you pick one country value, and decide to change it later. I wanted to make sure that no stale or invalid permissions remained.

    //delete any existing permissions in the //case that this is an update to an item SPRoleAssignmentCollection currentRoles = listItem.RoleAssignments; foreach (SPRoleAssignment role in listItem.RoleAssignments) { role.RoleDefinitionBindings.RemoveAll(); role.Update(); }

    I need the country value actually entered in the line item, so I grab that here.

    //get country value from list item string selectedCountry = listItem[“Country”].ToString(); SPFieldLookupValue countryLookupField = new SPFieldLookupValue(selectedCountry);

    I used the SPFieldLookupValue type to be able to easily extract the country value. If read as a straight string, you get something like “1;#Canada” where it’s a mix of the field ID plus value.

    Now that I know which country was entered, I can query my country list to figure out what group permissions I can add.   So, I built up a CAML query using the “country” value I just extracted.

    //build query string against second list string queryString = “<Where><Eq> <FieldRef Name=’Title’ /> <Value Type=’Text’>”+ countryLookupField.LookupValue +”</Value> </Eq></Where>”; SPQuery countryQuery = new SPQuery(); countryQuery.Query = queryString; //perform lookup on second list Guid lookupListGuid = new Guid(“9DD18A79-9295-47BC-A4AA-363D53DA2336”); SPList groupList = web.Lists[lookupListGuid]; SPListItemCollection countryItemCollection = groupList.GetItems(countryQuery)

    We’re getting close.  Now that I have the country list item collection, I can yank out the country record, and read the associated Windows groups (split by a “;” delimiter).

    //get pointer to country list item SPListItem countryListItem = countryItemCollection[0]; string countryPermissions = countryListItem[“CountryPermissionGroups”].ToString(); char[] permissionDelimiter = { ‘;’ }; //get array of permissions for this country string[] permissionArray = countryPermissions.Split(permissionDelimiter);

    Now that I have an array of permission groups, I have to explicitly add them as “Contributors” to the list item.

    //add each permission for the country to the list item foreach (string permissionGroup in permissionArray) { //create”contributor” role SPRoleDefinition roleDef = web.RoleDefinitions.GetByType(SPRoleType.Contributor); SPRoleAssignment roleAssignment = new SPRoleAssignment( permissionGroup, string.Empty, string.Empty, string.Empty); roleAssignment.RoleDefinitionBindings.Add(roleDef); //update list item with new assignment listItem.RoleAssignments.Add(roleAssignment); }

    After all that, there’s only one more line of code.  And, it’s the most important one.

    //final update listItem.Update();

    Whew. Ok, when you build the project, by default, the solution isn’t deployed to SharePoint. When you’re ready to deploy to SharePoint, go ahead and view the project properties, look at the build events, and change the last part of the post build command line from NODEPLOY to DEPLOY. If you build again, your Visual Studio.NET output window should show a successful deployment of the feature and workflow.

    Back in the SharePoint list where the data is entered, we can now add this new workflow to the list.  Whatever name you gave the workflow should show up in the choices for workflow templates.

    So, if I enter a new list item, the workflow immediately fires and I can see that the permissions for the Canadian entry now has two permission groups attached.

    Also notice (in yellow) the fact that this list item no longer inherits permissions from its parent folder or list.  If I change this list item to now be associated with the UK, and retrigger the workflow, then I only have a single “UK” group there.

    So there you go.  Making data-driven permissions possible on SharePoint list items.  This saves a lot of time over manually going into each item and setting it’s permissions.

    Thoughts?  Any improvements I should make?

    Technorati Tags:

  • Quick Look at UML in VSTS "Rosario"

    During the MVP Summit this past April, I saw a presentation of UML capabilities that are part of the Visual Studio Team System “Rosario” April 2008 Preview.  I immediately downloaded the monstrous virtual machine containing the bits … and finally took a quick look at things today.

    In my current job, I find myself creating a fair amount of UML diagrams.   My company uses the very powerful Sparx Enterprise Architect (EA) for UML modeling, and despite that fact that some days I spend as much time in EA as I do Microsoft Outlook, I still probably only touch 10% of the functionality of that application.  How does Visual Studio measure up?  I thought I’d take a quick look at the diagram types that I’ve created most recently in EA: use case, component, sequence and activity.

    When you look to create a new Visual Studio project, you now see “Modeling Projects” as an option.

    Funny, but all the modeling diagram types (logical, use case, component and sequence) can be added to existing VS.NET projects, EXCEPT “activity diagrams” which must be created as a standalone project.  Alrighty then.

    For the use case diagram, there’s a fair representation of the standard UML shapes.

    Can’t seem to create a system boundary though.  That seems odd.  The “use case details” is a nice touch.

    The sequence diagram also looks pretty decent.  What’s nice is that you can generate operations on classes, or the classes themselves directly from the diagram.

    How about component diagrams?  We actually use a few flavors of these to create system dependency diagrams as well as functional decomposition diagrams.  Not sure I could do that particularly easily with this template.

    Doesn’t look like I can change the stereotypes at all on either the components or links, so it’s tough to make a “high level” component design.  But wait!  Looks like I can do a “application design” or “system design” diagram.

    Here is a system design.

    I couldn’t figure out how to associate multiple systems, but that’s probably my stupidity at work.    Pretty nice diagram though, with the ability to add deployment details and constraints.

    Finally, you have the activity diagram.  This has many of the standard UML activity shapes, and looks pretty solid.

    The basic verdict?  Looks promising.  I have to do a bit too much clicking to make things happen (e.g. no “drag from shape corner to connect to another shape”), and it would be nice if it exported to the industry standard format, but overall, it’s a step in the right direction.  I’d also like to see a “lite” version that folks (e.g. business analysts) could use without having to install Visual Studio.

    This wouldn’t make me stop using or recommending Sparx EA, but, let’s keep an eye on this.

    Technorati Tags: , UML

  • New BizTalk Performance, WCF Whitepapers

    I was looking for a particular download today on the Microsoft site, and came across a couple of new whitepapers.  Check out the Microsoft BizTalk Server Performance Optimization Guide which 220+ pages of performance factors, analytic tools, planning/preparing/executing a performance assessment, identifying bottlenecks, how to test, and optimizing operating system / network / database  level settings.

    Also check out the new whitepaper on BizTalk 2006 R2 integration with WCF.  This is a different paper than Aaron’s WCF adapter paper from last year.

    And not sure if you’ve seen this, but the BizTalk support engineers are blogging now and chat about orchestration performance and other topics.  The recent post covers singletons, which is of recent interest to folks I know.

    Technorati Tags: , WCF

  • New WCF Management Pack for SOA Software

    I was on a conference call with those characters from SOA Software and they were demonstrating their BizTalk Management Pack.  They also spent a lot of time covering their in-development WCF binding.

    Moving forward, SOA Software is releasing Microsoft-friendly agents for …

    • IIS 6.0 (SOAP/HTTP)
    • WCF (any transport)
    • BizTalk (any transport)
    • BizTalk-WCF (any transport)

    All of these (except the BizTalk agent) support policy enforcement.  That is, the BizTalk agent only does message recording and monitoring whereas the other agents support the full suite of SOA Software policies (e.g. security, XSLT, etc).

    So what is the difference between the BizTalk agent, and the BizTalk-WCF agent?  The relationship can be represented as such:

    The BizTalk-only agent is really a pipeline component which captures things from inside the BizTalk bus.  This means that it will work with ANY inbound our outbound adapter.  Nice.  The SOA Software WCF binding is at the WCF adapter layer, and allows for full policy enforcement at the adapter layer.  However, this is ONLY for the BizTalk WCF adapters, not the other adapters.

    So if I had a WCF endpoint that I wanted to play with SOA Software, I could first attach the out-of-the-box SOA Software pipelines to the receive location.

    Next, in the WCF-CustomIsolated adapter configuration, I can specify the new soaBinding type.

    I don’t HAVE to do the pipeline AND the WCF binding if I have a WCF endpoint, but, if I want to capture the data from multiple perspectives, I can.  For that binding, there are a few properties that matter.  Mostly importantly, note that I do NOT have to specify which policy to apply.  The appropriate policy details are recovered at runtime, so making changes to the policy requires no changes to this configuration.

    From within the SOA Software management interface, I can review my BizTalk endpoints (interpreted as operations on a WSDL that represents the BizTalk “application”).

    Notice that this is a managed BizTalk receive location.   If I sent something through this managed receive location (with a policy set to record and monitor the traffic) I could see a real-time chart of activity, and, see the message payload.

    Notice that I see all the context values, AND, the payload in a CDATA block.  This supports BizTalk flat file scenarios.

    As for the WCF binding, you would install the SOA WCF binding on the client machine, and it becomes available to developers who want to call the SOA-managed WCF service.  The binding looks up the policy details at runtime, again shielding the developer from too much hard coding of information.

    So what’s cool here?  I like that the BizTalk agent works for ALL BizTalk adapters.  You can create a Service Level Agreement (SLA) policy where more than 10 faults to an Oracle adapter send port results in an email to a system owner.  Or if traffic to a particular FILE receive location goes above a certain level (per day), then raise an issue.  From the WCF side, it’s very nice that all WCF transports are supported for service management and that service policy information is dynamically identified at runtime versus embedded in configuration details.

    If you’re a BizTalk shop, and you have yet to go nuts with SOAP and services, you can still get some serious value from using the BizTalk agent from SOA Software.  If you’ve fully embraced services, and are already on the WCF bandwagon, the upcoming WCF binding from SOA Software provides a vital way to apply service lifecycle and management to your environment.

    Technorati Tags: , , WCF

  • Building InfoPath Web Forms With Cascading Lists

    We’re replacing one of our critical systems, and one of the system analysts was looking for a way to capture key data entities in the existing system, and every system/form/report that used each entity.  Someone suggested SharePoint and I got myself roped into prototyping a solution.

    Because of the many-to-one relationship being captured (e.g. one entity may map to fields in multiple systems), a straight out SharePoint list didn’t make sense.  I have yet to see a great way to do parent/child relationships in SharePoint lists.  So, I proposed an InfoPath form.

    I started by building up SharePoint lists of reference data.  For instance, I have one list with all the various impacted systems, another with the screens for a given system (using a lookup field to the first list), and another with tabs that are present on a given screen (with a lookup field to the second list).  In my InfoPath form, I’d like to pick a system, auto-populate a list of screens in that system, and if you pick a screen, show all the tabs.

    Using the InfoPath rich client, one can utilize the “filter” feature and create cascading drown downs by filtering the data source results based on a previously selected value.  However for InfoPath Form Services enabled forms, you see this instead:

    Son of a!  The suggestions I found to get around this included either (a) write custom code to filter the result set, or (b) use a web service.  I know that InfoPath Form Services is a limited version of the rich client, but I hate that the response to every missing feature is “write a web service.”  However, that’s still a better option than putting code in the form because I don’t want to deal with “administrator approved” forms in my environment.

    So, I wrote a freakin’ web service.  I have operations that take in a value (e.g. system), and uses the out-of-the-box SharePoint web services to return the results I want.  The code looks like this …

    Notice that I’m using the GetListItems method on the SharePoint WSDL.  I pass in a CAML statement to filter the results returned from my “system screens” SharePoint list.  Since I don’t like to complain about EVERYTHING, it is pretty cool that even though my operation returns a generic XMLDocument, InfoPath was smart enough to figure out the return schema when I added a data connection to the service.

    What next?  Well, I have a drop down list bound to this web service data connection, but chose to NOT retrieve the information when the form opened.  It’s data is conditional based on which system was selected, so calling this web service is dependant on choosing a system.  So, on my “systems” drop down list, I have a rule that fires if the user actually selected a system.  The rule action first sets the input parameter of the web service schema to the value in the “systems” drop down list.  Next, it performs the “Query Using A Data Connection” function to call the custom web service.

    So what do I have?  I’ve got a nice form that gets all its data from external SharePoint lists, and cascades its drop downs like a mad man.

    Of course after I deployed this, I was asked about reporting/filtering on this data.  The tricky thing is, the list of system mappings is obviously a repeating field.  So when publishing this form to SharePoint, and asked to promote columns, I have to choose whether to pick the first, last, count or merge of system fields.

    I chose merge, because I want the data surfaced on a column.  However, the column type that gets created in the SharePoint list is a “multiple lines of text”, which cannot be sorted or filtered.

    So how to see a filtered view of this data?  What if the business person wants to see all entities that touch system “X”?  I considered about 72 different options (views, custom columns updated by WF on the list, connected web parts, data sheet view, etc) before deciding to build a new InfoPath form and new web service that could give me the filtered results.  My web service takes in all possible filter criteria (system name, system screen, system tab) and based on which values came into the operation, builds up the appropriate CAML statement.  Then, in my new form, I have all the search criteria in drop down lists (reusing my custom web service from above to cascade them), and puts the query results in a repeating table.  One table column is a hyperlink that takes the user to the InfoPath form containing the chosen entity.  Had to figure out that the hyperlink control’s data source had be specially formatted so that I could have a dynamic link:

    concat(http://sharepointsite/sites/IS/division/Program/IntakeDist/Safety%20Entity%20Definition%20List/, @ows_LinkFilename)

    This takes my static URL, and appends the InfoPath XML file name.  Now I have another form that can be opened up and used to query and investigate the data entities.

    That was a fun exercise.  I’m sure there’s probably a better way to do some of the things I did, so if you have suggestions, let me know.  I do really like InfoPath Form Services, but once you really start trying to meet very specific requirements, you have to start getting creative to work around the limitations.

    Technorati Tags: ,

  • All Source Code Posted for BizTalk + WCF Articles

    I just finished zipping up all the source code for my recent set of articles over at TopXML.com.  Specifically, I just added the source code for the set of articles on publishing WCF services out of BizTalk (with security, transactions, attachments) and the source code for all the BizTalk Adapter Pack demonstrations that utilized the Oracle adapter.  I make no promises that the code is attractive, contains best practices, or avoids the use of obscenities in the comments.

     

    Series Summary
     BizTalk and WCF: Part I, Operation Patterns Get the source code!
     BizTalk and WCF: Part II, Security Patterns
     BizTalk and WCF: Part III, Transaction Patterns
     BizTalk and WCF: Part IV, Attachment Patterns
     BizTalk and WCF: Part V, Publishing Operations Patterns Get the source code!
    BizTalk and WCF: Part VI, Publishing Advanced Service Patterns
    BizTalk and WCF: Part VII, About the BizTalk Adapter Pack Get the source code!
    BizTalk and WCF: Part VIII, BizTalk Adapter Pack Service Model Patterns
    BizTalk and WCF: Part IX, BizTalk Adapter Pack BizTalk Patterns

     

    Technorati Tags: ,

  • BizTalk "Message Aggregation For Email" Pattern

    One of our BizTalk developers had a requirement to collect related messages and send a single email summary with details about those messages. I built the example below to help her out.

    In this scenario, a series of independent, but related, messages are sent to BizTalk. Each of these messages will have a “batch ID” which connects them as well as a “batch count” value which identifies the total number of messages in the batch. As you might expect, I’ll need a convoy to collect these messages. The interesting part was how to build up the email data which summarized the results of the processing for a given batch.

    First, I have a schema that represents an individual message.

    As you can see, I have details about the batch, and then a record containing details about the document that was processed through BizTalk. These fields are mostly populated by the system called by BizTalk earlier in the process and those new values need to be reported back to the initiator of the submission.

    The next schema represents the email content being sent back to the initiator. There is a summarization of the batch of records they submitted, and then one section where successfully processed documents are recorded, and a section where failed documents are recorded.

    Next, we need a convoy orchestration that processes all messages for a given batch. The first receive shape initializes a correlation set on “batch ID” (from a separate property schema) and then initializes the loop variables. This loop will run until all the number of messages received is equal to the “batch count” in the message.

    The meat of this orchestration is the part that builds up the email message. I have a helper class that accepts data from each batch message, and stores it in a member variable until I’m ready to return the completed, aggregate message. Now, I could have chosen to build some sort of custom type object, and when the loop was complete, turn that object into the XML representation of my email schema. But, I’d rather cut out that middle man. So, I passed my “BatchSummary” schema (above) through the .NET Framework xsd.exe tool to get a type object that directly mapped to the schema. That type is used as a member variable of my helper class.

    You’ll notice that I also created a few lists to hold the success and failure item types. The “BatchSummary” object takes in an array of success and failure items, but during the running of the convoy, I have no idea how many success or failure items I have, and thus couldn’t properly initialize an array of the necessary size. So, by creating a list, and simply adding to it along the way, I can postpone array creation until later.

    Within this class I have operations to add message data elements to the appropriate “success” or “failure” list object, and then finally, the convoy orchestration should call the operation below to get a “completed” batch summary object.

    The next part is fun. I created an orchestration message for the batch email message, but, instead of choosing the XSD file for the “Message Type”, I chose the object generated by the xsd.exe tool.

    This .NET object has all the necessary metadata to automagically serialize into an XML message on the way out of the orchestration. At the end of the orchestration loop, I have a “message assignment” shape where I create the orchestration message by calling the appropriate operation on my helper class.

    Because I want to take this XML payload and turn it into an HTML email, I need to massage the data on the way out. For this scenario, I used the XslTranformComponent sample from the BizTalk SDK (C:\Program Files\ Microsoft BizTalk Server 2006\ SDK\ Samples\ Pipelines\ XslTransformComponent). After building this pipeline component and GAC-ing it, I created a new send pipeline and dropped this component there. Finally, I wrote an XSLT stylesheet which took the XML and prettied it up. Now, when I drop three files (all with the same “batch ID”) into BizTalk, I get the following email:

    Nice! So, with fairly few moving parts, I collected a bunch of related messages, built up a new composite message on the fly, and then send a .NET object-type orchestration message out, which had an XSLT transform applied before being emailed to the target recipient.

    Technorati Tags:

  • New BizTalkHotRod Issue Out; BizTalk Bloggers to Check Out

    The latest issue of the BizTalk HotRod magazine is out.  Some of the topics you’ll find are:

    • Detailed look at using the ESB Guidance Exception Management framework
    • Exposing BizTalk BRE rules via WCF services
    • Look at parallel convoys
    • Peek at the BizTalk WCF adapters
    • Taking control of XSLT in BizTalk solutions
    • Hosting WF in BizTalk

    And much more.  As usual, a well done issue.

    Skimming through this issue got me thinking about where I get my BizTalk information, and reminded me to update my RSS reader so that I regularly read some of the “newer guys” who cover BizTalk topics.  Some of the original BizTalk giants (like Steven, Tomas, Charles, Scott, Jon, Lee, etc) have (naturally) shifted some of their attention to other technologies, so it’s important to keep an eye out for folks who are taking a fresh look at BizTalk things.

    Some of the blogs I read on occasion (but need to actually subscribe to) include:

    I need a show of hands of who still has Scott Woodgate on their BizTalk blogroll.  Seriously people, it’s time to let go. 

    Technorati Tags:

  • BizTalk Orchestration Throttling Pattern

    I’m currently architecting a project where one of the requirements is to limit the number of concurrent calls to a web service. I’d covered a similar topic in a previous post, and outlined two ways one could try and configure this behavior.

    First, you could limit the number of simultaneous connections for the SOAP adapter by setting the “maxconnections” setting in the btsntsvc.exe.config file. The downside to this mechanism is that if you have many messages, and the service takes a while to process, you could timeouts.

    The second choice is to turn on ordered delivery at the send port. This eliminates the timeout issue, but, really slows processing. In our case, the downstream web service (a FirstDoc web service that uploads documents to Documentum) is fairly CPU intensive, and may take upwards of 200 seconds to run (which is why I also asked the developer to consider an asynchronous callback pattern), so we need a few calls happening at once, but not so many that the box collapses.

    So, Scott Colestock recommended I take a look at the last issue of the BizTalk HotRod magazine and review the orchestration throttling pattern that he had expalined there. Unlike the two options mentioned above, this DOES requirement development, but, it also provides fairly tight control over the number of concurrent orchestrations. Since Scott’s article didn’t have code attached, I figured I’d rebuild the project to help our developer out, and, learn something myself.

    The first step was to define my two BizTalk message schemas. My first is the schema that holds the data used by the FirstDoc web service. It contains a file path which the web service uses to stream the document from disk into Documentum. I didn’t want BizTalk to actually be routing 50MB documents, just the metadata. The second schema is a simple acknowledgement schema that will be used to complete an individual processing instance. Also, I need a property schema that holds a unique correlation ID, and the instance ID of the target convoy. Both properties are set to MessageContextPropertyBase since the values themselves don’t come from the message payload but rather, the context.

    The second step was to build a helper component which will dictate which throttled instance will be called. Basically, this pattern uses convoy orchestrations. The key is though, that you have multiple convoys running, vs. a true singleton that processes ALL messages. The “correlation” used for each convoy is an instance ID that corresponds to a number in the numerical range of running orchestrations allowed. For instance, if I allowed 10 orchestrations to run at once, I’d have 10 convoy orchestrations, each one initializing (and following) an “InstanceID” between 1-10. Each of my calling orchestrations acquire a number between 1-10, and then target that particular correlation. Make sense? So I may have 500 messages come in at once, and 500 base orchestrations spin up, but each one of those target a specific throttled (convoy) orchestration.

    So my helper component is responsible for doling out an instance ID to the caller. The code looks like this:

    [Serializable] public static class RoundRobinHelper { //member variable holding current selection private static int roundRobinSelection = 0; private static object sync = new object(); /// /// Thread-safe retrieval of which /// orchestration instance to target /// /// public static int GetNext() { const int maxInstances = 3; lock (sync) { //increment counter roundRobinSelection++; //if we’ve reached the limit, reset if (roundRobinSelection == maxInstances) { roundRobinSelection = 0; } } return roundRobinSelection; } }

    Notice that because it’s a static class, and it has a member variable, we have to be very careful to build this in a thread safe manner. This will be called by many orchestrations running on many threads.

    Once this component was built and GAC-ed, I could build my orchestration. The first orchestration is the convoy. The first receive shape (which is direct bound to the MessageBox) initializes the “instance ID” correlation set. Then I have a loop which will run continuously. Inside that loop, I have a placeholder for the logic that actually calls Documentum, and waits for the response. Next I build the “acknowledgement message”, making sure to set the “Correlation ID” context property so that this acknowledgement reaches the orchestration that called it. I then send that message back out through a direct bound send port. Finally, I have a receive shape which follows the “Instance ID” correlation set (thus defining this orchestration as a convoy).

    Next, we have the orchestration that spins up for EVERY inbound message. First, it receives a message from a port bound to a file receive location. Next, within an Expression shape, I call out to my “round robin” helper component which gives me the next instance ID in the sequence.

    ThrottledInstanceId = BizTalkPattern.OrchHelper.RoundRobinHelper.GetNext();

    I then make sure to create a new message with both the “Correlation ID” and “Instance ID” context properties set.

    //create message copy Metadata_Output = Metadata_Input; //set convoy instance ID Metadata_Output(BizTalkPattern.BizTalkBits.InstanceId) = ThrottledInstanceId.ToString(); //set unique ID using orchestration instance identifier Metadata_Output(BizTalkPattern.BizTalkBits.CorrelationID) = BizTalkPattern.BizTalkBits.ProcessAllMetadataFiles (Microsoft.XLANGs.BaseTypes.InstanceId);

    Finally, I send this message out (via direct bound send port) and wait for the acknowledgement back.

    So what I have now is a very (basic) load balancing solution where many inbound messages flow through a narrowed pipe to the destination. The round robin helper component keeps things relatively evenly split between the convoy orchestrations, and I’m not stuck using a singleton that grinds all parallel processing to a halt.  Running a few messages through this solution yields the following trace …

    If I look in the BizTalk Administration Console, I now have three orchestrations running at all times, since I set up a maximum of three convoys.  Neat.  Thanks to Scott for identifying this pattern.

    Any other patterns for this sort of thing that people like?

    Technorati Tags:

  • Flowing Transactions To Oracle Using Adapter Pack

    So the documentation that comes with the BizTalk Adapter Pack makes scant reference to flowing transactions to the adapters.  That is, if I want to call the “Insert” operation on an “Orders” table, but only commit that if the “Insert” operation on the “Order Items” table succeeds, how do I wrap those operations in a single transaction?

    WCF has great transaction support, and the BizTalk Adapter Pack is built on WCF, but the product documentation for the Oracle adapter states:

    The Oracle Database adapter does not support performing transactions on the Oracle database using System.Transaction. The adapter supports transactions using OracleTransaction.

    Limitations of BizTalk Adapter 3.0 for Oracle Database

    Hmmm.  That’s pretty much the only time transactions are mentioned at all.  That makes it sound like I cannot wrap my service calls in a System.Transaction and have to use the OracleTransaction object from the ODP.NET bits.  What better way to confirm this than by actually testing it?

    I’m using the example from my TopXML.com articles.  So in that article, I mention inserting into two tables sequentially via proxy classes.  So, what happens if I take that same block of “insert” code and purposely create an error in the second set of data (e.g. use a non-existent “OrderID”)?  An exception occurred during the second operation, but the first insert command succeeded …

    Notice that my “Orders” table has a record in it, but the “OrderItems” table has no corresponding items for OrderID #34.  So, I’m stuck in an inconsistent state.  Not good.

    On a whim, I decided to wrap the entire block of “insert” code inside a System.Transaction.TransactionScope block to see what would happen.  On the first execution, I got an error saying “Unable to Load OraMTS“.  Interesting.  Looked like the System.Transaction in my code is converted to an Oracle transaction by the adapter and the OraMTS object (from the Oracle client) wasn’t found.  So, I went back to my Oracle client installation and made sure to install the Oracle Services for Microsoft Transaction Server.

    Now, if I executed my code again, with the same error in the 2nd set of insert commands, the database remained in a consistent state, and the first insert did not commit.  So you CAN wrap these service invocations inside a System.Transacton object (at least for the Oracle adapter) to daisy-chain atomic operations.

    Overall, the documentation for the BizTalk Adapter Pack is top notch, but the complete absence of transaction instructions seems curious.

    Technorati Tags: ,