Author: Richard Seroter

  • Interview Series: FIVE Questions With … Ofer Ashkenazi

    To mark the just-released BizTalk Server 2009 product, I thought my ongoing series of interviews should engage one of Microsoft’s senior leadership figures on the BizTalk team.  I’m delighted that Ofer Ashkenazi, Senior Technical Product Manager with Enterprise Application Platform Marketing at Microsoft, and the guy in charge of product planning for future releases of BizTalk, decided to take me up on my offer.

    Because I can, I’ve decided to up this particular interview to FIVE questions instead of the standard four.  This does not mean that I asked two stupid questions instead of one (although this month’s question is arguable twice as stupid).  No, rather, I wanted the chance to pepper Ofer on a range of topics and didn’t feel like trimming my question list.  Enjoy.

    Q: Congrats on new version of BizTalk Server.  At my company, we just deployed BTS 2006 R2 into production.  I’m sure many other BizTalk customers are fairly satisfied with their existing 2006 installation.  Give me two good reasons that I should consider upgrading from BizTalk 2006 (R2) to BizTalk 2009.

    A: Thank you Richard for the opportunity to answer your questions, which I’m sure are relevant for many existing BizTalk customers.

    I’ll be more generous with you (J) and I’ll give you three reasons why you may want to upgrade to BizTalk Server 2009: to reduce costs, to improve productivity and to promote agile innovation. Let me elaborate on these reasons that are very more important in the current economic climate:

    1. Reduce Costs – through servers virtualization and consolidation and integration with existing systems. BizTalk Server 2009 supports Windows Server 2008 with Hyper-v and SQL Server 2008. Customers can completely virtualize their development, test and even production environments. Using less physical servers to host BizTalk solutions can reduce costs associated with purchasing and maintaining the hardware. With BizTalk Enterprise Edition you can also dramatically save on the software cost by running an unlimited number of virtual machines with BizTalk instances on a single licensed physical server. With new and enhanced adapters, BizTalk Server 2009 lets you re-use existing applications and minimize the costs involved in modernizing and leveraging existing legacy code. This BizTalk release provides new adapters for Oracle eBusiness Suite and for SQL Server and includes enhancements especially in the Line of Business (LOB) adapters and in connectivity to IBM’s mainframe and midrange systems.
    2. Improve Productivity – for developers and IT professionals using Visual Studio 2008 and Visual Studio Team System 2008 that are now supported by BizTalk. For developers, being able to use Visual Studio version 2008 means that they can be more productive while developing BizTalk solutions. They can leverage new map debugging and unit testing options but even more importantly they can experience a truly connected application life cycle experience. Collaborating with testers, project managers and IT Pros through Visual Studio Team System 2008 and Team Foundation Server (TFS) and leveraging capabilities such as: source control, bug tracking, automated testing , continuous integration and automated build (with MSBuild) can make the process of developing BizTalk solutions much more efficient. Project managers can also gain better visibility to code completion and test converge with MS project integration and project reporting features. Enhancements in BizTalk B2B (specifically EDI and AS2) capabilities allow for faster customization for specific B2B solutions requirements.
    3. Promote Agile Innovation – specific improvements in service oriented capabilities, RFID and BAM capabilities will help you drive innovation for the business. BizTalk Server 2009 includes UDDI Services v3 that can be used to provide agility to your service oriented solution with run-time resolution of service endpoint URI and configuration. ESB Guidance v2 based on BizTalk Server 2009 will help make your solutions more loosely coupled and easier to modify and adjust over time to cope with changing business needs. BizTalk RFID in this release, features support for Windows Mobile and Windows CE and for emerging RFID standards. Including RFID mobility scenarios for asset tracking or for doing retail inventories for example, will make your business more competitive. Business Activity Monitoring (BAM) in BizTalk Server 2009 have been enhanced to support the latest format of Analysis Services UDM cubes and the latest Office BI tools. These enhancements will help decision makers in your organization gain better visibility to operational metrics and to business KPI in real-time. User-friendly SharePoint solutions that visualize BAM data will help monitor your business execution ensure its performance.

    Q: Walk us through the process of identifying new product features.  Do such features come from (a) direct customer requests, (b) comparisons against competition and realizing that you need a particular feature to keep up with others, (c) product team suggestions of features they think are interesting, (d) somewhere else  or some combination of all of these?.

    A: It really is a combination of all of the above. We do emphasize customer feedback and embrace an approach that captures experience gained from engagements with our customers to make sure we address their needs. At the same time we take a wider and more forward looking view to make sure we can meet the challenges that our customers will face in the near term future (a few year ahead). As you personally know, we try to involve MVPs from the BizTalk customer and partner community to make sure our plans resonate with them. We have various other programs that let us get such feedback from customers as well as internal and external advisors at different stages of the planning process. Trying to weave together all of these inputs is a fine balancing act which makes product planning both very interesting and challenging…

    Q: Microsoft has the (sometimes deserved) reputation for sitting on the sidelines of a particular software solution until the buzz, resulting products and the overall market have hit a particular maturation point.  We saw aspects of this with BizTalk Server as the terms SOA, BPM and ESB were attached to it well after the establishment of those concepts in the industry.  That said, what are the technologies, trends or buzz-worthy ideas that you keep an eye on and influence your thinking about future versions of BizTalk Server?

    A: Unlike many of our competitors that try to align with the market hype by frequently acquiring technologies and thus burdening their customers with the challenge of integrating technologies that were never even meant to work together, we tend to take a take a different approach. We make sure that our application platform is well integrated and includes the right foundation to ease and commoditize software development and reduce complexities. Obviously it take more time to build an such an integrated platform based on rationalized capabilities as services rather than patch it together with foreign technologies. When you consider the fact that Microsoft has spearheaded service orientation with WS-* standards adoption as well as with very significant investments in WCF – you realize that such commitment have a large and long lasting impact on the way you build and deliver software.
    With regard to BizTalk you can expect to see future versions that provide more ESB enhancements and better support for S+S solutions. We are going to showcase some of these capabilities even with BizTalk Server 2009 in the coming conferences.

    Q: We often hear from enlightened Connected Systems folks that the WF/WCF/Dublin/Oslo collection of tools is complimentary to BizTalk and not in direct competition.  Prove it to us!  Give me a practical example of where BizTalk would work alongside those previously mentioned technologies to form a useful software solution.

    A: Indeed BizTalk does already work alongside some of these technologies to deliver better value for customers. Take for example WCF that was integrated with BizTalk in the 2006 R2 release: the WCF adapter that contains 7 flavors of bindings can be used to expose BizTalk solutions as WS-* compliant web services and also to interface with LOB applications using adapters in the BizTalk Adapter Pack (which are based on the WCF LOB adapter SDK).

    With enhanced integration between WF and WCF in .NET 3.5 you can experience more synergies with BizTalk Server 2009. You should soon see a new demo from Microsoft that highlights such WF and BizTalk integration. This demo, which we will unveil within a few weeks at TechEd North America, features a human workflow solution hosted in SharePoint implemented with WF (.NET 3.5) that invokes a system workflow solution implemented with BizTalk Server 2009 though the BizTalk WCF adapter.

    When the “Dublin” and “Oslo” technologies will be released, you can expect to see practical examples of BizTalk solutions that leverage these. We already see some partners, MVPs and Microsoft experts that are experimenting with harnessing Oslo modeling capabilities for BizTalk solution (good examples are Yossi Dahan’s Oslo based solution for deploying BizTalk applications and Dana Kaufman’s A BizTalk DSL using “Oslo”). Future releases of BizTalk will provide better out-of the-box alignment with innovations in the Microsoft Application Platform technologies.

    Q [stupid question]: You wear red glasses which give you a distinctive look.  That’s an example of a good distinction.  There are naturally BAD distinctions someone could have as well (e.g. “That guy always smells like kielbasa.”, “That guy never stops humming ‘Rock Me Amadeus’ from Falco.”, or “That guy wears pants so tight that I can see his heartbeat.”).  Give me a distinction you would NOT want attached to yourself.

    A: I’m sorry to disappoint you Richard but my red-rimmed glasses have broken down – you will have to get accustomed to seeing me in a brand new frame of a different color… J
    A distinction I would NOT want to attach myself to would be “that unapproachable guy from Redmond who is unresponsive to my email”. Even as my workload increases I want to make sure I can still interact in a very informal manner with anybody on both professional and non-professional topics…

    Thanks Ofer for a good chat.  The BizTalk team is fairly good about soliciting feedback and listening to what they receive in return, and hopefully they continue this trend as the product continues to adapt to the maturing of the application platform.

    Technorati Tags:

  • Quick Thoughts on Formal BizTalk 2009 Launch Today

    So, looks like today was the formal release of BizTalk Server 2009.  It’s been available for download on MSDN for about a month, but this is the “general availability” date.

    The BizTalk page at Microsoft.com has been updated to reflect this.  Maybe I knew this and forgot, but noticed on the Adapters page that there doesn’t seem to be the classic Siebel, Oracle, or SQL Server adapters included anymore.  I know those are part of the BizTalk Adapter Pack 2.0 (which still doesn’t show up as a MSDN subscriber download for me yet), but I guess this means that folks on the old adapters really better start planning their migration.

    The Spotlight on Cost page has some interesting adoption numbers that have been floating around a while.  The ESB Guidance page has been updated to discuss ESB Guidance 2.0.  However, that package is not yet available for download on the CodePlex ESB Guidance page.  That’ll probably come within a few weeks.

    The System Requirements page seems to be updated, but doesn’t seem to be completely accurate.  The dependency matrix still shows HAT, and the one section of Software Prerequisites still says Visual Studio.NET 2005.

    There are a handful of BizTalk Server 2009 books either out or coming out, so this incremental release of the product should be adequately covered.

    To mark this new version, look out for a special Four Questions to kick off the month of May.

    UPDATE:I forgot to include a link to the latest BizTalk Server 2009 code samples as well.

    Technorati Tags:

  • "SOA Patterns with BizTalk Server 2009" Released

    This morning my publisher turned a few knobs and pressed a complex series of buttons and officially released my first book, SOA Patterns with BizTalk Server 2009.  It’s available for purchase right now from Packt’s web site and should cascade down to other booksellers like Amazon.com within a week.

    You can find the complete table of contents here, as well as a free, partial sample chapter on the new WCF SQL Server adapter.  What looks like a full, PDF version of that sample chapter (as well as the book’s intro and acknowledgements) can be found here.

    I had three general goals when I started this process almost a year ago:

    • Cover topics and angles on BizTalk Server that had not been broadly discussed before
    • Write the book in a conversational tone that is more like my blog and less like an instruction manual
    • Build all examples using real-life scenarios and artifacts and avoid the ubiquitous “Project1”, “TestSchema2” stuff.

    In the end, I think I accomplished all three. 

    First, I included a few things that I’ve never seen done before, such as WCF duplex binding using the out-of-the-box BizTalk adapters, quasi complex event processing with BizTalk, detailed ESB Guidance 2.0 walkthroughs, and the general application of SOA principles to all aspects of BizTalk solutions.  Hopefully you’ll find dozens of items that are completely new to you.

    Secondly, I don’t truly enjoy technical books that just tell me to “click here, click there, copy this code” so that by the end of the chapter, I have no idea what I just accomplished.  Instead, I tried to follow my blog format where I address a topic, show the key steps and reveal the completed solution.  I provide all the code samples anyway, so if you need to dig into every single detail, you can find it.

    Finally, I decided up front to use actual business use cases for the samples in each chapter of the book.  It just doesn’t take THAT much effort, and I hope that it makes the concepts more real than if I had shown a bunch of schemas with elements called “Field1” and “Field42.”

    So there you go.  It was lots of fun to write this, and I hope a few of you pick up the book and enjoy reading it.   My tech reviewers did a great job keeping me honest, so you shouldn’t find too many glaring conceptual flaws or misplaced expletives.   If you do have any feedback on it, don’t hesitate to drop me a line. 

    UPDATE: The book is now available in the US on Amazon.com and is steadily creeping up in sales rank. Thanks everyone!

    Technorati Tags: ,

  • What Technologies Makes Up an SOA?

    My boss’s boss asked if our architecture team could put together a list of what technologies and concepts typically comprise a service oriented architecture, so I compiled a collection of items and organized them by category.  I then evaluated whether we had such a technology in house, and if so, did we actively use it.  While you don’t care about those latter criteria, I thought I’d share (and ask your opinion of) my technology list.

    Is this fairly complete?  Is anything missing or mischaracterized?

    Category Technology Description Toolset Examples
    Standards XML Markup language for defining the encoding of structured data sets Application platforms, database platforms, COTS packages
    SOAP Protocol specification for exchanging XML data over networks Application platforms, COTS packages
    WSDL XML language for describing web service contracts Application platforms, COTS packages that expose WSDL, IDE tools such as XmlSpy for hand-crafting WSDL documents
    WS* Set of web service standards of varying maturity that address message routing, security, attachment encoding, transactions and more. COTS packages such as SAP, application platforms such as Microsoft WCF
    RESTful Services Architectural style with a focus on resource orientation and interacting with the state of that resource through traditional HTTP verbs .NET Framework 3.5+
    Design Service Modeler Graphical tools for modeling SOA solutions WebSphere Business Modeler, Mashup Tools
    Data Enterprise Entity Definitions Computable representations of shared entities that may span multiple source systems XSD, DDL
    Reference Data Architecture Three components made up of: (a) Operation Data Stores for logical entity definitions that act as “real time data warehouse” consisting of both real-time and batch updated data (b) Data marts for data subset analysis, and (c ) Data Warehouse for enterprise data storage and analysis Oracle, Teradata, SQL Server
    Enterprise Information Integration (EII) Virtual integration of disparate data sources Composite
    Data Service Interface Generation Application for generating interfaces (SOAP/batch) on existing data repositories BizTalk Server 09, .NET Framework, Composite
    Enterprise Service Bus Reliable message routing, transformation, queuing and delivery across disparate technologies BizTalk Server 09, Tibco, Sonic
    Application Adapters Connectivity to cross-platform systems via configuration and/or abstract interfaces BizTalk Server 09, BizTalk Adapter Pack (Siebel, SAP, Oracle)
    ETL Application Bulk data extraction, cleansing and migration between repositories.  Frequently needed for consolidating data in the ODS Informatica, SSIS
    Infrastructure Application Platforms Libraries for building SOA solutions .NET Framework, Java EE
    XML Gateway (Accelerators) Focuses on SOA security and performance issues; typically a hardware appliance WebSphere DataPower; Layer 7
    Complex Event Processing Engine Concept where distinct application or system events are correlated with the goal of discovering critical business information in real time Microsoft StreamInsight, Tibco, Oracle CEP
    Single Sign On Single set of authenticated credentials can be converted to system-specific credentials without user intervention Siteminder
    Data Quality Services Used to cleanse data and validate data quality DataFlux
    Load Balancing Hardware or software solution which sits in front of physical web servers and is responsible for distributing load F5; SOA Service Manager
    Web Hosting Platforms Environment to making web services available to consumers Microsoft IIS, Oracle WebLogic
    Process Integration BPM Server Tooling that supports the modeling, simulation, execution, monitoring and optimizations of business processes BizTalk Server 09Lombardi
    Business Rules Engine Software system for maintaining business rules outside of compiled code Microsoft BRE,           Oracle
    Orchestration Services Arrangement of services into an executable business process often designed through a graphical model BizTalk Server 09
    Business Activity Monitoring Aggregation of activities, events and data that provides insight into organization and its business processes BizTalk Server 09
    Service Management XML Service Management (XSM) A platform independent system for administering, mediating and monitoring services SOA Service Manager
    Repository Metadata repository for service definitions SOA Service Manager
    UDDI-compliant Registry Interface for runtime infrastructure to discover and bind to service endpoints SOA Service Manager
    Runtime Policies Attach attributes to service (or operation) that dictates how the service request is secured and processed SOA Service Manager
    SLA Enforcement Define acceptable web service availability with customizable rules consisting of alert code, metric to monitor and an interval SOA Service Manager
    Access Control Contracts Limit usage of service based on distinct number of allowable attempts, time of day window, or based on caller identity SOA Service Manager
    Service Lifecycle Management Stage, status, approvals, dependency tracking, audit trail

    SOA Service Manager

    Security Authorization Verify user identity through LDAP integration, SAML tokens, x.509 certificates Active Directory, Sun ONE LDAP, SOA Service Manager
    Authentication User/group/role at service or operation level SOA Service Manager
    Identity Management Central storage of user profile

    Any feedback is appreciated!

    Technorati Tags:

  • BizTalk 2009 Hyper-V Guide Released

    My buddy Ewan’s newest paper on BizTalk and Hyper-V was released a few days ago.  I had the opportunity to provide some feedback on the early drafts, and while it’s nice that he mentions me in the acknowledgements, I’m mad at him for outing my not-so-secret employer 🙂  I am now actively searching for secrets of Ewan’s that I can publish to a broad audience.  The racier the better.

    Anyway, it’s a good paper.  Don’t let the 135 pages fool you.  It’s not like it takes that much text to explain HOW to deploy BizTalk on Hyper-V.  If it did, then Microsoft failed miserably at making something understandable.  Rather, all this content focuses on the best practices, upfront considerations, and the results of a comprehensive set of tests run by the Microsoft team.  They’ve added a set of checklists that you can use which let you jump straight to the parts that help you deploy and configure BizTalk in the virtual environment.

    So who is this for?  While the paper boldly claims that this is for “All IT Professionals who work with BizTalk Server”, I’d say that this paper is quite useful to those responsible for estimating costs of new environments (consultants, internal IT orgs), those installing such environments, and system owners responsible for keeping the environment healthy and appropriately sized.

    Even if you aren’t using Hyper-V (we aren’t, for example), there is useful information here when trying to figure out the performance cost of virtualization on BizTalk environments and tips for improving that performance.

    Technorati Tags:

  • BizTalk and WCF Finds a New Home at Microsoft

    Saw on eWeek this morning that there has been some movement of Microsoft product teams.  Buried near the end of this article we find out that the Connected Systems Division (home of BizTalk, WCF, Oslo, etc) is combined with the Data and Storage Platform (SQL Server) team.  The entire new team is called “Business Platform Division.”

    Probably not a bad move given the inherent relationship between the two sets of technologies.  This probably helps BizTalk’s long term prospects because (a) Ted Kummert, who leads the new team, is a long time friend of BizTalk and (b) long term we may see a very tight integration (bundling?) between the products.

    Technorati Tags:

  • Interview Series: Four Questions With … Ewan Fairweather

    In this month’s interview with a CSD thought leader, we chat with Ewan Fairweather who works for Microsoft on the BizTalk Customer Advisory Team (previously known by their hip moniker “BizTalk Rangers”) and has authored or contributed to numerous BizTalk whitepapers including:

    Ewan is really THE guy when it comes to BizTalk performance considerations, and has delivered an EPIC set of answers to “Four Questions.”  Also note that because Ewan is a delightful Englishman, I demand that you read his answers using the thickest North England accent you can muster.

    Q: Whether someone has just purchased BizTalk Server, or is migrating an existing environment, appropriate solution sizing is critical.   What are the key questions you ask customers (or consider yourself) when determining how best to right size a BizTalk farm (OS, hardware, database, etc)?

    A: I’ll start with the numbers that I use and then go through the thought process I use to scope the size of environment I need when I run performance labs.  The number of messages that you can process on a BizTalk Group depends on a lot of factors (machine characteristics, adapters, etc.).  Therefore the best way to size a BizTalk solution remains to do a POC. However if this is not possible, here are the numbers that I can and do use to make sure I am in the right ballpark.  These numbers are derived from our internal test results. The BizTalk machines were Dual Proc Dual Core, 4 GB memory and SQL was Dual Proc Quad Core with 8 GB of memory:

    Single Server Messaging Performance

    1. A simple small messaging  scenario can achieve 715 messages a second (WSHTTP WCF One Way Messaging Scenario using 2KB messages and passthru pipelines)
    2. Optimizing the transport takes it to ~850 messages a second. Utilizing NetTCP provided us with an approx  ~18% gain.

    The scale factor that I use is approximately 1.5 per BizTalk Server. So in this scenario going to 2 servers I would expect ~1000-1200 messages per second. At a certain point adding additional BizTalk Servers is going to cause the MessageBox SQL Server to become a bottleneck. To alleviate this Multiple MessageBox’s can then be added. In practice going past 4 or 5 MessageBox’s the returns begin to diminish.

    Stage of the Project

    The first thing I need to determine is the stage of the project.  If I am coming in at the beginning of the project, I will always seek to understand the customers business problem first, as it is only once I know this that I can ascertain their requirements. Once I understand a customer’s requirements the first question, I ask myself is “is BizTalk the right solution for this customers problem”.  BizTalk is a fantastic product but is not a universal panacea for all problems.  I strongly believe that positioning the right technology to solve the problem a customer has is my number one job, even if this means not using BizTalk.  A common question I am hearing (along with every other BizTalk person) is “when should I use BizTalk over Dublin and vice versa”.  Now answering that would be a full article in itself so rather than attempt it here I will refer to Charles Young’s very good article on the topic here.  I also use the numbers mentioned above to determine whether there system requirements are realistic.

    Assuming that BizTalk is the right solution and will fulfill their requirements I then need to understand the key characteristics of the system, specifically I’m interested in the following:

    1.  Message flow through the system. 

    Now I know that there are often many tens of message flows through a system. For the majority of systems I’ve worked,  a much smaller subset of these tends to account for a large proportion of the load and hence have the most impact of perceived performance of the system.  I’ve found that identifying these and focusing on them is key.  For example, I recently worked with an customer in Europe to test their BizTalk system which was handling the back-end processing for their new Internet Bank.  In this case, over 80% of all requests were for the “Summary of Accounts view” (providing a consolidated view of all bank accounts).  In this scenario users are unlikely to wait a long time when they first log into the online bank therefore optimizing this and any directly message flows should be the key priority.    Once I’ve identified these flows, I’m primarily interested in the features of BizTalk that are being used.  Is messaging used heavily? Orchestration? Business Rules Engine? Is tracking required, either out of the box or BAM? I’m also trying to understand what external system are involved and how many calls are made from Orchestration.  Each of these features has some performance “cost” associated with it, my job as a Ranger is to work with the customer to ensure they get the functionality that they need at the minimum possible cost in terms of performance . In most cases I will put together a Visio diagram of the main message flows and clarify that these are correct with all the Developers of the system.

    2.  Size/Format/Distribution of the messages

    Message size and format is very important regarding BizTalk performance.  Processing binary files is expensive in terms of performance because often the message body cannot be compressed in the same way that XML can.   It is important to determine how big the messages will be and their distribution.  This is particularly important for large batches of messages that in many systems occur once a week.  I’ve seen many customers present me with extremely complex spreadsheets illustrating each and every single message type that will be processed in the system.  I think it is important to abstract to the appropriate level of detail otherwise dealing with and reproducing this in a lab becomes an impossible task. I typically ask customers to put together a table as per below with the constraint that it must be simple enough to fit on a single PowerPoint slide.

    Response Type

    Size

    % of Traffic

    Small 8413 bytes 20%
    Medium 16998 bytes 60%
    Large 52128 bytes 20%

    3.  Production Traffic Can This be Replayed

    Ideally in any performance testing or scoping scenario, I want to be able to replay actual data that will be processed in production.  Especially when the Rules Engine is used this is important to validate correct functionality.  Testing with a small subset of test data is better than nothing, but it is my experience that using production data will identify issues in system design before they get to production which is good for everybody.

    4.  Performance Goals

    Clear quantifiable goals are a must for anyone serious about BizTalk! Without these there is no quantifiable way to  judge the effectiveness of the system.  In short, performance goals are essential part of project success criteria. Good goals should state any constraints and should cover throughput and latency and any other relevant factors as well as how you will measure them. I’ve included an example below:

         Orchestration scenario

             Throughput :250,000 calls within 8 hours

                                ~9 messages/sec sustainable

              Latency:    < 3 seconds for  99% of all response messages

    5.  Type of Environment

    How many applications are present on the system – is this BizTalk group going to be processing a single application or is it going to be a centralized environment?

    Q: You’ve done a great job capturing BizTalk performance tuning metrics and providing benchmarks for the results.   The sheer volume of options can seem intimidating, so give me only four modifications you would want all BizTalk owners to make in their environment.

    A: I decided to cheat a bit on this answer and break my four modifications into areas: hardware, sql config, BizTalk config and monitoring.

    1.  Invest in the right hardware – gigabit networking, fast storage (SAN or 15K local SQL disks), modern fast processing machines.

    2.  Optimize SQL Server configuration for BizTalk. Including:

    • Data and log file separation
    • Create separate Temp DB data files (1 per processor) and use Trace flag 1118 as a startup parameter (this alone gave me 10% in a recent performance lab). 
    • Set autogrowth for the BizTalk databases

    3.  Tune BizTalk Host Settings.  Here are the main ones I look for:

    • CLR worker threads – to increase the number of processing threads
    • For latency reduce the MaxReceive interval.  Be careful with this one – make sure that the polling interval is not set to a value lower than the execution time for any of your BTS_Deque stored procedures (there is one of these SPROCS per BizTalk host).  If this happens then BizTalk will overwhelm SQL by creating more connections in order to poll in time.
    • Adjust the BizTalk config file max connection property if you are using HTTP

    4.  Invest in a monitoring solution and continuously configure this as you learn about the system.

    Q: Besides tuning hardware and deploying new infrastructure, there are ways that developers can directly impact (positively or negatively) the performance of their application through design choices they make.  What are some considerations that developers  should take into account with regards to performance when designing and building their applications?

    A: The most important thing I think that developers can do is help drive a performance culture on their project.  In my opinion performance should be viewed as an essential quality attribute for any BizTalk project throughout its lifecycle. It is much more than a two week lab which is performed at the end of a project, or even worse not at all.   I think it is important to point out that many BizTalk applications are mission critical and downtime cannot be tolerated; in many cases, downtime of the solution can affect the liquidity of the company.  In my experience a proper performance engineering approach is not taken on many BizTalk projects.  Everyone is responsible for performance. Unfortunately, in many cases I have seen developers not realize until it is too late that the lack of consideration for performance from the beginning  resulted in decisions early in the lifecycle which really affected the potential performance of the system.  I would advise any developer who is considering using BizTalk to consider performance right from the beginning and to test continuously (not just two weeks before go-live).  If something that affects performance creeps into the build and is not detected for more than two weeks, it is likely that removing it will require a lot of engineering.  In many cases I see that developers do not want to invest in these assets due to the perceived “cost” of them.  I would use the word investment instead, for me investing in automated performance tests and metrics are assets that will save me a huge amount of time later on and help ensure the success of the project.

    In terms of the application itself, ultimately bear in mind that everything that you add to an application will slow down the pure BizTalk engine.  Consider performance in everything that you do: e.g. within pipelines and Orchestrations minimize the use of XML document; instead use a streaming approach and XLANGMessage.  I would advise developers to invest in test assets which will enable them to continuously run performance tests and benchmarks .  I use BizUnit for my end to end functional testing.  I’ve found that using this in combination with the Visual Studio 2008 Load Test tools enables me to perform both automated functional testing and performance testing. Without a good baseline it is very difficult to determine whether a check-in has degraded or improved performance. BizTalk requires powerful hardware, therefore I would advise developers to invest in production quality machines at the beginning of a lifecycle. This will enable them to continuously run performance comparisons throughput the project.

    The final thing that I’d like to see developers do is to train their operations/infrastructure team in BizTalk and how their application uses it.  Most infrastructure teams understand what SQL, Exchange and Active Directory does.  This helps them define their support processes and procedures.  In many cases infrastructure teams have no prior experience of BizTalk – so they treat and support it as a black box. By training them in the architecture of BizTalk Server, you will enable them to effectively tune and maintain the environment once it goes into production and also minimize the amount of support incidents which need to be escalated to the development team.  I know that this is something that you yourself have done Richard within your organization.

    Q [stupid question]: Working at Microsoft, you have a level of credibility where folks believe the advice and information that you provide them.   This means that you could deliver them fake “facts” and they’d never know.  A few examples include:

    • “Little known fact, if you add the registry key “Woodgate4” to HKLM\SOFTWARE\Microsoft\BizTalk Server\3.0,  Microsoft actually provides you three additional orchestration shapes.”
    • “There are actually 13 editions of BizTalk Server 2009  including Home Premium and Web Edition.”

    Ok, you get the point.  Give me a “fake fact” about technology that you could sell with a straight face.

    A: If you hold down CTRL-SHIFT-P on startup, Windows Vista will load a custom version of Pac Man where your goal is to eat the evil Linux Ghosts.

    You won’t find all this information elsewhere, folks, so thanks to Ewan for taking the time to share such real world experiences.

    Technorati Tags:

  • System Integration with Nintex Workflow for SharePoint 2007 (Part III)

    [Series Overview: Part IPart II / Part III]

    In my first post of this series I looked at what Nintex workflow for SharePoint is.  The second post looked at its web service integration capabilities.  In this final post, we dig into the native BizTalk integration provided by the product.

    Let’s start out with the use case scenario.  Let’s say that I’ve got a new consultant on board and want to publish this employee’s information to the ESB and get back employee identifiers provisioned by downstream system.  So our SharePoint 2007 custom data list stores attributes that we’ll capture up front (e.g. vendor name, consultant name, start date) and has placeholders for values (e.g. employee ID, seating location, corporate laptop name) defined by our various onboarding applications.

    What we want is a workflow that can fire off once a new consultant is loaded into the SharePoint list.  This workflow shouldn’t be required to coordinate the various user provisioning systems, but rather, should communicate with our ESB (BizTalk Server 2006 R2) through a single interface.   In the previous post I showed how web services could be executed by a Nintex workflow.  While that is nice, I want a set of asynchronous interfaces where we can send a message to BizTalk and get something back whenever the user provisioning process is completed.

    My workflow starts out with the “Send/Receive BizTalk” activity that sends a message to BizTalk and is followed by a “Set Field Value” activity which flips the record’s status from “Pending” to “In Progress.”

    So what does this BizTalk activity look like?  First, we designate whether this is a “Send”, “Receive” or “Send/Receive” action.  The “Send/Receive” is used for synchronous transactions while the other two are the choices for asynchronous transmission.  Next we specify a “Message ID” which acts as the unique identifier for the message (e.g. correlation).   By default, this activity uses a GUID alongside the ID of the list row, but I changed mine to just be the organization ID (which I realize is not going to be unique to each transaction.  Sue me.).  Note that you can inject any value from the list or workflow variable into this “Message ID” identifier field.

    The next section of the configuration pane is the “BizTalk Web Service Endpoint Settings” which we don’t have yet.  That will come later, and is blank for now.  Following that section is the place where we define the data we wish to send to BizTalk.  There are two choices: (a) send the file being acted upon (in the case that this workflow runs from a document library), or (b) choose list properties that contain the relevant message payload information.


    Notice the “Export to XSD” link.  This link causes your previous list property selections to be loaded into an XSD file for BizTalk to consume.  So, this becomes your inbound message contract.  What about the response contract?   We configure this by adding another “Send/Receive BizTalk” activity to our workflow.  Because our provisioning action may take awhile, I used an asynchronous publication to BizTalk and now need a way to get the response back in.  The data received back from BizTalk must be stored in workflow variables (as opposed to taking the whole response document and putting it somewhere).  My workflow variables look like this:

    Now let’s configure the response.  This time, my “Action” is set to “Receive” from BizTalk and I used the same “MessageID” as the “Send” activity.

    This shape also has its “BizTalk Web Service Endpoint Settings” left blank, but unlike the previous activity where we’ll fill this in later, this activity’s value always remains blank.  This is because the Nintex folks provide a single HTTP channel back into their workflow engine from BizTalk.

    Finally, we choose which available workflow variables we wish to populate with response data from BizTalk Server.

    Just like before, we can export this information out to an XSD.  I’ve gone ahead and exported both the request message (from list values) and response message (put back into workflow variables) and added them to a new BizTalk project in Visual Studio.NET.

    Both the request and response message have a header added which includes routing information needed by Nintex to correlation inbound and outbound messages.  In my sample orchestration which coordinates employee provisioning activities, I receive a message in via a one-way port, call a few operations to generate an employee ID, set the office location and establish the laptop machine name, and finally send the message back out via a one-way port.

    After building and deploying this orchestration, I need to define the means by which Nintex sends a message into BizTalk.  As I mentioned in the previous post, Nintex does not currently support WCF, so you have to use the BizTalk Web Services Publishing Wizard to produce an ASMX service endpoint.  Once the wizard is complete, you have a valid service endpoint and a receive location that can be bound to the orchestration’s receive port.    What we need to manually create is the send port which sends our response message back to the running workflow.  The HTTP endpoint set up by Nintex is found at:

    http://<sharepoint server>/_layouts/nintexworkflow/BizTalkHandler.ashx

    Once that send port is bound to our orchestration, we’re ready to roll.  Now we can return to our SharePoint workflow and update the existing “Send to BizTalk” activity with the valid service connection details.

    Finally, I included a bunch of “Set field value” activities in the workflow which take the values from the workflow variables (set by the BizTalk response) and put them into the list values for the item.

    All that’s left to do is publish, and then trigger the workflow on our existing list item.  Sure enough, after launching the workflow, my item has its status set to “In Progress” and a couple of browser refreshes later, it displays the values returned by my BizTalk orchestration.

    Summary

    All in all, this was pretty easy to do.  It’s convenient that I can send either list data or entire documents into BizTalk, and it helps greatly that the tool produces valid XSD files (except there seems to be a bug where datetime fields in SharePoint lists don’t properly map to their XSD counterparts).  I’d choose the BizTalk integration vs. traditional web service integration when I want asynchronous interactions with my service or ESB.

    It’s a good toolset overall.  Definitely take a look.

    Technorati Tags: ,

  • System Integration with Nintex Workflow for SharePoint 2007 (Part II)

    [Series Overview: Part IPart II / Part III]

    In my previous post I briefly described what the workflow tool from Nintex looks like and how to use it within the SharePoint environment.  Now, let’s see how to actually perform our first system integration scenario through the use of web services.

    One big caveat before I start: Nintex currently only has support for consuming ASP.NET Web Services, not Windows Communication Foundation.  I consider this a pretty big gap, but for now, let’s work with what’s available.

    In this demonstrated scenario, I have a SharePoint 2007 list which holds all the vendors a company uses to provide IT services.  Because core vendor data is stored in a different CRM system, we want to store a minimum of information in the SharePoint list and retrieve critical data from the CRM system via a web service.  So, once I add a record to the SharePoint list, I want to look up the details for that vendor in the CRM system and put the vendor “contact name” into the SharePoint list.

    I start out with a simple ASP.NET service which fronts the CRM system.  Because I’m writing a blog post and not a real system, I simply hard coded a response object.

    In my SharePoint list, I added a new Nintex workflow and dragged a “Call web service” and “Set field value” activities to the flow.  On this first pass, I’m just going to dump the entire web service XML response value into a SharePoint list field.

    Note that the web service response has to go somewhere, so I first set up a workflow variable (VendorDetails) to hold the XML data.  I’ll use the other variable later to hold a specific value from the XML response.

    The “Call web service” activity has a pretty comprehensive set of configuration options.   First you specify the endpoint URL.  The available service operations are then auto-populated in a list.  For dictating the service payload, you have two choices.  First, for services with simple type parameters, you can enter in each value from either list values or workflow variables using the SOAP Builder.

    The other option, which is handy, is the SOAP Editor where you can shape the SOAP content yourself.  Note that you can still insert values from either lists or workflow variables into this interface.

    As for the service response, you can choose to apply an XSLT transform and then select a workflow variable to stash the value.   You also get the option to catch exceptions and store those messages.

    After using the “Set field value” workflow activity to take the service result and copy it into the SharePoint list field, we’re ready to publish the workflow.  I added a new row to the list and kicked off the workflow.

    As we’d hope for, the full XML payload from the service is thrown into the SharePoint list field.

    Note that XML payloads are wrapped in an “<xml>” node by the workflow activity.  It’s great that we can call a service, but just dumping the XML result is not particularly friendly to someone who is viewing this information.  What we want to do is parse the XML response and yank out JUST the “VendorContact” node.  So, I went back to our existing workflow and added a new “Query XML” activity to the process.  This activity lets me parse XML content stored either in a URL, workflow variable or list field.

    You can process the XML via either XSLT or XPath.  In my case, I want to do an XPath query to find the node I’m looking for.  I then took the result of that query and stored it in a workflow variable.

    Finally, I have a “Set field value” activity which takes this new workflow variable and copies its data to the SharePoint list.  After once again publishing the workflow and kicking it off, we can see that not only do we have the XML blob, but now we have another field that just stores the name of the vendor contact person.

    Summary

    The ability to punch out to external systems is a valuable aspect of a full featured business process.  The Nintex workflow product does a fine job making service invocation a fairly straightforward task.  Now, the lack of WCF integration is a concern, but hopefully one being actively addressed.  However, because the “Query XML” activity can accept a URL, it seems possible that I could mash up RESTful services via this toolset.  I’ll have to try that.

    The final post in this series will cover the native BizTalk Server integration in the product.  Stay tuned.

    Technorati Tags:

  • System Integration with Nintex Workflow for SharePoint 2007 (Part I)

    [Series Overview: Part IPart II / Part III]

    If your organization uses MOSS 2007, hopefully you’ve taken a look at what the folks at Nintex have to offer.  My company recently deployed their workflow solution, and I thought I’d take a look at how to execute system integration scenarios as part of a Nintex Workflow.

    In this first post, I’ll take a short look at the general product toolset.  The second post will show off web services integration, and the final post will highlight their native BizTalk Server integration.

    First off, what is Nintex Workflow?  It’s a solution hosted within the SharePoint environment that allows you to graphically construct robust workflow solutions that play off of SharePoint lists and libraries.  While you can build workflows in SharePoint using either WF or SharePoint Designer, the former is purely a developer task and the latter really exposes a linear, wizard driven design model.  Where Nintex fits in is right in the middle: you get the business-friendly user experience alongside a rich set of workflow activities (including any custom ones you build in WF).

    Design Experience

    Once the Nintex module is installed and enabled in your SharePoint farm, then any list or library has a set of new options in the “Settings” menu.

    If I choose to create a new workflow, then I am given the option to select a pre-defined template which has a default flow laid out for me.

    Once a template or blank workflow is chosen, I have a plethora of “Workflow Actions” available to sketch out my process.  For example, the Integration category has options such as “Call web service”, “Execute SQL”, “Query LDAP”, “Send/Receive BizTalk” and “Call Workflow.”

    There are nine categories of workflow activities in all, including:

    • Integration
    • Libraries and lists (e.g. “Check out item”, “Create list”, “Set field value”)
    • Logic and flow (e.g. “For each”, “Run parallel action”, “State machine”)
    • Operations (e.g. “Build dynamic string”, “Wait for an item to update”)
    •  Provisioning (e.g. “Add User to AD Group”, “Provision user on Exchange”)
    • Publishing (e.g. “Copy to SharePoint”)
    • SharePoint Profiles (e.g. “Query user profile”)
    • Sites and Workspaces (e.g. “Create a site”)
    • User Interactions (e.g. “Request approval”, “Send a notification”, “Task reminder”)

    Note that you can turn off any individual action you wish if the capability exposed is too risky for your particular organization.

    Using these activity shapes, I can draw out a simple process made up of decisions, notifications and approvals.

    Each activity can be configured (and re-labeled for later readability) per its function.  When an activity requires data to act upon (as most do), those values can often either be retrieved from (a) a hard coded value, (b) a workflow-specific variable, or (c) content from any list on the site.  For the “Request approval” activity, I have all sorts of options for choosing where to get the approver list from, which means of approval to require (all, single, first, vote) and where to store the assigned tasks.  What’s also cool is the “Lazy Approval” setting in Nintex which allows you to respond to a notification email with a single word or phrase to indicate your response.  In the image below, notice that I used a value from the list (in red text) as part of my task name.

    The configuration experience is pretty similar for each activity.  For the most part, I’ve found it to be fairly intuitive although I’ll admit to actually having to open the Help file a few times.

    Runtime Experience

    You can choose what the start up option should be for the workflow.

    Then, from the “Actions” menu, you can publish the workflow and make it available for use.  Pretty darn easy.

    Then, when you add/change data in the list or library, the workflow is either automatically or manually triggered.  Just like with any SharePoint workflow, you have a column added to the list which keeps you up to date on the status of the workflow (which you can drill into).  Whatever task list is associated with the workflow is also populated with tasks assigned to individual users or groups.

    Users can also add the “My Workflow Tasks” web part to a page which will show only the tasks for the active user.

    Users can also browse into the running workflow and graphically see what’s been completed so far, how long each step took, and what comes next.

    Analysis Experience

    Just like the previous image, we can drill into a completed workflow and analyze how it ran and the duration of a given step.

    As for reporting, the product comes with plenty of canned reports on a per-site or all-site basis that address topics such as: Approver Performance Statistics, Workflows in Progress, Workflow Performance and much more.

    You can display these reports either graphically or tabularly as a web part.  For the graphical reports, you can choose line, bar or pie chart.  The charts actually rely on Microsoft Silverlight (for the 2-D representation) and are pretty snazzy and configurable.

    Summary

    This was a very simple, but hopefully adequate, walkthrough to show you around the software.  This technology has lots to offer and integrates nicely with the Microsoft stack of products (including Live Communication Server).   Note that nothing I did here required a lick of programming or even a particularly technology-centric background.  And because the design surface is hosted within the SharePoint environment itself, you get a very rapid, accessible means for building and deploying functional workflows.

    If you have a SharePoint sandbox, consider downloading the free trial and playing around.

    In the next post, I’ll show you how to do some simple system integration via web service calls from a workflow.

    Technorati Tags: ,