Pages

Friday, September 30, 2011

Get Localize DAte format

[pass parameters date, language and country and output local datetime]

public static string How2GetLocalizedDate(string INdate, string INlanguage, string INcountry)
 {
  if ((!string.IsNullOrEmpty(INdate)) && ((!string.IsNullOrEmpty(INlanguage))) && (!string.IsNullOrEmpty(INcountry)))
 {
    DateTime dt = DateTime.Parse(INdate);
    // Sets the CurrentCulture property to U.S. English.
    Thread.CurrentThread.CurrentCulture = new CultureInfo(INlanguage + "-" + INcountry);
    // Displays dt, formatted using the ShortDatePattern
    // and the CurrentThread.CurrentCulture.
    return dt.ToLongDateString();
  }
else
{
   return "";
 }
}

Monday, August 8, 2011

The dependent tool 'svcutil.exe' is not found

I compile a new WCF application and upon running gave the error
(WCF Test Client)(debug-startnewInstastance)

"The dependent tool 'svcutil.exe' is not found. "

To solve this problem install the Windows SDK .Net 3.5 Here the link to download it from. Install that and the WCF Test Client should run without error

IF Windows SDK .net 3.5 i already installed on system then
Verify and copy nd paste 'svcutil.exe' at path C:\Program Files\Microsoft SDKs\Windows\v7.0\Bin.Now it should run fine.I suppose that 'svcutil.exe' was erased from machine while installing higher versions of framework. – 

Wednesday, July 20, 2011

Double Integer

Convert integer into double (100) (100.0)
use of N1, N2

public string retDoubleInteger(string Integer)
{
    return (Convert.ToDouble(Integer).ToString("N1"));
}

-----------------------------------------------------------------------------------------------------------
Convert (datetimetype) Date to yyyy/MM/dd (stringtype)format takes care of null or empty dates

public string serviceFormatDate(string inputDate)
   {
       if (!string.IsNullOrEmpty(inputDate))
   {
          System.DateTime date = System.DateTime.Parse(inputDate);
          inputDate = date.ToString("yyyy/MM/dd");
   }
   else
    {
         inputDate = " ";
    }
return inputDate;
}

----------------------------------------------------------------------------------------------------------
String Replace (all is "well") (all is well)

public static string CleanStringanything(string pur)
        {
            return pur.Replace("\"", "");
        }

----------------------------------------------------------------------------------------------------------
Flag status (true)(0)

public string FlagStatus(string strFlagStatus)
{
    if (strFlagStatus== "true")
       {
           return "0";
       }
   else
      {
            return "1";
       }
}

-----------------------------------------------------------------------------------------------------------
Multiple IF conditions return

public static string StatID(string strStatID)
        {
            if (string.Equals(strStatID, "pur"))
            {
                return "A";
            }
            if (string.Equals(strStatID, "sha"))
            {
                return "B";
            }
            if (string.Equals(strStatID, "krt"))
            {
                return "C";
            }
            if (string.Equals(strStatID, "arp"))
            {
                return "D";
            }
            if (string.Equals(strStatID, "mon"))
            {
                return "E";
            }
            return "NOTVALID";
        }
-----------------------------------------------------------------------------------------------------------
Return nd split (5a), (5)(a)

public string ExtractNum(string numcode)
{
           string[] numbers = Regex.Split(numcode, @"\D+");
           string RetNum = "";
foreach (string value in numbers)
{
   if (!string.IsNullOrEmpty(value))
   {
int i = int.Parse(value);
                                RetNum = RetNum + i;
   }
}
return RetNum;
}

--    --    --
public string RetString(string numCD)
{
    return  Regex.Replace(numCD, @"\d", "");
}

------------------------------------------------------------------------------------------------------------
Sometihng abt conversions of date

public string FDate(string inputDate)
    {
      System.DateTime date = System.DateTime.Parse(inputDate);
      return date.ToString("yyyy-MM-dd");
     }
-------------------------------------------------------------------------------------------------------------

same thing we did before

public string ExtractNum(string Dizycode)
{
           string[] numbers = Regex.Split(Dizycode, @"\D+");
           string RetNum = "";
foreach (string value in numbers)
{
   if (!string.IsNullOrEmpty(value))
   {
int i = int.Parse(value);
                                RetNum = RetNum + i;
   }
}
return RetNum;
}

public string RetString(string DizyCD)
        {
            Regex r = new Regex("(?:[^a-z0-9 ]|(?<=['\"])s)", RegexOptions.IgnoreCase | RegexOptions.CultureInvariant | RegexOptions.Compiled);
            DizyCD = r.Replace(DizyCD, String.Empty);
            return Regex.Replace(DizyCD, @"\d", "");
        }
-------------------------------------------------------------------------------------------------------------
replace phone ((265)) to (265)
public static string CleanPhoneNumber(string hone)
        {
            return Regex.Replace(hone, @"[\(\)\-\.\s]", "");
        }

------------------------------------------------------------------------------------------------------------
striped off leading zero (0241)(241)
public string reformData(string Data)
{
if (string.IsNullOrEmpty(Data)) return Data;
else if (Data.StartsWith("0")) return Data.Substring(1, Data.Length-1);
else return Data;
}
-------------------------------------------------------------------------------------------------------------
add fix alpha (0245)(E0245)
public string retstrStat(string strStat)
 {
   if (!System.String.IsNullOrEmpty(strStat))
      {
        return "E"+strStat;
      }
  else
    {
       return "";
   }

Tuesday, July 19, 2011

Selecting the flat file disassembling schema dynamically

COMING SOON

Configuring BizTalk Orchestrations to handle un-typed messages

Found one of useful article thought of sharing
A typical orchestration in BizTalk deals with several kinds of messages. A message in BizTalk is always strongly typed. It is usually associated to some schema defined in the project. A Receive/Send shape in an orchestration is tied to a message declared in the Orchestration View. In the case of an un-typed message, the message type is set to 'System.Xml.XmlDocument', instead of a schema type. The class "XmlDocument" is a super class for all XML messages, and hence it can hold any type of XML message and subsequently any type of orchestration message. To summarize, an un-typed message is one whose message type is set to "XmlDocument" in an orchestration
Consider that we are integrating several small messages into a StoreFront application. This application deals with several kinds of messages like "Car", "Music" and "Book". These messages need to be integrated into a "StoreFront" message. So, a designed orchestration would receive several different types of messages. The orchestration in the article is shown below:

In the above orchestration, the "Send" and "Receive" shapes can handle any type of message. It is because the "Receive" shape is associated with a message which is of type "XmlDocument".

The Decision shape separates out the different kinds of messages.

// Message Type consists of TargetNamespace#SchemaRootElementName
InputMessage(BTS.MessageType) ==
// Message Type consists of TargetNamespace#SchemaRootElementName
InputMessage(BTS.MessageType) ==
// Message Type consists of TargetNamespace#SchemaRootElementName
InputMessage(BTS.MessageType) ==
Notice the commonality in the above schemas. A quick explanation of one of the schemas is as follows:
The schema "Car.xsd" has the following properties:
RegID - The registration ID.



Make - The make of the car.
Model - The year of manufacture.
Operation - Can be either "BUY" or "SELL".
ExpectedPrice - The expected price.
The "InputMessage" message is specified in the Receive shape:

The "OutputMessage" message is specified in the Send shape:
The Message Assignment shape has the following lines of code:

Biztalk Port File Name Macros

If you ever wanted to know how to use any other file name macros other than %MessageID% here they are:

%datetime%

Coordinated Universal Time (UTC) date time in the format YYYY-MM-DDThhmmss (for example, 1997-07-12T103508).

%datetime_bts2000%

UTC date time in the format YYYYMMDDhhmmsss, where sss means seconds and milliseconds (for example, 199707121035234 means 1997/07/12, 10:35:23 and 400 milliseconds).

%datetime.tz%

Local date time plus time zone from GMT in the format YYYY-MM-DDThhmmssTZD, (for example, 1997-07-12T103508+800).

%DestinationParty%

Name of the destination party. The value comes from message the context property BTS.DestinationParty.

%DestinationPartyID%

Identifier of the destination party (GUID). The value comes from the message context property BTS.DestinationPartyID.

%DestinationPartyQualifier%

Qualifier of the destination party. The value comes from the message context property BTS.DestinationPartyQualifier.

%MessageID%

Globally unique identifier (GUID) of the message in BizTalk Server. The value comes directly from the message context property BTS.MessageID.

%SourceFileName%

Name of the file from where the File adapter read the message. The file name includes extension and excludes the file path, for example, foo.xml. When substituting this property, the File adapter extracts the file name from the absolute file path stored in the FILE.ReceivedFileName context property. If the context property does not have a value, for example, if message was received on an adapter other than File adapter, then the macro will not be substituted and will remain in the file name as is (for example, C:\Drop\%SourceFileName%).

%SourceParty%

Name of the source party from which the File adapter received the message.

%SourcePartyID%

Identifier of the source party (GUID). The value comes from the message context property BTS.SourcePartyID.

%SourcePartyQualifier%

Qualifier of the source party from which the File adapter received the message.

%time%

UTC time in the format hhmmss.

%time.tz%

Local time plus time zone from GMT in the format hhmmssTZD (for example, 124525+530).

Customize filename dynamically in Orchestration

In many cases it can be useful to know the exact name of your output file that will be sent from your Orchestration using the File Adapter. This can be difficult if you are using the %MessageId%.xml macro to write the file since this it set after the message is sent from the Orchestration.

Delivery Notification can help you determine if your message was sent successfully but it can not give you the file name.

BizTalk 2004 has two ways to dynamically name your files from inside the Orchestration. The two ways to accomplish this are either to use a Dynamic Send Port or to use the %SourceFileName% macro on the Send Port.

Dynamic Send Port

Dynamitic Send Ports are powerful and useful if you need to send your files to many different locations on the file system like sometime to C:\data\ and other times c:\root\. The downside is you need to have all this information inside your message or hard code it in the Orchestration. So, it can be difficult to change.

Source File Name Macro

This is my preferred approach to Output File Naming. This does not require your data or the Orchestration to have any idea as to the directory you want to write your file to. This requires using the %SourceFileName% macro to set the output file name inside the Send Port.

Don’t want the same name as your input file you say? Now, here is the trick. Just change the File.ReceivedFileName property inside the Orchestration to be anything you want! This can be done by creating a new message and changing the context property. The code inside a Message Assignment shape would look like this:

// Create a new message

OutMessage = InMessage;

// Set the ReceivedFileName context property

OutMessage(FILE.ReceivedFileName) = "SetInOrch.xml";

It is not required to demote this value into your message. So, this method works with the Pass Through Send Pipeline because this context value is used by the File Adapter and not the pipeline.

CRITICAL: The %SourceFileName% macro does not need an additional extension (like .xml or .txt) after it like the %MessageId% macro.

---------------------------------------------------------
msgPerson2 = msgPerson;

fileName = msgPerson2(FILE.ReceivedFileName);
fileName = System.IO.Path.GetFileName(fileName.Replace(".txt",".xml"));
msgPerson2(FILE.ReceivedFileName) = "output_" + fileName;
---------------------------------------------------------

Sunday, July 17, 2011

Promoting reoccuring elements

Found one useful article written by eliasen: Well, I suppose we have all been there – in order to get the business process running, a specific element from a schema needs to be promoted in order to route on it, correlate on it, and so on.

Unfortunately, elements that can occur more than once can not be promoted. This, off course, makes perfectly sense, since the property can only hold one value, and how would BizTalk know which one of the many occurring elements to take the value from at runtime? So we agree with the limitation, but hope for a nice solution. :-)

If you try to promote a reoccurring element, you get this error when adding it to the list of promoted properties

“This node can occur potentially multiple times in the instance document. Only nodes which are guaranteed to be unique can be promoted.”

Right. Now, some people have found the editor for the XPath describing the element that one wants to promote. If you have promoted some element, you can click on it like this

Then you can click on the dot at the right of the line, and get into the editor like this

Now, wouldn’t it be lovely, if you could just change this expression to include for instance an index on the reoccurring element? In my example from this screenshot, the “ReoccuringRecord” record can occur multiple times. So it would be nice, if I could just change the XPath to be like this:

/*[local-name()='ExampleRoot' and namespace-uri()='http://PromotingReoccuringElement.ExampleSchema']/*[local-name()='ReoccuringRecord' and namespace-uri()=''][1]/*[local-name()=’ElementWhereNumber1IsPromoted’ and namespace-uri()='']

By setting the “[1]” into the XPath, I state that I will be needing the first occurrence of the ReoccuringRecord and therefore, this XPath expression will always give me exactly one node. Unfortunately, the engine can not see this, so the error will be the same, only difference being that this error doesn’t occur until compile time:

Node "ElementWhereNumber1IsPromoted" - The promoted property field or one of its parents has Max Occurs greater than 1. Only nodes that are guaranteed to be unique can be promoted as property fields.

Bummer!

So how do we get this working? If I really need to promote a value that occurs in an element that might occur multiple times, I see four options:

  1. Map to a schema on receive port
  2. Custom pipeline component
  3. Orchestration to do it and then publish to MessageBox
  4. Call pipeline from orchestration
I will go these options in more detail here:

Option 1: Map to a schema on receive port.

When a map is executed on a receive port, some extra magic functionality is performed by BizTalk. After the map has been executed, the message is sent through some code that promotes properties that are specified inside the destination schema. If you execute a map inside an orchestration, this doesn’t happen.

So you can create a schema that has an extra field, in which you place the value that needs to be promoted. This element must not be able to occur multiple times. Promote this new field, and after the map on the receive port has been executed, you have your value promoted.

Option 2: Custom Pipeline Component.

It isn’t that difficult to create a custom pipeline component, that can promote a field for you. Your Execute method might look just like this:

public Microsoft.BizTalk.Message.Interop.IBaseMessage Execute(IPipelineContext pContext, Microsoft.BizTalk.Message.Interop.IBaseMessage pInMsg)
{
    pInMsg.Context.Promote("MyProp", "http://ReoccuringElement.PropertySchema", "MyValue");
    return pInMsg;
 }

Of course, you will probably want to load the body stream of the IBaseMessage somehow, in order to find the value inside the body to promote and then replace "MyValue" with the value form within the XML.

Just use the pipeline component inside a custom receive pipeline, and you are all set.

Option 3: Orchestration to do it and then publish to MessageBox

Create a intermediate orchestration, that gets the input message. Then, it should create a new message of the same type in a message assignment shape like this:
NewMessage = InputMessage;
NewMessage(*) = InputMessage(*);
NewMessage(MyNewProperty) = xpath(InputMessage, xpathexpression);

Then, use a direct bound port to publish the message to the MessageBox. In order for the new property to follow the message, you need to initialize a correlation set on the send shape that is based on this new property.

Let other orchestrations and send ports subscribe to this message and let then do their work.

Option 4: Call pipeline from orchestration

The last option is to call a receive pipeline from within your orchestration. This requires a new schema, that has a field for the value to be promoted, just as in option 1. Inside your orchestration, map the input message to this new schema, and call a receive pipeline with this new message as a parameter. Remember to promote the field in this new schema. There is an article on MSDN about calling a pipeline from within an orchestration, which can be found at http://msdn2.microsoft.com/en-us/library/aa562035.aspx

Upsides and downsides

In order to choose which way to go in a specific solution, several things need to be considered.

Basically, I'd go for option 1 almost anytime. This is because it is best practices to map anything incoming into a canonical schema anyway. So instead of promoting values inside all your partners schemas - schemas they might change, you should promote from within your own canonical schema.
Reasons not to choose option 1 include: The canonical schema also has a reoccurring element, so it hasn't provided extra functionality with regards to getting this specific value promoted. Or perhaps, we aren't using canonical schemas, because there was no time for this when the project was started.
If we can't go for number 1, I'd go for number 3. Number 2 requires programming of a pipeline component, which can be a bottleneck, unless done correct. Also, the pipeline component is a whole new component to maintain, document and test. Number 4 requires a new schema and therefore also a map to be built. If I am ready to do this, I'd go for number 1 instead.
If I don't like number 3, for unknown reasons, I'd go for option 2 - the custom pipeline component. Allthough it is custom code, and must be done right, and testet and everything... I still feel that creating a new schema and map in order to call the pipeline in option 4 is overkill, since I'd go for option number 1 instead, which would also require the new schema and map.
I hope this explains some details about this issue, and that it helps someone in the future.

Saturday, July 9, 2011

Flat File Messages with Positional Records

Positional records within a flat file instance message contain individual fields (items of data) that are each of a predefined length. The fields are parsed according to these lengths.

The character used to fill the unused portion of each field, known as the pad character.

Field Padding: Pad characters are used in fields within both delimited and positional records when the data contained within the field smaller than the number of characters or bytes reserved for the field. These characters occupy the portion of the field not required by the data, if any. Pad characters are specified on a field-by-field basis using the Pad Character and Pad Character Type properties of the corresponding Field Element and Field Attribute nodes. If no pad character is specified for a particular field, the default pad character, space (" "), is used for that field

For inbound instance messages, regardless of whether a particular record is positional or delimited, the flat file disassembler discards leading or trailing instances for the specified or default pad character for a particular field as the instance message is translated into its equivalent XML form. Whether it is leading or trailing instances of the relevant pad character that are discarded depends on whether the Justification property of corresponding Field Element and Field Attribute node is set to Right or Left, respectively.

For outbound instance messages, the flat file assembler will insert the appropriate number of the specified or default pad character into fields so that the length of the field is correct. The pad characters will be inserted before or after the data characters based on whether the Justification property of the corresponding Field Element and Field Attribute node is set to Right or Left, respectively.

When the field to be padded in an outbound instance message is contained within a positional record, the Positional Offset and Positional Length properties of the corresponding Field Element or Field Attribute node, combined with the number of data characters that the field must contain, determine whether any pad characters are required, and if so, how many. When the field to be padded in an outbound instance message is contained within a delimited record, pad characters are only inserted when the value of the Minimum
Length with Pad Character property of the corresponding Field Element or Field Attribute node exceeds the number of data characters.

Wednesday, July 6, 2011

Extract Digits nd Char out of string with Regular expression

Extract Digits out of string with Regular expression

public string ExtractNum(string xnumberx)
{
        string[] numbers = Regex.Split(xnumberx, @"\D+");
        string RetNum = "";
foreach (string value in numbers)
   {
      if (!string.IsNullOrEmpty(value))
             {
                int i = int.Parse(value);
                RetNum = RetNum + i;
             }
    }
       return RetNum;
}

Extract Char out of string with Regular expression

public string RetString(string strspcex)
   {
      return Regex.Replace(strspcex, @"\d", "");
   }

Clean Telephone Number with Regular expression

using System;
using System.Collections.Generic;
using System.Text;
using System.Text.RegularExpressions;
using System.Windows.Forms;
using System.Xml;

namespace GOSI.Motion
{
    public class StringUtils
    {
       public static string CleanTelephoneNumber(string inputString)
           {
               return Regex.Replace(inputString, @"[\(\)\-\.\s]", "");
           }
    }
}

Thursday, June 23, 2011

Disassembler Component, directly invoking command lines


Pipeline Tools


DSDump.exe
Enables you to dump the document schema structure, which is an in-memory lightweight representation of the one or more XSD schemas, with or without flat file annotations. This tool can be helpful when you get parsing engine errors such as $Root$0$3$2 and you need to decode them. Numbers after $ mean 0-based index or records as they appear in the document schema

FFAsm.exe
Runs the flat file assembler component, directly invoking it by emulating a send pipeline to enable you to see how it serializes or assembles a user's XML document(s) into a flat file document.

FFDasm.exe
Runs the flat file disassembler component, directly invoking it by emulating a receive pipeline to enable you to see how it parses or disassembles a user's flat file document into one or more XML documents.

Pipeline.exe
Runs a send or receive pipeline; accepts one or more input documents and their parts, XSD schemas and related information; and produces an output document after the pipeline runs.

Pipeline.exe does not access BizTalk Server databases, so pipelines containing BizTalk Framework assembler and disassembler components that access BizTalk Server 2004 databases during execution may not be supported.
You should only use pipeline.exe to verify custom pipelines without third-party custom components. If you use pipeline.exe to verify a custom pipeline with third-party custom components, pipeline.exe will produce the desired output. However, if you deploy the same custom pipeline with third-party custom components, use the pipeline in a receive or send port, and then use pipeline.exe to submit a message to the pipeline, the pipeline will fail and BizTalk Server will return an error

XMLAsm.exe
Runs the XML assembler component, directly invoking it by emulating a send pipeline to enable you to see how it serializes, assembles, or envelopes a user's XML document(s) into an output XML document.

XMLDasm.exe
Runs the XML disassembler component, directly invoking it by emulating a receive pipeline to enable you to see how it parses, disassembles, or un-envelopes a user's XML document into one or more XML documents.

USAGE: FFDasm.exe
C:\xx\xx\xxinterfacedefinitionlocation>C:\"Program Files"\"Microsoft BizTalk Server 2009"\SDK\Utilities\PipelineTools\FFDasm xxx.txt -bs xx.xsd

Friday, June 10, 2011

FlatFile xs:Date field with an empty or NULL value

I have seen this issue a couple of times and thought of recording this extract from one of blog.

Exception:

After generating a flat file schema with optional elements(minoccurs =0) of type date,int,decimal etc., which does not accept a ‘ ‘ value, we might endup with an error as follows.

error BEC2004: The ‘JoinDate’ element is invalid – The value ” is invalid according to its datatype ‘http://www.w3.org/2001/XMLSchema:date’ – The string ” is not a valid XsdDateTime value.

Cause:
This exception is perfectly valid from the compiler’s perspective. Here comes the example.
Suppose my flatfile schema is generated using the following imput and is demilited by ‘|’ .

1|Name|2010-02-03|Description

As the date field(highnlighted in red) is optional in our scenario, the following input is also valid.

1|Name||Description

But when BizTalk validates this instance against the schema, it just picks the value between the two ‘|’ symbols for the date field which is ”. Thus it throws the error when it checks if the value ” is valid for a date field. This will occur for any datatype which does not accept ” value.

Solution:
The Solution is to logically let BizTalk know priorly that the datefield is optional and it has to ignore if any blank values occur.

This can be done by setting the “Suppress Empy Nodes” Property to “No” at the Schema Level(By Clicking on the word ”Schema” above the root node of the flatfile).

This will ensure that BizTalk knows that the field might have empty value and it should ignore it. This will get your validation successful.
------------------------------------------------------------------------------
public string ParseDate(string parameter1)

    {
          if (!(System.String.IsNullOrEmpty(parameter1)))
    {
          System.DateTime dt;
                if(DateTime.TryParse(parameter1, out dt)){
                return dt.ToString("s");
                }else{ return "";}
     }
else
{
return parameter1;
}
}
----------------------------------------------------------------------------------------------
xsl:if test="$paramDate != ''">

Thursday, June 9, 2011

Debatching a flat file in BizTalk

below is extract of information which i found in one of blog, for debatching flat files in biztalk. Sometimes you receive a batch file containing multiple records and you have to split each record into a separate message and process them individually.

This kind of splitting of a batch file is also known as debatching.

Depending on your requirement, you can either debatch an incoming message inside the pipeline or inside the orchestration.

The technique used to debatch a message inside the pipeline depends on whether the input is an XML file or a flat file.

For an XML file, you will need to use an XML disassembler, an envelop schema and a document schema, which I will discuss some other time on separate blog post / article.

Here I will show you how you can debatch a flat file.

Let us assume that we need to debatch a flat file that contains the following structure.

Field1,Field2,Field3,Field4,Field5,Field6
Field1,Field2,Field3,Field4,Field5,Field6
Field1,Field2,Field3,Field4,Field5,Field6
Field1,Field2,Field3,Field4,Field5,Field6
Field1,Field2,Field3,Field4,Field5,Field6
Field1,Field2,Field3,Field4,Field5,Field6

First you need to create a flat file schema that looks similar to the one shown in the figure below.
Make sure you have specified the correct child delimiters for the Root (0x0D 0x0A) node as well as the Record (,) node.

Root Node:
Child Delimiter Type = Hexadecimal
Child Delimiter = 0x0d 0x0a
Child Order = Infix or Postfix

Record Node:
Child Delimiter Type = Character
Child Delimiter = ,
Child Order = Infix

Now select the Record node and set its MaxOccurs property to 1. This is actually the key to debatching. The flat file disassembler will automatically separate each record into individual message.

Important: If you have set the Child Order Property of the Root node to Infix, make sure you have also set the property 'Allow Message Breakup At Infix Root' to Yes

Next, as you would normally do, add a receive pipeline to the project and add a flat file disassembler and set the Document schema property to the schema we created.

That’s all there is to it.

Now suppose you have a flat file that also has a header record or a trailer (footer) record or both as shown below.

FirstName,LastName,RegNo,School,City
SubjectName1,Score1
SubjectName2,Score2
SubjectName3,Score3
SubjectName4,Score4
SubjectName5,Score5
SubjectName6,Score6
SubjectName7,Score7
SubjectName8,Score8
TotalScore

You would normally create a flat file schema as shown below to parse as one single message
But for the purpose of debatching the body records, you need to create 3 separate schemas as shown below. Make sure the MaxOccurs of the Body record is set to 1.
And in the flat file disassembler, you need to set the Header schema, Document schema and Trailer schema properties appropriately.

Note: You might need to set a unique Tag Identifier property for each of the schemas, and also prefix the same tag identifier on the records as shown below, so that the flat file parser can distinguish the header, body and footer records.

HFirstName,LastName,RegNo,School,City
BSubjectName1,Score1
BSubjectName2,Score2
BSubjectName3,Score3
BSubjectName4,Score4
BSubjectName5,Score5
BSubjectName6,Score6
BSubjectName7,Score7
BSubjectName8,Score8
FTotalScore

It is possible to preserve the header record into the message context, by setting the Preserve header property of the flat file disassembler to true, so that you can use it as a header record for each of the debatched message.

To use the preserved header, you need to set the Header schema property of the flat file assembler to the same header schema you used in the disassembler. Here’s how the output will look.

HFirstName,LastName,RegNo,School,City
BSubjectName1,Score1

HFirstName,LastName,RegNo,School,City
BSubjectName2,Score2

HFirstName,LastName,RegNo,School,City
BSubjectName3,Score3

HFirstName,LastName,RegNo,School,City
BSubjectName4,Score4

HFirstName,LastName,RegNo,School,City
BSubjectName5,Score5

HFirstName,LastName,RegNo,School,City
BSubjectName6,Score6

HFirstName,LastName,RegNo,School,City
BSubjectName7,Score7

HFirstName,LastName,RegNo,School,City
BSubjectName8,Score8

gen new GUID c# script

public string GetnewGuid()
    {
        return System.Guid.NewGuid().ToString();
    }

BizTalk WCF-SQL Adapter to load a flat file into a SQL Database


Working with FlatFiles came through one of blog written by Thiago Almeida and found useful information and thought to share; below is extract from that

I have a flat file that contains a list of products. I need to load the contents of this flat file into a SQL Server 2008 table using BizTalk Server 2009 and the WCF-SQL Adapter. The table might already have some of the products in the flat file, and in this case the product row should be updated.


The data in my sample flat file was extracted from the Adventure Works sample database in the SQL Server 2008 samples in Codeplex, which gave me 504 products to play with.

Flat File and Debatching
-------------------------------------
As you might already know, messages with multiple items in them (multiple Products in this case) coming into BizTalk can be disassembled and debatched on their way in by the disassembler pipeline components. In this case, since it is a flat file that we are receiving, we will use the flat file disassembler component that comes out of the box with BizTalk.

In this post I want to go over loading the file into SQL Server in two ways: one by splitting the Product items into individual messages and loading them individually with the WCF-SQL Adapter; and another by not splitting the Product items and sending one single message to the WCF-SQL Adapter with all the products.

For that I created two flat file schemas, and both look like the below:

On one of the schemas the Product node has its ‘Max Occurs’ set to ‘unbounded’. The other schema has the Product node’s ‘Max Occurs’ set to 1. This property is what tells the flat file disassembler pipeline component if it should debatch the Products or not.

I created two BizTalk pipelines to handle the two different schemas. I dragged the flat file disassembler pipeline component to the disassemble stage of each pipeline, and selected the appropriate schema for each.




SQL Server Table and Stored Procedure
---------------------------------------------------------
On the SQL Server database side, we have a table called Product (what a surprise!) with the following columns:


We are going to call a stored procedure for each line in the flat file to load each product. An easy way to either insert the product if it doesn’t exist in the table or update it if it exists is to use the MERGE statement that is new in SQL Server 2008. So all we have in our stored procedure is the following:

CREATE PROCEDURE [dbo].[ADD_PRODUCT]

@ProductShortDescription varchar(50), @ProductFullDescription varchar(max), @UOM nchar(10), @UnitPrice money

AS BEGIN
SET NOCOUNT ON;
–Use merge statement to either insert or update product based on product short description

MERGE INTO Product AS Target

USING (SELECT @ProductShortDescription, @ProductFullDescription, @UOM, @UnitPrice)
AS Source(ProductShortDescription, ProductFullDescription, UOM, UnitPrice)
ON (Target.ProductShortDescription = Source.ProductShortDescription)
WHEN matched THEN

UPDATE SET ProductFullDescription = Source.ProductFullDescription, UOM = Source.UOM, UnitPrice = Source.UnitPrice
WHEN not matched THEN
INSERT (ProductShortDescription, ProductFullDescription, UOM, UnitPrice)
VALUES (Source.ProductShortDescription, Source.ProductFullDescription, Source.UOM, source.UnitPrice);
END

WCF-SQL Adapter Schemas
--------------------------------------------
To add the SQL Server schemas used by the WCF-SQL Adapter from the BizTalk solution you can right click on the BizTalk project, select Add, and then ‘Add Generated Items’. From there you can either choose the ‘Add Adapter Metadata’ or the ‘Consume Adapter Service’ options. They will both bring the ‘Consume Adapter Service’ wizard where you can connect to the target SQL Server database and select what items and operations you want to consume. In our case we are only interested on the ADD_PRODUCT strongly typed stored procedure:


This will give you a schema like the following for the ADD_PRODUCT stored procedure

Note that the ADD_PRODUCT node is the root node, and therefore can only exist once in the XML instances for this schema. This is the schema we are going to map to for the debatched Product information we get from the flat file schema with a max occurs of 1.  It is a straight map then from the single product flat file schema to the single stored procedure schema

That takes care of mapping the products when the flat file is being debatched into single product messages. Now what do we do about mapping all the products in the flat file to only one XML that is sent to the WCF-SQL Adapter? Here’s where the WCF-SQL Adapter’s composite operations come in handy. The composite operations in the adapter have been described on Richard Seroter’s book (free sample chapter on the WCF-SQL Adapter). I created a new schema with a root node ‘Request’ and a second root node ‘RequestResponse’. The first root node name isn’t really important, as long as the second root node name is the same as the first with a ‘Respose’ suffix. I then added the single ADD_PRODUCT schema as an XSD Import to my composite schema. This allows me to create an unbounded record under the ‘Request’ node and change it to have a data structure type of ns0:ADD_PRODUCT, and an unbounded record under the ‘RequestResponse’ node and change it to have a data structure type of ns0:ADD_PRODUCTResponse.


This allows us to map from the non debatched flat file schema to the schema created above

Calling the WCF-SQL Adapter
--------------------------------------------------------
After deploying the solution I created two receive ports and two respective receive locations – one of them configured with the debatching pipeline and the single product map, and the other configured with the single file pipeline and the composite operation map.


I then created two one way send ports with the WCF-Custom Adapter and the sqlBinding, each with a filter for one of the receive ports. The send port that filters on the debatched single product insert receive port is configured as follows, with the TypedProcedure/dbo/ADD_PRODUCT action:

The send port that filters on the single file with multiple products and composite operation map is configured as follows, with the CompositeOperation action:

Both send ports had a binding type of sqlBinding of course, with the default values (make sure useAmbientTransaction is enabled so that the stored procedure calls are inside a transaction):

Transactions and Conclusion
---------------------------------------------
So when we debatch the Products flat file on the way in we end up with multiple concurrent calls to the stored procedure via the WCF-SQL Adapter, each in its own transaction

When we map the entire file to the composite schema we end up with one transaction that wraps around all the stored procedure calls:

If we monitor the Transactions/sec for the database we see barely any activity when we use the single file method:

If we use the debatch multiple message method we some spikes as the multiple transaction to the database are made

As expected the single file method performs much faster for loading the 504 rows into the table. By placing a datetime column on the products table I could see the difference from the first insert to the last is only 254 milliseconds. With the debatch method BizTalk goes through the debatched records at a much slower pace taking around 16 seconds to load them all, since it has to map each debatched message, route multiple messages to the send port, create multiple transactions against SQL Server, etc.



After looking into it a bit more I also noticed that for the debatched scenario the message delivery throttling and message publishing throttling were kicking off for the BizTalk host loading the messages into SQL Server. By simply changing the number of samples that the host should base its throttling decision on to something over the 504 records being inserted the time for the debatched inserts went down to 4 seconds from the 16 seconds mentioned above:

The debatch method is still useful in many situations – if you need to perform extra steps for each message in the batch, or if your DBAs require one transaction for each stored procedure call, etc.

WCF SQL Adapter using OUTPUT parameters rather than a SELECT statement

Recently working on one of the task faced issue, I came through the direction of Thiago’s article on the WCF SQL adapter; which is a very good example. what happens as shown below (ref);

ALTER PROCEDURE [dbo].[ADD_PRODUCT]
@ProductShortDescription varchar(50),
@ProductFullDescription varchar(max),
@UOM nchar(10),
@UnitPrice money

AS
BEGIN
SET NOCOUNT ON;

–Use merge statement to either insert or update product
–based on product short description

MERGE INTO Product AS Target
USING
(SELECT @ProductShortDescription, @ProductFullDescription , @UOM, @UnitPrice)
AS
Source(ProductShortDescription, ProductFullDescription, UOM, UnitPrice)
ON
(Target.ProductShortDescription = Source.ProductShortDescription)
WHEN
matched THEN
UPDATE
SET ProductFullDescription = Source.ProductFullDescription , UOM = Source.UOM, UnitPrice = source.UnitPrice
WHEN
not matched THEN
INSERT
(ProductShortDescription, ProductFullDescription, UOM, UnitPrice)
VALUES
(Source.ProductShortDescription, Source.ProductFullDescription, Source.UOM, Source.UnitPrice);
SELECT ‘HELLO’ AS Hello —Added to send a value back in the response
END

I had to change the SQL schema too, as shown below.

Now when I fired up the same example I got the similar error to what my colleague has got namely;

“The adapter failed to transmit message going to send port "InsertProductSingleFile_WCFSQL" with URL "mssql://.//BT09WebcastsInvoice?". It will be retransmitted after the retry interval specified for this Send Port. Details:"Microsoft.ServiceModel.Channels.Common.InvalidUriException: Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached. —> System.InvalidOperationException: Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached.”

I executed sp_who in SQL management studio to find out how many connection to the database had been opened by the composite WCF-SQL adaptet. I found the the composite adapter had created 100 connections ( the maximum that was set in the WCF-Custom binding).

Why is the WCF-SQL adapter creating so many connections when a record set is returned? I think the answer is that a separate connection is opened until all the results are returned but unfortunately we run out of connections before the entire composite transaction has finished.

UPDATE 2011-04-08 Communication from Thiago
“…, it’s a known limitation:
http://msdn.microsoft.com/en-US/library/dd788151(v=BTS.10).aspx

I guess they would tell you to increase the MaxConnectionPoolSize value to a value larger than the number of operations you expect to bunch together, the max value being 2,147,483,647. “ }

next I changed the stored procedure to return the result using an OUTPUT parameter instead of a SELECT statement like so;

ALTER PROCEDURE [dbo].[ADD_PRODUCT]
@ProductShortDescription varchar(50),
@ProductFullDescription varchar(max),
@UOM nchar(10),
@UnitPrice money,
@Hello varchar(50) OUTPUT

AS
BEGIN
SET NOCOUNT ON;
–Use merge statement to either insert or update product
–based on product short description

MERGE INTO Product AS Target

USING
(SELECT @ProductShortDescription, @ProductFullDescription , @UOM, @UnitPrice)
AS
Source(ProductShortDescription, ProductFullDescription , UOM, UnitPrice)
ON
(Target.ProductShortDescription = Source.ProductShortDescription)
WHEN
matched THEN
UPDATE
SET ProductFullDescription = Source.ProductFullDescription , UOM = Source.UOM , UnitPrice = Source.UnitPrice
WHEN
not matched THEN
INSERT
(ProductShortDescription, ProductFullDescription, UOM , UnitPrice)
VALUES
(Source.ProductShortDescription, Source.ProductFullDescription,UOM, Source.UOM, Source.UnitPrice);
–SELECT ‘HELLO’ AS Hello —Added to send a value back in the response
SET @Hello = ‘HELLO’
END

Running the sample again, the data is inserted in to the database without any error because only one pooled connection is created.

In summary if you use a WCF-SQL adapter with a composite operation and you want to return a result set then you must do this using OUTPUT parameters rather than a SELECT statement. if you don’t then you risk running out of connections.