BizTalk 2006 R2/2009 to BizTalk 2013 Migration Approach

Reasons to Move On

There are a number of businesses still relying on BizTalk 2006 R2 as an integration platform. Medium to large corporations may have a significant number of integration partners necessitating having BizTalk applications in their hundreds. Considering criticality of BizTalk Server to such business and the effort involved in upgrading and retesting all BizTalk applications, it is not surprising that a business may decide not to go ahead with a BizTalk server upgrade unless there is a compelling case for it. As the BizTalk 2006 R2 is already past the mainstream support end date (7/12/2011) and the extended support date (7/12/2016) is rapidly approaching, the businesses are facing the inevitable upgrade.

BizTalk Server 2006 R2 Lifecycle

Products Released 

Lifecycle Start Date

Mainstream Support End Date

Extended Support End Date

Service Pack Support End Date

BizTalk Server 2006 R2 Branch Edition

6/23/2006 

7/12/2011 

7/12/2016 

4/12/2011 

BizTalk Server 2006 R2 Developer Edition 

6/23/2006 

7/12/2011 

7/12/2016 

4/12/2011 

BizTalk Server 2006 R2 Enterprise Edition 

6/23/2006 

7/12/2011 

7/12/2016 

4/12/2011 

BizTalk Server 2006 R2 Service Pack 1

4/27/2010 

Review Note 

Review Note 

 

 

Upgrading to BizTalk Server 2010 is probably not a wise decision in 2014 as the mainstream support end date for that product is only a couple of years away:

BizTalk Server 2010 Lifecycle

Products Released

Lifecycle Start Date

Mainstream Support End Date

Extended Support End Date

Service Pack Support End Date

BizTalk Server Branch 2010 

11/14/2010 

1/12/2016 

1/12/2021 

 

BizTalk Server Developer 2010 

11/14/2010 

1/12/2016 

1/12/2021 

 

BizTalk Server Enterprise 2010

11/14/2010 

1/12/2016 

1/12/2021 

 

BizTalk Server Standard 2010 

11/14/2010 

1/12/2016 

1/12/2021 

 

 

It is also the end of life for Windows Server 2003 R2 (7/13/2010 Mainstream and 7/13/2015 Extended) and Windows Server 2008 (1/13/2015 Mainstream and 1/13/2020 Extended) that is equally important consideration for migration.

Upgrade vs. Migration

Two approaches can be considered:

  • Upgrading of the existing BizTalk Server 2006 Infrastructure to BizTalk Server 2013
  • Building a new BizTalk Server 2013 Infrastructure to co-exist with the existing BizTalk Server 2006 Infrastructure to allow for gradual migration of BizTalk Applications

 

The later approach (gradual migration of BizTalk Applications from BizTalk Server 2006 to BizTalk Server 2013 platform) can be shown to have a number of significant advantages, while the feasibility of BizTalk 2006 to BizTalk 2013 upgrade is questionable.

Upgrading to BizTalk Server 2013 directly from BizTalk Server 2006 R2 is not supported. If such an upgrade was to be attempted, this would have to happen in two stages:

  • Upgrade to BizTalk Server 2010 (Intermediate Configuration) from BizTalk Server 2009/2006 R2 (Current Configuration)
  • Upgrade to BizTalk Server 2013 (Target Configuration) from BizTalk Server 2010/2009 (Intermediate Configuration)

     

BizTalk Server 2013 supports the following new versions of the software dependency stack, which is a major update from the current stack:

  • Windows Server 2008 R2 SP1, Windows Server 2012, Windows 7 SP1, Windows 8.
  • .NET Framework 4.5
  • Visual Studio 2012
  • SQL Server 2012 and SQL Server 2008 SP1

     

The entire software stack needs to be deployed to leverage the New Capabilities of BizTalk Server 2013.

The hardware architecture should also be reviewed to account for:

  • Virtualisation of BizTalk 2013 Servers
  • New Disaster Recovery capabilities

 

Key System Component 

Current Configuration 

Intermediate Configuration 

Target Configuration 

BizTalk Server 

BizTalk Server 2006 R2 Enterprise Edition 

BizTalk Server 2010 Enterprise Edition 

BizTalk Server 2013 Enterprise Edition

Operating System

Microsoft Windows Server 2003 64 Bit Enterprise Edition SP2

Windows Server 2008 R2 64 Bit Enterprise Edition

Windows Server 2012 64 Bit Enterprise Edition

SQL Server 

SQL Server 2005 R2 Enterprise Edition (64-bit) 

SQL Server 2008 R2 Enterprise Edition (64-bit)

SQL Server 2012 Enterprise Edition

SQL Server Notification Services

 

SQL Server 2005 Notification Services with Service Pack 2 

SQL Server Notification Services cannot be used by BAM Alerts on SQL Server 2012.

If SQL Server is upgraded to SQL Server 2012, SQL Server Database Mail is required.

Microsoft .NET Framework

2.0 

4.0 and 3.5 SP 

4.5

Visual Studio

Visual Studio 2005 

Visual Studio 2010 

Visual Studio 2012

 

Migration Methods

Database Migration

BizTalk Applications may rely on databases for the purposes of:

  • Configuration / Message Routing
  • Value mapping
  • Data archiving
  • Data-driven Business Logic implemented as Stored Procedures

     

Such databases need to be migrated to the new infrastructure.

The sections below describe the different methods available to migrate existing BizTalk Application Databases over to the new hardware infrastructure.

Database Export / Import

The Export/Import method involves Exporting the source Database to Flat Files and then importing the Files into the target Database.

Benefits of Export/Import

Benefits of performing an Export/Import include:

  • The ability to optimally layout the Database Data files
  • The ability to compress the Database on the fly
  • Evenly lay out data within the database, thereby removing any existing Database storage issues including fragmentation, out of order pages etc.
Risks of Export/Import

The following downside may occur when utilising the Export/Import migration method.

  • This method requires the longest System outage
  • Highest level of technical complexity
Export/Import Process Flow

The process flow is as follows:

Pre-Migration Tasks
  • Install target Database in the Cluster
Migration Tasks
  • Stop the BizTalk Application
  • Export data from the source Database
  • Import data into the target Database
Advanced Export/Import Techniques
Manually run Export/Import

We can use Advanced Techniques to speed up the Export/Import process. By taking manual control of the process, we can introduce a high level of parallelism in order to minimise the System outage time.

These Techniques include:

  • Using multiple servers to perform different tasks in parallel
  • Parallel Export/Import
  • Table and Package Splitting
Use log shipping to migrate Database to faster Infrastructure before Export

Often there can be a network bottleneck between the current Infrastructure and the new Infrastructure. This is normally due to the usage of older 1Gb/s Network cards which provide a maximum theoretical throughput of about 400GB an hour. The new Infrastructure will utilise far more efficient 10Gb/s Network cards.

To compensate for the slower 1Gb/s Network card, we can utilise MS-SQL log shipping to migrate the Database to faster Infrastructure. This has the effect of dramatically increase throughput, and reducing any System outage.

Database Backup/Restore

The Backup/Restore migration method involves taking a backup of the source Database to flat files and then performing a Database restore from the flat files to the target Database.

We can either use the standard MS-SQL backup software to perform backups to flat files or use more efficient third party backup tools such as EMC Avamar.

Benefits of Backup/Restore

Benefits of performing the Backup/Restore migration method include:

  • Will be the easiest method
Risks of Backup/Restore

The following downside may occur when utilising the Backup/Restore migration method:

  • No ability to optimally layout the Database Data files
  • Unable to compress the Database on the fly
  • Any existing Database storage issues will remain after the migration process. This includes Fragmentation, out of order pages etc.
Backup/Restore Process Flow

The backup/restore process flow is as follows:

Migration Tasks
  • Stop the BizTalk Application
  • Backup the source Database to flat files
  • Restore the target Database from flat files

MS-SQL Log Shipping

With MS-SQL log shipping, we are able to copy the database to the target server then have this database synchronised with the source server by using the Transaction logs.

When we need to physically migrate the System to the new infrastructure, a final log backup of the source Database is taken and used to roll forward the target Database. This process results in a minimal outage to the target System during the migration process.

Benefits of MS-SQL Log Shipping

Benefits of using the MS-SQL log shipping approach includes:

  • One of the quickest methods
  • Migrating the Database is performed with little or no interruption to the source System
Risks of MS-SQL Log Shipping

The following downside may occur with the MS-SQL log shipping approach:

  • No ability to optimally layout the Database Data files
  • Inability to compress the Database on the fly
  • Any existing Database storage issues will remain after the migration process. This includes Fragmentation, out of order pages etc.
  • Will require a temporary change to the regular Backup process while the log shipping process is active
Update Backup Policies

When we activate MS-SQL log shipping, the current Backup Policies must be changes as we can no longer take log backups using EMC Avamar. We still need to take Daily Full backups, but any Log Backups must now be performed via MS-SQL log shipping. This means we need to introduce a new backup policy to independently backup the log shipping directory in case a database Restore is required.

MS-SQL Log Shipping Process Flow

The process flow is as follows:

Pre-Migration Tasks
  • Install the target state Log-ship Database on the Pre-Prod server
  • Disable the current Backup Policies
  • Activate log shipping on the source Database
  • Enable Backup Policies to Back up the log shipping directory
  • Backup the source Database
  • Restore Backup to the Log-ship Database
  • Activate log shipping on the Log-ship Database
  • Install target Database in the Production Cluster
Migration Tasks
  • Stop the BizTalk application
  • Take the last log shipping Log Backup
  • Restore the last Log into the Log-ship Database and Recover the Database
  • Export from the Log-ship Database
  • Import into the target Database

EMC Disk Based Replication

The EMC Disk based Replication Method involves using SAN technology to migrate Database files. Currently, we are able to use EMC disk replication technology to copy LUNs.

Benefits of EMC Disk Based Replication

Benefits of performing an EMC Disk based Replication include:

  • One of the quickest methods of System migration
Risks of EMC Disk Based Replication

The following downside may occur with the EMC Disk based Replication approach:

  • No ability to optimally layout the Database Data files
  • Unable to compress the Database on the fly
  • Any existing Database storage issues will remain after the migration process. This includes Fragmentation, out of order pages etc.
  • Requires co-ordination of teams outside of the Basis team
  • May require specialised EMC SAN technology skills.
EMC Disk Based Replication Process Flow

The process flow is as follows:

Migration Tasks
  • Stop the BizTalk application
  • Detach the source Database
  • Copy LUNs to the target server
  • Attach the target Database

BizTalk Application Migration

MSI Export / Import

It is not possible to import an MSI into BizTalk 2013 which have been exported from BizTalk 2006 R2.

The main reason for this is the platform alignment in BizTalk 2013 with the .NET 4.5 framework, while BizTalk 2006 R2 applications run on the .NET 2.0 platform.

Visual Studio Conversion Wizard

BizTalk 2006/Visual Studio 2005 to BizTalk 2013 Visual Studio 2012

Visual Studio 2012 Conversion Wizard can be used to convert the Visual Studio 2005 projects into Visual Studio 2012 project. To use Visual Studio 2012 with BizTalk 2006 projects, you must install BizTalk 2013 on the computer that has Visual Studio 2012. The first time you open a BizTalk 2006 project, it is automatically upgraded to the BizTalk 2013/Visual Studio 2012 project system.

Trading Partner Configuration Migration

Microsoft BizTalk Server Party Migration Tool (Party Migration Tool)

BizTalk 2006/Visual Studio 2005 to BizTalk 2010/Visual Studio 2010

Before you start upgrading to BizTalk Server 2010, you must run the Microsoft BizTalk Server Party Migration Tool (Party Migration Tool) to migrate the party-specific data from BizTalk Server 2006 R2 or BizTalk Server 2009 to the new TPM model in BizTalk Server 2010.

If you have parties with Electronic Data Interchange (EDI) data, you will have to migrate the party data from old Trading Partner Management (TPM) model to new TPM model.

BizTalk Server 2006 R2 did not require Business Activity Monitoring (BAM) to be configured as part of AS2 configuration. However, for BizTalk Server 2010, you must have BAM configured before configuring AS2. So, while migrating from BizTalk Server 2006 R2 environment (that has AS2 configured) to BizTalk Server 2010, the tool first checks if BAM is configured. If not, the tool prompts you to configure BAM in BizTalk Server 2006 R2 and then run the tool. For more information, see http://go.microsoft.com/fwlink/?LinkId=183138.

In BizTalk 2013, EDI/AS2 Runtime is divided in 3 functionalities:

  • BizTalk EDI Functionality
  • BizTalk AS2 Functionality
  • BizTalk EDI/AS2 Runtime Status Reporting

For the last two functionalities “AS2″ and “EDI/AS2 Runtime Status Reporting”, BAM Tools must be configured

BizTalk 2010/Visual Studio 2010 to BizTalk 2013 Visual Studio 2012

Before you upgrade from BizTalk Server 2009 to BizTalk Server 2013, run the Microsoft BizTalk Server Party Migration Tool (Party Migration Tool) to migrate the party-specific data from BizTalk Server 2009 to the new TPM model in BizTalk Server. For more information about the enhanced Trading Partner Management (TPM) provided with BizTalk Server 2013, see Trading Partner Management Using BizTalk Server

Run the Microsoft BizTalk Server Party Migration Tool (Party Migration Tool). If you have parties with Electronic Data Interchange (EDI) data, migrate the party data from the old Trading Partner Management (TPM) model to the new TPM model.

Certificates Migration

Web server certificates contain information about the server that allows the client to positively identify the server over a network before sharing sensitive information, in a process called authentication. Secure Sockets Layer (SSL) uses these certificates for authentication, and uses encryption for message integrity and confidentiality. SSL is a public key–based security protocol that is used by Internet services and clients to authenticate each other and to establish message integrity and confidentiality.

If you use SSL to protect confidential information exchanged between the Web server and the client, you must migrate or export the certificates and the associated private keys from the source server to the target server.

Certificate Export/Import

Windows Server 2003 to Windows Server 2012
Choosing an Export Format

If you are exporting certificates to be imported onto a computer running Windows, PKCS #7 format is the preferred export format, primarily because this format preserves the chain of certification authorities, or the certification path, of any certificate that includes countersignatures associated with signatures.

If you are exporting certificates for import onto a computer running another operating system, it is possible that the PKCS #7 format is supported. If it is not supported, the DER Encoded Binary format or the Base64 Encoded formats are provided for interoperability.

Exporting a Server Certificate from Windows Server 2003
  • In the Run dialog box, type mmc, and then click OK. The Microsoft Management Console (MMC) appears.
  • If you do not have Certificate Manager installed in MMC, you need to install it.
  • For more information on how to add the Certificate snap-in to an MMC console, see the procedure “To add the Certificates Snap-in to MMC” in Install a Server Certificate this appendix.
  • In the console tree, click the logical store where the certificate you want to export exists. Usually this is in the Certificates folder in the Personal directory under Certificates (Local Computer) on the Console Root.
  • Right-click the certificate you want to export, click All Tasks, and click Export to start the Certificate Export Wizard.
  • Click Next.
  • On Export Private Key, click Yes to export the private key.
  • In the Export File Format dialog box, click the format you want for the certificate. If the certificate has already been formatted, that format is selected as the default. Click Next.
  • Do not select Delete the private key if export is successful, because this will disable the SSL site that corresponds to that private key.
  • Continue to follow steps in the wizard, and enter a password for the certificate backup file when prompted. Using a strong password is highly recommended because it ensures that the private key is well protected.
  • Type the name of the file you want to export, or click Browse to search for the file. Click Next.
  • Click Finish to complete the Certificate Export Wizard.
Importing a Server Certificate to Windows Server 2012

In Windows Server 2012, you need to perform the following steps to import a PFX certificate into the Certificate store:

  • Start a MMC session. From a command prompt, type MMC.
  • Add/Remove Snap-in.
  • Add Certificates.
  • Use the Computer Account.
  • Manage the Local Computer.
  • Expand to the Personal Certificate store
  • Right-click, All Tasks, Import. Then browse to the location of the PFX file to import the certificate. The Certificate Import Wizard will begin.
  • Browse to the PFX file.
  • Type in the password for the certificate, and mark the key as exportable, in case you need to re-export the key elsewhere in the future.
  • Place the certificate in the Personal store.
  • Click Finish to complete the wizard.
  • When the import is successfully completed, the certificate and intermediate certificate will be displayed in the Certificates folder.

BRE Policies and Vocabularies Migration

Migration of BRE Policies and Vocabularies from BizTalk 2006 R2 to BiTalk 2013 via BRE Policies and Vocabularies Export/Import may work but is not supported by Microsoft. The business rules may have to be build from scratch on the BizTalk 2013 environment.

BRE Policies and Vocabularies Export/Import

Rules Engine Deployment Wizard can be used to export a policy or vocabulary to an XML file. The file can be imported into BRE on the target system, once again using the Rules Engine Deployment Wizard.

As the Policies and Vocabularies are exported in as an XML, which is a human-readable format, the contents of the file can be modified prior to import.

It’s important to import all dependencies first:

  • All vocabularies used by a policy or vocabulary must be imported first or you will get an error.
  • All assemblies used by a policy or vocabulary must be imported first or you will get an error.
Exporting BRE Policies and Vocabularies from BizTalk Server 2006
  • Click Start, point to All Programs, point to Microsoft BizTalk Server 2006, and then click Business Rules Engine Deployment Wizard.
  • On the welcome page, click Next.
  • On the Deployment Task page, select Export Policy/Vocabulary to file from database and then click Next.
  • On the Policy Store page, from the drop-down lists, select an available SQL Server computer and database, and then click Next.
  • On the Export Policy/Vocabulary page, do the following, and then click Next.
  • Select Policy or Vocabulary depending on what you want to export..
  • From the Policy/Vocabulary drop-down list, select the desired policy/vocabulary.
  • Click Browse to select the definition file.
  • Review the server, database, and policy or vocabulary information, and then click Next.
  • After the import or export is completed, click Next.
  • Review the completion status of the export, and then click Finish.
Importing BRE Policies and Vocabularies into BizTalk Server 2012
  • Click Start, point to All Programs, point to Microsoft BizTalk Server 2006, and then click Business Rules Engine Deployment Wizard.
  • On the welcome page, click Next.
  • On the Deployment Task page, select Import Policy/Vocabulary to file from database and then click Next.
  • On the Policy Store page, from the drop-down lists, select an available SQL Server computer and database, and then click Next.
  • On the Import Policy/Vocabulary page, do the following, and then click Next.
  • Select Policy or Vocabulary depending on what you want to import..
  • Click Browse to select the definition file.
  • Review the server, database, and policy or vocabulary information, and then click Next.
  • After the import is completed, click Next.
  • Review the completion status of the import, and then click Finish.

BAM Activities Migration

BAM Management Utility

BAM activities will be migrated by re-deploying the BAM Activities definition files using the bm.exe utility.

Exporting BAM Definitions from BizTalk Server 2006

If you are using BAM in BizTalk Server 2009/2006 R2, you must manually regenerate the LiveData Workbook. To regenerate the LiveData Workbook, follow these steps:

  • Retrieve the BAM Definition by running the following command:

        BM get-defxml MyDef.xml

  • Re-create the PivotTable reports by first starting Microsoft Office Excel and then selecting the BAM Add-ins. Import the file MyDef.xml created in Step 1 and re-create the PivotTable reports. Save the new BAM Workbook as MyNewBook.xls.
  • Rename the PivotTable reports by finding the PivotTable names in MyDef.xml under <Caption> in the path:

        <BAMDefinition>\<Extension>\<OWC>\<PivotTableView>\<PivotTable>\<PivotView>\<Label>.

    Use these names to rename your PivotTable reports in MyNewBook.xls.

  • Regenerate the LiveData Workbook by running the following command:

        BM regenerate-livedataworkbook MyNewBook.xls

    Regenerated LiveData Workbooks do not re-create the Excel artefacts (for example, charts) in the original LiveData Workbook. You must manually re-create the artefacts.

Deploying BAM Definitions to BizTalk Server 2012

Administrators use the deploy-all BAM Management utility command to deploy a BAM definition from the Excel workbook or the XML definitions file exported from the workbook. When you perform a complete installation of BizTalk Server, the Configuration Wizard automatically configures the BAM Configuration XML.

  • Open a command prompt as follows:

        Click Start, click Run, type cmd, and then click OK.

  • Navigate to the tracking folder

        C:\Program Files (x86)\Microsoft BizTalk Server 2013\Tracking

  • Execute

        bm.exe deploy-all -DefinitionFile:<BAM definition file>.

    To run the BAM management utility, you must be member of the db_owner SQL Server Database role in the BAM Primary Import, BAM Star Schema, and BAM Archive databases. You must also have db_owner permissions on the BAM Notification databases.

On a system that supports User Account Control (UAC), you may need to run the tool with Administrative privileges: right-click the application, and then select Run as administrator.

 

 

 

 

 

 

 

 

Tagged , , , , , , ,

The Pros and Cons of GACing BizTalk Resources – Part 2

 

As I argued in my previous blog (The Pros and Cons of GACing BizTalk Resources), BizTalk dlls stored in the adpl_sat table of the BizTalkMgmtDb database when generating MSIs, which is a good enough reason not to GAC BizTalk assemblies directly. Instead, all updates should be done as recommended, by either importing MSIs or via using the Add > BizTalk Assemblies…/Resources… option.

As it turns out, BizTalk server also uses dlls stored in the adpl_sat table of the BizTalkMgmtDb to re-GAC dependent resources when a BizTalk Application Resource is updated via using the Add > BizTalk Assemblies…/Resources… option, which is just another good reason to keep the GAC and the adpl_sat table in sync:

DLL Location

DLL Location

Details

Used by Runtime

Used for BizTalk Configuration and Message Routing Metadata

Used to re-GAC dependent resources when a BizTalk Application Resource is re-added

Used to Export MSI

Updated when GACed

Updated

when Added as a BizTalk Application Resource

Updated when MSI is Imported

Updated when Application Installation Wizard is Run

 

Configuration Database

adpl_sat table of the BizTalkMgmtDb database

No

Yes

Yes

Yes

No

Yes

Yes

Yes

 

GAC

C:\Windows\ Microsoft.NET\ assembly\ GAC_MSIL

Yes

 

Microsoft. XLANGs. Core. ServiceCreationException

Is thrown if not in GAC

No

No

No

Yes

 

The GACed version will be used by runtime

Optional -

“Add to the global assembly cache on add resource (gacutil)” is selected

Optional -

“Add to the global assembly cache on MSI file import (gacutil)” is selected

Optional -

“Add to the global assembly cache on MSI file install (gacutil)” is selected

 

File System

%ProgramFiles%\Generated by BizTalk\<application name>

No

No

No

No

No

No

No

Yes

 

 

I first suspected that was the case when working on a project with fairly unrealistic timelines. The development team was under a lot of pressure to meet the deadline so all sorts of shortcuts were taken, including GACing of BizTalk assemblies directly, rather than pushing updates by either importing an MSI or refreshing specific resources by using the Add > BizTalk Assemblies…/Resources… option.

We did save some time and effort by doing that but on a couple of occasions things just stopped working. After some investigation we’d find that the latest changes, previously deployed to UAT by simply GACing of the assemblies would be lost… At that time the suspicion was that some developers would not get the latest code from TFS before compiling and deploying the freshly compiled assemblies to UAT, even though everyone denied doing so.

In fact, directly GACed dlls were getting overwritten by out-dated versions stored in adpl_sat table of the BizTalkMgmtDb database.

Consider the following scenario:

  1. A BizTalk application (let’s call it MyTestApp) is deployed into UAT via importing an MSI extracted from a developer’s local BizTalk server.
  2. MyTestApp.Maps.dll and MyTestApp.Orchestrations.dll end up in GAC and the adpl_sat table of the BizTalkMgmtDb database:

     

     

  3. The subsequent updates are pushed by manually overwriting the MyTestApp.Maps.dll and MyTestApp.Orchestrations.dll in GAC.
  4. At some point a decision is made to update the MyTestApp.Maps.dll by using the Add > BizTalk Assemblies…/Resources… option:

     

     

  5. MyTestApp.Orchestrations.dll is dependent on MyTestApp.Maps.dll:

     

     

  6. The MyTestApp application is stopped to ensure all orchestrations are in the unenlisted state and before proceeding through the Add > BizTalk Assemblies…/Resources… step.
  7. Both the MyTestApp.Maps.dll and the dependent MyTestApp.Orchestrations.dll are updated in GAC even though only the MyTestApp.Maps.dll was re-added:

     


     


     

     


     

  8. BizTalk runtime is now using the out-dated version of MyTestApp.Orchestrations.dll retrieved from the adpl_sat table of the BizTalkMgmtDb database and re-GACed.

 

 

 

 

 

 

 

 

Tagged , , , , ,

The Pros and Cons of GACing BizTalk Resources

Any BizTalk developer will be all too familiar with the pain of updating resources shared by multiple applications: what appears like a simple task of re-adding an application resource may necessitate removal of all dependencies, followed by an inevitable re-deployment and re-configuration which greatly increases chances of something going wrong…

Needless to say, all these complications seem like way too much hassle, especially in the development and testing stages. Shortcuts are frequently sought and readily found in overwriting the dlls containing BizTalk resources in GAC with the modified versions: simply stop BizTalk host instances (and, if necessary, the IIS), GAC the assemblies and restart the hosts. BizTalk server runtime is now using the updated resources and the development/testing can go on.

Some drawbacks of this method of pushing the updates are immediately obvious and can be recognised as soon as an attempt is made to configure an effected BizTalk application. Some issues will be picked up by the subsequent testing and you will realise that there are really no other options but to do the right thing and deploy the resources by using the Add BizTalk Assemblies/Resources option.

Other issues, such as having out-dated versions of dlls deployed, may not be identified until it’s too late, once an MSI is exported and is used for deployment into the UAT environment, if you are lucky, or into the production environment, if luck deserts you completely.

As explained in the table below, BizTalk uses a copy of the dll stored in the adpl_sat table of the BizTalkMgmtDb database when generating MSIs. That copy gets updated only when a resource is added to a BizTalk application using the Add BizTalk Assemblies/Resources option, which is what you may have to do prior to exporting an MSI in cases where some dlls may have been GACed in the development/testing stages:

DLL Location 

DLL Location

Details 

Used by Runtime 

Used for BizTalk Configuration and Message Routing Metadata 

Used to Export MSI 

Updated when GACed 

Updated

when Added as a BizTalk Application Resource

Updated when MSI is Imported 

Updated when Application Installation Wizard is Run 

 

Configuration Database 

adpl_sat table of the BizTalkMgmtDb database

No 

Yes

Yes 

No 

Yes 

Yes

Yes

 

GAC

C:\Windows\ Microsoft.NET\ assembly\ GAC_MSIL

Yes

 

Microsoft. XLANGs. Core. ServiceCreationException

Is thrown if not in GAC

No 

No 

Yes

 

The GACed version will be used by runtime

Optional -

“Add to the global assembly cache on add resource (gacutil)” is selected

Optional -

“Add to the global assembly cache on MSI file import (gacutil)” is selected

Optional -

“Add to the global assembly cache on MSI file install (gacutil)” is selected

 

File System

%ProgramFiles%\Generated by BizTalk\<application name>

No 

No 

No 

No 

No 

No 

Yes 

 
Tagged , , , ,

Enabling BRE Support for Static .Net Methods

One step which is not automated and is therefore commonly missed when installing and configuring BziTalk server is enabling BRE support for static .Net methods.

The condition becomes hard to diagnose at a later stage for these reasons:

  1. Business Rules Composer will allow you to use the static .Net methods (such as static mscorelib String methods)
  2. Testing a Policy in Business Rules Composer will not result in an error, the outcome of applying a policy simply becomes unpredictable
  3. BRE will not report an error if the support for static .Net methods is not enabled on the server, not even an error in the application error log!

To enable BRE support for static .Net methods these methods, add a REG_DWORD key named “StaticSupport“, with a value of “1” to the following registry path:

32-bit Windows:    HKEY_LOCAL_MACHINE\Software\Microsoft\BusinessRules\3.0\

64-bit Windows:    HKEY_LOCAL_MACHINE\Software\Wow6432Node\Microsoft\BusinessRules\3.0\

Tagged , , , ,

Implementing Windows Azure SQL Azure/ADO.Net/Cloud Services/Media Services Solution – Part 3: Facilitating Development for Windows Azure in Visual Studio 2012

Introduction

In this tutorial series, we will develop a simple database-driven solution for Windows Azure platform.

For the sake of simplicity, interaction with the database will be via Cloud Services (WCF services), rather than a GUI.

The solution will also showcase Media Services offered by the Windows Azure platform.

Key Technologies Used

  • Windows Azure
  • Windows Azure Storage
  • SQL Azure
  • Cloud Services (WCF services)
  • ADO.NET
  • Media Services
  • Windows Azure Diagnostics Services

Tutorial Sections

Part 3: Facilitating Development for Windows Azure in Visual Studio 2012

Installing Windows Azure SDKs

Visual Studio Tools for Windows Azure and the Windows Azure SDK are not included in VS 2012 (or VS 2010, 0r VS 2013 Preview) installation.

To be able to develop a solution for Windows Azure, download and install Windows Azure SDKs for .Net.

To enable Windows Azure Tools in Visual Studio 2012:

  • On the File menu, click New and then click Project.
  • Under Installed Templates, expand the node for either Visual Basic or C#, and then click Cloud.
  • In the middle pane, select Enable Windows Azure Tools, and then click OK.

    This will add a new .Net Framework 3.5/4/4.5 project template:

    Windows Azure Cloud Service project template covers a variety of scenarios and uses, such as developing WCF services or an ASP.NET MVC web application:

    We’ll use Windows Azure Cloud Service project template and a WCF Service Web Role to publish a number of WCF services as Windows Azure Cloud Services.

Connecting to a Windows Azure Database

Connecting to an SQL Azure database is not dissimilar to connecting to a SQL server database. Developing SQL Azure databases can be done using the same familiar set of tools as well as Windows Azure Platform Management Portal for SQL Database:

  1. SQL Server Data Tools

    If you wish to use the SQL Server Object Explorer in VS, Install Microsoft SQL Server Data Tools.

  2. SQL Server Management Studio 2012
  3. The Management Portal for SQL Database, which is part of the Windows Azure Platform Management Portal experience.

Either of these tools support database development tasks, schema modification, Transact-SQL queries, and extract/deploy/upgrade operations with data-tier applications.

When choosing options 1 or 2, download and install Windows Azure SDKs for .Net first.

Tagged , , , , , ,

Implementing Windows Azure SQL Azure/ADO.Net/Cloud Services/Media Services Solution – Part 1: Getting Windows Azure Account

Introduction

In this tutorial series, we will develop a simple database-driven solution for Windows Azure platform.

For the sake of simplicity, interaction with the database will be via Cloud Services (WCF services), rather than a GUI.

The solution will also showcase Media Services offered by the Windows Azure platform.

Key Technologies Used

  • Windows Azure
  • Windows Azure Storage
  • SQL Azure
  • Cloud Services (WCF services)
  • ADO.NET
  • Media Services
  • Windows Azure Diagnostics Services

Tutorial Sections

Part 1: Getting a Windows Azure Account

First step in developing for Windows Azure will, inevitably, be obtaining a Window Azure account at http://www.windowsazure.com/en-us/. Multiple options are available, as discussed below, but all of these involve signing in to your Microsoft account with your Windows Live login.

  1. 3-Month Free Trial

    If you’ve never had a Windows Azure account under associated with your Windows Live login, the easiest way to obtain a Widows Azure account would be to sign up for a free 3-Month Trial, which is a perfectly suitable option for development and testing purposes.

    Once the account expires, you will have an option to upgrade to a Pay-As-You-Go or one of the Monthly Plans (see Option 2).

    To save yourself headache at a later stage and to avoid financial ruin, please consider your eligibility for Options 3 and 4 before activating a Free Trial account or subscribing to a Pay-As-You-Go or one of the Monthly Plans.

  2. Pay-As-You-Go or a Monthly Plan

    For development and testing purposes, I’d recommend the Pay-As-You-Go options.

    This option is also ideal for production deployments with low anticipated traffic, database transactions and compute volumes. Estimating such volumes is an art form in itself… You may choose to subscribe to a Pay-As-You-Go account just to analyze such volumes and the associated costs breakdown in the next couple of months before finally upgrading your subscriptions to one of the Monthly plans available. These plans offer considerable savings but do lock you in for a minimum of 6 months.

    Follow this link for more information on monthly subscription plans and pricing options.

  3. Member Offers

    MSDN members, or members of Microsoft Partner Network or BizSpark can get free access to Windows Azure as part of the membership benefits.

    Follow this link for more information.

  4. Windows Azure Benefit for MSDN Subscribers

    This subscription option is available to Visual Studio Professional, Test Professional, Premium or Ultimate with MSDN or MSDN Platforms subscribers.

    Follow this link for more information.

Once you’ve picked a subscription option, you’ll be forwarded to the Windows Azure portal for your account…

Please note, the portal works best in IE.

 

Tagged , , , , , , ,

Implementing Windows Azure SQL Azure/ADO.Net/Cloud Services/Media Services Solution – Part 6: Exception Logging using Windows Azure Diagnostics

Introduction

In this tutorial series, we will develop a simple database-driven solution for Windows Azure platform.

For the sake of simplicity, interaction with the database will be via Cloud Services (WCF services), rather than a GUI.

The solution will also showcase Media Services offered by the Windows Azure platform.

Key Technologies Used

  • Windows Azure
  • Windows Azure Storage
  • SQL Azure
  • Cloud Services (WCF services)
  • ADO.NET
  • Media Services
  • Windows Azure Diagnostics Services

Tutorial Sections

Part 6: Exception Logging using Windows Azure Diagnostics

It is important to remember that diagnostic data is not permanently stored unless you transfer the data to the Windows Azure storage emulator or to Windows Azure storage. You will not be able to view the exception trace written by the System.Diagnostics.Trace.WriteLine(Exception) unless Windows Azure Diagnostics are enabled for your solution and configured to use a storage account.

For more information on diagnostics in Windows Azure, please refer to the following online publications:

Enabling Diagnostics in Windows Azure

Store and View Diagnostic Data in Windows Azure Storage

Viewing Windows Azure Diagnostics data may also present a challenge as this is not facilitated by the Windows Azure portal. Please refer to this document for a review of a number of available Windows Azure Storage Explorers.

Fortunately, Visual Studio 2012 Server Explorer is perfectly capable of viewing the contents of all Windows Azure Storage types:

Linking a diagnostics storage account to WaTutorial1Service.

The storage account created when deploying the WaTutorial1Service application as a Windows Azure Cloud Service in Part 5: Deploying Cloud Services can be used for Data Management and Business Analytics purposes, including Diagnostics and Error logging.

However, to demonstrate the flexibility of the Windows Azure platform, we’ll create a separate storage account as a linked resource and configure the solution to use that storage account for error logging.

To link the storage account to the cloud service, select LINKED RESOURCES > LINK A RESOURCE > Create a new resource option > Storage Account:

waTutorial1diagnostics linked storage account
is now created and linked to WaTutorial1Service cloud service:

Configuring Windows Azure Diagnostics Connection Settings

WaTutorial1Service solution utilizes Windows Azure Diagnostics Services to log all exceptions occurring in the WCF services. Specifically, System.Diagnostics.Trace.WriteLine(Exception) method is used for exception logging:

namespace WaTutorial1ServiceWebRole
{
    public class WaTutorial1Service : IWaTutorial1Service
    {
        public bool AddMedia(string title, string url)
        {
            try
            {
                using (var waTutorial1Entities = new waTutorial1Entities())
                {
                    waTutorial1Entities.AddMedia(title, url);
                }

                return true;
            }
            catch(Exception exc)
            {
                System.Diagnostics.Trace.WriteLine(exc); 

                return false;
            }
 

The exception trace data written to Windows Azure logs with the System.Diagnostics.Trace.WriteLine(Exception) statement will end up in the WADLogsTable table under the nominated storage account. For more information, see the Diagnostics Data Sources section of Enabling Diagnostics in Windows Azure online reference.

To enable Windows Azure Diagnostics for your application, view the properties of the WaTutorial1ServiceWebRole:

Check the Enable Diagnostics option:

Activate the button to specify a storage account to be used, select the subscription the storage account was created under and, finally, the storage account name:

The connection string will be retrieved from your Windows Azure account and the Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString entry of ConfigurationSettings section in both ServiceConfiguration.Cloud.csfg and ServiceConfiguration.Local.csfg file will be updatesd:

<?xml version="1.0" encoding="utf-8"?>
<ServiceConfiguration serviceName="WaTutorial1Service" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="3" osVersion="*" schemaVersion="2012-10.1.8">
  <Role name="WaTutorial1ServiceWebRole">
    <Instances count="1" />
    <ConfigurationSettings>
      <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=waTutorial1diagnostics;AccountKey=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX+XXXXXXXXXXXXXXXXXXX+XXXXXXXXXXXXX" />
    </ConfigurationSettings>
  </Role>
</ServiceConfiguration>

Please note that the AccountName is the name of the name of the cloud service, not of the storage account.

Windows Azure logs trace messages sent from your code to the trace listener. A trace listener must be added to the web.config or app.config file:

<listeners>
    <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.8.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
         name="AzureDiagnostics">
        <filter type="" />
    </add>
</listeners>

Log data will be transferred at the scheduledTransferPeriod transfer interval to storage table WADLogsTable.

To control the transfer interval and a number of other settings, you may use the Windows Azure Diagnostics Configuration file diagnostics.wadcfg or add the following code to the OnStart method of your role.

namespace WaTutorial1ServiceWebRole
{
    public class WebRole : RoleEntryPoint
    {
        public override bool OnStart()
        {
            // To enable the AzureLocalStorageTraceListner, uncomment relevent section in the web.config  
            DiagnosticMonitorConfiguration diagnosticConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();

            diagnosticConfig.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
            diagnosticConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;

            diagnosticConfig.WindowsEventLog.DataSources.Add("Application!*");
            diagnosticConfig.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
            diagnosticConfig.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);

            diagnosticConfig.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
            diagnosticConfig.Directories.DataSources.Add(AzureLocalStorageTraceListener.GetLogDirectory());

            DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagnosticConfig);

            // For information on handling configuration changes
            // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.

            return base.OnStart();
        }
    }
}

Reviewing Diagnostics Settings for WaTutorial1Service Cloud Services in Windows Azure Management Portal

To review the settings, select WaTutorial1Service from the list of all items and select the CONFIGURE option:

Viewing Contents of waTutorial1diagnostics Windows Azure Storage in Visual Studio 2012

Select Add New Storage Account option in the Server Explorer to view WaTutorial1Service Windows Azure Diagnostics in Visual Studio 2012:

Import the Windows Azure account credentials and select waTutorial1diagnostics (West US) as the account name

To examine waTutorial1service application tracing logs, view entries in WADLogsTable:

Tagged , , , , , ,

Implementing Windows Azure SQL Azure/ADO.Net/Cloud Services/Media Services Solution – Part 5: Deploying Cloud Services

Introduction

In this tutorial series, we will develop a simple database-driven solution for Windows Azure platform.

For the sake of simplicity, interaction with the database will be via Cloud Services (WCF services), rather than a GUI.

The solution will also showcase Media Services offered by the Windows Azure platform.

Key Technologies Used

  • Windows Azure
  • Windows Azure Storage
  • SQL Azure
  • Cloud Services (WCF services)
  • ADO.NET
  • Media Services
  • Windows Azure Diagnostics Services

Tutorial Sections

Part 5: Deploying Cloud Services

In this tutorial I’ll demonstrate how to publish a number of WCF services as Windows Azure Cloud Services using Windows Azure Cloud Service project template and a WCF Service Web Role.

A simple GET request to the WCF service issued from a web browser will result in data saved or retrieved from the Media table or waTutorial1 database deployed in Part 4: Deploying an SQL Azure Database part of this tutorial series.

ADO.Net Entity Framework will be used as an underlying data access layer. Programming will be done against a conceptual application model auto-generated for waTutorial1 database.

Once the web services have been developed and tested locally, we’ll proceed with creating a Cloud Service on Windows Azure and publishing the project.

Developing Windows Azure Cloud Service in Visual Studio 2012

The Windows Azure Cloud Service project template should become available once development for Windows Azure in Visual Studio 2012 has been enabled in your development environment (Part 3: Facilitating Development for Windows Azure in Visual Studio 2012):

We’ll use this template to publish a number of WCF services as Windows Azure Cloud Services.

Select WCF Services Web Role as it is best suited for developing WCF services:

Rename the role to WaTutorial1ServiceWebRole to save yourself a lot of hassle at a later stage.

A Visual Studio 2012 solution will be created with all the necessary artefacts present:

All you are left to do is give those artefacts some meaningful names. Open IService.cs and rename the interface to IWaTutorial1Service:

Proceed with renaming the IService.cs file to IWaTutorial1Service.cs in the project solution to match the interface name.

Deal with the interface implementation in the same way:

Once renaming is complete, the items in the solution explorer should look like so:

Establishing a Data Connection to the Windows Azure SQL Database Server in Visual Studio 2012

A data connection to waTutorial1 database needs to be created in order to be able to create/update WaTutorial1Model ADO.NET model that will be added to the project in the next step.

Select Data Connections > New Connection in the Server Explorer to create a data connection to waTutorial1 database in Visual Studio 2012.

Fill in the connection details such as Server name, User name, login credentials and the database name (see Part 4: Deploying an SQL Azure Database for details on how to obtain the necessary connection details):

Test and save the connection. A new database connection will be added to Data Connections in Server Explorer:


Adding WaTutorial1Model ADO.NET Model

The WaTutorial1Service solution will utilize ADO.NET Entity Framework to connect to waTutorial1 database. By programming against a conceptual application model instead of programming directly against a relational storage schema, we’ll decrease the amount of code and maintenance required for data-oriented applications.

To add an ADO.NET model, add a new item to WaTutorial1ServiceWebRole project
and select
ADO.NET Entity Data Model from the list of Data templates.

The model will generated from waTutorial1 database:

Proceed to select the data connection to waTutorial1 database, making sure to select the Yes, include the sensitive data in the connection string option:

Include all available database artefacts such as the Media table and the two stored procedures:

You don’t actually need to include the Media table for the purposes of this tutorial as all data manipulations will be done via the stored procedures.

Notice that the object name for table Media is Medium, which is the result of selecting the default Pluralize or singularize generated object names option.

Also, the function imports should be available for the AddMedia and GetMedia stored procedures as Import stored procedures and functions into the entity model option was also selected by default:

Obtaining Connection Strings for ADO .Net

It is possible to retrieve connection strings to an SQL Azure database formatted specifically for ADO.Net, ODBC, PHP or JDBC.

Select waTutorial1 database from the list of databases. The connection string for ADO.NET can be found in the database section of the management portal:

To obtain the connection string for ADO.NET, select View SQL Database connection strings for ADO .Net, ODBC, PHP, and JDBC option in the administrative portal:

Updating WaTutorial1Model ADO.NET Model

To ensure that the data access layer of WaTutorial1Service application functions normally and that all connection settings are configured appropriately, update WaTutorial1Model ADO.NET model using the data connection to waTutorial1 database:

Right click on the diagram and select Update Model from Database… option:

Select the connection to waTutorial1 database, making sure to select Yes, include the sensitive data in the connection string option. This will ensure that the password is persisted in the connection string.

Complete the Update Wizard to update the ADO.NET model.

Note, that updating the ADO.NET model in the way described below DOES NOT result in the connection string used by the ADO.NET persisted in waTutorial1Entities
entry of the connectionStrings
section of Web.config being updated. However, the database connection setting will be created if it does not exist. Therefore, edit Web.config to remove the setting prior to running Update Model from Database… wizard:

Finally, rebuild the solution.

Configuring ADO.NET to Connect to waTutorial1 SQL Server Azure Database Manually

You may choose to skip a few
steps and simply edit Web.config file manually in case you want to quickly switch to another database:

To configure ADO.NET connection string, edit waTutorial1Entities
entry of the connectionStrings
section of Web.config:

<connectionStrings>

    <add name="waTutorial1Entities" connectionString="metadata=res://*/WaTutorial1Model.csdl|res://*/WaTutorial1Model.ssdl|res://*/WaTutorial1Model.msl;provider=System.Data.SqlClient;provider connection string=&quot;data source=xxxxxxxxxx.database.windows.net,1433;initial catalog=waTutorial1;persist security info=True;user id=sqlserveradmin;password=xxxxxxxx;MultipleActiveResultSets=True;App=EntityFramework&quot;" providerName="System.Data.EntityClient" />

  </connectionStrings>

Replace the value of the entry with the ADO.NET connection string obtained from the administrative portal:

Specifying IWaTutorial1Service Interface

Now that the data access layer has been generated, we can proceed to develop the web services by specifying the service contract first:

The URL template and the response format can be specified using the WebGet and OperationContract attributes:

namespace WaTutorial1ServiceWebRole
{
	[ServiceContract]
	public interface IWaTutorial1Service
	{
			[WebGet(
				UriTemplate = "AddMedia?title={title}&URL={url}",
				ResponseFormat = WebMessageFormat.Json)]
			[OperationContract]
			bool AddMedia(string title, string url);
		   
			[WebGet(
				UriTemplate = "GetMedia?id={id}",
				ResponseFormat = WebMessageFormat.Json)]
			[OperationContract]
			string GetMedia(int id);
	}
}

Implementing IWaTutorial1Service Interface

The implementation of the IWaTutorial1Service interface is extremely simple:

namespace WaTutorial1ServiceWebRole
{
    public class WaTutorial1Service : IWaTutorial1Service
    {
        public bool AddMedia(string title, string url)
        {

            try
            {
                using (var waTutorial1Entities = new waTutorial1Entities())
                {
                    waTutorial1Entities.AddMedia(title, url);
                }

                return true;
            }
            catch(Exception exc)
            {
                System.Diagnostics.Trace.WriteLine(exc); 

                return false;
            }
        }

        public string GetMedia(int id)
        {
            try
            {
                using (var waTutorial1Entities = new waTutorial1Entities())
                {
                    return waTutorial1Entities.GetMedia(id).First();
                }
            }
            catch(Exception exc)
            {
                System.Diagnostics.Trace.WriteLine(exc); 

                return false.ToString();
            }
        }
    }
}

See Part 6: Exception Logging using Windows Azure Diagnostics for instructions on how enable Windows Azure Diagnostics for your application and for explanations as to where the exception trace ends up written to by System.Diagnostics.Trace.WriteLine(exc).

Finally, the system.serviceModel
section of the Web.config file has to be edited to enable the relevant service endpoints for the IWaTutorial1Service contract:

<system.serviceModel>
    <behaviors>
      <serviceBehaviors>
        <behavior name="WaTutorial1ServiceBehavior">
          <!-- To avoid disclosing metadata information, set the value below to false before deployment -->
          <serviceMetadata httpGetEnabled="true" />
          <!-- To receive exception details in faults for debugging purposes, set the value below to true.  Set to false before deployment to avoid disclosing exception information -->
          <serviceDebug includeExceptionDetailInFaults="false" />
        </behavior>
      </serviceBehaviors>
      <endpointBehaviors>
        <behavior name="WebBehavior">
          <webHttp />
        </behavior>
      </endpointBehaviors>
    </behaviors>
    <services>
      <service behaviorConfiguration="WaTutorial1ServiceBehavior" name="WaTutorial1ServiceWebRole.WaTutorial1Service">
        <endpoint address="ws" binding="wsHttpBinding" contract="WaTutorial1ServiceWebRole.IWaTutorial1Service" />
        <endpoint address="" behaviorConfiguration="WebBehavior" binding="webHttpBinding" contract="WaTutorial1ServiceWebRole.IWaTutorial1Service"></endpoint>
        <endpoint address="mex" binding="mexHttpBinding" contract="IMetadataExchange" />
      </service>
    </services>
    <serviceHostingEnvironment multipleSiteBindingsEnabled="true" />
  </system.serviceModel>

Testing WaTutorial1Service Application Locally

The solution can be tested locally, in the Windows Azure compute emulator, using the connection to waTutorial1 SQL Azure database to facilitate debugging.

Select the Media table in the Server Explorer and choose the Show Table Data option:

Testing AddMedia Method Locally

Run the solution.

Invoke the WCF method via the following GET request:

http://127.0.0.1:81/WaTutorial1Service.svc/AddMedia?title=Video1&url=url1

The service will return a JSON encoded string representing a Boolean value FALSE or TRUE in case of failure/success respectively:

Check the contents of Media table in waTutorial1 SQL Azure database to validate URL registration process:

Testing GetMedia WCF Method Locally

Run the solution.

Invoke the WCF method via the following GET request:

http://127.0.0.1:81/WaTutorial1Service.svc/GetMedia?id=2

The service will return “False” in case of failure or a JSON encoded string representing a URL of the published media content:

Creating a Cloud Service

To set up a Cloud Service in your Windows Azure account, select CLOUD SERVICES > CREATE A CLOUD SERVICE > QUICK CREATE option:

QUICK CREATE option is sufficient for the purpose of this tutorial… Simply choose a name for the cloud service, the geographical region where you wish to host your cloud services and click the Create Cloud Service button.

When choosing the geographical region to host your services in, you may take into account

  1. Which geographical region the waTutorial1 database is hosted, as the cloud service will be used to save and retrieve data from that database
  2. Where most of the service requests will be coming from

The CUSTOM CREATE option allows deploying a cloud service package, which we will develop in the next step:

A new cloud service will be created and hosted in West US:

Deploying WaTutorial1Service

To deploy the WaTutorial1Service application, select WaTutorial1Service project in the Solution Explorer and choose Publish… command from the menu:

Select Sign in to download credentials option at the start of the publishing wizard and follow the instructions:

Proceed to import the downloaded subscription settings:

An associated storage service for Data Management and Business Analytics purposes can be created in the same step:

Review the summary page of the publishing wizard and proceed to publish the services:

Finally, publish the application. This may take a few minutes to complete:

Examine the items under you Windows Azure account:

Testing WaTutorial1Service Online Application

Select waTutorial1cloude service from all items list and view the dashboard to obtain the site URL:

Testing the published AddMedia Method

Invoke the WCF method via the following GET request:

http://watutorial1service.cloudapp.net/WaTutorial1Service.svc/AddMedia?title=Video2&url=url2

The service will return a JSON encoded string representing a Boolean value false or true in case of failure/success respectively.

Check the contents of Media table in waTutorial1 SQL Azure database to validate URL registration process:

Testing the published GetMedia Method

Invoke the WCF method via the following GET request:

http://watutorial1service.cloudapp.net/WaTutorial1Service.svc/GetMedia?id=3

The service will return “false” in case of failure or a JSON encoded string representing a URL of the published media content.

Tagged , , , , , ,

Implementing Windows Azure SQL Azure/ADO.Net/Cloud Services/Media Services Solution – Part 2: Publishing Video Streaming Content

Introduction

In this tutorial series, we will develop a simple database-driven solution for Windows Azure platform.

For the sake of simplicity, interaction with the database will be via Cloud Services (WCF services), rather than a GUI.

The solution will also showcase Media Services offered by the Windows Azure platform.

Key Technologies Used

  • Windows Azure
  • Windows Azure Storage
  • SQL Azure
  • Cloud Services (WCF services)
  • ADO.NET
  • Media Services
  • Windows Azure Diagnostics Services

Tutorial Sections

Part 2: Publishing Video Streaming Content

Windows Azure Media Services offers encoding, format conversion, content protection and both on-demand and live streaming capabilities which allow quickly build a media distribution solution that can stream audio and video to a variety of devices, from Xbox and Windows PCs, to MacOS, iOS and Android.

You’d normally encode media content locally, and then upload the encoded files to Media Services for transcoding into multiple formats best suited for your purposes.

Once uploaded, the content can be managed with the Windows Azure Media Services content view, which supports the following functionality:

  • View content information like published state, published URL, size, and date time of last update.
  • Upload new content
  • Encode content
  • Play content video
  • Publish/Unpublish content
  • Delete content

In this part of the tutorial series, I’ll demonstrate how to

  • Enable media services
  • Upload media content
  • Encoding media content for Streaming, targeting Android devices
  • Publish encoded media content
  • Test streaming of the media content in a common media player

Enabling Media Services

To enable media services in your account, select MEDIA SERVICES > CREATE A MEDIA SERVICE ACCOUNT option

Both the media service and the associated storage will be created:

Uploading Media Content

To upload a media file, select the media service from the list of all items and select Upload Content option:

Encoding Video Content for Streaming

Select one of the uploaded files in the Windows Azure Media Services content view and choose the ENCODE option.

To particularly target streaming to Android devices, select H264 Adaptive Bitrate MP4 Set SD 16×9 option from the list of Presents.

This action will create an encoding job for the media content. The encoding process should commence within half an hour, depending on how long the jobs queue is and on the server workload, and, depending on the size of the source file, may take up to an hour to complete.

Notice, that now you have two files, the original file you uploaded, which can be encoded at a later stage in a number of formats as required, and a file encoded in H264 Adaptive Bitrate MP4 Set SD 16×9 format and that there is a noticeable file size difference:


Publishing Encoded Video Content for Streaming

The encoded video content has to be published in order to obtain the URL to the video streaming resource.

To publish the content, select the file encoded in H264 Adaptive Bitrate MP4 Set SD 16×9 format and select the PUBLISH option:

To obtain the URL of the encoded video resource, put the mouse over the URL column and click the button to copy the URL to the clip board.

The URL to the encoded resource will appear in the following format:

https://XXXXXXstorage.blob.core.windows.net/asset-3c1e1491-974a-45ad-8161-40e9a754d6a7/video1_1300.mp4?st=2013-04-19T00%3A25%3A47Z&se=2015-04-19T00%3A25%3A47Z&sr=c&si=865ae334-fa8f-4894-818d-587e85b53d28&sig=htDSL9NXpha8i6xBB63WFra9hhnlQ1L76aNcVgQVgiA%3D

Test Streaming of the Media Content in a Common Media Player

You can use any media player supporting video streaming, such as VLC and Windows Media Player to test streaming of the encoded and published video content by supplying the URL to the encoded resource:

Tagged , , , , , ,

Implementing Windows Azure SQL Azure/ADO.Net/Cloud Services/Media Services Solution – Part 4: Deploying an SQL Azure Database

Introduction

In this tutorial series, we will develop a simple database-driven solution for Windows Azure platform.

For the sake of simplicity, interaction with the database will be via Cloud Services (WCF services), rather than a GUI.

The solution will also showcase Media Services offered by the Windows Azure platform.

Key Technologies Used

  • Windows Azure
  • Windows Azure Storage
  • SQL Azure
  • Cloud Services (WCF services)
  • ADO.NET
  • Media Services
  • Windows Azure Diagnostics Services

Tutorial Sections

Part 4: Deploying an SQL Azure Database

Creating SQL Database Server

To create a new SQL Database server, select SQL DATABASES > SERVERS > CREATE A SQL DATABASE SERVER option:

A new database server with an arbitrary name will be created under the specified subscription:

Creating waTutorial1 SQL Server Database

Proceed to create a database used in the solution once an SQL database server is created.

Select SQL DATABASES > DATABASES > CREATE A SQL DATABASE option:

The database will be created within seconds:

Obtaining the SQL Azure Server Connection Details

Confusingly, while most of the database server details (such as server name, Administrator Login and Password Management tools) can be found on the Database Server Dashboard (SQL DATABASES > SERVERS > your server > DASHBOARD) , server name/port are found in the database section of the management portal, rather than in the server section.

Select waTutorial1 database from the list of databases (you should only have one at this point).

The server name/port can be found under the Connect to your database section:

Before you attempt to connect to the server though, you need to add your current IP address to the list of allowed IP addresses.

Select the Set up Windows Azure firewall rules for this IP address option:

Connecting to the Azure SQL Server form SQL Server Management Studio 2012

You can easily use SQL Server Management Studio 2012 to connect to the SQL Azure database.

Enter the server name and the authentication details in the SQL Server Management Studio 2012 Connect to Server dialog box:

The Object Explorer for the SQL Azure server will be as if you connected to any SQL Server database and you will have the complete set of tools at your disposal:

Connecting to the Azure SQL Database form Visual Studio 2012

If you wish to use the SQL Server Object Explorer in Visual Studio 2012, Install Microsoft SQL Server Data Tools first.

In the Server Explorer, select the Add Connection option and enter the server name and the login details, leaving Microsoft SQL Server (SQL Client) as the Data source:

Please note, that if no database was selected, the connection will be made to the master database by default:

To connect to waTutorial1 database, make sure to select it in the Connect to a database section of the Add Connection dialog. The connection to waTutorial1 will appear in the Server Explorer:

Connecting to the Azure SQL Server Management Portal

You will find the link to the Management Portal on the Database Server Dashboard (SQL DATABASES > SERVERS > your server > DASHBOARD):

You should be able to perform such as designing tables, executing queries and developing stored procedures with the online Management Portal:

Creating Tables and Stored Procedures in waTutorial1 SQL Server Azure Database

We will create one simple table Media and a couple of stored procedures (to insert a record into the tale and to retrieve the URL of a resource by the record ID) using the online management portal.

To create a new table, select the Design tab on the left and Tables option at the top:

Proceed to design the table, saving the changes once done:

The process of creating a stored procedure is similar: select the Design tab on the left and Stored Procedures option at the top and proceed to develop the stored procedure:

Notice that the online SQL Azure Management portal offers simplified syntax compared to SSMS:

-- =============================================
-- Author:        &lt;Author,,Name&gt;
-- Create date: &lt;Create Date,,&gt;
-- Description:    &lt;Description,,&gt;
-- =============================================
CREATE PROCEDURE AddMedia
    @Title nvarchar(250),
    @URL nvarchar(500)
AS
BEGIN
    -- SET NOCOUNT ON added to prevent extra result sets from
    -- interfering with SELECT statements.
    SET NOCOUNT ON;

    INSERT INTO [waTutorial1].[dbo].[Media]
           ([Title]
           ,[URL])
     VALUES
           (@Title
           ,@URL)
END
GO 

To test the stored procedure, select the Data option, enter the parameters and activate the Run button at the top:

You should have one row in the Media table.

Create the GetMedia stored procedure:

The SSMS syntax:

-- =============================================
-- Author:        &lt;Author,,Name&gt;
-- Create date: &lt;Create Date,,&gt;
-- Description:    &lt;Description,,&gt;
-- =============================================
CREATE PROCEDURE GetMedia
    @ID int
AS
BEGIN
    -- SET NOCOUNT ON added to prevent extra result sets from
    -- interfering with SELECT statements.
    SET NOCOUNT ON;

    -- Insert statements for procedure here
    SELECT 
        URL 
    FROM 
        [waTutorial1].[dbo].[Media] 
    WHERE 
        [waTutorial1].[dbo].[Media].ID = @ID
END
GO

 

To test the stored procedure by selecting the Data option, entering the parameters and activating the Run button at the top:

Tagged , , , , , ,
Follow

Get every new post delivered to your Inbox.