Datapower RBM Active Directory Members

When you want to enable Active Directory Login with Datapower and your members are part of a Group you need to add the following to your LDAP Search Credintials:

Name:                ldapsearch

LDAP Base DN:        ou=groups,dc=ibm,dc=com

LDAP Filter Prefix:  (&(member=

LDAP Filter Suffix:  )(|(cn=administrators)(cn=architects)(cn=operations)))

Call Web Service in Datapower Multiprotocol Gateway

So many time you would like to call a Web Service that receives input from a MQ, File, JMS or HTTP and want to put the result of the Web Service to an different output protocol (MQ, File, JMS or HTTP).

This can be done using an Datapower Mulitprotocol gateway.

Step 1: Setup your front Side and BackSide Handler on your Multi protocol Gateway.

Step 2: Modify your current policy by adding an Transform action after your Match Action that will execute the Web Service SOAP request.

Below is an sample of the Web Service Stylesheet file:

<?xml version="1.0"?>

<xsl:stylesheet xmlns:xsl=""

  <xsl:output method="xml"/>

  <xsl:variable name="call">
    <soapenv:Envelope xmlns:soapenv="" xmlns:evt="">
      <xsl:copy-of select="/soap:Envelope/soap:Body"/>

  <xsl:template match="/*">
    <xsl:variable name="result" select='dp:soap-call("", $call)'/>
    <xsl:copy-of select="$call"/>
About to call.
    <xsl:copy-of select="$result/."/>
Done with call


Note: Add your own namespaces to the SOAP Envelope and change the endpoint IP, Port and URI.

Datapower disable RBM to login to appliance

If you have set RBM and you can’t log into your Datapower appliance:

Open a an SSH session and login to your appliance (if SSH is enable on appliance)

Putty -> <DataPower IP> -> Port 22 -> enter Super Admin Username and Password

Execute to following commands to reset RBM:


xi50 (config)>rbm

xi50(config rbm)>reset

xi50(config rbm)>exit

Datapower Active Directory RBM Authentication

Below is good resources on setting up Datapower Active Directory for login into a Datapower Appliance:

Below is my version on setting up Active Directory for Datapower and it works:

Step 1: Navigate to Administration -> RBM Settings

Step 2: Make sure your Main page looks like the following:

Step 3: Navigate to the Authentication Tab

Specify your Active Directory Server Host, Port Number (default 389), LDAP version and enable Search LDAP for DN

Your Bind DN will look something like:

CN=,OU=Application Specific Resources,OU=Enterprise Configuration & Resources,DC=,DC=,DC=

Note: Set Local Login as Fallback to login to your appliance. This is for when LDAP fails you still have access to the appliance.

Your LDAP Search Parameter will looks like follows:

Your LDAP Base DN will look something like:

OU=User Accounts,DC=,DC=,DC=

Step 4: Navigate to the Credentials Tab and select Mapping Credentials Method -> xmlfile

Disable: Search LDAP for Group Name

Create a new RBM Policy URL by clicking on the + . Navigate to the wizard till u get to: Access Profile Mapping.

Your Credential Name will look something like: OU=User,OU=Business Unit,OU=User Account,DC=,DC=,DC=

WebSphere Appliance Management Center V4.0

IBM just announced the new product called WebSphere Appliance Management Center V4.0. Below is some of the functionalities of the WAMC 4.0:

  • Support for managed groups of different appliance models and firmware levels for greater flexibility
  • Supports new managed domain tasks, and configuration and firmware deployments providing more fine-grained control of the environment
  • Management of deployment policies for WebSphere DataPower appliances, both individually or in managed sets, providing full lifecycle management across deployment environments
  • Easy to use multiplatform installer
  • Support for multiple generations of WebSphere DataPower appliance platforms and firmware versions
  • Enhanced monitoring capability with default settings for critical WebSphere DataPower appliance key performance indicators
  • Easy-to-use and navigate user interface, including support for different user roles
  • Seamless integration into the IBM Tivoli Monitoring infrastructure
  • Integration into the IBM Service Management solution

Datapower XB60 Large File Size Handling

Datapower has become a proxy to receive large files from customers but also to send large file to enterprise back end. These files are processed, transformed, validated, authenticated, authorized and routed.

If we are handling large file we need to make sure the auxilary storage on the appliance is configured correctly: as well as how to manage files on the appliance:

Using the XB60 for received files (payment files/invoices files/transaction files) it is better to create a Multi protocol gateway to handle the transfer of the and the B2B gateway to process, transform, validate, authenticate, authorize and route the file content.

The reason for that is the Datapower XB60 appliance has an Internal hard drives that are 140GB of usable space, 70 GB of AES encrypted space for B2B payloads, should you choose to store them on the device, and 70GB for the metadata store. Below is some options to consider when handling large files through an Datapower appliance:

Option 1: File larger than 100 MB cannot be sent through the B2B Gateway as there is not streaming functionality to stream a file. File Streaming functionality is only available in the Multi Protocol Gateway. So the solution is to enable streaming on a Multi Protocol Gateway in the return action point the file to the B2B gateway for processing. The files can be stored off device to either a NFS mount point or an ISCSI Drive subsystem.

Option 2: Best practice for large files is to pass files into a downstream system for processing and use the XB60 as just a gateway. This system can be an dedicated FTP server or WebSphere MQ FTE to securely and efficiently handle the transfer of the file into the intranet to be processed by the Datapower XB60. Datapower XB60 can also call a WebSphere MQ FTE agent to transfer a file from a source to an destination using an XML Transfer file.

Option 3: Chucking files into smaller sizes and sent to Datapower for processing. This can be done using MQ FTE, see example link below: also split a file into same length message and PUT to Websphere MQ Queue.

Option 4: Proving the size of the document and the average TPS per appliance add more appliance to handle the file size and TPS requirements. Before choosing which option is the best for your organization first go through the Document and Metadata exercise below and replace the values with your organizations current or estimated values:

Document Storage on Datapower XB60 Exercise: For illustrative purposes, assume 10,000 EDI messages per day with an average size of 10 KB. Therefore, you are expecting 100 MB of content per day. This capacity requires 360 MB of document storage per day, which equates to approximately 84 days. This corresponds to:

  • 200 MB of stored data for these messages. 2 copies of each message one copy for “off the wire” and one for the processed request.
  • 40 MB (4 KB per message) for protocol responses
  • 120 MB for the raw MDN (4 KB per message), HTTP request containing MDN (4 KB per message), and HTTP response containing MDN (4 KB per message)

Metadata Storage on Datapower XB60 Exercise: If the messages per day are split evenly between inbound and outbound and that every outbound message requests an asynchronous MDN, the space for metadata storage adds up to 15,000 transactions per day, which equates to 60 MB (15000 * 4 KB) of metadata storage per day. At the default 1 GB persistence store size, this will fill in approximately 16 days. If the persistence store is increased to 64 GB, over 1,000 days of message metadata can be persisted. Using the default persistence store size in this example, setting the “Archive Document Age” property to 7 days will remove document metadata from the persistence store before it fills. See the following article on Capacity planning for Datapower XB60 Appliance:

Datapower Deployment Scenarios

This article explains on the Datapower Deployment Scenarios and their specific functions inside an enterprise environment.

1. Lab environment – this is an isolated environment that allows testing of any major new firmware release features to be tested without any impact on ongoing development streams. This assist change management of new features and testing new feature before implementation.

2. Development environment – this is an very common practice to isolate Datapower service development to a dedicated environment. This will usually be an single appliance for developers as an black sandbox to develop as well as do project-specific configurations.

3. Testing environment – this is an isolated test environment from the development environment mentioned above. This environment is used to test all developed services. The appliance provide easy to service migration between appliances and domains.

4. Staging environment – the environment allows testing pre-releases, or rolling new releases into production. The environment is used to do performance testing to determine sizing and scaling of production appliances.

5. Production environment – appliances in the production environment can be deployed as a cluster in an active/passive configuration or active/active configuration. Appliances can balance traffic to target servers using the Application Optimization feature.

6. DR environment – Many organization require full data center failover to a second fully equipped site. The DR environment will provide failover appliance.

File Transfer Scenarios with WebSphere Datapower

Below is some scenarios for using Datapower S/FTP frontside handler/poller to solve enterprise file transfer requirements.

Scenario 1: Enterprise customers want to bridge legacy S/FTP-based messaging flows with newer HTTP Web Services.

The following tasks are performed by Datapower:

1. A file is retrieved by polling a remote FTP Server

2. Routes the file content to an enterprise HTTP Server for processing

3. Captures the response by the HTTP Server application and places a response file on the remote FTP Server. This is done by creating a predetermined naming pattern.

Scenario 2: Datapower extends the ESB to include an FTP-based client requiring additional security

The following task are performed by Datapower:

1. Datapower retrieves file by polling a remote SFTP server

2. It passes binary file through to an enterprise Websphere MQ Queue

3. And captures the response from WebSphere MQ system and places the response file on the remote SFTP sever. This is done by creating a predetermined naming pattern.

Scenario 3: Datapower protects existing enterprise FTP servers.

The following tasks are performed by Datapower:

1. Datapower access FTP connection from remote FTP Clients which allow client to connect to a inbound FTP server transparently and securely.

2. Authentication using username and password is done from incoming FTP Client.

3. Datapower supports the streaming of large file through the appliance.

Scenario 4: Datapower provides additional security for SFTP-based message exchange patterns

The following tasks are performed by Datapower:

1. Datapower acts as an SFTP server to remote client connections.

2. Datapower presents a virtual files system that exists only on the device, proxing the actual back end servers.

3. Datapower dynamically routes files to the desired location on the back end SFTP Server

4. Datapower  performs public key authentication on the inbound FTP Clients.

DataPower and the PCI DSS (Data Security Standard)

DataPower ideal solution for many requirements:

•Build and Maintain a Secure Network
Requirement 1: Install and maintain a firewall configuration to protect cardholder data
–Requirement 2: Do not use vendor-supplied defaults for system passwords and other security parameters
•Protect Cardholder Data
–Requirement 3: Protect stored cardholder data
–Requirement 4: Encrypt transmission of cardholder data across open, public networks
•Maintain a Vulnerability Management Program
–Requirement 5: Use and regularly update anti-virus software
–Requirement 6: Develop and maintain secure systems and applications
•Implement Strong Access Control Measures
–Requirement 7: Restrict access to cardholder data by business need-to-know
–Requirement 8: Assign a unique ID to each person with computer access
–Requirement 9: Restrict physical access to cardholder data
•Regularly Monitor and Test Networks
–Requirement 10: Track and monitor all access to network resources and cardholder data
–Requirement 11: Regularly test security systems and processes
•Maintain an Information Security Policy
–Requirement 12: Maintain a policy that addresses information security

Red – Complete Solution with Datapower
Blue – Partial Solution with Datapower