Tuesday, 1 November 2016

Message Broker/IIB: A thought on Logging Transactions

Hi Guys,

Here I am going to discuss mainly 2 approaches for logging transactions and at the end let’s do a comparative study between both the approaches. Here I assume that you are working in an environment where logging each message, along with the payload, at every stage of processing is very important.

Approach 1: Make use of the monitoring events

You can configure your message flows to emit monitoring events at various stages of processing. Either you can make use of the built in record-and-reply option provided by WMB/IIB or you can write custom application/message flows to process the monitoring events emitted by main(service) flows and persist them in a database or file or any medium of your choice.

Approach 2: Subflows to log message

You can write a subflow which constructs the log message and pass it to an application/message flow which process the log message and persist it. The subflows can be plugged to any stage of processing depending on your requirement.


Monitoring Events
Sub Flow
Decoupling
Provides total decoupling from the main transaction processing message flows. The main flows processing transactions won’t be impacted if there is a failure to emit a monitoring event
Provides partial decoupling. Being a subflow plugged directly in to the message flow, any failure in it could result in the failure of main transaction
Throughput
Has impact on throughput. Under load, I could observe ~ 20% less throughput as compared to Approach 2
Better throughput when compared to Approach 1
Monitoring
Need manual action to switch on the monitoring events after every deployment. If your administrator fails to that, no messages would be logged even though mail message flows would be processing transactions as usual
No monitoring at all
Grouping Business Transactions
Monitoring events provide an option to group business transactions
To achieve similar grouping, additional code has to be written



That’s it for now. Feel free to write your views on the above post.

Monday, 24 October 2016

Integrating IIB 10.0.0.6 with Mongo DB



Integrating IIB 10 with Mongo DB

            Pre-requisites:

1.       Create a free account in Mongo labs
a.       Create a free account in Mongo labs, a Cloud based Mongo DB. Obviously free of cost and you get 500 MB space
b.      Create a Collection (Table in RDBMS)

Broker Configs:

1.       Check your Environment variable MQSI_WORKPATH and switch your work path to MQSI_WORKPATH

2.      Change directory to node_modules

3.      Run the command as below to install the mongodb connector. This might take up to a minute to complete

npm install loopback-connector-mongodb --save

4.      Now a config file datasources.json has to be created. So, switch directory to MQSI_WORKPATH\ connectors\loopback. Create a file with name datasources.json and copy the below contents to the file

{
  "MONGO": {
    "host": "ds031257.mlab.com",
    "port": "31257",
    "database": "esbdev",
    "name": "MONGO",
    "connector": "mongodb",
    "debug": "true"
  }
}

5.      Configure the username and password of mongodb at the broker level using the below command

mqsisetdbparms <Broker name> -n loopback::<securityid> -n username -p password

6.      Restart your broker

7.       Create a simple message flow with http input and reply nodes (it is just my preference) and connect them to a LoopbackRequest node. Please select the input message parsing domain as “JSON”

      

8.      Configure LBR node as below. Leave the other options as default


9.      You are good to go .. deploy the flow .. If it gets successfully deployed, half done ..

10.   Post a Json message and verify the Collection to find the data you sent stored as Key-value pairs (no need of defining any schema as MongoDB is NoSQL). Sample data is given below

 

That's it :-)