Hi Guys,
Here I am going to discuss mainly 2 approaches for
logging transactions and at the end let’s do a comparative study between both
the approaches. Here I assume that you are working in an environment where
logging each message, along with the payload, at every stage of processing is
very important.
Approach 1:
Make use of the monitoring events
You can configure your message flows to emit monitoring
events at various stages of processing. Either you can make use of the built in
record-and-reply option provided by WMB/IIB or you can write custom application/message
flows to process the monitoring events emitted by main(service) flows and
persist them in a database or file or any medium of your choice.
Approach 2:
Subflows to log message
You can write a subflow which constructs the log message
and pass it to an application/message flow which process the log message and
persist it. The subflows can be plugged to any stage of processing depending on
your requirement.
Monitoring Events
|
Sub Flow
|
|
Decoupling
|
Provides total decoupling from the main transaction
processing message flows. The main flows processing transactions won’t be
impacted if there is a failure to emit a monitoring event
|
Provides partial decoupling. Being a subflow plugged
directly in to the message flow, any failure in it could result in the
failure of main transaction
|
Throughput
|
Has impact on throughput. Under load, I could observe ~
20% less throughput as compared to Approach 2
|
Better throughput when compared to Approach 1
|
Monitoring
|
Need manual action to switch on the monitoring events
after every deployment. If your administrator fails to that, no messages
would be logged even though mail message flows would be processing
transactions as usual
|
No monitoring at all
|
Grouping Business Transactions
|
Monitoring events provide an option to group business
transactions
|
To achieve similar grouping, additional code has to be
written
|
That’s it for now. Feel free to write your views on the
above post.