Friday, March 14, 2008

Thursday, March 13, 2008

Script to read any Windows Event/Application Log

Awhile back there was a discussion on the TME10 list about reading a custom application event log that a developer is using. The out of the box problem is that the ITM agents don't allow you to specify a log to look at, it only looks at the basic Windows logs. The ultimate solution is to write a script to grad the data.

So here is a simple PERL script that will read all messages from all Windows event/application logs. With a little tweaking, it can be used to feed a Universal Agent that will report to ITM based on Event number or LogFile name.

use strict;
use Win32::OLE('in');

use constant wbemFlagReturnImmediately => 0x10;
use constant wbemFlagForwardOnly => 0x20;

my @computers = ("localhost");
foreach my $computer (@computers) {
print "";
print "==========================================";
print "Computer: $computer";
print "==========================================";

my $objWMIService = Win32::OLE->GetObject("winmgmts:\\\\$computer\\root\\CIMV2") or die "WMI connection failed.";
my $colItems = $objWMIService->ExecQuery("SELECT * FROM Win32_NTLogEventLog", "WQL",
wbemFlagReturnImmediately | wbemFlagForwardOnly);

foreach my $objItem (in $colItems) {
print "Log: $objItem->{Log}";
print "Record: $objItem->{Record}";
print "";
}
}

Determining which executable has a port open on Windows

A while back I wrote an article about using 'netstat -o' for finding out which PID had a particular port open (on Windows - you can use 'lsof' on Linux/Unix). Well, it turns out that in windows an additional flag will give you even more information.

Specifically, the addition of the '-b' flag will tell you which executable has which port open. Here's an example of the command and a snippet of its output:

C:\> netstat -bona

Active Connections

Proto Local Address Foreign Address State PID
TCP 0.0.0.0:135 0.0.0.0:0 LISTENING 932
RpcSs
[svchost.exe]
TCP 0.0.0.0:554 0.0.0.0:0 LISTENING 5652
[wmpnetwk.exe]
TCP 0.0.0.0:912 0.0.0.0:0 LISTENING 3204
[vmware-authd.exe]
TCP 0.0.0.0:990 0.0.0.0:0 LISTENING 1616
WcesComm
[svchost.exe]
TCP 0.0.0.0:3389 0.0.0.0:0 LISTENING 1628
Dnscache

NOTE: If you try to run this on Vista as anyone other than Administrator, you'll get an error stating "The requested operation requires elevation.". To get around this:

RIGHT-Click on Start->All Programs->Accessories->Command Prompt, and select "Run As Administrator"

Then you can run the command from that new command prompt.

SCE JavaScript Action

The State Based Correlation Engine (SCE) is a powerful tool for filtering and applying simple correlation rules to TEC events before TEC rules are applied, but a major deficiency is a lack of a general purpose action for manipulating events. In order to manipulate events in ways other than the simple methods provided by the supplied actions, one had to develop a new Java class.

Presented here is a general purpose SCE action that embeds the Rhino JavaScript engine that enables the use of JavaScript programs to manipulate events.

The SCE JavaScript custom action can be downloaded here.

Key features include:


  • Add, delete, and modify event attributes (slots)
  • Change event class (type)
  • Generate new events
  • Discard events
  • JavaScript regular expressions
  • Automatic conversion of Prolog list types to JavaScript arrays and back
  • Forward events to other SCE rules
  • Access SCE variables
  • Get and set SCE rule properties
  • Rhino Live Connect to access native Java objects such as JDBC
  • Optional mapping of event attributes to JavaScript properties of an Event object


    The README file contains examples of event flood detection and handling and JDBC (MySQL) event enrichment using Rhino's Live Connect feature.

Three simple rules for Universal agent naming

Most of you know that the Universal application naming is very restrictive. Sometimes, we spend lot of hours troubleshooting universal agent issues only to figure out later that the issue was caused by bad application naming. This article lists three simple rules for naming your Universal agent applications.

Name your application using 3 letters

You could use more than three letters for application name but only the first three letters matter and they should be unique.

Avoid names starting with 'K'

In ITM 6.1, application names starting with 'K' are reserved. Avoid them.

Avoid special characters

Universal agent application naming convention allows few special characters such as underscore but not all of them. Just to keep your application naming convention simple, avoid any special characters and instead use only numbers or alphabets.

Hope this saves some time during your next UA development.

Tivoli Provisioning Manager Basic Terminology

As TPM gains foothold in Tivoli product space as a successor to ITCM 4.2, it is important to learn some of the basic terminology to understand the upcoming lingo. This article gives you a basic idea for some of the TPM terms. This article credit is shared with Martin Carnegie.

TPM Product Names

Since there are quite a few products under TPM product family, you might want to read some of our earlier blogs about TPM Products. Here are couple of links.

1. http://www.gulfsoft.com/blog_new/index.php?name=News&file=article&sid=351
2. http://www.gulfsoft.com/blog_new/index.php?name=News&file=article&sid=382

TPM Server

TPM Server is roughly the equivalent of TMR Server in the ITCM 4.2 world. The server initiates all the TPM operations such as scans, software distribution, etc.

Tivoli Common Agent (TCA)

TCA is similar to endpoint in ITCM. However, in TPM, you can perform operations such as basic scans, software installs, etc without the presence of TCA code.

Content Delivery Service (CDS)

Content Delivery Service can be roughly considered as a "Mdist2" like service in TPM that facilitates data transfer across TPM.

CDS Management Server

Content Delivery Service Management server is the server component of Content Delivery Service. It can be thought as a manager of depot servers. (See Below)

Depot Servers

These are equivalent to the gateways in ITCM, from where endpoints pulls the data. Please note however that in TPM, the data transfer can skip depot servers depending upon the SAP mechanism used.

Service Access Point (SAP)

Service Access Point specifies the underlying protocol & authentication information to use while performing TPM operations. For example, TPM can communicate with an endpoint using a SSH service or a Windows SMB service or using a TCA SOA-SAP.

Workflows

These are set of instructions that performs specific task. For example, you can execute a task, copy a file to a remote machine, etc. In TPM, most of the operations such as software distribution, tca installation, Inventory scan are implemented as a set of workflows. Custom workflows can be developed using Jython programming language.

These are some of the basic TPM terms that comes to mind. Hope you find it useful.

Converting TDW timestamps to DB2 Timestamps

One of the key aspect of history tables is timestamps. But as you might have noticed, performing date operations with TDW timestamps is not straight forward. Part of the reason is that TDW uses "Candle timestamps" rather than RDBMS timestamps. This article discusses a way to convert the "Candle timestamps" into DB2 timestamps.

The problem

First, in TDW, the "Candle timestamps" are stored in a "CHAR" format rather than "timestamp" or "datetime" format. If you would like to utilize the powerful date functions of the RDBMS, you will not be able to do unless you convert the string into RDBMS timestamp format.

So, are you ready to convert the string into timestamp using DB2 Timestamp() function? Well, it is not so simple, my friend! The "Candle timestamps" are stored in a julian year like "CYYMMDDhhmmssSSS" format where C is the century (0 for 20th, 1 for 21st, etc). This non-standard format makes it difficult to convert the string into timestamps directly.

The solution

Okay, enough whining. Let us see how it can be converted to DB2 timestamps. You can use the SUBSTR() function of DB2 to extract the desired fields from the database but it will lead to a very complicated queries especially if you need to convert the timestamps at multiple places in the same query.

For a related blog article, see here

The idea here is to do something similar but define it in a DB2 User Defined Function (UDF). After defining it, we can use the UDF just like any other function.

Creating a UDF.

I am going to rush thru the UDF creation steps like the guy in marriage ceremony in Budlight superbowl commercial. See . To create an UDF, goto "Control Center" -> Tools -> "Development Center" -> Project -> New Project -> Add Connection -> User Defined Functions -> New SQL User Defined Function.

Now enter the following UDF definition in the Editor view and click Build. Make sure to replace the ITMUSER with your TDW user id.


CREATE FUNCTION ITMUSER.STR2TIMESTAMP (
STRDATE CHAR(16) )
RETURNS TIMESTAMP
LANGUAGE SQL
DETERMINISTIC
CONTAINS SQL
NO EXTERNAL ACTION
BEGIN
ATOMIC
DECLARE RESULT TIMESTAMP ;
SET RESULT = TIMESTAMP_FORMAT(
'20' || SUBSTR(STRDATE,2,2) || '-' ||
SUBSTR(STRDATE,4,2) || '-' ||
SUBSTR(STRDATE,6,2) || ' ' ||
SUBSTR(STRDATE,8,2) || ':' ||
SUBSTR(STRDATE,10,2) || ':' ||
SUBSTR(STRDATE,12,2), 'YYYY-MM-DD HH24:MI:SS');
RETURN RESULT;
END



Using the UDF in SQL queries

Once the UDF is defined, converting the "Candle Timestamps" to DB2 is very easy. The following shows an example.


SELECT "Server_Name", ITMUSER.STR2TIMESTAMP(A."Timestamp"), "%_Disk_Used" From ""NT_Logical_Disk" A


Hope you find this useful.