Wednesday, December 30, 2009

Tivoli Policy Driven Software Distribution (TPDSD) Quick Overview

General Comments
We have been hearing about this product since Pulse 2008. The idea was to target a product on the desktop segment rather than one the covers both the server and desktop markets. This was done as these are two different worlds when it comes to management. IBM was hearing from customers that they wanted a more hands-off approach to dealing with desktops, in other words “set and forget” (my term I just made up, so don’t blame IBM for this possible bad phrasing). This was also what other vendors have been doing in the desktop market.

I have been involved with the beta of this product since it was available and will cover a few components of the product.

During the beta, this product was known as DTM. At release, it was renamed to Tivoli Policy Driven Software Distribution (TPDSD), which at least let you know that it was policy driven, but seemed to state that it only did software distribution, which it does, but also does more. It looks like this could be renamed in the next release to Tivoli Endpoint Manager (TEM). Which may seem to be a better name to some, it is actually really confusing to others that have been involved with Framework as this was the name of the component that responsible for managing information regarding the endpoints. For the purposes of this blog, I will use the current name of TPDSD. Who knows, maybe it will change again before TEM comes out ;)

Operating System Support
With the first release of TPDSD, there is very limited support for just Windows on both the server and target. The server can run on Windows 2003 and 2008 on 32 or 64 bit. The agents are supported on Windows XP, Vista, 2003 (32/64), 2008 (32/64).

The next release is supposed to have more support for both the servers and agents. There is even talk of MacOS support.

Agent Communications
In TPMfSW you were able to handle tasks to the targets using either RXA or the TCA. TPDSD only uses the agents for these tasks and the only time RXA is used is for the remote agent installation.

The great news is that the agent is now natively compiled, so no more JRE! The new agent is approx 30MB and that is the core agent and all other required support files.

Like TPMfSW, the agent also has various polling intervals. There are intervals for:
1. Checking local cached policies
2. Checking for new policies
3. Sending reports

All of these variables can be set at installation time or by an agent configuration policy.

What is a Policy?
The policies within TPDSD are what are used to do anything to an agent. If you want to install a piece of software, that is a policy. Scan the system, another policy. Configure the agent, you guessed it, a policy. When a policy is created, it is assigned to a target or a group of targets and will stay with that target until it is withdrawn or the machine is rebuilt.

Policies can be setup to either require or prohibit software from being installed. In order to require software to be installed, you need to first define the software to TPDSD and then define how the software is to be recognized on the target. For the recognition of software, there quite a few different ways that can be used to define how to detect software on a target.
1. If it is an MSI, you can define the GUID as the key. When importing an MSI, this is done automatically
2. If using a SPB, the state code is used when importing. This would take the name and version of the SPB and verify that it is in a successful install state on the target (IC--- for those familiar with TCM)
3. File existence – check for a specific file in a directory
4. Registry existence – check for a specific registry key

There are 35 different checks that can be done and multiple checks can be combined to determine if a product is installed or not. This can make the detection very flexible.

To prohibit software, you still need to create the software definition which would also define how to remove the software. The definition needs to exist so that the policy can be created and the uninstall instructions need to exist to actually do the removal.

Self Service Catalog
The self service catalog is a web interface that can be used be the client system to request the installation of software. The catalog contains a searchable section and a “Popular Software” section. The “Popular Software” could be something like a core set of products or free software that is allowed to be installed. This interface also allows the end user to remove software that is currently installed.

When the user requests software from the catalog, it is submitted and a new policy is created for the software to the specific target. This means that the agent will keep checking if the software is installed even after the initial request is completed.

The “set and forget” is the policy part of the product. You set a policy for a target, or group of targets, to have or not have software and the product takes care of maintaining the state of the target.

Integration with TPMfOSD
TPMfOSD is not included with TPDSD, but can be easily integrated using a couple global variables. It is very easy to include the TPDSD agent as a software module so that it will automatically be installed and connected with the TPDSD server. Once installed, the agent can be set to automatically install application or perform other configurations based on policies.

Conclusion
This product is looking really good and I cannot wait for the next release. Right now, due to the limited OS support and a few other things, it is not really ready for the primetime, and I do not believe that IBM is even stating that it is. It is really close and the concepts will continue from this version to the next. The next version will include much more OS support and I have heard of many new (and cool, as in good) features that people have been saying are missing. I do believe that this is a way better solution that has been designed from the ground up for the distributed environment, unlike TPMfSW, which will good, really was designed more for a datacenter.

Shameless Plug
Look for Gulf Breeze Software Partners at Pulse 2010. I will also be doing a presentation called “End to End computer management with Tivoli Policy Driven Software Distribution”. This session will be demoing the use of TPDSD along with TPMfOSD to perform a bare-metal install all the way to installing applications via polcy.

Hope to see you there!

Thursday, December 10, 2009

How to find the GUID of an object in the TADDM GUI

The TADDM GUI only shows you a subset of the information that's stored in the database. The rest of the information can be found by accessing the database directly or using the command line ($COLLATION_HOME/dist/sdk/bin/api.sh). If you use the command line, the easiest way to get information about an object is if you have its GUID, with:

api.sh find [--depth num] --guid THE_GUID

but you need to get the GUID first. It's actually easy - just drag and drop an icon from the GUI into Notepad, and what you'll see will be similar to:

<DragInformation><Node><Name>SVC-2145-GYUR0XSVC01-IBM</Name><ClassName>com.collation.platform.model.topology.storage.StorageSubSystem</ClassName><SubType>com.collation.platform.model.topology.storage.StorageSubSystem</SubType><CollationType>topology.storage.StorageSubSystem</CollationType><ID>459DBBF7B68832D98A1C414EA1E5E2EF</ID><IconName>storagesub</IconName></Node></DragInformation>

The part in bold is the GUID. Easy as that.

Wednesday, December 9, 2009

Framework Monitoring in ITM

If you are looking to monitor Tivoli Framework from ITM, the best way is to develop an Universal agent or agent builder to pull framework metrics using your custom script. Needless to say, Framework provides a vast array of commands that can easily be scripted to get you the metrics you needed.



If you are looking to monitor basic server components such as TMR, ManagedNode, epmgr and gateways and if you are at Framework 4.3.1 or later, you are in luck. You don't even need to develop an MDL. Framework 4.3.1 provides a new component called tmfmon that provides necessary MDLs and commands that can be readily imported into Universal agent.



You need to take a look at $BINDIR/../generic/tmfmon/README file on how to implement this solution.

Wednesday, December 2, 2009

Some quick notes on CCMDB 7.2

These aren't clear or verbose at this point, but I wanted to get the info out to anyone who wants it:

TADDM 7.2 and CCMDB 7.2 are out now, and they seem to work just great.

The TADDM 7.1.1.5 -> 7.2 upgrade works amazingly well, tho it takes several hours if you have lots of data

Java 1.6_17 (the latest as of a couple of weeks ago) JRE doesn't work with the TADDM GUI. You need to use an older version (1.6_07 works like a champ).

New in TADDM 7.2 is an "explore" feature in a topology view. This lets you add additional relationships to a view. So you can right-click a machine and select "Explore", then select what relationships you want displayed, then you can select which other objects that are associated via those relationships will be displayed. Kinda nifty.

CCMDB 7.2 has a new CI topology view. This is a nice feature so you don't have to launch over to TADDM just to see topology info.

In CCMDB 7.2, the "Admin" workstation is no longer limited to just Windows - Linux is supported (so you can run the CCMDB Launchpad from a Linux machine).

Tuesday, November 17, 2009

SQLite Database

If you have been using CSV/flat file as a persistent storage for your scripts, you should really checkout, SQLite. It gives you the power of RDBMS while without the complexity that comes with it. Any SQLite database you create is nothing but a file. It provides locking, transaction support, joins, etc. With ".dump" command, it can generate the SQL commands to reproduce the whole schema.

Did I mention, this database format is supported by ActivePerl, by default? You can use the standard Perl DBI module to manage this database.

Also, there is a CLI tool called sqlite3 (~500K) that lets you run all database manipulation and SQL commands. And, if it is good enough for Google Android and Apple Safari, chances are it should be robust enough for my needs.

Here is a simple Perl script to access this database from Perl.

#!/usr/bin/perl

use DBI;

my $dbh = DBI->connect('dbi:SQLite:dbname=sample.db',',');

$sql = qq{ CREATE TABLE MYCERT ( num int not null, name varchar(20) ); };
$dbh->do($sql);
$sql = qq{ INSERT INTO MYCERT VALUES(1, 'ITM'); };
$dbh->do($sql);
$sql = qq{ INSERT INTO MYCERT VALUES(2, 'Omnibus'); };
$dbh->do($sql);

$dbh->disconnect();

Thursday, November 5, 2009

To AB, or not to AB

If you are looking to develop custom monitoring solution in ITM, ITM gives you two options, viz. an eclipse GUI based Agent Builder tool or Universal Agent (UA). Which one would you choose? While the agent builder is shiny and easy to use, UA solution has its own advantages. Read on to know some of the pros and cons of each approach.

Agent Builder based solution makes sense for the following scenarios.

1) If you want to deploy something real quick and easy, then Agent Builder is a good candidate for your needs. Once you are familiar with the agent builder interface, you can create a custom monitoring agent literally in minutes. Moreover, there are not many typos/mistakes you can make with the Agent Builder's GUI based approach.

2) If you generally prefer GUI method over CLI methods, you will like Agent Builder more.

3) If you want to pull from data sources such as JDBC, WMI, NT Event Log, Service Control Manager, etc, then you should be build an agent builder agent with few clicks. UA will require lot of work as you may have write your own code to pull data from these data sources.

4) If you want to integrate the custom monitoring deployement with your current agent deployment methods, then obviously agent builder is the way to go. Deploying agent builder is very much the same as deploying any other agent.


Universal Agent based solution makes sense for the following scenarios.

1) If you want to minimize the number of agents you want to manage, then you are better of with UA. For example, if your requirement is to deploy 'n' custom monitoring solutions, typically agent builder would require 'n' agents, whereas in case of UA, one agent should be able to perform all 'n' monitoring activities.

2) Let me prefix this statement with a caveat. Check with your IBM representative for all licensing related information. Since one UA can handle multiple monitoring tasks, the licensing costs of UA based solution is typically lower than that of Agent Builder.

3) If you have been using UA for a long time, you can deploy the UA solution as quickly as an Agent Builder solution. More over, UA works pretty reliably.

4) If your monitoring requirements needs advanced summarization capabilities, then UA provides more advanced features than agent builder. Again, some of these tasks can be done using by modifying the itm_agent_toolkit.xml file, but it is just that the Agent Builder capabilities in this regard is not fully known yet.

Hope this information is helpful in your next custom monitor deployment.

Wednesday, October 14, 2009

Including Javascript functions in your BIRT reports

BIRT provides a very tight integration with Java/Java Script for customizing your reports. Most of the time, you embed your JavaScript within your reports and you have to modify each of the reports if something need to be changed.

However, there is a better way especially for some frequently used functions. You can put them in a .js file and re-use them across your reports. Here is how to do it.

1. Create a set of Java Script functions (such as for logging, modifying your queries, etc) and put it in a file. (e.g. GbsFunctions.js)

2. Save the file under somewhere under your resource directory, which can be set using Window->Preferences->Report Design->Resource->Resource folder within Eclipse. (e.g. resourcedir/GBS/scripts/GbsFunctions.js).

3. Now add the following XML tag to your XML source of the reports. Make sure that the XML you add doesn't result in malformed XML. (e.g. add just before <data-sources> tag).
<list-property name="includeScripts">
<property>GBS/scripts/GbsFunctions.js</property>
</list-property>

4. Now, you can access the functions listed in GbsFunctions.js within BIRT.

Hope you find this useful.