Friday, January 10, 2025

IBM Identity Manger: Adding a new attribute

Background

The documentation on this is lacking on this topic, and the example LDIF file provided is in a format that will cause idsldapmodify to fail with a painfully obtuse error message. So I'm documenting this here to help the next person avoid some headache.

Details

Here's the documentation on how to add an attribute. In there, it provides this LDIF file as an example:

dn: cn=schema changetype: modify add: attributetypes attributetypes: ( myAttribute-oid NAME ( 'myAttribute' ) DESC 'An attribute I defined for my LDAP application' EQUALITY 2.5.13.2 SYNTAX 1.3.6.1.4.1.1466.115.121.1.15 {200} USAGE userApplications ) - add: ibmattributetypes ibmattributetypes: ( myAttribute-oidDBNAME ( 'myAttrTable' 'myAttrColumn' ) ACCESS-CLASS normal LENGTH 200 )

However, if you try to import that LDIF file with the idsldapmodify command given, you'll get the following error:

ldapmodify: invalid format (line 5 of entry: cn=schema)
After a bit of pain, I figured out that the whitespace in the lines are the problem. You need to either concatenate the continued lines together OR have some white space at the front of the continued lines. Additionally, there should be a space in betwee "oid" and "DBNAME". So if you change the file to the following, you will no longer get the error:
dn: cn=schema
changetype: modify
add: attributetypes
attributetypes: ( myAttribute-oid NAME ( 'myAttribute' ) DESC 'An attribute I defined for my LDAP application'    EQUALITY 2.5.13.2 SYNTAX 1.3.6.1.4.1.1466.115.121.1.15    {200} USAGE userApplications )
-
add: ibmattributetypes
ibmattributetypes: ( myAttribute-oid DBNAME ( 'myAttrTable' 'myAttrColumn' ) ACCESS-CLASS normal LENGTH 200 )

What about updating the objectclass?

That's what I was wondering! Simply adding an attribute doesn't really help you much until you add it as a MUST or a MAY attribute to an objectclass. IBM has an almost complete guide to doing that here, But it is incomplete. For one thing, the format of the data is wrong again. The long attribute values should all be on a single line, like this:
dn: cn=schema
changetype: modify
replace: objectclasses
objectclasses: ( 2.16.840.1.113730.3.2.2 NAME 'inetOrgPerson' DESC 'Defines entries representing people in an organizations enterprise network.' SUP 'organizationalPerson' Structural MAY ( audio $ businessCategory $ carLicense $ departmentNumber $ employeeNumber $ employeeType $ givenName $ homePhone $ homePostalAddress $ initials $ jpegPhoto $ labeledURI $ mail $ manager $ mobile $ pager $ photo $ preferredLanguage $ roomNumber $ secretary $ uid $ userCertificate $ userSMIMECertificate $ x500UniqueIdentifier $ mytest ) )
Additionally, where did that value for "objectclasses" come from?? Well, that came from this command:
ldapsearch -b cn=schema -s base objectclass=* objectclasses | grep \'inetOrgPerson\'
Running that, you will get a result similar to:
objectclasses=( 2.16.840.1.113730.3.2.2 NAME 'inetOrgPerson' DESC 'Defines entries representing people in an organizations enterprise network.' SUP organizationalPerson STRUCTURAL MAY ( audio $ businessCategory $ carLicense $ departmentNumber $ displayName $ employeeNumber $ employeeType $ givenName $ homePhone $ homePostalAddress $ initials $ jpegPhoto $ labeledURI $ mail $ manager $ mobile $ o $ pager $ photo $ preferredLanguage $ roomNumber $ secretary $ uid $ userCertificate $ userPKCS12 $ userSMIMECertificate $ x500UniqueIdentifier ) )
Modify that to include your new attribute, then put that into the LDIF file. Then run idsldapmodify to import that LDIF file into your server. After that, you'll have your new attribute added to the inetOrgPerson objectclass.

Friday, January 3, 2025

IDI v10 uses PKCS12 keystores

IDI Version 10 ships with PKCS12 keystores rather than JKS keystores, but the file names still end with .jks, which can be quite confusing, especially if you're still running on an OS that has an older version of java as the system java. If you have your shell configured with the defaults and you just run something like:

keytool -list -keystore testserver.jks -v

You'll get the following error:

keytool error: java.io.IOException: Invalid keysstore format
java.io.IOException: Invalid keystore format
...

The simple fix for this is to provide the full path the the IDI-provided keytool command, such as:

/opt/IBM/TDI/cev10/jvm/jre/bin/keytool -list -keystore testserver.jks -v

Which will show you details about all of the certificates in the testserver.jks keystore.

FYI: the default password for testserver.jks is "server" and for serverapi/testadmin.jks, it is "administrator"

IDI Development: Promoting Code From TEST to PROD

Background

The documentation on this topic is slim on the gory details, so I figured I would write a little bit about it. 

Situation

If you have different TEST/DEV and PROD environments (as you should), you will be creating solutions in TEST/DEV, then you need to somehow get those solutions to PROD. There are two difference scenarios to consider at this point.

1. Solutions with no external properties files

This is the easy scenario. And the way to make it the easiest is if you add your PROD IDI Server to the list of Servers in your TEST/DEV CE (in the bottom left hand pane of your CE):




Adding a remote server is easy if you simply have all of the defaults in place and all servers are at the same version of IDI. If you've updated certificates and/or have different versions of IDI, it's harder. So for this post, I'm going to assume you're on the easy path. So to add a remote server, you just need to first add a server to create a new server document. To keep things simple, name it the same as the remote hostname. Then right-click that server and select "Open Configuration". In the configuration, point to the remote IDI server like so:

My remote server is named jazzsm9, and the IDI server is running on the default port of 1099. Just the first (Server API Address) field needs to be filled in. It will actually fill in the last field (ActiveMQ Management port) automatically for you based on the port you specify in the first field. once you add it, the server should show up with a green circle with an inscribed white triangle next to it, like this:


Now you can simply Export your solution (Project) directly from the CE to the remote server. 

To do this:

1. Right-click your project and select "Export..."
2. Select "IBM Security Verify Directory Integrator-> Runtime Configuration (e.g. rs.xml)"
3. On the dialog that follows, make sure to select "Server", select the appropriate server, and specify a name for the solution. The name you pick will be the name of the XML file created in that remote server's $SOLUTION_DIR/configs directory. 


2. Solutions WITH external properties files

This process is more involved because you not only have to copy the solution over to the remote server, you ALSO MUST COPY OR CREATE THE EXTERNAL PROPERTIES FILES ON THE REMOTE SERVER. This is one of the important parts that the existing documentation leaves out. The project/solution contains REFERENCES to external properties files, but does not actually contain the properties files themselves nor the properties in those files. This means that you will have to either copy the properties files over to the other server in the correct location or create them in the correct location. This isn't a huge step, but it's one you need to keep in mind when promoting code from TEST to PROD, especially the first time you do it for a particular project/solution.

Thursday, January 2, 2025

IDI Operations: Always Start the IDI Server and CE with the "-s" Flag

Keeping it simple, you should always start the ibmdisrv or ibmditk processes with the "-s" flag to specify the solution directory to use. If you don't do this, then you're just using whatever happens to be set in your environment, which could lead to quite a bit of confusion. As an example, you could have your CE (the eclipse IDI GUI) running with a DIFFERENT solution directory than your Default ibmdisrv process. This could lead to a case where properties that resolve perfectly in the GUI don't resolve when you go to run an AL. You can avoid this by always using full paths to properties files, but then that makes your solutions much less portable.

IDI Operations: Finding the Solution Directory of a Running ibmdisrv Process

Background

I am working with a client that has had ITIM and TDI/IDI installed and running for over a decade, with different groups responsible for the administration of each. They now have around ten ibmdisrv processes running, with some of them configured to run the same Listener ALs (so the one that starts first wins). None of the servers are started with the "-s" flag, so it's a little difficult to figure out their solution directories. 

Each ibmdisrv and ibmditk process needs to have a Solution Directory (aka SOLDIR, SolDir, or current working directory) defined. All default logging is done in the solution directory, and all relative paths used are relative to the solution directory. Basically, the solution directory is very important, and this post will show you how you can find it for a running process.

The solution shown here is for Linux, because that is the most common OS I've seen for running IBM Directory Integrator. The second most popular platform is Windows. If you're running Windows, you can get similar data using the Process Explorer GUI tool.

Solution

The first thing to look at is the process listing for the IDI server java process. The easiest way to find all of these processes running on your server is with this command:

# ps -ef | grep IDILoader

root       5693   5683  0 Jan01 pts/0    00:07:57 /opt/IBM/TDI/ceV10/jvm/jre/bin/java -cp /opt/IBM/TDI/ceV10/IDILoader.jar -Dlog4j2.configurationFile=file:etc/log4j2.xml --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED com.ibm.di.loader.ServerLauncher -d

We see from that output that the PID of this java process is 5693 and the Parent PID (PPID) is 5683. The most straightforward way I've found to find the SolDir for this server is using this command:

# lsof -Pp 5693 | grep cwd

The output should be similar to:

java    5693 root  cwd    DIR              253,0      4096  68731158 /opt/IBM/TDI/prod_V7.1.1

Easy as that, we see that the SolDir for this server is /opt/IBM/TDI/prod_V7.1.1.

Additional Details

We can get more details about the PPID with this command:

# ps -fwwp 5683

UID         PID   PPID  C STIME TTY          TIME CMD
root       5683   5462  0 Jan01 pts/0    00:00:00 /bin/sh /opt/IBM/TDI/ceV10/ibmdisrv -d

What we see here is that the /opt/IBM/TDI/ceV10/ibmdisrv script was called with only the "-d" flag to start this process. When that script runs, it calls the bin/setupCmdLine.sh script, which eventually calls the script bin/defaultSolDir.sh to set the solution directory of the process. (A great post about this and more can be found at this link.) While this is truly great information to know, one drawback is that the solution directory can be overridden with the TDI_SOLDIR environment variable. That's why I like the lsof approach better on Linux. The behavior of lsof on AIX is different and doesn't point to the solution directory like it does on Linux, so this approach may be better there. In either case, it's good to know both.

Tuesday, December 24, 2024

Logging in IBM Directory Integrator

Introduction

I've been creating a bunch of AssemblyLines in IDI/TDI/ISVDI recently, and wanted to share a trick to help you set up logging to a file whose path includes BOTH the name of the project and the name of the AssemblyLine.

Quick and Dirty

1. In the Log Settings for the AL, make sure to add a log. I prefer FileRollerAppender, but you choose whatever you want. 

2. Click on File Path and set it to Advanced (JavaScript)

3. Set the value to the following:

var logName = "TOBEREPLACEDATRUNTIME";
try {
    logName = task.getParent().getName() + "/" + task.getShortName();
} catch(e) {}
return "/opt/IBM/TDI/logs/" + logName + ".log";

4. Click OK and you're ready to go. The logfile for the AL will be:

/opt/IBM/TDI/logs/project_name/AL_name.log

Longer and More Details

The Quick and Dirty section above is for advanced/experienced users who have been working with the product for a while and have been beating their heads against a wall trying to find this solution (maybe a handful of people out there). This section is for everyone else.

Background

IDI has tons and tons and tons of logging options. So many, in fact, that most customers will just stick with the default, which logs everything to ibmdi.log, which can make life difficult because all ALs are logged in the exact same file, with their messages intertwined. There can absolutely be benefits to that (i if you're using something like Splunk for centralized log storage, for example). To me, it's much easier to have a different log file for each AssemblyLine. And because you can easily end up with hundreds of ALs in your implementation, it makes sense to me to have one directory per Project, just to make it easier to work through the directory structure.

The specific situation I'm in is that my client has had ITIM and TDI running for over a decade, with different people managing the implementation over the years. This means that they have multiple TDI servers (at various versions) running on the same host. If I was to set this up from scratch starting right now, an easier way to accomplish something VERY similar is to add these two lines to solution.properties for the DI server:

SystemLog.defaultCreateLog=true
SystemLog.defaultMaxGenerations=10

This will cause each AL to log messages in files named: 

$SOLDIR/system_logs/$project/$ALName/<logfile>.<timestamp>.log  (10 max)

But that's if I was starting from scratch, which I'm not. So back to my original solution.

Walkthrough

Here's an animated GIF walking you through the process of adding a log to an AL, then configuring that log to have a name that includes the project and AL name.


Explanation

There are a couple of interesting parts that I think deserve a little more detail.

try/catch

Without the try/catch block in the code, you'll see an error when going to edit the logger, like this:


This is true for any form element in IDI that you implement as JavaScript. If you reference the task or work or conn objects, you'll get a big red error similar to the one above. That's why I first declare the variable logName with a string value. Then inside the try block, I set it to the value I want to see at runtime. The catch block does nothing, but syntactically it's required.

getParent()

So where did I find the getParent() method of the task object? It's in the API javadocs, which you can find on your own system in YOUR_TDI_INSTALLDIR/api or you can find them online here. The task object is described in the AssemblyLine class (com.ibm.di.server.AssemblyLine specifically; there is another one named AssemblyLine, but it is com.ibm.di.api.jmx.mbeans.AssemblyLine, which is not the same). 

Why did you change the Pattern for the logger?

I removed "[%c]" from the pattern because this information takes up a TON of visual space and is 100% unnecessary when using a different log file for each AL. 

What to do next?

So here's a bonus strategy that can save you a lot of time as you create new ALs. Set this for all of your existing ALs (or at least those for which it makes sense). Then, when you copy any AL, the log will already be dynamically set for you in your new AL. In my experience, this is extremely useful because I end up creating quite a few "utility" ALs for performing one-off operations. I always start by copying an existing AL, then giving it a new name and modifying it as appropriate for the task at hand. I generally keep these utility ALs in a single project named something like "CLIENT_NAME_Utilities".

Comments

In every one of my ALs, I enable the Prolog - Before Init hook specifically as the place where I provide some documentation for the AL for the next sucker developer that comes along and has to maintain it.

Parting thoughts

If you got something out of this, please share the link to get it more exposure.


Friday, August 30, 2024

Working with dates in IBM Security Directory Integrator

Working with dates in TDI can be tricky, since you'll normally be dealing with not only ISIM's internal format, but also various formats from different databases and other sources. The easiest way I've found is to use the java.time.* classes that were introduced in Java 8. Specifically, the LocalDateTime class is great because it has methods like minusHours(), minusDays(), etc. to make it easy to get a time from "X time ago". 

Here are some examples I've come up with to use with the different formats I've run into on my latest contract:


var ldtObj = java.time.LocalDateTime.now();
var dtfObj = java.time.format.DateTimeFormatter.ofPattern("yyyyMMdd");
var today = ldtObj.format(dtfObj);  
// today is "20240830" for August 30, 2024

var dtfObjHHmm = java.time.format.DateTimeFormatter.ofPattern("yyyyMMddHHmm");
var todayHHmm = ldtObj.format(dtfObjHHmm);  
// todayHHmm is  "202408301531" for 15:31 on August 30, 2024

var dtfObjH = java.time.format.DateTimeFormatter.ofPattern("yyyy-MM-dd:HH");
var todayH = ldtObj.format(dtfObjH);  
// todayH is "2024-08-30:15" for 15:00 on August 30, 2024

var dtfObjChars = java.time.format.DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH'hrs'mm'mins'");
var todayChars = ldtObj.format(dtfObjChars);  
// todayChars is "2024-08-30T15hrs31mins" for 15:31 on August 30, 2024
// just put single quotes in the Pattern around any literal characters you want in the resulting string.

var utcZone = java.time.ZoneId.of("UTC");
var ldtObjUtc = java.time.LocalDateTime.now(utcZone);
var dtfObjZ = java.time.format.DateTimeFormatter.ofPattern("yyyyMMddHHmm'Z'");
var todayZ = ldtObjUtc.format(dtfObjZ);  
// todayZ is  "202408301531Z" for 15:31 on August 30, 2024 Zulu (UTC) time.

// get the time "3 hours ago"
var ldtObjThreeHoursAgo = ldtObj.minusHours(3);
var dtfObj = java.time.format.DateTimeFormatter.ofPattern("yyyyMMddHHmm");
var today = ldtObj.format(dtfObj);  
// today is "202408301231" for three hours before 15:31 August 30, 2024 (so 12:31)