Tuesday, October 16, 2018

You can now use Vega to create custom graphs in Kibana

Prior to Kibana 6.2, you had to create a custom plugin to create custom visualization types. Now, however, support for Vega is included. Vega is a JSON (HJSON, actually) language that you can think of as a wrapper around the D3 visualization toolkit to allow it to display in Kibana. Here's a video with the highlights:

https://www.youtube.com/watch?v=lQGCipY3th8

Monday, October 15, 2018

IBM Announces Multicloud Manager

https://www.ibm.com/cloud/multicloud-manager

It allows you to manage containers across all the biggest cloud providers.

You can now see your LinkedIn saved articles on the desktop!

I just ran across this today, and am very happy that I can finally view the items that I save on my phone on the web version. The link to see your saved articles is:

https://www.linkedin.com/feed/saved-articles/

To save an article, you should see a little bookmark icon under all articles. Click that, and you'll get a DIC popup that tells you it's saved and give you a link to view all of your saved articles.

Tuesday, October 9, 2018

ITMSuper is in a new location

IBM moved the ITMSuper tool to:

https://www-01.ibm.com/marketing/iwm/iwm/web/pickUrxNew.do?source=tivopal

If you're an ITM 6.x user and you haven't used this tool, you really should download it to help with the management of your environment.

Wednesday, October 3, 2018

Learning neat new things at the Splunk Conference


Splunk is introducing tons of great new features at their conference this year. Many customers complain about the cost of Splunk, but you can lower that cost by leveraging the data to cover more usecases. If you feel like you're not getting the most out of Splunk, give us a call to get some help.

Monday, September 17, 2018

Maximo 7.6.1 includes entitlement for Cognos Analytics 11

Cognos Analytics 11 is the all-new Cognos product built for self-service business analytics, competing with the likes of Tableau and Microsoft BI. Pam Denny at IBM created a 13-part video series about this new entitlement here:

https://www.youtube.com/playlist?list=PLOBy7UFdPupclhOayxt8jrebiSZqZiv3R

Thursday, September 6, 2018

Some of our current projects

We work with a pretty wide array of products, so I wanted to highlight some of the projects we're working on right now

ServiceNow Architecture and Implementation

We're working with a communications company to implement their procurement, installation and change processes within ServiceNow, with asset feeds from multiple external systems.

ServiceNow Incident Response integration with QRadar

We're helping our client customize both products and the integration between them to best leverage their existing investment and people.

IBM Control Desk for Field Service Management

We're helping a different communications customer with their field service management through workflows and custom user interfaces defined in IBM Control Desk.

Netcool Operations Insight Implementation

We're actually working on several of these at the moment. The most work on these goes into identifying the different event sources, what (if any) automated actions need to be performed and who needs to be notified.

BigFix Steady State

A medical client of ours has been leveraging our BigFix Managed Services for several years to ensure that all IT equipment is both known and is running software at the appropriate patch level.

ICD and BigFix Implementation with Airgap

We're working with a defense contractor to ensure that their Asset Management and Change Management processes continue to work smoothly leveraging ICD and BigFix

Friday, August 31, 2018

The latest version of OpenStack (Rocky) can leverage bare metal servers directly

The source link from ZDNet:

https://www.zdnet.com/article/new-openstack-cloud-release-embraces-bare-metal/

Now you can provision bare metal servers through OpenStack. The linked article describes some of the use cases, and provides additional links to more information.

Wednesday, August 29, 2018

ServiceNow - requiring input from a user completing a task from workflow

Background

A normal part of workflow is requiring some additional information from someone involved in the workflow. A lot of information can be captured automatically, but there often seems to be some information that must be input manually by someone simply because not everything can be determined within an algorithm. This may be because the information is maintained in a separate, walled-off system, or it could be because the sensors required to gather the information aren't yet deployed, etc.

In ServiceNow IT Operations Management, you have this ability Out-of-the-box when dealing with Service Catalog Items. A Service Catalog Item is also known as a Requested Item or an RITM. Specifically, you can define Variables that are associated with an RITM, and those Variables are then available for use within any Catalog Tasks that you create within the workflow for that RITM. Without a little customization, this feature is ONLY available within workflows that target the sc_req_item table. So if you require some generic user input as part of a change task, for example, you need to perform the customizations detailed here.

Here is a link to a great article on this very topic. That post is a little old (from 2010), so the information is a bit dated and very terse. I'm writing this article to update the information and to clarify a few pieces to make it clearer.

I suggest you read through these instructions once or twice before you start trying to follow them. Then re-read them as you're implementing them. The pieces will come together for you at some point, just probably not immediately unless you've worked on this area of ServiceNow a bit.

What's already in place?

Basically, almost all of the components needed to provide this capability already exist in the system, so there's only one place that you'll have to add some code. Everything else is just customization.

By default there are two tables that already exist in ServiceNow for this very purpose:

Question [question]
Question Answer [question_answer]

(The format of these names is a common one that you'll see in ServiceNow. It is
"Label [actual_table_name]")

The Question table holds all of the questions/variables defined in the system. What's specified here is the text of the question and the type of field required (simple text field, choice list, reference, etc.)

The Question Answer table exists to hold one entry for each question and its answer associated with an "entity". For the purpose of this article, the "entity" we'll be adding question/answer pairs to is a change_task item.

Note: Catalog Items already use both the question and question_answer tables to store the Variables (options) that can be specified for each item. 

Areas that need to be customized

You will need to make customizations in the following areas:

1. Workflow->Administration->Activity Definitions

In here, you need to edit the definition for the Create Task activity. The customizations we're making here will allow us to add Questions to a Create Task workflow activity when we add it to the workflow canvas.


Click on Create Task to edit its definition:


As with most forms in ServiceNow, there are numerous fields and sections within this form. 

Define new variable

The first thing we need to do is define a new variable in the bottom section named "Activity Variables":


Add a variable of type "List", with a label of "Questions" with a Column name of "task_questions" (this name will have "u_" prepended to it once you click Submit), and that this is a Reference to the table Question:

Edit the script

The next thing we need to do is add to the script on the Script tab near the top of the page:



In here, you're going to add two pieces of code. In the onExecute function, you're going to add this code, which references the variable you defined in the last step:

if(activity.vars.u_task_questions)
   this._setVars(taskID);

Add it before the call to this.autoClose(taskID), as shown here:


The purpose of this portion of code is to call our function (defined in the next paragraph) when the "Create Activity" task actually creates a task as part of a workflow.

The next piece of code you'll add is the definition of the _setVars() function that's called in the code you just inserted. This is the code you'll add:

_setVars: function(taskID){
   var questions = activity.vars.u_task_questions;
   questions = questions.split(",");
   for(i=0;i<questions.length;i++){
  var qa = new GlideRecord("question_answer");
  qa.initialize();
  qa.question = questions[i];
  qa.table_name = activity.vars.task_table;
  qa.table_sys_id = taskID;
  qa.insert();
   }
},

Add it after the _generate function definition, as shown here:


The purpose of this code is to add one entry to the question_answer table for each of the questions that are defined for this particular workflow activity task. We'll see this in action later.

Edit the Create Task form

Now that you've got a variable to hold the questions for the task and you've got the code in place to add each of the questions to each new task that will be created by this workflow activity, you need to edit the form to actually let someone choose the questions that will be presented in the task. For this, you need to click on the Edit Variables Layout link in the Related Links section of the Create Task Workflow Activity Definition (hint: this section is directly under the "Update" button under the body of the script):

In here, drag and drop your new "Questions" field  wherever you'd like to see it on the form. I've placed mine in the second section of the form:


That's the end of the customizations you need to make to the Workflow Activity Definition. If you want to see the fruits of your labor, you can open the Workflow Editor and create a new workflow, and drop the Create Task activity onto it. Here's what it looks like by default:


And here it is with the new "Questions" field:


We'll come back to the Workflow Editor later. 

2. System UI->Formatters

You now need to create a UI Formatter to display the list of questions and answers. This is covered in the ServiceNow documentation here:


Basically, you just need to create a new UI Formatter that specifies the name you want (I chose "FTQuestionFormatter), the "Formatter" value as com_glideapp_questionset_default_question_editor and you need to specify the type of task that you're going to be creating. In my case, I'm working with a Change Task (the change_task table):


You don't need to change anything about the UI Macro for this formatter - it's written to do exactly what we need.

3. Change Task default view

This is described in the product documentation link above, but I'm including it here for completeness. Since I decided to make this feature available to Change Tasks, that's the form we're going to work with. The most straightforward way to edit this form is to go to System Definition->Tables and select the Change Task [change_task] table


Then scroll down to the middle of the page to select the Design Form Related Link to open the Form Designer, and there you can drag your UI Formatter from the Formatters section (on the left under "Fields") onto the form where you want the questions displayed. I put mine at the bottom of the second section:


This formatter will ONLY show something if the change task you're viewing actually has questions defined. That means that ONLY change tasks created from the workflow that you create in the next step will have anything shown. So at this point, you won't see anything different on any existing change tasks.

4. Service Catalog->Catalog Variables->All Variables

Here is where you need to define any "Questions" (aka Variables) that you want to see on the tasks you'll create later. The main tip here is that you don't need to specify a value for the Catalog Item field, as what we're doing has nothing to do with catalog items. In fact, catalog items already have this capability built-in, with some additional capabilities. What we're configuring is a bit more generic, but we're using the built-in forms to accomplish our goal.

5. Workflow->Workflow Editor

At this point, you can edit a workflow that targets a table that extends the task table and drag the "Create Task" Core Activity onto the canvas. When you do, you'll see your Questions field:


Make sure to select the appropriate type of task - Change Task. This does NOT work for Task items directly in the Task table. This is important to remember! You will not see a successful result if you select "Task".

You can click the padlock icon to open it, then click the magnifying glass to search for your questions:


You'll see ALL of the variables/questions in the questions table. Pick the ones you want displayed to the person assigned the task.

The Result

Now when this task is assigned to a user, that user will see the task in "My Work" and will see that they have the option to provide values for all of the questions selected in the workflow design:


Conclusion

You now have the ability to prompt users for additional input when they're completing a task as part of a workflow. You still need to write scripts to access that data, perform validation, make decisions, etc., and I'll leave that for another day.

Tuesday, August 14, 2018

The Lenovo P1 is a thin-and-light laptop with a Xeon and 64GB RAM that will be available by September

https://www.theverge.com/circuitbreaker/2018/8/13/17682992/lenovo-thinkpad-thin-laptop-work-travel

Additionally, they've got a P72 that will support up to 128GB RAM coming out at the same time.

More memory, storage and power is a great thing.

Wednesday, July 11, 2018

Processing JSON in automation scripts in IBM Control Desk 7.6

Background

You may need to deal with JSON-formatted data in an automation script, and it can be a little tricky. I've written this post to provide the few little pointers to make it easier for you.

JavaScript

You can write automation scripts in Rhino JavaScript or Jython. While Jython is the most common language used for automation scripts, it turns out that JSON processing is MUCH easier in JavaScript. Specifically, in a JavaScript automation script, you have access to the popular object named JSON that will give you everything you need. Here's an example:

var jsonObject = JSON.parse(jsonString);

And that's it. You can now work with jsonObject as a JSON object as described in this reference material from w3schools:


As far as I know, this will work in both WebSphere and WebLogic application servers. One possible caveat is that the JavaScript engine is changing from JDK 7 to JDK 8. Here's more information on that:

Jython in WebSphere

For Maximo/ICD automation scripts, Jython is by far the most popular language. It's also more thoroughly documented and, IMO, easier to work with in this context. However, JSON parsing has a couple of caveats. Specifically, the Jython interpreter in ICD 7.6 is version 2.5.2, which doesn't have a built-in JSON parser (one was added in Jython version 2.6). However, we're still in luck because WebSphere actually includes a JAR file that provides JSON processing. The specific class that you need to import is com.ibm.json.java.JSONObject :

from com.ibm.json.java import JSONObject
...
my_json = JSONObject.parse(my_filebody)

And from there, you can deal with my_json appropriately according to the JavaDoc here:


Jython in WebLogic

Admittedly, I haven't tested this one. I've tested the above two, and from my research, I believe this will work. Specifically, these two links give the necessary information:



If you find that it doesn't work, please ping me and I'll help you get it to work then update this entry as necessary.

With that in mind, you just need to import the appropriate classes in your automation script:

from javax.json import Json
from javax.json import JsonObject

And there you go.

Tuesday, July 10, 2018

It only takes an hour to get a test BigFix environment installed and working

The only caveat (which they've maybe fixed now) is that the SQL Server that's bundled with the BigFix Eval is borked, so you first need to install an eval version of MSSQL Server 2014, which is available from Microsoft.

But the whole process is really easy:

1. Create/clone a Windows 2012 or 2016 server (you can download an eval copy of Windows Server 2016 if needed)
2. Google MSSQL Server 2014 evaluation download and download it
3. Install MSSQL with all the defaults
5. Follow IBM's instructions for installation.
6. Once it's up and running (takes about 10 minutes), continue through the install instructions to add all of the available Sites. The site named IBM BigFix Inventory v9 is actually the one that will get you the BigFix Inventory install files.
8. Optionally create/clone a Windows or Linux VM to be an additional client in your environment

That's it, and even if you need to install Windows Server from scratch, it only takes at most 1.5 hours.

There are other parts you can also install now, such as BigFix Inventory or the WebUI (both are available via fixlets in one of the available Sites).

Monday, July 9, 2018

How to change the BigFix WebUI database userid and password

I recently installed the BigFix WebUI with the wrong password and needed to fix it. I found the encrypted information in the db_config.json file in the folder:

C:\Program Files (x86)\BigFix Enterprise\BES WebUI\WebUI

However, this is what the contents of that file are:

{"user":"96\u002fzY1rPfE40v69uFttQAg==","password":"MwKBDmT00BEwEZm1ctZahg==","hostname":"WIN-5M6866TPST1.mynet.foo","port":1433}

And while those look like Base64 encoded values, there's also some encryption going on (try putting either of those strings through an online Base64 encoder/decoder and you'll see).

So the first thing I tried was to just put the information in the file in cleartext and restart the WebUI service, so the file looked like:

{"user":"sa","password":"passw0rd","hostname":"WIN-5M6866TPST1.mynet.foo","port":1433}

Amazingly, that worked, and here's the logfile entry that shows it:

Wed, 04 Jul 2018 13:14:24 GMT bf:dbcredentials-error Failed to decrypt database credentials, attempting to use inputted credentials as plaintext

However, the file kept the cleartext data (I had hoped that it would re-encrypt the values on startup, but it did not).

Then I found the solution in the place I should have looked to begin with - in the BigFix console! There's a task defined in the BES Support site specifically for this purpose. The task is named "Deploy/Update WebUI Database Configuration". Run the action associated with that task and it will create a new db_config.json file with the properly encrypted data and you're good to go.

Friday, July 6, 2018

For business use, don't buy a laptop with higher than 1080p resolution

The high resolution screens available today are amazing for graphics and gaming, but absolutely horrible if you need to use any traditional/legacy applications. The main application that gives me trouble is Quickbooks Desktop Pro. We have version 2016, and I don't imagine they're going to fix it anytime soon since they seem to (rightly) want everyone to move to their online version. We've been using Quickbooks for over 15 years, so we're using some features that simply aren't available in the online version. I'm sure we'll move to the online version at some point, but it won't be any time soon. I'm certain there are other desktop applications that similarly have a problem with high resolution monitors - specifically, the text and windows are too small to see, and scaling doesn't work correctly at all. It's just ugly.

The higher end business laptops (Lenovo Thinkpad T, P or X series; Dell XPS; etc.) generally offer a 1920x1080 pixel option as a base, then higher resolutions and touchscreens cost more. In my experience, you'll be the happiest with the lower cost 1920x1080 option. Whether you get a touch-enabled screen or not is up to you, but definitely skip the high resolution screen.

Wednesday, June 27, 2018

Just Announced: IBM Cloud App Management

Here's the announcement, with architecture details:

https://developer.ibm.com/apm/2018/06/26/introducing-ibms-new-service-management-cloud-native-offering-ibm-cloud-app-management/

Some of the highlights are that it runs on IBM Cloud Private (so it runs in containers orchestrated by Kubernetes) and supports both ITM v6 and APM v8 agents.

Monday, June 25, 2018

Reading and writing files in a Maximo automation script

Background

All of the product documentation tells you to use the product provided logging for debugging automation scripts (see here, for example: https://www.ibm.com/support/knowledgecenter/SSZRHJ/com.ibm.mbs.doc/autoscript/c_ctr_auto_script_debug.html ). For quick debugging, however, I thought that was cumbersome, so I decided to figure out how to access files directly from within an automation script. This post goes over exactly what's required to do that. Maximo supports Jython and Rhino-JavaScript for automation scripting, and I'll cover both of those here.

Jython

This one was straightforward, since the Python documentation can be followed exactly. Jython is simply an implementation of Python written completely in Java. All you need to open a file is:

my_file = open('c:/tmp/outfile.txt','a')

where 'a' specifies that we're appending to the file (and creating it if it doesn't exist). You then do need to flush and close the file, and this is my function to do that:

def logit(mytext):
  my_file = open('c:/tmp/jout.txt','a')
  my_file.write(mytext + '\n')
  my_file.flush()
  my_file.close()

So to log a string, just run:

logit("this is my string")

Reading from a file is just as easy:

my_read = open('c:/tmp/computers.json')
my_json = my_read.readline()
my_read.close()

In my case, the file contains one long line of JSON data, so readline() works great to store all of the text of the file into the string named my_json.

Rhino-JavaScript

This one is quite a bit more painful than Jython, which is really just one more reason that all of your automation scripts should be written in Jython. Specifically, the Rhino implementation in Maximo doesn't seem to completely adhere to the documentation you'll fine online. For example, there is no "ReadFile()" method available in Maximo. There are also other limitations, and the only way I found to get over them was to use Java classes. I thought that would make it easy, but then you have to deal with the fact that Java objects (specifically Array objects) are absolutely not the same as JavaScript objects. 

So, writing a file isn't too difficult once you realize that you need to use Java. Here's how you open and write a file:

var outFile = new java.io.FileWriter("c:/tmp/autoscriptout.txt");
outfile.write("my string");
outfile.close();

The hard part is actually reading data from the file. Using the same JSON file as above with one long line of JSON, the following is required to read that data into a JavaScript string that can then be parsed:

var 
  thefile = new java.io.File("c:/tmp/computers.json"),
  filelength = thefile.length(),
  thefilereader = new java.io.FileReader(thefile),
  jsonData = java.lang.reflect.Array.newInstance(java.lang.Character.TYPE,filelength),
  res = thefilereader.read(jsonData,0,filelength),
  jsonString = new java.lang.String(jsonData);

And now all of the JSON data is in the string named jsonString.

Enjoy.

Friday, June 15, 2018

ICD 7.6 Fresh install and config with LDAP authentication configured will fail

We found a problem when installing IBM Control Desk 7.6 on WebSphere and MSSQL where it fails every time if you enable LDAP/AD authentication during the configuration phase of the install. Specifically, you'll see this error in the ConfigTool window:

Apply Deployment Operations-CTGIN5013E: The reconfiguration action deployDatabaseConfiguration failed. Refer to messages in the console for more information.

And if you look in the CTGConfigurationTrace<datetime>.log file, you'll see this error:

SEVERE: NOTE ^^T^Incorrect syntax near the keyword 'null'.
com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near the keyword 'null'.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:216)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1515)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:404)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:350)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:5696)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1715)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:180)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:155)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeQuery(SQLServerPreparedStatement.java:285)
at com.ibm.tivoli.ccmdb.install.common.util.CmnEncryptPropertiesUtil.init(CmnEncryptPropertiesUtil.java:187)
at com.ibm.tivoli.ccmdb.install.common.util.CmnEncryptPropertiesUtil.<init>(CmnEncryptPropertiesUtil.java:101)
at com.ibm.tivoli.ccmdb.install.common.util.CmnEncryptPropertiesUtil.getInstance(CmnEncryptPropertiesUtil.java:141)
at com.ibm.tivoli.ccmdb.install.common.config.database.ACfgDatabase.createCronTask(ACfgDatabase.java:1391)
at com.ibm.tivoli.ccmdb.install.common.config.database.CfgEnableVMMSyncTaskAction.performAction(CfgEnableVMMSyncTaskAction.java:140)
at com.ibm.tivoli.ccmdb.install.common.config.database.ACfgDatabase.runConfigurationStep(ACfgDatabase.java:1108)
at com.ibm.tivoli.madt.reconfig.database.DeployDBConfiguration.performAction(DeployDBConfiguration.java:493)
at com.ibm.tivoli.madt.configui.config.ConfigureSQLServer.performConfiguration(ConfigureSQLServer.java:75)
at com.ibm.tivoli.madt.configui.common.config.ConfigurationUtilities.runDatabaseConfiguration(ConfigurationUtilities.java:540)
at com.ibm.tivoli.madt.configui.bsi.panels.deployment.TpaeDeploymentPanel$RunOperations.run(TpaeDeploymentPanel.java:1550)
at org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:121)


This happens before the EAR files are built.

The way around this is to select "Use Maximo internal authentication" on the "Configure Application Security" screen of the ConfigTool". Once everything is installed, configured and running, you can then go in and enable J2EE application security for authentication.


Wednesday, June 13, 2018

JD-Gui is an invaluable tool for troubleshooting Java applications

If you deal with Java applications, you should get familiar with JD-Gui if you aren't already. JD-Gui (Java Decompile - Graphical User Interface) does exactly what its name states, and it seems downright magical because it shows you source code from compiled Java applications, which can give you amazingly useful insight into how an application is working. Here's a screenshot of it in action, where I'm using it to look at a JAR file that's included with the IBM Control Desk ConfigTool:


All of that information came from just dropping the JAR file onto JD-Gui.

The problem I'm encountering is a SQL error complaining about a syntax error near the keyworkd "null". By looking at the trace file produced and the source code, I've been able to reproduce the exact error message, and I'm 100% confident I know exactly in the code where the error is generated. So instead of just randomly trying different possible solutions, I can focus on the very small number of areas that could be causing this particular problem.

I've been using this tool for years, so I'm not sure what took me so long to write about it.

Monday, June 11, 2018

I definitely recommend installing Linux on Windows

If you've got Windows 10, hopefully you've been able to install a recent update that allows you to enable the Windows System for Linux feature and install a supported Linux distribution from the Microsoft Store. I finally bit the bullet and installed Ubuntu today, and it makes the life of a system administrator MUCH easier. I already had Cygwin installed, but this is just a slightly smoother integration, with many of the tools you need already installed (or available with the normal 'apt' or 'apt-get').

The Ubuntu distribution available in the MS Store even comes with vi with color highlighting for known file types (like html or js), and it's got telnet, ssh, sftp, etc. to make your life easy.

It's been available for a while, and I was hesitant to install it, but now I'm very happy I did.

Friday, June 1, 2018

Amazon Chime is a cheaper and more powerful alternative to WebEx

If you haven't looked at Amazon's different AWS offerings in a while, you definitely should take a look sometime. For example, I stumbled across their Chime web conference service:

https://aws.amazon.com/chime/

and I can report that it's just as reliable and easy as WebEx, but with more capabilities and at a fraction of the cost. Specifically, it's only a maximum of $15 per month per host, with 100 attendees allowed, plus you get a dial-in number (an 800 number is available, but there are additional per-minute charges associated with it).

We had an older WebEx account that was $50 per host per month, so I was very happy to run across this service and to get a minimum of a 70% savings. I say minimum of 70% savings because some of our host accounts were used only at most 2 days per month, which, with Chime, will now only cost a maximum of $6 per month.

High Availability for DB2 on AWS

If you're running an IBM product that requires a DB2 backend, you should really consider running DB2 on AWS. Here's a great article that provides you with the CloudFormation template to set it all up for you very quickly:

https://aws.amazon.com/blogs/database/creating-highly-available-ibm-db2-databases-in-aws/

If you're concerned about running your infrastructure in the cloud, please contact us so we can give you the information you need about the tight security and incredible flexibility that AWS provides.

Update 8/8/2023: One important note: this still uses TSAM for failover. What this means is that if your primary DB2 instance has a problem, the entire VM hosting that instance is rebooted. So if you have multiple databases on that instance, they're all failing over to the backup. If you have any other processes running on that VM, they're all going away. You can definitely architect your application to work fine with this, but it is definitely something you have to keep in mind before simply implementing DB2 HADR.

Monday, February 26, 2018

Netcool and other IBM ITSM products upgrades due to Java6 EOS

Problem

IBM has announced End of Support dates for quite a few products in 2018.  In many cases, this stems from the impending end of support for Java 1.6.  You can search for IBM products and the EOS date here: https://www-01.ibm.com/software/support/lifecycle/

Solution

Gulfsoft Consulting can help you move to a supported release in a short time period, or we can get you upgraded to a product with more features (like moving to NOI from Omnibus). We have helped hundreds of clients over the years upgrade and migrate in situations exactly like this.  The typical time needed is a few weeks, not months.  For more information contact:

frank.tate@gulfsoft.com 304 376 6183
mark.hudson@gulfsoft.com 816 517 7179

Details

Some of the products whose support ends in 2018 are:

Product Version EOS Date
IBM Tivoli Monitoring  6.2.2 4/30/2018
IBM Control Desk  7.5.x 9/30/2018
Tivoli Workload Scheduler  8.6.x 4/30/2018
Netcool Operations Insight 1.2, 1.3.x 12/31/2018
Network Mgmt 9.2.x 12/31/2018
OMNIbus 7.4.x 12/31/2018
Impact 6.1.x 12/31/2018
IBM Tivoli Network Manager 3.9.x, 4.1.x 12/31/2018
Netcool Performance Manager 1.3.x 12/31/2018
Netcool Performance Flow Analyzer 4.1.x 12/31/2018
Network Configuration Manager 6.3.x, 6.4.0, 6.4.1  12/31/2018

IBM recommends upgrading to later versions of the products as soon as possible in order to maintain full support. After April 2018, support for Java™ 6 will be limited to usage and known problems with possible updates for critical security fixes through the end of 2018. After April 2018, WebSphere Application Server (WAS) 7 support will be limited to non-Java defects. Support for other components will continue as usual.

More information about Gulfsoft can be found here: https://www.gulfsoft.com/about

Friday, February 23, 2018

We've got a few open time slots for one-on-one meetings at #Think2018

The #Pink18 ITSM conference was a great success for us, and we're now looking forward to #Think2018. It's going to be a jam packed week, but we've still got a few time slots open to schedule sit down discussions with existing and potential clients and partners. Contact Frank (frank.tate@gulfsoft.com) or Mark (mark.hudson@gulfsoft.com) today to set up a 30-60 minute meeting!

Monday, February 19, 2018

#Pink18 is off to a Great Start

The #Pink18 ITSM conference kicked off last night with a reception, and it looks to be another great conference this year. Pink Elephant always has great thought leaders presenting at the sessions, and this year will continue that tradition.

If you're at the conference, please stop by our booth, #601, in the exhibitors showcase.

Friday, January 19, 2018

IBM Maximo named a Leader in Gartner Magic Quadrant for Enterprise Asset Management!

https://www.ibm.com/developerworks/community/blogs/a9ba1efe-b731-4317-9724-a181d6155e3a/entry/IBM_Maximo_named_a_Leader_in_Gartner_Magic_Quadrant_for_Enterprise_Asset_Management?lang=en

Something very important to note is that the IBM Control Desk product is built entirely on Maximo. So almost all of the features and capabilities that put Maximo into Gartner's Magic Quadrant for Enterprise Asset Management are also included in IBM Control Desk.

Tuesday, January 16, 2018

Gulf Breeze Software Partners is now Gulfsoft Consulting

Gulf Breeze Software Partners was started 15 years ago as a consulting firm that also had small
aspirations to write software at some point. Along the way, we realized that we really prefer
implementing and customizing software over writing new applications from scratch. And while
we've advertised that we're specialists in the implementation and customization of the IBM suite
of products up to this point, we're now marketing the fact that we have experience in and offer
services on a much larger array of products from multiple vendors. To effectively market our
capabilities to new customers, we decided to change our name from Gulf Breeze Software Partners
to Gulfsoft Consulting. We've still got the same amazing people and the same drive to make customers
successful, and now we've got a name that more accurately describes what we do. Here’s a link to some of the technologies we work with every day.


We look forward to continuing our relationships with existing customers and making new ones.

Monday, January 8, 2018

We will be at multiple conferences this year: Pink Elephant's Pink18, IBM's Think 2018, and ServiceNow's Knowledge18

We are growing in size and capabilities! That's why you'll see us at three great conferences this year:

Pink18 Feb 18-21, 2018 at the JW Marriott Orlando, Grande Lakes

Adopt, Adapt & Apply!
DevOps, Agile, Lean IT, ITIL® & More
The conference theme will be covered in over 120+ sessions and 12 tracks to show how you can master the dynamics of today's business environments by adopting, adapting and applying tried and true best practices. Subjects include: ITSM, ITIL, Lean IT, Agile, Scrum, DevOps, COBIT®, Organizational Change Management, Business Relationship Management, and more!

Think 2018 Mar 19—22, 2018 Las Vegas, NV

Think Campus
100s of core sessions, Think Tanks, and opportunities to get out of traditional sessions. You’ll experience face to face conversations with experts and product developers, network with clients that face similar challenges as you, and meet with 100s of IBM strategic business partners.

Think Academy
1000s of labs, certifications, client and technical deep dives that provide the info you need to take back to your organization and make it your own.

Knowledge18 May 7-10, 2018 Las Vegas, NV

Spark Your Transformation
Create smart, delightful experiences for your employees and customers

Friday, November 3, 2017

Thursday, November 2, 2017

We're a sponsor at Pink18 in Orlando!

We'll be a sponsor of the Pink18 conference in Orlando, Florida Feb. 18-21, 2018. Come by booth #601 to see what we're offering or just to say hi.

Tuesday, October 24, 2017

How Netcool Operations Insight delivers cognitive automation by Kristian Stewart

https://www.ibm.com/blogs/cloud-computing/2017/08/netcool-operations-insight-cognitive-automation/

One important topic that Kristian omitted from his excellent article is the optional Agile Service Manager (ASM) component of NOI. ASM provides a context aware topology view of your applications and infrastructure, which gives you a clear view of the impacts causes by events. Take a look at our other articles and YouTube videos for more information on ASM.

Friday, October 13, 2017

What to use instead of ITMSuper

ITMSuper is a JavaScript based tool that can be used for maintaining the health of your ITM 6.x environment. It was written by IBM and made available as a separate download, but was never completely supported. It's even less supported today, as it only supports Internet Explorer 8. Here is a great blog post from Shaun R at IBM pointing to different tools that you should use instead, all written by IBM's own John Alvord:

https://www.ibm.com/developerworks/community/blogs/0587adbc-8477-431f-8c68-9226adea11ed/entry/Helping_us_help_you_ITM_Bitesize_Edition_ITMSuper?lang=en

Wednesday, October 11, 2017

Wednesday, September 20, 2017

IBM Control Desk 7.6.0.3 is available

Introduction

IBM has released the ICD 7.6.0.3 FixPack:

https://www-945.ibm.com/support/fixcentral/swg/selectFixes?parent=ibm~Tivoli&product=ibm/Tivoli/IBM+SmartCloud+Control+Desk&release=7.6.0.2&platform=All&function=all&source=fc

Installation issues

On linux, the service_portal.bin installer crashes unless you use the "-i silent" install option. See this link for more information:

New/Updated Functionality

Here are IBM's links for the new capabilities:


I've found a couple of additional things that have been fixed that I didn't find in the documentation. Specifically:

Service Portal

- Default values for specifications now completely work. This means that you can specify a default value for a specification for an offering, and you will see that default value filled in when you go to request that offering.

- Specifications tied to Table Domains now work again. This function existed in 7.6.0.1, but was broken in 7.6.0.2, and is now back. Specifically, if you define a specification to be tied to a table domain, when you click in the field for that specification in the offering, you'll get a popup with all of the possible values. 

- Users can add an attachment before hitting the Submit button. One thing this allows you to do is require an attachment through the use of an "Add to Cart" or "On Submit" validation script.

Update 9/25/17: Control Desk Platform

The Person Groups "Group Availability" Gantt chart works now. It was broken in 7.6.0.1 and 7.6.0.2, but it does work again in 7.6.0.3.

Monday, September 11, 2017

Force change of global system property in Maximo

UPDATE 6/3/2020

Another way to resolve this issue is to point to a local maximo.properties file as described here:


And put the mxe.report.birt.viewerurl property in that file.

Introduction

I recently encountered an issue in one of my ICD 7.6 installations where a global system property had an incorrect value set that I needed to change without rebuilding my MAXIMO.EAR file. This post is a description of the problem and my eventual "fix". It's just a test environment, and this is NOT a resolution that I would recommend for a production system. But I wanted to document the details to possibly help others in similar situations.

Problem

I installed ICD 7.6 and chose to use the maxdemo DB2 database script during configuration. This apparently set the mxe.report.birt.viewerurl global property to
http://myhostname.domain.name/maximo/reports/ , and that is an invalid value. This system property should either be unset or set to
http://myhostname.domain.name/maximo/report   (with no trailing "s"). The problem that this causes is that any attempt to click on the "Run reports" action gives an HTTP 404 error. 

It took a while to run this down, but finally looking in System Configuration->Platform Configuration->System Properties showed me the setting for this system property:


Notice that I'm unable to modify the value AND "File Override?" is checked. So this means the value is set somewhere in the filesystem. Unfortunately, I couldn't find the value anywhere in any file on the system, so the only normal way around this is to modify maximo.properties on the Admin workstation, rebuild MAXIMO.EAR, then redeploy the EAR file. But I didn't want to do that for various reasons. Also, since "Global Only?" is set to true, I couldn't create an instance-specific property with the same name and different value. 

My "solution"

I tried several different tactics, but the one that finally worked for me was to directly update the database to set "User Defined?" true for this property so I could then delete it and create an instance-specific property with the same name. The SQL command I used to make this change was:

update maxprop set userdefined=1 where propname = 'mxe.report.birt.viewerurl'

After running the above SQL command from a DB2 command prompt, I could then create an instance-specific property with the same name but with the correct value. Once I did that, I was able to successfully run all* BIRT reports.

All* reports?

Actually, the demo database script has at least one problem. Specifically, the CI named RBA_SERVER has a problem that causes the "CI List" report to fail. To get around this issue, you need to first find and delete the WORKORDER that references the RBA_SERVER CI, and then you can delete the RBA_SERVER CI. Once you delete that CI, you'll be able to successfully run the "CI List" report.

Tuesday, September 5, 2017

Disabling IE Enhanced Security Mode on Windows 2012 Server

Here's a handy PowerShell script I found to disable IE Enhanced Security Configuration on Windows 2012 Server. This needs to be run as Administrator:


function Disable-IEESC
{
$AdminKey = "HKLM:\SOFTWARE\Microsoft\Active Setup\Installed Components\{A509B1A7-37EF-4b3f-8CFC-4F3A74704073}"
$UserKey = "HKLM:\SOFTWARE\Microsoft\Active Setup\Installed Components\{A509B1A8-37EF-4b3f-8CFC-4F3A74704073}"
Set-ItemProperty -Path $AdminKey -Name "IsInstalled" -Value 0
Set-ItemProperty -Path $UserKey -Name "IsInstalled" -Value 0
Stop-Process -Name Explorer
Write-Host "IE Enhanced Security Configuration (ESC) has been disabled." -ForegroundColor Green
}
Disable-IEESC

Friday, August 18, 2017

A new IBM Redbook on writing applications with Node.JS, Express and AngularJS

IBM just published another great Redbook, this time on application development with Node.JS, Express and AngularJS:

http://www.redbooks.ibm.com/redbooks/pdfs/sg248406.pdf

It describes the process on BlueMix, but it is applicable to a local application also.

What I like about it is the intricate detail it goes into for each and every step of the process and line of code in the application. It includes a ton of details about exactly what is going on with each step. If you're just learning these technologies or want a primer, this is an extremely informative resource.

Monday, July 31, 2017

Debugging Remote Control in IBM Control Desk

Introduction

One of the many great features in IBM Control Desk is the ability to have a service desk agent remotely take control of a user's machine for troubleshooting (or repair) purposes. This function leverages the IBM BigFix for Remote Control agent on the target machine and a JNLP file on the server that launches a JAR file on the agent's machine.

Architecture

The architecture is fairly simple. The JAR file running on the agent's machine communicates DIRECTLY with the BigFix Remote Control agent on the user's machine, which listens by default on port 888. This means that any firewalls between the agent's machine and the user's machine must allow a connection to port 888 on the user's machine.

Installing the Agent on the User's Machine

If you manually install the agent, it prompts you for the server name and port, but these values are ignored if you don't have BigFix in your environment. So if you don't have BigFix in your environment, these two values can be anything you want - it doesn't matter. It also asks you for the port that the agent should listen on. This is 888 by default, but can be changed to anything you'd like.

Launching the Controller Interface in debug mode on the Agent's Machine

This can be done in several ways from the ICD GUI, but going that route doesn't actually allow you to put the Controller interface into debug mode. To do that, you need to copy the TRCConsole.jar file from any of your ICD application servers or from the Administrative Workstation. Search for the file by name and you'll find it. Copy this file to the agent's machine.

On the agent machine, you need to launch the JAR file with the --debug flag:

TRCConsole.jar --debug

This will create a file named trctrace_<date_timestamp>.log in your %HOMEPATH% folder. This file will contain detailed tracing information that can be used for debugging.

Thursday, July 20, 2017

DevOps and Microservices Architecture done right - IBM Netcool Agile Service Manager

Introduction

Our last article described just how easy it is to upgrade any or all of the components of Agile Service Manager. This article is meant to describe some of the design, patterns and processes that had to go into the application itself to allow a two-command in-place upgrade.

Microservices Architecture

Yes, this is a trendy buzzword these days, and that's only part of the reason I'm using it here. In general, a "microservice" can describe almost anything you access over the web - a website, a document, etc. Absolutely anything. But one very important concept about microservices when designing an application is separating functions that then expose all of their capability through some interface. In ASM, there are several "services", each of which is implemented in a Docker container. Each one of these "services" actually provides a number of related "microservices", which are then exposed via URLs accessible through the host system. All of these services communicate with one another through the exposed microservices. 

ASM's strict separation of functionality allows a lot of flexibility in application development. For example, one service is the File Observer, which is used to read a file of a specific format, which contains topology information. The main purpose of this service is to read a file and convert it into data that is then sent to the Topology service, which is responsible for processing the data for its purposes and ultimately sending it to the Cassandra (database) service, which will persistently store the data on a filesystem that's available on the host and accessible via the Search/ElasticSearch service. Notice that this application pattern is very similar to existing application patterns, but in this case each service is provided by a separate container.

Containers vs VMs

Docker containers are MUCH smaller than full virtual machines. Additionally, Docker has defined and implemented numerous usecases that make containers easier to create, deploy, configure and orchestrate than VMs. So, while ASM could have been delivered as multiple VMs (via OVA files and some type of hypervisor-specific orchestration), the use of Docker containers makes deployment and management much simpler. I'm saying that a similar result could have been achieved with VMWare vSphere, but IBM's Docker solution for this application seems to be sleeker to me.

Containers vs J2EE Applications in an App Server

In many ways, Docker can be seen as similar to a J2EE Application Server like WebSphere - it provides a common architecture with functions, capabilities and services that are shared among the applications/containers running within it. However, Docker containers can run applications written in any language you want - from Java to R to Haskell. Anything you can run on the host OS can be run inside a container. Containers can also be given strict resource limits for CPU usage, memory, file access, etc. To me, containers seem to be much more like atomic units than J2EE applications. 

As an example that I believe many people can relate to, an Application Server can be thought of as your browser, with each tab being an "application". It doesn't happen often, but one tab can crash your entire browser. Docker has been written specifically to avoid this with containers, which is a great thing.

IBM's Design Choices

IBM appears to have chosen this particular pattern in order to make the application as manageable as possible from both perspectives - development AND administration. When upgrading the components of the application, each service/container is basically free to do whatever it wants as long as it continues to adhere to its published REST interface (since REST is the only interface IBM has created for the services).

What does this have to do with DevOps?

DevOps requires frequent building and deploying, ideally in a manner that does not cause any regression test failures. The structure of this application wholeheartedly adheres to this requirement, and it is brilliant. 

I'm certain you can think of thousands of ways that this doesn't apply to some application that you deal with, but you should ignore those thoughts when thinking of the future. I truly believe that most, if not all, enterprise-scale applications will be rewritten using either this pattern or one that's extremely similar to it. And everyone in all areas of IT needs to be ready for the new opportunities and challenges that will come with it.

Wednesday, July 19, 2017

IBM Agile Service Manager application maintenance is very easy

The Agile Service Manager team has done an amazing job on installation and upgrade. If you've never managed an enterprise-scale application, the information in this post probably won't impress you much. But really, if you've never managed an enterprise-scale application, you probably quit reading our articles a long, long time ago.

So now that I've got a technical audience, here's the amazing thing:

I just received some updated ASM components from IBM. To install them took TWO COMMANDS:

yum install *.rpm

docker compose up -d

THAT'S IT, and the new components are up and running, with the new functionality. I didn't even have to manually stop or start any processes. It was literally THOSE TWO COMMANDS. This, to me, is absolutely stunning, and hopefully a sign of more good things to come.

Docker Agent for IBM Workload Scheduler

IBM Workload Scheduler has Docker agents!



Wednesday, July 12, 2017

Using IBM Agile Service Manager and BigFix to obtain and display application communication topology data

Background

We've been working with a client who owns BigFix and Netcool Operations Insight, and who recently purchased the optional Agile Service Manager component of NOI. Up until now, we've been helping this customer obtain communication data (network/port/process connection information) in their environment through BigFix. A valid question you may have is: Doesn't TADDM do that and more? And the answer is yes it does, but the customer has some fairly severe obstacles that prohibit a successful deployment of TADDM.

Why are we doing this?

Any Operations group needs as much contextual information as possible to allow them to do their job effectively. Some of the information that Operations teams need is:

- Which systems are communicating with (dependent upon) Server X?

- What processes and applications are running on Server X?

- What is the impact to other systems if we reboot Server X?

etc. There are many, many more questions that come up, and often the best way to answer those questions is with a topology view of the environment. TADDM gives you this topology information, but again, this client is not able to install TADDM, so wanted another way to get similar data.

How are we doing it?

The first challenge was getting the communication information via BigFix. With just a little searching, we realized that this was actually very easy. The 'netstat' command in both Windows and Linux will actually show you information about which ports are owned by/in use by which processes, and then it's just a matter of getting more details about each PID. Linux has the 'ps' command, and Windows PowerShell does too, though the output is different, of course. We also found that PowerShell has a few functions that will directly convert command output into XML. This is important because BigFix includes an XML inspector that lets you report on data that's in an XML file. On Linux, a little Perl scripting was used to accomplish the same goal.

So with the IP/port/process information in had, we then needed to display that data in the ASM Topology Viewer. To do that, we used the included File Observer. Specifically, we wrote a script to create the appropriate nodes and edges so that this information can be displayed by ASM.

What's it look like?


Here you can see that a java process on client.gulfsoft.com has opened TCP port 40474 to communicate with a DB2 process listening on port 50000 on db2srv.gulfsoft.com.

Conclusion

Topology data is absolutely crucial to a Operations team for numerous reasons. In this case, we were able to provide this visualization to our client in a very short amount of time (a week or so) while leveraging software they already owned. They now have better insight into their environment and are better prepared to address events in their environment.

Thursday, July 6, 2017

A Windows command similar to awk

I'm always amazed at the capabilities available with built-in Windows command line tools. My latest find is the FOR /F command, documented here:

https://ss64.com/nt/for_cmd.html

My main use for the awk command in *NIX is to pull out some piece of a line of text. I know awk is MUCH more powerful and even has its own robust language, but I've always used it to pull pieces of text out of structured output. And that's what FOR /F does for you. The syntax is completely different, but the capability is there and it's quite powerful.

Friday, June 30, 2017

Now you can get started with Artificial Intelligence on a Raspberry Pi

Microsoft made its AI work on a $10 Raspberry Pi https://www.engadget.com/amp/2017/06/30/microsoft-made-its-ai-work-on-a-10-raspberry-pi/

Thursday, June 29, 2017

More IBM Netcool Agile Service Manager Videos

I think some wires got crossed in YouTube recently as IBM Service Management moved over to the IBM Cloud channel, and it appears that their most recent videos are hidden from any searches. However, thanks to Matt Duggan from IBM who shared the direct links on LinkedIn, I've added them all to my own IBM Agile Service Manager playlist, which can be found here:

https://www.youtube.com/playlist?list=PLxv2WlaeOSG9z_L4LCjHzz-qnZ-vDqnjn

Have fun