Wednesday, November 23, 2011
GBSCMD Performance Improvement Tip
Thursday, November 3, 2011
Working with the Deployment Engine
The OMNIbus documentation states that if you use the DE as a regular user, then as root, that all installs from that point forward will only use the DE instance that was installed by root (the global instance). It also states that if you uninstall the global/common instance, then the DE is uninstalled everywhere.
The situation that brought me to this point is that I installed TBSM 6.1 on Linux as the user 'netcool' (so it uses a user-specific DE). I then tried to install CCMDB 7.2.1 on the same machine as the user 'root'. This failed early in the process, but not before a new global/common DE was installed. I gave up my CCMDB install dreams and proceeded to install an OMNIbus probe as the user 'netcool'. This gave me an error that I was currently using the global DE on an installation that had been performed using my user-specific DE and I should abort the installation. After reading the above OMNIbus documentation, I didn't want to uninstall the global DE (for fear that it would wipe out everything and I wouldn't be able to upgrade any products). However, since I had a copy of my VM, I gave it a shot. What I did was:
as root:
cd /usr/ibm/common/acsi/bin
export SI_JAVA_HOME=/usr/ibm/common/acsi/jre
./si_inst -r -f
This scared me a bit because it did COMPLETELY remove the /usr/ibm/common/acsi directory and killed all of the acsi processes ('ps -ef | grep acsi' showed nothing at this point). But my ~netcool/.acsi_* directories were still there (I don't know why, but I have two of these directories - ~/.acsi_netcool and ~/.acsi_myserverhostname). At this point, I re-ran the probe installation as user netcool (nco_install_integration), and I got no error messages, and the install information was added correctly to my local DE instance.
And the lesson I learned is that once you install any DE-based product on a machine as a non-root user, all of your subsequent DE-based installs need to be done as non-root users (it doesn't need to be the same user for different products, but you don't want to install anything DE-based as root).
Monday, October 10, 2011
Tips on the ITM Agent for Maximo (product code MI )
https://www-304.ibm.com/support/docview.wss?q1=itm&rs=3214&uid=swg24025477&context=SSLKT6&cs=utf-8&lang=en&loc=en_US
The instructions for installing the agent assume that you have Maximo, your ITM TEMS and ITM TEPS all on the same Windows machine, which I imagine would not be the case for most customers. You *can* install the TEMS and TEPS support on a non-windows machine using the following commands:
installIraAgentTEMS.sh /opt/IBM/ITM
installIraAgentTEPS.sh /opt/IBM/ITM
Where the first parameter is your ITM install directory. You can then install just the agent on a Windows machine with:
installIraAgent.bat C:\IBM\ITM
The agent itself is ONLY installable on Windows. However, this can be ANY Windows machine you want - it only needs to be able to access your Maximo server via URL. NOTE: The agent does get some information from the Log file dir that you specify; if you install the agent on a machine that is not your Maximo server, this data will not be available. (I'm not certain exactly what information it gets from the logs.)
A BIG caveat of the agent is that you CAN NOT use it if you have configured Application Server Security for authentication and authorization with Maximo. (I didn't test out the scenario of configuring Application Server Security only for authentication due to time constraints). So you can only use the agent to monitor a Maximo installation that is configured to use Maximo security.
The next tip has to do with configuration. When you configure the agent, you're required to provide a few pieces of information:
Instance Name: Do NOT use "maximo" as the value! I found this out the hard way - it simply doesn't work if you do this. I used "MXServer", but it looks like you can use anything OTHER than "maximo".
Log file dir: This is the location of your application server log files. For example:
/opt/IBM/WebSphere/AppServer/profiles/ctgAppSrv01/logs/MXServer
Port: This is the port you use to access Maximo. The default is 7001, which is the default http port for WebLogic. If you're using WebSphere, you should change this to 9080 for http access (or 9443 for https).
Java Home Directory: This can be set to any Java 1.5 (or above) install location on the system. I set mine to:
E:\IBM\SMP\sdk
Another tip is that you do NOT need to configure Maximo Performance Monitor for the agent to work.
The last tip is on usage, once you get the agent up and running. Let it run for a several minutes before assuming it's not working correctly. It just takes a few minutes to capture some of its data. Once it up and running correctly, the table in the Performance Object Status workspace should look similar to this:
SysInfo SysInfo ACTIVE NO ERROR
DataBaseInfo DataBaseInfo ACTIVE NO ERROR
InstalledApps InstalledApps ACTIVE NO ERROR
License License ACTIVE NO ERROR
InstalledProducts InstalledProducts ACTIVE NO ERROR
DBConnections DBConnections ACTIVE NO ERROR
MemGaugeForAllSrvrs MemGaugeForAllSrvrs ACTIVE NO ERROR
RuntimeMXBean RuntimeMXBean ACTIVE NO ERROR
Memory Memory ACTIVE NO ERROR
MemoryGauge MemoryGauge ACTIVE NO ERROR
MBOCountGauge MBOCountGauge ACTIVE NO ERROR
UPSGauge UPSGauge ACTIVE NO ERROR
CronTasks CronTasks ACTIVE NO ERROR
EscalationErrorLog EscalationErrorLog ACTIVE NO INSTANCES RETURNED
Sunday, October 2, 2011
PLAYterm: a New Way To Improve Command Line Skills
Read more of this story at Slashdot.
I think this is a great resource for Windows people learning UNIX/Linux, and also for Linux people who just want to learn about some new commands.
Thursday, September 29, 2011
How to create a lock on a DB2 table
Monday, September 26, 2011
Tivoli Common Reporting Security - Removing users from administrator roles
Tuesday, September 6, 2011
Fixing perl's CPAN on CentOS
Undefined subroutine &Compress::Zlib::gzopen called at /usr/lib/perl5/5.8.8/CPAN.pm line 5721
When working with Tivoli software, it is helpful to use virtual machines. You can save images for different app/version combinations and retrieve them when you need to develop or test something. CentOS is the open source brand for Red Hat, so it works really well when you have to install multiple VMs but don't want the hassle of tracking RHEL licenses. Sometimes you may have to modify various file contents to look more like RHEL itself, but in general CentOS does the trick.
Recently, I was attempting to write a perl script to parse XML files. I chose to download the XML::LibXML module because it is very flexible and pretty fast. I started CPAN with:
user@system> cpan
user@system> install XML::LibXML
but then I got the 'Undefined subroutine...' error above. I tried running CPAN with the alternate command:
user@system> perl -MCPAN -e shell
I also tried to install other packages (such as DBD::DB2), but they generated the same error. I have been using the same CentOS 5.5 image for a couple of years, so it made sense to update the perl packages. Same error.
After some Google research, it appears that another error may have a similar cause.
(note the different package and line #):
It took a while to piece together this solution in steps, so hopefully this can save someone else a little time.
user@system> yum install yum-utils
(This contains the yum-complete-transaction executable, which does what its name says. Description here.)
2. Get libxml2:
user@system> yum install libxml2-devel
user@system> yum-complete-transaction
(notice it's yum-, NOT the normal yum with a space)
3. Update software packages:
But use yum from the command line to do it instead of the built-in CentOS 'Software Updater'. Run this:
user@system> yum check-update
It will outline all the available updates and ask if you want to execute them. Go ahead and say yes. There may be libraries in some of those packages that will be required to build perl modules. (In a single run, I had 242 installs and 242 removes, and it completed all of them. In my previous attempts to do the same thing from the Software Updater, the Package Manager would hang every time.)
user@system> yum-complete-transaction
(this will just make sure they're all done)
4. Run CPAN:
user@system> perl -MCPAN -e shell
Within CPAN, run these commands in this order:
cpan> force install Scalar::Util
cpan> force install IO::Compress::Base
cpan> force install Compress::Raw::Zlib
cpan> force install IO::Compress::Gzip
cpan> force install Compress::Zlib
after running all of these, CPAN should run just fine. Go ahead and download any perl packages you want.
Saturday, August 27, 2011
Verify the CloudBurst 2.1 Tivoli software stack
tasks; many times this complexity is hidden by the interface to the appliance, giving the user a limited view of the entire configuration and integration points. But a user may need to verify or re-verify the software stack when the environment changes (restoring backup images in a disaster recovery scenario), making modifications to hardware configurations (like when you add new blades) or software configurations (like when you add new networks with VLAN tagging). In this article, the author provides a quick guide to verifying the IBM CloudBurst 2.1 Tivoli software stack.
Friday, July 15, 2011
Emailing Reports in TCR
- First make sure that you configured TCR for emailing using the tcr_cogconfig.sh or by using "Cognos Configuration" application in Windows.
- Now to email a report, click on "Run with Options" icon against the report. It is a green arrow icon appearing on the same row as the report name.
- Now click on "To specify a time to run the report, or additional formats, languages or delivery options, use advanced options" link that appears to the right.
- In the advanced options page, click on "Run in the background" and "Now" under Time and Mode.
- Choose the appropriate format such as PDF.
- Under Delivery, uncheck save report.
- Under Delivery:, check send the report by email and click on "Edit options" right next to it.
- In the "Set Email Options" page, set the email receipients, (separated by commas). Edit the subject and body if necessary.
- Ensure the "Attach the report" is clicked. Alternatively, you can send a TCR link to the receipients. Click OK.
- Ensure that "Prompt for values" is checked. Now click Run.
- Now any report parameter values will be prompted and once you entered them and click finish,
- Finally, click OK to confirm and now the report will be generated and emailed to the receipients.
Hope this helps.
Wednesday, June 29, 2011
Importing Custom Images in TCR Cognos Reports
- First assemble the custom images that you need to include. These images must be in JPG or GIF format.
- Copy these images to the following directory location in TCR. Or, you can create a subdirectory under the directory below and put your images under the subdirectory.
<installdir>/../tipv2/profiles/TIPProfile/installedApps/TIPCell/IBM Cognos 8.ear/p2pd.war/tivoli - Important: Also copy the images to <installdir>/../tipv2Components/TCRComponent/cognos/webcontent/tivoli directory or any of its subdirectory.
- Now you can drag and drop image objects in Report Studio in your report designs. After dropping a image object, right click on it and select "Edit Image URL".
- Specify the image url as "../tivoli/mylogo.jpg" if your images are located in tivoli folder in the above example. Modify the URL to include subdirectory names in case your images are located in the subdirectory of tivoli folder in steps 2 & 3.
That's it, Happy Reporting!
Tuesday, June 21, 2011
Monday, May 23, 2011
ITM Dynamic Thresholds - NYTUG presentation 5/24/2011
Tuesday, May 3, 2011
The 10 commandments of good source control management
http://www.troyhunt.com/2011/05/10-commandments-of-good-source-control.html
(There are just a few R-rated words, but nothing egregious.)
Friday, April 29, 2011
Wednesday, April 27, 2011
Friday, April 22, 2011
Interesting information on Tivoli's Cloud initiatives
If you're new to Tivoli's Cloud movement, I think the best way to get use out of this paper is to just read about the products that are involved in the total solution. If your company is moving toward the cloud, knowledge of those components will definitely help you.
Tuesday, March 29, 2011
Passing TCR UserID in BIRT Reports
- Select a blank area in the report. This should display report properties in the Property Editior.
- Now click on the "Script" tab for the report displayed at the bottom of the main work area. (where Preview/Layout tabs are).
- In the script drop down, select "Before Factory" and paste the javascript code below.
TCR_IUSER = "com.ibm.tivoli.reporting.api.reportEngine.IUserInfo";
userInfo = reportContext.getAppContext().get(TCR_IUSER);
userName = "unknown";
if (userInfo != null) {
userName = userInfo.getUserPrincipal();
} - Now you can use the userName javascript variable in your reports to identify/display the TCR User.
- For example, to display the UserName, insert a "Dynamic Text" item anywhere in your report and enter the following value. "User name = " + userName
Thursday, March 17, 2011
IBM Service Management YouTube channel
http://www.youtube.com/user/ismconnect?feature=mhum#p/c/5C4BC71AD2C77801
A great tutorial on ITCAMfT integration with TBSM
Wednesday, March 9, 2011
BitLocker on Windows 7
What is BitLocker?
Windows Vista and 7 included the BitLocker functionality to allow for encryption of the drive.
Deployment Problem:
According to the Info Center documentation, OSD is BitLocker ready. Well, not really. The idea is that OSD has the capability of creating a partition that will allow BitLocker to be activated. The problem is that when OSD creates the partition it assigns a driver letter to the partition and this is not something that can be there for BitLocker to function.
Solution:
As of Windows 7 (and Vista SP1(?), but who cares), Microsoft included a tools called bdehdcfg.exe that allows for the ability to take any partition, shrink it by a certain amount and prepare it for BitLocker. In order for BitLocker to work, it requires a minimum of 100MB or 300MB if you also want the recovery console (For Vista this is 1.5 GB). In order to do this, just use a software module that is deployed with the image to execute the bdehdcfg command.
One thing to note with this solution, when the image is deployed, you will end up with a larger partition than expected. The reason for this is that when the bdehdcfg command is executed, the partition ends up being created at the end of the drive and when OSD is completed, it takes the cache partition (about 500MB) and adds it to the last partition on the drive. So if you are defining bdehdcfg to create a 300MB partition, you will end up with a 800MB partition (approx). Currently the only way around this is to have the bdehdcfg execute after the OSD deployment is completed.
BitLocker sounds simple enough to implement, but there are some things to think about that will impact the business
- The PIN is used to provide an additional level of security to the BitLocker process. This PIN is set to the computer not to the user(s) of the computer, so if there are multiple users of the system, then they all share the same PIN.
- The PIN can only be set with someone with Administrative access. (I have not personally confirmed this, but I was informed of this by an engineering group, so if this is incorrect, please let me know and I will remove)
- There is no native method to enforce a password expiry of the PIN
- BitLocker can be disabled/paused by anyone with administrative access, thus leaving the system unprotected.
- Will require processes to be put in place when users forget their PIN (you know it will happen) and provide the recovery password. This is possibly the hardest part depending on the users and the number of users.
On the plus side:
- It is free so you are able implement encryption without additional software expense
- When protected, the encryption seems to be as good as any
- Encrypting a drive is relatively quick compared to other vendors
- Recovering a drive is simple as you just need the recovery password from Active Directory
- Did I mention it was free?
Hope this helps you out :)
If you have any other topics you would like covered, send me a note at martin dot carnegie at gulfsoft dot com.
Deploying Windows 7 with TPMfOSD
Recently I have been involved in using TPMfOSD to capture and deploy Windows 7 images. There is quite a bit of information available on the web and on IBM’s Info Center, but at times we found that there are certain areas that are not completed enough.
I have been working through the Devworks site with various people and thought I would also give back some information. Since this was too big for Devworks, I thought a blog would be best.
At a high level, here is what I did:
1. Importing Windows 7 DVD for Unattended Install
2. Preparing the OS Configuration for Unattended Install
3. Deploying the Unattended Install
4. Customizing Master Image
5. Executing sysprep
6. Capture Clone Image
7. Modifying the OS Configuration for Clone Install
8. Deploying the Cloned OS
For my environment, I am using VMware Workstation to create my profile. There are many advantages of using VMware rather than physical hardware such as:
1. The image does not contain any drivers for the physical hardware. Windows 7 can be installed on VMware with almost no extra drivers (depending on the vm hardware defined)
2. Simple and quick to restore an image with the snapshots rather than using OSD to capture the “Golden Master”
3. Multiple snapshots can be created to backup and restore during various stages
4. The restore of an image can be done to any system that has VMware installed, as long as the hardware is setup the same. So the VM image can be built on Lenovo/HP/Dell/etc hardware
When using VMware, I also add the setting bios.bootdelay=15000 to the .VMX file to allow time to press the F12 key or ESC for the boot menu.
Before starting on this, one big note is around the Built-in Administrator name that is used. When installing Windows 7, you are prompted to create an id that will be an administrator on the system. When this user is created, it will be added to the Administrators group and the Built-in administrator will be disabled. In order to get the Built-in administrator enabled, you need to set the Administrator name in the OS profile to “Administrator” (has to be this no matter what you want the id to actually be). For this example, I will be changing the Built-in administrator to “myadmin” and show how to make this will work.1. Importing Windows 7 DVD for Unattended Install
This was fairly simple. Just use the New Profile > Unattended Setup and walk through the wizard.
Info Center documentation:
2. Preparing the OS Configuration for Unattended Install
Once the import is complete, open the OS configuration, go to the Windows tab and set the "Administrator Name:" field to Administrator. Also verify that the time zone is set. If you are using volume licensing, then select the “Volume licensing” option. If not, then set the serial number.
3. Deploying the Unattended Install
After the unattended install system profile is created, it can be deployed to a target system in order to create the clone profile. The methods to deploy an unattended or cloned profile are exactly the same. The big difference is the time for installation. The unattended install is significantly longer to complete than a cloned image.
Info Center documentation:
4. Customizing Master Image
There are many options to configure in the image such as included software, user ids, local policies, etc. Also remember that software modules can be used to customize an image after deployment, so make sure what is included will not require you to make more updates to the image than necessary.
Some of the deciding factors for what to do in the image vs in a software module:
- will the software take too long to deploy in a software module. For example:
- MS Office, this product takes a very long time to run through installation than it does to have included in the image
- Adobe Flash, this product is quick to install but is updated quite regularly, so it is probably better to have in a software module.
- Antivirus applications. Since these are core to protecting the corporate environment, they should be in the image. This is because there could be a failure installing the software module which would end up leaving a system unprotected.
The Windows 7 image is quite large even without any software installed, so whatever can be cleaned to minimize this image would be a good idea. Typically I would include any patch backups as this could shrink an image by 1GB or more.
As stated, I have changed my Administrator (SID 500) account to myadmin. This is a typical configuration that most sites will do. There are a couple “quirks” that happen when you do this:
- After the change, the user directory on the system will be C:\Users\Administrator. When you deploy the image, the directory will be changed to C:\Users\myadmin. You cannot change the directory name on the original image (you can Google it).
- As stated earlier, when setting the OS Configuration in step 7, you have to set the Administrator Name to “Administrator”. If you do not, the system will be deployed with the “myadmin” account, but it will not be the SID 500 account, it will just be an id in the Administrators group. The SID 500 will be called Administrator and it will be disabled. When set correctly, the “myadmin” will be the SID 500 account and another account called “Administrator” will be added to the Administrators and Users groups. For my deployment, I included a software module that would remove it from both groups and disable the account.
Another issue that I ran into was that I deleted the C:\install directory. This is created by the unattended install. When deploying an image to the target, the c:\install directory would be created, but when executing software modules later in the build process, they would not execute. This is being addressed in a future fix (not in FP04). To workaround this issue, just leave the c:\install directory in the image.
5. Executing sysprep
Once the unattended install is complete, the system can then be configured with any corporate software and configurations. After all configurations are completed, the next step is to use the Microsoft tool called Sysprep. This tool is used to remove system specific configurations to allow for a cloning of an image to different systems.
http://technet.microsoft.com/en-us/library/cc783215%28WS.10%29.aspx
Unlike Windows XP, sysprep is already on Windows 7 and is located in C:\Windows\System32\Sysprep. The options selected are OOBE, Generalize and Shutdown. I prefer using the shutdown as I do not want to miss the reboot and have the mini-setup run again.
Info Center documentation:
Notes:
A system that is joined to a domain cannot be used for creating a cloned profile. If the system has been joined to a domain, then it has to be moved to workgroup mode.
- Some extra recommended tasks are:
- Empty recycle bin
- Execute chkdsk to ensure there are no disk error
- Clean out temporary files
- Remove any persistent drive mappings
- Clear the Application, Security and System event logs
- Sysprep still has the limit of being executed 3 times in Windows 7.
6. Capture Clone Image
Capturing the Windows 7 OS is no different than the methods used for any other operating system. The process is quite a bit longer than Windows XP and requires more reboots, but overall the whole process is the same.
Info Center documentation: http://publib.boulder.ibm.com/infocenter/tivihelp/v3r1/index.jsp?topic=/com.ibm.tivoli.tpm.osd.doc/deploy/tosd_clone_win.html
7. Modifying the OS Configuration for Clone Install
Once an image is imported, the OS configuration will need to be set. The OS Configuration is where you use OSD to set the parameters that will be used in the unattend.xml file. The UI will allow for the configuration of many of the common settings, but if there are more that are required, use the “Edit custom unattend.xml” on the General tab. When setting the OS configuration, the most important item to set is the “Administrator Name” to “Administrator”. This is done by opening the properties for the OS configuration and going to the Windows tab. Also on this tab in the “System Customization”, check the setting “Always authorize installation of unsigned drivers”.
8. Deploying the Cloned OS
Deploying the Windows 7 OS is no different than the methods used for any other operating system. The process is quite a bit longer than Windows XP and requires more reboots, but overall the whole process is the same. One thing that did happen in Windows 7 and not XP is that OSD actually logs into the OS. This causes some issues with scripts that may be in the run/runonce/startup.
Info Center documentation:
Other Notes:
TPMfOSD started supporting Windows 7 in 7.1.1.1, but this version and 7.1.1.2 use the WinPE2. There are some pretty significant improvements in using 7.1.1.3 or better yet 7.1.1.4 as it utilizes WinPE3 for the deployments. If you have not started, or are just starting, then move to one of these versions. There are other reasons for moving to these newer versions, but this is one of the most visible from a deployment perspective.
Conclusion
As noted, this is a fairly high level of using OSD for Windows 7 deployments, but should start you on the right path.
Remember, we at Gulf Breeze Software Partners are ready to help you with your implementations on TPMfOSD or any IBM Tivoli product
If you have any other topics you would like covered, send me a note at martin dot carnegie at gulfsoft dot com.
Thursday, March 3, 2011
Monday, January 31, 2011
GbsTask - A Task Management Utility for ITM
- A Simple database driven tool to create/update/delete/execute tasks.
- Tasks can be executed on individual OS agents or on ITM MSLs.
- Tasks can be executed in a multi-threaded manner across agents of different platforms.
- Supports SQLServer or DB2 databases to store task information.
- Authorization information kept in a separate file and can be specified with -a switch. You don't need to specify the password in your scripts.
- Maximum number of threads is limited by the maximum number of "tacmd"s that can be run in parallel. Running more than this limit could cause stability issues. As of ITM 6.2.2. FP2, the maximum number of threads is 10.
- Currently the tasks can be executed only against Windows, Linux and Unix OS agents.
- JRE 1.5 or later. (The code will NOT work with JRE 1.4).
- JDBC driver for your database.
- Tacmd CLI. (The CLI is installed with an OS agent installation or ITM TEMS installation).
- A SQL Server or DB2 database where you can create a table to contain task information.
C:\temp>java -jar GbsTask.jar -a db2.auth -c -l mylib -t pingtask -o Linux -f C:\temp\test.sh
C:\temp>java -jar GbsTask.jar -a db2.auth -c -l mylib -t pingtask -o Windows -f C:\temp\test.bat