Showing posts with label Database. Show all posts
Showing posts with label Database. Show all posts

Hot reset sequence without restarting Maximo

One error we often have to deal with is incorrect sequence when adding new data to Maximo. There are many situations when it might come up, such as:

  • When loading data using MXLoader, or inserting data directly via SQL
  • Sequence corruption due to unknown cause in Production, probably due to errors caused by cancelled/terminated job
  • Restoring database from a copy, or after an upgrade.

When this happens, the user sees an error with duplicated key value such as “BMXAA4211E - Database error number 2601 has occurred…


My failed attempt to get Maximo to work with Azure SQL database

Recently, I started playing with Azure by attempting to migrate a Maximo instance from my local VM to Azure platform. Although the attempt was a success. I didn’t realize SQL Server and Azure SQL are different databases (or more correctly, two different versions). There were a few issues during the process, but I figured out how to work around them and got Maximo running on Azure VM and Azure SQL. After sharing the result on LinkedIn, there were some comments that Maximo couldn’t be installed on Azure SQL and IBM doesn’t support it, so I spent a bit more time digging and thought I should share the details and some of my opinions on this matter.

First, let us be clear, Azure is a big cloud platform which offers many different services. I’m not a cloud expert, but from what I understand, we are talking about two main services:


How to modify (almost) any Maximo data with no database access


Being Maximo consultants, we often come into a scene where the client gives us MAXADMIN access to the system but access to the database is an absolute No-No. This is usually the case with companies which have a clear separation of the App Admin and DB Admin roles. This is also one of the key restrictions with Maximo as a Service.

If you have been doing a bit of admin and config activities, you will surely understand the limitation of having no database access. It’s like having to work with tied hands. Luckily, we can use MXLoader to query/update almost any data tables in Maximo. Below is an example on how to do it.

Let say we're working with a Maximo SaaS and IBM only gives us front-end admin access, but no access to the back-end.

Solving problems with Update DB process when install or upgrade Maximo


When upgrading Maximo or installing new add-ons or fix packs, new source files will be copied to SMP folder which include Java classes and DBC (database configuration) script files. After that, the installer will run the UpdateDB process to update the database, run the BuidMaximoEar process to build the EAR file, and then deploy the EAR file to Websphere.

The DBC script files contain incremental changes to the Maximo database which add changes to GUI, update data, and modify DB configuration objects. Most of the problems you get when installing fix packs or upgrading Maximo come from the UpdateDB process which execute these DBC files in a set order.

Bulk upload images via Integration Framework



In the previous post, I have provided an example on how we can customize Object Structure to enable import/export binary data via MIF. In Maximo 7.6, the automation scripting framework has been greatly extended to support integration. With this update, we can enable import/export of binary data by adding a simple script without having to write and deploy custom java code. Below is an example how we can configure Maximo 7.6 to bulk upload images to Item Master application:

Step 1: Add an Object Structure integration script
  • Open System Configuration > Platform Configuration > Automation Script application
  • On Select Action menu, choose Create > Script for Integration
  • On the Create Script for Integration pop-up, enter the following details:
    • Select “Object Structure
    • Choose “MXITEM” for Object Structure
    • Select “Inbound Processing
    • Language: Python
    • Paste the following piece of code to the  Source Code text area:


    • Click on Create. Then save the script

Creating high performance service using MaximoCache

Sometimes in our application, we need to build custom services that run when Maximo starts. We can extend the psdi.server.AppService class and register it with MXServer by inserting a new entry into the MAXSERVICE table. If the service executes slow running queries, it is a good idea to cache the data in memory to improve performance. We can implement MaximoCache interface for this purpose. By doing this, we can initialize the service when MXServer starts and pre-load all data required by the service into JVM memory. When the service is called, it will only use cached data to provide instant response which gives a much better user experience. 

Below are the steps to create a sample service that loads all Location’s description into memory. The service will provide a function to check if an input string matches with a location’s description or not. We will call this check when user entering an Item’s description and it will throw an error whenever  the input matches with the description of any existing Location. This is not a very good use-case. But for the sake of simplicity, I hope it gives you an idea on how it can be implemented.


MboSet performance, Memory Cache, and DB Call


I recently had to look at ways to improve performance of a custom built operation in Maximo. Essentially, it is one of the many validation operations taken place after a user uploads a PO with a few hundred thousand lines.  The team here already built a high performance engine to handle the process, but even with it, this particular operation still take around 15-17 milliseconds to process each line which is too slow to their current standard. Imagine to process 200,000 PO lines, it will take nearly an hour just for this operation alone. There are a few dozen of these operations need to be executed, plus other standard basic Maximo operations like status change or save, the whole process takes up many hours.

With this millisecond operation, many assumptions or standard recommendations on improving performance may not work. In some instances, following the standard recommendations actually make it slower.

Import/Export Maximo ImageLib Data via Integration Framework

In Maximo, we can upload images as attachments in Doclinks which are stored as files the server or as avatar images which are stored as binary data inside the IMAGELIB table. Avatar image is quite useful to give the user a quick view of how an inventory item or an asset/location looks like.

While Maximo allows us to upload Doclinks attachments via MIF, uploading images to IMAGELIB table via MIF is not supported out-of-the-box. Therefore, in order to upload image, we can only do it manually one-by-one via Maximo’s GUI. For bulk loading, if we have access the DB server, we can write a stored procedure to read the files and import binary data directly into the DB. There are two scenarios I had in the past in which this approach doesn’t work:
  • When we built mobile apps and wanted to upload data to IMAGELIB. In that case, my team mate extended a REST handler class to achieve this requirement.
  • When we needed to bulk upload images, but the client did not allow us access to the database and database server.

Maximo with Oracle’s InMemory (Part 2) – Huge Performance Gain

Last week, I played around with Oracle’s new toy: the InMemory feature available in Enterprise Edition. Although it made Maximo runs 1.25x faster, but it didn’t meet my expectation which was from 2x to 5x. This has bothered me for the whole week and I kept thinking about it.

If you’ve read my previous blog post, the one thing I pointed out which could lead to no performance improvement is that I ran the test on a tiny demo version. It has only a few hundred assets and less than a thousand work orders. So, any heavy processes or poorly written queries couldn’t make the database 1 second slower. This week, I set out to do a more elaborate test with a setting that looks more similar to a real production environment.

Test Oracle InMemory Database with Maximo

For the last few years, SAP has been pushing hard on its HANA InMemory data platform and everybody talks about it. For me it makes sense because SAP’s ERP is such a huge system usually used by super large enterprises and is both a data intensive and mission critical system.

Maximo on the other hand is usually much less data intensive and for most clients I work with in Vietnam, they have small systems with databases of less than 10-20GB. Thus, I believe InMemory database is not a big deal for Maximo users. As I recently moved to Australia and got a chance to work with a much bigger client. Their Maximo runs on a cluster of more than two dozen JVMs yet somehow is still a bit slow considering the number of active users that they have. I suspect (since I don’t have visibility to their DB server) the bottle neck is the database in this case. Besides from the standard suggestions of looking at disk storage/SAN, network, memory allocation etc., I also mentioned they can consider implementing InMemory. Then I realized I never seen it implemented with Maximo, it would be a huge embarrassment if they look at it and find out that it doesn’t work.

This week I have some free time, so I decided to play around with InMemory database for Maximo to (1) confirm if it is possible and (2) see if it gives any real performance gain for Maximo.