A few weeks back, it occurred to me that one of the common debugging tools used for mass log munging here was Splunk, but I had something new in my bag of tricks…SAP Lumira. Why? If you’ve ever spent time analyzing SAP BusinessObjects server trace logs, you know this is not necessarily a “fun” activity. Coy Yonce has given an awesome BetterBOBJ webinar on analyzing log files with things like Cygwin.
One of two significant issues exists with this process, but I was able to overcome:
- The delimitation in the log files is a loose interpretation of the word delimitation.
- SAP Lumira doesn’t handle special return characters that great.
This, was not a show stopper with some simple pre-cleansing before I pulled the data into SAP Lumira, thankfully. On the first swipe, the thing observed is that the log files contain extra header information that breaks the delimited file as far as Lumira is concerned. So to begin, using Notepad++, kill off lines 1 through the keyword “COLUMNS:” on line 6. Delete it. Everything else on line 6 is good header data SAP Lumira will need.
Unfortunately, there is more poorly placed header junk from the trace log file that we have to bag as well. Now on line 2, you have to kill off Lines 2 and 3 completely as shown here.
Beyond this, I saved my file and went on my merry way. BUT WAIT. There are horrid line characters in the file that made SAP Lumira choke on the import. So, back to Notepad++ once again and one more edit to close out this issue. Simply selecting the end of the row and the first pipe of the following row allowed me to do a find/replace with a normal carriage return to fix the entire file in one step by simply replacing with another pipe. Note the highlighted regions at the end of line 2 and beginning of line 3. Bad wrapping is dead…
With a cleansed and saved file (really, this only took about 30 seconds to complete), use SAP Lumira to import a log file RENAMED to .txt…because SAP Lumira is super picky about the file extensions. In the following image, you can see with headings on and a named dataset name, all columns line up perfectly and can be added to the data set.
You’ll also observe that as SAP BusinessObjects spits out trace logs, it rolls a new file very very frequently. Could this get cumbersome in your analysis? Yeah, sure. You will have to edit out that header data in each file, but if each file is open in Notepad++, you can replace the bad line return at the end across all files in one step. Following that, you have to rename each file to a .csv or a .txt file. Tedious, but not a show stopper.
Following that, I now have a pretty massive dataset that has error data in there somewhere to help me diagnose what happened in the failure in my landscape by simply merging data sets via a union for each trace log file I needed to analyze.
If I had my wishes granted, SAP would clean up the trace logs to make them easier for consumption. I’d also make SAP Lumira not be so picky about special characters at the end of a line as well as file names for flat files. As an editorial note, this was more fun than anything and an interesting way to visualize systems data to troubleshoot performance. It’s worth mentioning, if you are an admin, there is a handy GLF viewer from SAP. Read about it over on the SCN in this handy blog.
Analysts may receive flat files in many shapes and sizes. This use case does a pretty good job of demonstrating that with SAP Lumira, there is simplicity in getting data tidied up and into something that makes data understandable.