When we released Sherlock 2.3, there were a few interesting updates that came along with it. One of them was the ability to tell which of your users are slackers and which ones are crippling your system – no really. We released a new component with Sherlock 2.3 called Sherlock KPI. It indexes your populated audit database and pulls in the number of times a report has been viewed and refreshed. It also pulls in the users doing the refreshing and viewing. We updated the Sherlock Universe to include relevant objects for that data as well. We even introduced a special object into the Sherlock Universe that determines the effective laziness of a user and categorizes them (e.g., slacker, mildly lazy, active, busy, and killing my servers). Okay, so that last part is a lie, but not a bad idea.
Here’s a list of the actual new objects introduced.
As you can see, in addition to pulling in the number of views and refreshes, we are also pulling in the number of schedules, successful schedules, and failed schedules that are associated with a particular report. With these new objects at hand, you can start discovering new insights into how frequently your deployment and the content within it are being used.
Want some concrete examples?
The chart below shows the number of views and refreshes that have occurred for each month starting with June 2012.
You can derive a few insights from this view:
- the level of activity for this deployment was on an upward trend until October 2012
- the number of refreshes are generally larger than views alone – with an exception in September 2012
- there are greater jumps in the number of views month over month than refreshes
These insights are important as they allow you to not only get an idea for how much of which activity is occurring on the system, but also to determine areas for further inquiry. What happened in October 2012 to demonstrate such a sharp drop in the activity for this deployment? Which reports were released in September 2012 to cause views to outpace refreshes in that month? Which users are involved in the views versus the refreshes?
As you can see from the table below, the User Name object allows us to get that answer. This view is showing us the Top 10 users with the highest total views across the entire SAP BusinessObjects deployment in the past six months. (Note: The user names have been hidden to protect the name of the customer). For the first user in the list, you can see that they have close to 350,000 views and 990,000 refreshes.
You can also see the bottom 10 users.
If you were to combine the top 10 and bottom 10 users lists while also breaking them out into applications or groups, then you could quickly determine who your most and least frequent consumers of information across your organizational domains. You could then use this information to determine who from Sales in the East Region is more frequently taking advantage of information from your BI deployment and how is not. You could then correlate this with sales numbers to determine if there is causation between the more frequent information usage and higher sales.
From a BI administrator perspective, inboxes in an SAP BusinessObjects deployment are always a large consumers of space and a black void in terms of information consumption. Sherlock always provided the capability to see how many of the reports in an inbox are being read versus not read, but with the new KPI data, you can also see which of your users are looking at their inboxes more frequently than others. For example, the table below shows a list of inboxes along with the number of report views that have occurred against the content contained in those inboxes.
As mentioned above, you can also see metrics about schedules within your deployment. The chart below shows the number of failed and successful schedules that occurred within this deployment beginning in June 2012.
Based upon this chart, you can see that the total number of schedules showed the same upward trend as the number of views and refreshes from an earlier chart in this post. You can also see that the total number of schedule failures is on a slight upward trend as well. Of course, this follows the same path as the views and refreshes, in that the total amount of activity sharply drops in October 2012.
I hope that this post gives you a good understanding of our new Sherlock KPI component and gets you started down the path of thinking about the various insights with which it can provide you. How do you think this new component would be useful for your deployment? What do you think we should add to make this even more useful?
Thanks for reading.