Law Thirty-Six

You gotta go with what works

Reporting choices and date handling in ServiceDesk Plus

| 1 Comment

When you’re trying to get data out of ManageEngine ServiceDesk Plus, you basically have a choice between:

  • Built-in reports, which will generally provided something close to what you need without actually ever being precisely what you need
  • Your own custom report, created by choosing the fields you want, filters, grouping and sorting from a series of drop-down lists. These can be a great way of getting started, but I often find that they can be quite limiting in terms of which tables you want to draw information from.
  • Query reports, where you write your own SQL statement to query the underlying database and run it from within the SDP UI itself
  • Querying the database directly from your database’s own querying tool, which has the most flexibility but which also requires you to do the most work.

Except for the most straight-forward queries that can be generated using the custom report tools, I will typically start from Microsoft SQL Server Management Studio and write the query from scratch. If I’m likely to re-use the query or I need to pretty up the results, I may see if it can be re-worked so that it could run as a Query Report inside SDP.

Query Reports have a number of limitations:

  • They must be a single SELECT statement. As far as I can tell (please write in if I’m wrong), there is no way to return multiple tables of data in a single Query Report. Nor is there a way of running a series of commands that eventually return a single table of data. Nor can you do something like executing a stored procedure that returns a single table of data.
  • You cannot use line breaks to help format the SELECT statement as you write it. I burned more time than I would like to admit to in discovering that one, and it makes writing Query Reports much more difficult than they really need to be, especially when you remember that there’s no colour coding in the SDP query editor, unlike external tools.
  • You have to give every column in the output a name, which means paying attention to any calculated columns and making sure you provide a column alias.
  • If your query fails to execute for whatever reason, you will get a reasonably unhelpful display of the Java exception that was thrown by SDP rather than the slightly more meaningful SQL error that underlies it.

Using Query Reports, you get the advantage of some help with date handling. If you’re using date criteria in your query, particularly for something that will be used as a scheduled query, you can use <from_thisweek>, <to_thisweek>, <from_today>, <to_today> and various others as listed on the helpcard. I’m cautious of these because I’ve yet to see any documentation on precisely what they mean. What do we class as “today” in a system that’s being used in more than one timezone? What is “this week” or “last week” – does it start on a Monday? A Sunday? And at what time – UTC or local time?

Much more useful are the functions LONGTODATE and DATETOLONG. All dates in SDP are stored as bigint and show the number of milliseconds since 1 January 1970 00:00:00 UTC. This is great for internal use, but not at all useful for reporting purposes. If you use a Custom Report, SDP will automatically format these columns as dates for you. If you’re creating a Query Report, you need to do the conversion yourself. LONGTODATE will convert an SDP bigint field into a date, and the SDP reporting engine will then format it for you. DATETOLONG can be used when you’re crafting the WHERE clauses of your SELECT statement, as in:

...WHERE wo.CREATEDTIME > DATETOLONG('2013-11-01 00:00:00')

When I call DATETOLONG, I will tend to specify the date format as above, including time, to avoid any risks of d/m or m/d confusion. All dates in SDP are stored in UTC – remember this when writing your queries.

If you step away from Query Reports in SDP to writing your own SQL statements in your database querying tool of choice, you lose access to the DATETOLONG and LONGTODATE functions. There are some functions in the database (at least, there are in the MSSQL database) that you can try to call.

First, unix_timestamp converts from a date string (eg ‘2013-11-23 17:30:00’) to a bigint.

CREATE FUNCTION [dbo].[unix_timestamp]
  @dateString varchar(25)
RETURNS bigint
  RETURN datediff(s, CAST('1970-01-01' AS DATETIME),
           CAST(@dateString AS DATETIME)) -
           (select dd from sdp_DateDiff)

The first thing to notice about this is that it’s out by a factor of 1000. Dates are stored as milliseconds since 1 January 1970, not seconds, and so anything that calls this function needs to multiply the output by 1000. The second thing to look at is the reference to sdp_DateDiff. This turns out to be a view within the database:

CREATE VIEW [dbo].[sdp_DateDiff]
  SELECT dateDiff(s,getutcdate() ,getdate()) as dd

This approach is necessary because you cannot call getutcdate() from within a function on Microsoft SQL Server. What it does it a sort of adjustment from UTC to the local time for wherever your SQL Server is. My brain starts to melt when I try to work out the timezone maths, but I suspect this approach isn’t perfect, but it’s close enough. I’d also note that in some multi-time zone companies, the time zone of the SQL Server won’t be the most appropriate location to refer to.

The reverse function is from_unixtime:

CREATE FUNCTION [dbo].[from_unixtime]
 @dateValue bigint
  RETURN dateadd(s,(select dd from sdp_DateDiff) +
         (@dateValue),'1970-01-01 00:00:00')

Again, the function is out by a factor of 1000, and again there is some attempt to address the difference between UTC and the local time on the SQL Server.

Personally, I don’t call either of these functions. Instead, I use the following pairs of functions to date to bigint conversion. They remove the issue of being out by a factor of 1000 and also work purely in UTC – partly because I live in the UK and this is local time for me at least half the year, and partly because I distribute reports to people in at least three different time zones, and it’s easier for them to convert from UTC in their heads than other time zones.

  @Input bigint
RETURNS datetime AS 
  RETURN dateadd(s,(@Input/1000),'1970-01-01 00:00:00')
  @Input datetime
RETURNS bigint AS 
  RETURN CAST(datediff(s, '1970-01-01 00:00:00',
         @Input) as bigint) * 1000

I don’t profess to be a T-SQL expert, so feel free to suggest improvements in the comments below.

Now, once you start dealing with dates, it’s only a short hop from there to wanting to calculate the elapsed time since a given date or between two dates… but we’ll leave that for another post.

One Comment

  1. Pingback: Reporting on software installations in ServiceDesk Plus | Law Thirty-Six

Leave a Reply

Required fields are marked *.