There is a danger to writing about this particular session from the RIMS 2014 conference held last week. It turns out one of the presenters is a regular reader of this blog. Normally I can opine on my impressions from a seminar with no one being the wiser about whether I really understood the topic or not. This time I can’t do that. If I miss the mark here someone will know I am a blithering idiot.
That is not to be confused with those of you who just strongly suspect that I am a blithering idiot.
I will hand it to conference organizers and the presenters. It takes a certain intestinal fortitude to schedule a presentation on analytics for the last time slot on the last day of the conference. To compound the issue, it appeared that RIMS had run out of money by the time this program rolled around. The presenters were given the smallest dais I have ever seen. The back curtain was very small; the draped table looked like it was set for two 10 year olds, and the microphone batteries were on their last legs. Nevertheless, the four adults spilling off the tiny stage did an excellent job of making this predictive modeling discussion informative and entertaining. When Moderator Jeff Branca from Marsh explained they were actually the openers for Closing Keynote Ben Stein, it all made sense.
Even Paul, the actuary sitting next to me, seemed to enjoy it. At least that is what I inferred when he dropped his pencil.
Entitled “Workers’ Compensation Predictive Modeling: The Crystal Ball Becomes Clearer”, attendees gained insight into how both carriers and employers can benefit from analyzing and benchmarking claim data. It does seem that history repeats itself, and using historical data early in the life of a claim can certainly help employers gain an idea of where it may ultimately go. The best description of this process was given by Melissa Bowman-Miller, Vice President of Risk Management at Staffmark (and a Bob's Cluttered Desk aficionado). She spoke of the importance of using good predictive data for finding what she called “your land mine claims” before they actually blew up in your face.
It’s a great analogy. You know what a land mine is; that explosive device hidden just below the surface, waiting for some unsuspecting soul to step on it and trigger its destructive fury. In the workers’ comp world, that land mine is a dangerous element hidden from view, waiting for an unsuspecting adjuster oblivious to the dangers that lie ahead in a particular claim.
Claim Analytics can help find that land mine, but won’t disarm it. That is the job of the claims professional.
You see, the problem inherent in our industry is twofold when it comes to “big data”. Everyone wants comprehensive and accurate information, and it is difficult to come by. The second problem is that many people, once given good data, have no idea how to effectively use it. It is like a dog chasing a car – if it actually caught the car it would have absolutely no idea what to do with the damn thing. This session, leveraging the land mine claim concept, did a good job of showing how even simple elements of data can be put to very good use. One of the most interesting takeaways for me was the acknowledgment that something as simple as the commuting distance of the injured worker to their job can help predict the likelihood of a successful return to work.
David Duden of Deloitte Consulting even acknowledged that with the availability of broad public data on individuals today, predicting the path of a claim based on geographic and demographic information could be accomplished with very little to begin with. He told the audience they had at one point generated useful predictive data on a claimant using only their home address. From that single point of information they were able to research and draw out other elements with which to score the “land mine potential” of that particular workers claim.
I have long recognized that humans tend to follow basic and predictable patterns. Many years ago, in “another life” as they say, I was in restaurant and hospitality management. We recognized then that by tracking and recording our hourly guest counts, we could fairly accurately predict our business traffic for any given day by combining that historical data with current sales trends. It helped us with our scheduling and product procurement, and proved a fairly reliable indicator of “what was to come”.
To me, the lesson offered in this session was to “think outside the box” when it comes to viewing and analyzing “big data”. Within those fields lie nuggets of gold that can help predict the path of certain claims. Skeptics of such concepts will see potential evil in this idea; as if somehow predictive data will be used to manipulate and deny the injured the benefits entitled them. I could not disagree more with that view. Quite the opposite is true.
Predictive modeling is nothing but the development of a roadmap, one that will identify turns in the road and the location of potential land mines along the path. It is not about finding useful data to make a claim go wrong; it is about identifying potential hazards early and helping a claim go right.
Finally, I must give the fourth presenter credit not just for his contributions, but for getting me to this session in the first place. I learned of it from Sean Martin of Travelers Insurance while participating in the Saturday night “Safety Audit” of Denver micro-breweries organized by Mark Walls and Ray Sibley. He described what they would be covering, and not having a clue who I was, invited me to attend. Talk about a land mine from not having enough data; he invited a blithering idiot blogger because he did not have appropriate historical data from which to make a wise and appropriate decision.
In this case I am glad he lacked that pertinent analytical data. I would have missed this session entirely. After all, it was the last time slot on the last day, and the topic was analytics. Who in their right mind would go to that?