Tuesday, September 8, 2009

Knowledge Services Metrics - Two Foolproof Questions

As I described in a post several days ago, knowledge workers face few challenges as daunting as coming up with workable measurements and metrics for knowledge services.

I think it's very much a value issue, having to do how we communicate and differentiate the strategic value of knowledge, especially if the road to that knowledge is through a specialized library or similar knowledge services business unit. There seems to be something about the relationship between organizational management and knowledge workers that puts some distance between them, that seems to prevent them from communicating with each other in ways that get to the point.

We know why we measure. First of all, it's part of being a manager. If we're not willing to step back and ask how well our unit is doing, we have no business being in a management position.

But it's more than that. We also have an obligation to review what we're doing so we can figure out how to do things differently as we move forward. Simply put, the status quo won't cut it in today's workplace, and the sooner we grab on to that little piece of the management process, the better off we're going to be.

Don't let me get preachy, though. Measuring what we do also enables us to conceptualize new tasks relating to how we provide knowledge services for the organization, how we capture the impact of knowledge services in the workplace, and - as significant as anything else we do with management and metrics - to help us monitor and keep things "on track."

As knowledge services directors, though, we don't always give measurement and metrics the attention this important discipline deserves, and sometimes we get caught up short. Here's an example:

Bill Slidell is a thoughtful, user-focused knowledge services director. He manages a corporate information center with four information professionals and five support staff. He is often not in his office because he is known in the company as the go-to person when anyone, in any department, has a question about where to find anything. So Bill is out and about a lot, meeting with colleagues as they seek to work through various issues.

Because he has a loquacious and very open personality, Bill is often called into meetings simply to be the “point person” for this or that discussion, whether the topic has to do with KM/knowledge services or any other information, knowledge, or strategic learning-related subject. He’s a good listener, and his suggestions for the next step are very sound and usually lead to good results.

Bill’s functional unit prides itself on its KD/KS success. Today Bill was informed that he is to supply metrics for the unit’s performance, and that past performance measures are not to be revisited. He is to come up with something new.

What two questions must Bill ask before he can provide the metrics he’s been asked for?

To find out, join the next Click U KM/Knowledge Services Certificate Program Course. In Critical Success Factors: Measuring Knowledge Services, we tackle measurement and metrics for knowledge services head-on. In five one-hour meetings, we’ll work together on how to evaluate (and convey the value of) information management, knowledge management, and strategic learning in the organizations where we work. The course is part of Click U’s Premium Programs series, sponsored by SLA (but membership in SLA is not required to take the course). The course begins on September 14. Go here to listen to Dale Stanley and me talk about the course, and go here for more information and to register.

2 comments:

  1. Hello Guy,

    I certainly agree that far more attention needs to focused on performance metrics, particularly in large organizations-- for the mutual benefit of the knowledge worker, business unit, and organization. That's why we incorporated rich color coded metrics in our Kyield architecture.

    However, we didn't arrive at that point easily-- it took over a decade of applied R&D in fact, waiting for technical standards to catch up, awareness to propagate through our little village, and for learning curve for me- the architect- to pull it all together in a holistic system design.

    Speaking primarily of the digital workplace here-- although the metrics can be extended to real world applications, first one must have the ability to collect rich data. It turns out that in a holistic semantic design, the rich data can be collected while overcoming the related problem of information overload, aligning interests between the K worker and organization, and improving innovation. Of course in any modern holistic system, it must be adaptable and interoperable, if adoption is to be achieved.... and as inexpensive as possible....

    One of many benefits of collecting rich data in a semantic enterprise is the ability to dramatically improve crisis prevention. Indeed, the past several man caused crises would have been avoided if only our government had adopted advanced knowledge systems when we first approached them -- at about the time Australia did -- a decade ago.....

    Easier said than done for sure, but we did figure it out, finally. Cheers, Mark Montgomery, Founder/CEO - Kyield.

    ReplyDelete
  2. Good comment, Mark, and thanks very much. Really glad to hear about the research in color coded metrics and we'll have to give some attention to your Kyield product.

    Also intrigued with your reference to the ability to dramatically improve crisis prevention. I'll be interested in how colleague speak and share thought about this one.

    Thanks very much.

    All the best,

    Guy

    ReplyDelete