The Collaboration Box Score

A few weeks ago I asked what the NBA could teach us about measuring collaboration.  As a follow-up, I thought it might be neat to think about the elements that would make up someone’s collaboration box score.  The box score is telling because it’s an aggregate of performance, so it accounts for tradeoffs made by players during the game (e.g. shoot the ball for a point, or pass it for an assist) and demonstrates how they use the limited time they have on the court.  Brainstorming with others in the office, I came up with this initial list of box score categories, but I’d love to hear what other Wikinomics reader think:

Signal-to-noise ratio: I think the most visibly important metric – analogous to points on a basketball stat sheet – should be one that is focused on the value and quality of content you broadcast.  Using Twitter as an example, measure tweets versus re-tweets.  If your content is getting re-tweeted it’s safe to assume that it’s valuable (signal) and not noise. For blogs, the metric might be comments per post, indicating a compelling or timely argument worth discussion. Using online sentiment analysis tools, companies could add an additional layer of complexity to this stat by also measuring positive versus negative comments.  As a basic example of signal-to-noise, my ratio based on re-tweets/tweets is: 10/91 = 0.11.  For comments/blog posts it would be: 289/79 = 3.7.  Of course in a multi-channel world, the metrics get muddled.  If re-tweet is the new blog comment, how do you calculate the metric?

In/out ratio: How good of a curator are you of information?  We interviewed the company Cataphora a few weeks ago (recently profiled in BusinessWeek) – their software uses social network analysis to identify good content by how much it is shared and passed around and track document originators and curators to assess individual productivity.

Document “originator” stats: Building on the previous point, how much good content do you generate, where “good” is defined by the number of times your content is reused (this is similar to signal-to-noise, but attempts to quantify the strength of the signals). At a more granular level, the originator stat might also highlight certain areas of expertise – e.g. all good ideas related to robots originate from Alan.  A twist on this metric could be “conversation initiator stats,” which would track who kicks off popular conversations on ones that lead to valuable ideas that are implemented.

Responsiveness: This would be a fairly basic stat that looks at how quickly you respond to other people’s requests.  As part of this, you would also have to account for the number of requests an individual gets (which might actually be another stat around collaborative demand or reputation).

Feedback assessment: To temper responsiveness stats, you’d want to have something like the “assist” stat in basketball, where you only get credited for responses that lead to a positive outcome.  As an example, you could base it how much change occurs to documents as they flow through you.  This would have to use software that analyzes document content and tracks versioned documents (independent of filenames, which often change as they pass through different users – e.g. docs that come through me usually leave as filename_nh).

Sociometric factors: These are the intangible aspects of collaboration that don’t necessarily leave a digital trail that is easily measured.  I liken this to measuring an individual’s plus/minus in basketball – it’s based on how other people perform on their stats when they are around you.  Since there’s no “court time,” in enterprise collaboration, you could measure face time through badges, digital connections, or even video.  Using this type of reality mining, a company could also analyze things like the tone of conversations as well as emotional response in order to gauge the impact you have on the morale of those around you (without necessarily measuring content, which leads to privacy issues).  This stat might also highlight diminishing collaborative returns – if too many minuses start showing up, maybe you’re collaborating too much (or are not very good at collaborating).  A company could also develop a “starting line-up” for projects based on sociometric factors that show positive results from certain combinations of employees.

Article written by

Naumi Haque has more than a decade of experience in the research and advisory industry. Naumi has been at the forefront of customer experience management, recently arguing that enterprises need an integrated customer experience strategy to meet customer expectations. He has conducted research and provided thought leadership on a wide variety of topics related to emerging technology and business innovation, including: social media strategy, customer experience, next generation marketing, enterprise collaboration, open innovation, digital identity, new sources of enterprise data, and disruptive web-enabled business models. He received his MBA and his Honors in Business Administration from the University of Western Ontario’s Richard Ivey School of Business.