Community Metrics in Context
In late 2011 I started working for DISA on the Forge.Mil project. Forge.Mil is a really great concept that was born out of the need for a change in the Department of Defense concerning software development and application life-cycle management. Some the concepts that attracted me to Forge (that is what I call it), was that it integrates knowledge management, information management, software development and object relational mapping to authoritative data. As a System Integrator, I would always have a challenge when it came to documentation and information management aside from required documents for information assurance. Forge enables teams to share data in context of IT in ways that are similar to other knowledge systems but essentially designed for software development and software projects.
This is a unique and challenging situation because most of the time in social communities people have a desire to communicate and collaborate. In this environment with hundreds of different projects and thousands of users motivators for the individual developer varies. In other words, some kids may not want to play in the sandbox, while others excel at building castles together. As Morgan Freeman says while discussing the universe “Answers are terminus, it is the questions that are where it’s at.” With that said, I had and have a lot of questions about the community management concerning IT related work.
How do you know what you need to look at to determine what is happening in your community?
I have spoken to a lot of people about this in the past few weeks and I have scoured the internet looking for questions and answers to consolidate a nice list. One of the first things to consider though is asking yourself about the purpose of your community. Who are my community members? What does the population look like? What do they do on a daily basis? What makes them different from each other? What makes them the same? Questions like these really help flesh out what you need to measure.
I put together a document on metrics and it was really a mashup of data from a lot of community managers, blogs, discussion posts and some publications. I am posting most of the content here so that if you happen to stumble along here, you can take what you need and put together your own thing. I have links to most of the authoritative data sources but if you come across something you wrote and I didn’t source it properly (PLEASE) let me know and I will include the proper reference. I am not looking to take credit for the hard work and thought of others.
Think -Wait-Think- Do..
“We need to first define the problem.
Albert Einstein once said:
“If I had an hour to save the world
I would spend 59 minutes defining the problem
and one minute finding solutions”
And I find in most organizations
people are running around spending sixty minutes
finding solutions to problems that don’t matter.”
~ Stephen Shapiro
The information included in this post is a primer to get you thinking about what you need to do in your organization. These are not the answers, these are the basis for the questions. Most technical people think about the answers as quickly as the problems present themselves. As a “leader from where you are” it is your job to help keep the solutions at bay until you determine that enough questions have been asked. Additionally, the process should never end. We should always ask questions, tune, adjust, qualify, quantify and wonder. If you want to talk about the technical side of this work contact me separately.
Context First
Just remember to always keep the qualitative metrics in mind first as you consider the quantitative metrics. One of the first posts that I came across that I thought was very interesting located here talks about a table with some basic attributes of consideration. I added some attributes in the paper I put together but I could have very well just stuck with these.
“It’s either quantifiable or it’s not measurable”
“It’s either quantifiable or it’s subjective
“We have to quantify it in order to measure it” – Dr. Mel Schnapper, PhD
Figures often beguile me, particularly when I have the arranging of them myself; in which case the remark attributed to Disraeli would often apply with justice and force:
“There are three kinds of lies: lies, damned lies and statistics.” – Autobiography of Mark Twain
Program Managers must understand what the total cost of implementation is. How do we capture a baseline? What is the total cost of ownership? What is the total cost in investment? What are the estimated savings? How do we know that we are achieving our goals? How do we know that we have growth? How do we know that this project is sustainable? How do we know where to course correct? What are the measures of performance? What are the measures of effectiveness? How do we define “effective”? How are we measuring success?
Community Goal: Drive Reuse
It is vague to consider a generic goal of “reuse.” There are a number of community components that can be measured for their reuse-ability.
• How do we know if people are informed about new features?
• How do we offer training on these features?
• How do we know if people are using these new features? If not using – Why?
• Are they using other web sites? Are they linking to them?
• Are they using their tool as a SharePoint or as a code collaboration tool?
• What are linkages / commonalities between projects?
1:Pre-Built Components Reuse
• Goal: Reuse Pre-Built Components
• Question: How much interest is there in each component that is meant for reuse?
• Metrics: Number of downloads, number of posts on the discussion posts around the component
2: Code Snippet Reuse
• Goal: Reuse Code Snippets
• Question: How many different places do code snippets intended for reuse appear within the community?
• Metrics: number of projects where this code snippet is being used, and/or where it has been “forked”
3: Knowledge Reuse (non-source code)
• Goal: Community Members are sharing knowledge within the community.
• Question: How often are users looking for information among the entire community?
• Metric: number of attempts to search for information community-wide on a particular topic.
4: Expertise Reuse/Reputation Management
• Goal: Users are able to find community experts that can help or guide them.
• Question: How would a user find an expert on a specific topic?
• Metric: determine which are the most popular community-wide search topics -> determine which community members post the most information about the most searched topics.
5: Trust in Reuse
• Goal: Users within the community trust that the community is a place to find valuable, trustworthy answers
• Question: Do users believe that they can rely on the information they are getting outside of their silo?
• Metric: while this is probably best learned through surveys, the number of hits/interest around specific topics that are demonstrating a high level of activity can also provide evidence of success in reuse trustworthiness.
(https://ctf.open.collab.net/sf/wiki/do/viewPage/projects.community-mgmt/wiki/DriveReuse)
Creating and Managing a Healthy Community
Growth
For growth, it is how many people are invested in what you’re doing that matters.
In terms of Growth, we look at:
• Twitter followers/Fan (LinkedIn, Milsuite, other) page members/social media friends.
• Blog Subscribers.
• # of Active commenters.
• Member registrations.
• Unique visitors.
• Ratio of posts to comments, types of comments.
• # of Message posts, if a forum.
• # of Conversations over a month period.
Presence
How visible are you in your space and how does your visibility measure up against that of other defense communities?
In terms of presence, we look at:
• Buzz over a 30 day period.
• Types comments/posts written about Forge.Mil – mentions (linked or unlinked).
• Who authored the mention – client, colleague, recognized social media contact, influencers, etc.
• Where was the mention located?
• How often does your community share your content?
Conversation
Presence is who’s talking about Forge.Mil, Conversation looks more at who Forge.Mil is talking to and the effectiveness of those conversations. Measuring the types of conversations Forge.Mil leadership and Community is having is an important metric because it shows leadership where your time is being spent and how people are engaging with you.
In terms of Conversation, we look at:
• Breakdown of the types of conversations being had– support-based, link sharing, friendly banter.
• Time spent on each conversation group. What’s more cost-effective – social media or phone/email, defense connect online chat?
• Whom you’re conversing with – customers, prospective customers, colleague, outsiders.
• Conversation spread and growth?
• Actionable knowledge learned about core audience.
Sentiment
More important than simply knowing Forge.Mil is being talked about knowing what people are saying about Forge.mil and how that’s changing over time.
In terms of Sentiment, we look at:
• Emergence of Evangelists – onsite and off.
• Ratio of positive/neutral/negative mentions (i.e. satisfaction).
• Forge users recommending the community, passing it on to friends.
• Frequency of community members responding to/helping other community members, overall “vibe” of the room based on tracked interactions.
• Community members defending Forge.Mil on negative blog posts and feedback.
Conversions
In terms of Conversions, we look at:
• Community member, followers, frequent blog commenter, etc).
• Customer loyalty forum (Charter) community conversions – how many times do they refer?
http://outspokenmedia.com/social-media/how-to-measure-community/ (great content)
Metrics as an indicator
A community must support business goals and the current (and prospective) community members themselves.
- If the business goals are not defined, the community risks being feature-driven and may suffer from shiny-object syndrome.
- If the community members are not involved in the success definition process, the community risks being irrelevant to its members. (Community Charter)
- If business goals are undefined, or if community members themselves are not involved in the definition of the community (it’s for them, after all), the community’s risk of failure grows substantially.
There are generally two types of metrics
Qualitative Connecting the dots can be challenging, since the points of data capture for qualitative metrics are often two or more degrees of separation from the data. Qualitative information is contextual information derived by the relationships of data points. Qualitative metrics takes variables, attributes, values and relationships into consideration in order to determine the likelihood of a condition. Qualitative information / data is based on a deep understanding and the correlation of data.
Community example:
There are various interactions in the community that can be measured by frequency of visits or downloads but we may not know why. This is an important factor as understanding the drivers can give us the ability to reinforce successful behaviors.
What does it mean? Interpretation of the data once captured.
Attraction -The ability to attract an initial audience.
Attention – The ability to ‘reel them in’ and have them go deeper into community content
Adoption – the ability to ‘convert’ them into Community users have them contribute to discussions.
When placing the aforementioned metrics into the 3 framework categories above there is a clearer understanding of what we are actually measuring:
Each of these has a quantitative component although it is difficult to measure external connectivity (meaning interactions that occur as a predecessor or successor as a result of the interaction in the community outside of the environment.
Quantitative Quantitative metrics are gathered directly through the observation and measurement of data. There is a high degree of transparency and a direct correlation between action and outcome with quantitative metrics.
The data points can be classified as large Q for quantitative or small q for qualitative.
Attraction – The ability to attract an initial audience.
Attention – The ability to ‘reel them in’ and have them go deeper into community content
Adoption – the ability to ‘convert’ them into Community users have them contribute to discussions.
When placing the aforementioned metrics into the 3 framework categories above there is a clearer understanding of what we are actually measuring:
Attraction (Q,q) | Attention (Q) | Adoption (Q,q) |
Visits | Page views per visit | User Creation |
Unique Visitors | Bounce Rate | % Returning Visits |
% New Visits | Length of Time on Site | Interactivity Rate (q) |
% Users by service | First time contributions | Account age |
% Users by organization | Most active members | Content ratings |
% Users by agency | Mentions by influencers | New posts per month |
% Users by unit | Overall project activity | Reputation changes |
Page views overall | Ratio: Views / Post | Topic activity by project |
Awareness (q) | Ratio: Posts / Thread | Individual project activity |
Inbound Links | Ratio: Searches / Post | % Content with “tags” |
Innovation (e.g. # new product ideas sourced from community) (q) | ||
Member Satisfaction (q) |
Other potential factors (some duplication)
Visitors Metrics: Such as Unique Visitors, Bounce Rate, Pages per Visit, Pageviews, Time on Site, Keywords, and Referring Sites
Members Metrics: Such as New Registrations, # of Active Members, Completed Profiles, Pages per Visit, Pageviews, and Time on Site
Contributors Metrics: Such as # of Edits, # of Comments, and # of New User Generated Content
Evangelists Metrics: Such as # of External Invitations, # of ShareThis external shares, # of Mentions on social media sites (e.g. Twitter)
Leaders Metrics: Such as # of Active Admins, and # of Active Moderators
http://blog.wiser.org/metrics-for-the-busy-community-manager/
Getting a Baseline
If we have 1,000 members in the Forge.Mil community and we do some basic analysis and learn that each week there are approximately 50 new threads posted in our project discussions and about 500 replies.
These three pieces of data are a good baseline for starting to look at ratios. A bit of quick division and your ratios come out like this:
Ratio of posts per member, per week: 0.05 (50/1000) (baseline)
Ratio of replies per member, per week: 0.5 (500/1000) (baseline)
Scenario 1 – Members Up But Engagement May Not Be Keeping Pace
Fast forward one month and let’s say our community has grown to 1,500 members. This is a huge increase, but are the ratios keeping pace with the growth? Let’s say, for example, that there are now 100 new threads posted (a lot of the new members introduced themselves) and about 650 replies each week. Your new ratios look something like this:
Ratio of posts per member, per week: 0.0666 (100/1500) (increased)
Ratio of replies per member, per week: 0.4333 (650/1500) (decreased)
The ratio for posts per member (per week) has gone up thanks to all those new members introducing themselves to the forum. However, the ratio for replies per member has decreased and hasn’t kept pace with the growth. Using these findings as a starter, a quick look around the community may reveal that all those new members are introducing themselves but are not engaging or replying as much.
If you can not measure it, you can not improve it.
– Lord Kelvin, 1883
Scenario 2 – Members Down But Is The Quality Better?
In another example, let’s say community experiences a slight loss in membership, dropping down to 900 members, but the number of posts and replies stays the same. The new ratios are:
Ratio of posts per member, per week: 0.055 (50/900) (increased)
Ratio of replies per member, per week: 0.55 (500/900) (increased)
The ratios above show that community is actually looking very healthy, and so the decrease in membership is likely represented inactive members.
Ratios are a simple way to start putting context to data. In the first example, the community numbers increased, which looks great, but the number of replies didn’t keep pace with the growth— leading to some actions or follow-up with the newer members. In example two, the overall numbers dropped, which looks negative on the surface, but the ratios show us that the community is actually healthier now.
(http://thecommunitymanager.com/communities-and-the-ratios-that-bring-insight)
Metrics Review: reviewing your performance at least every month, and compare month-over-month (MoM%) and year-over-Year (YoY%) growth rates in a spreadsheet
Other notes:
NOTES:
Here is some other relevant information. This comes from some of the collabnet notes in their community wiki https://ctf.open.collab.net/sf/wiki/do/viewPage/projects.community-mgmt/wiki/CommunityMetrics
Community Metrics should be
- Based on agreed upon, communicated community goals
- Centrally managed by the Community Manager and Internal Community PM
- Communicated ‘up’ to stakeholders in a way that provides information about community goal progress, community health, and ROI
- Communicated ‘down’ to the community in a way that promotes the community and identifies opportunities for community growth
Community Metrics should be planned to come to an agreement on
- The various aspects of the goal: For example, the goal reuse can include: code component reuse, code snippet reuse, knowledge reuse. Each aspect of the goal is itself a sub-goal that will be further analyzed and measured differently
- The questions around the goal: For example, the sub-goal of ‘code component reuse’ begs the question “How many times have reusable code components been reused”
- The metrics to be collected regarding the goal: For example, code component reuse’ could be measured by collecting: # of times reusable code components have been downloaded
- The methods employed to provide the goal results: For example, how/when will the metric/s data be captured
- Should the metrics data be trended over time: For example, Should the metrics data be compared against other data for relevance and further analysis
Measuring Community Health
Measuring community health is not looking at traffic numbers or page views. A community is considered healthy when:
- The community goals are being met, such as support questions are being answered
- Attitudes in conversations are light and friendly
- Conversation is two-way or more, but not just single posts
- Issues are being resolved and needs are being met
Community health can NOT be determined by simple numbers. It takes a community manager, or several, to read through conversations.
A community is considered unhealthy when:
- Community goals are not met
- Conversations are non-existent
- Flame wars and arguments are taking over the community
- Community members are unhappy with the interaction, or lack there of
Type of messaging that reveals the health of a community:
- Messages of appreciation
- Messages with solutions to problems posed by community members
- Requests for information are answered
- Conversations are friendly and sometimes lengthy
- Complaints are directly addressed by the community manager or other company staff
- Community leaders emerge from the conversations
Good luck!
Keep reading and writing! Keep asking questions and publish what you find, we can all use the help. Remember people first, context and scope (qualitative) and the quantitative should help with the qualitative. If there is an interest, I can post some information about tools but most of the focus is on people, process, and methods.. cheers!
One thought on “Community Management Metrics”
Comments are closed.