Udostępnij za pośrednictwem


How We Address Developer Documentation Comments

One of the things I've wanted to do on this blog is give people a clearer view into how we actually produce developer documentation for SharePoint; the processes we use, the decisions we make, and what factors influence those decisions. (The Office client developer docs team have done several great posts over on the Office Client Developer Content blog around similar topics.) With that in mind, when someone asked me the other day what we actually do with those comments people enter for our content on MSDN, I figured it was worth answering here.

 

The short answer is: we collect, classify, and prioritize them for action, pretty much like a software product team would triage bugs entered against their product.

 

And here's the long answer:

 

For each comment, when we initially receive it we assign a status, which is further broken down by a sub-status value as well. This enables us to separate out and focus on the comments for which some action can be taken, and group those comments in terms of what’s being requested. The status and sub-status values we use include:

· New

This is the status automatically assigned to comments that haven’t been classified yet.

· In progress

Comments that contain actionable content. The comment has been initially triaged and prioritized.

o Code sample

User wants a code sample to illustrate what the topic covers.

o Incorrect/Incomplete info

User points out that there is incorrect or incomplete information in the topic. (Not surprisingly, these are usually our first priority to fix.)

o More Info

User is asking for more information to be added to the topic.

o More Research Needed

We need to do more research on the comment before we can accurately categorize it.

o New Topic Requested

User is asking for a new topic, separate from the topic where they added the comment.

· No Action

o Negative

Comments that are negative but not specific enough to be actionable.

o Noise

Unintelligible comments, such as random typing.

o Positive

Positive comment for which no action is necessary.

· Off-topic

Comments that don't apply to the topic itself, but might be actionable at another level.

o MSDN Issue

Comment refers to a wider MSDN issue.

o Noise

Comments that are intelligible, but have nothing to do with the topic ("I like cheese, do you like cheese?")

o Product Issue

Comment refers to a product issue, such as whether the user likes the feature the topic describes, rather than the documentation itself.

o SDK Issue

Comment refers to a larger issue concerning our content set, that might be present in the topic.

· Spam

Exactly what you'd expect: foreign financial scams, discounted prescriptions, offers to refinance our property at 1 Microsoft Way.

· Fixed

The actionable content of the comment has been addressed. For sub-status, we leave the value at what is was when the comment was In progress.

 

Once we’ve got them classified, we can properly triage those comments that require action, prioritizing them along with the other content feedback we get from newsgroups, blogs, MVPs, MSDN Community Content comments, internal and external partner groups, and various other channels.

 

Now that I’ve outlined what we do with the comments users enter, let’s take a quick look at what those comments actually say. So what are people telling us about the WSS SDK? Taking a look at the comments entered for the WSS 3.0 SDK since its publication online, here's how they currently break out, according to status:

WSS SDK all comments by status

As you can see, 46% of the comments actually require no action beyond the initial triage. Of the actionable comments, about two-thirds are still in some state of being worked on. Like I said, these comments are prioritized in with all the content suggestions we receive from other channels as well, so that our documentation team is working on the highest priority work items, regardless of the path it took to get to us.

 

Looking a little closer at the comments still in progress, here's what you get when you break down those comments by sub-status:

 

WSS SDK In Progress comments by sub-status 

 

Users are overwhelmingly (78% of the total) asking for more information (52%) or code samples (26%). This is in line with feedback we’ve been getting through other channels, and is something we’ve been actively working to address.

 

When you include the comments we've fixed, here are the classifications for all actionable comments we've received to date:

 

WSS SDK all comments by sub-status

 

One other metric we periodically look at is the trend of the positive and negative comments over time. I'm really happy to see that as we periodically republish the SDK with new, expanded, and revised material, positive comments have tended to increase at a faster rate than negative ones:

 

WSS SDK no action comments over time 

 

We do some additional data analysis on comments, but that’s the basics. So next time you’re looking at developer documentation on MSDN, take a moment to enter a comment to let us know how we’re doing. I guarantee we’re on the other end, listening.

Comments