How to interpret OCS 2007 R2 Monitoring reports


After deploying the Monitoring Server role in your OCS organization you start getting some feedback and reports to analyze. In this blog I will try to explain in short how to analyze the data from the monitoring reports.

On how to deploy the Monitoring role, see my earlier post on the subject

The MOS value

In the monitoring reports you see a MOS score for each instance that is being measured. To determine the MOS score an absolute categorization rating (ACR) is used. The ACR scale rates from 1-5 where 5 represents an excellent experience. In a manual process the users are asked to rate the quality their experience. When all the users have rated their experience, the average of these values are calculated as a mean option score (MOS). Although MOS scores are not a perfect representation of the listening experience, they do make it possible to compare and contrast listening experiences.

If group A reports an MOS of 4.1 and group B reports an MOS of 2.2, it is safe to say that, on average, listeners in group A had a much better experience than listeners in group B. 

The Monitoring server does not ask users to rate their listening experiences on scale of 1-5; instead the Monitoring server uses a series of algorithms to predict how users would rate the quality of each listening experience. Based on those algorithms the Monitoring server reports several MOS scores.

The MOS scores

  • Listening MOS –A prediction of the wideband quality of an audio stream being played to a user. The MOS score takes into account audio fidelity and distortion as well as speech and noise levels
  • Sending MOS –A prediction of the wideband quality of an audio stream sent from a user. The MOS score takes into account audio fidelity and distortion as well as speech and noise levels
  • Network MOS –Another prediction of wideband quality of an audio stream played to a user. In this case, however, only network factors are considered such as the audio codec used, packet loss, packet errors, and jitter (the variation in delay time of packets arriving at a destination) 
    • NOTE: Latency should not exceed 150 ms. In my experience a latency up to 300 gives a satisfying experience, as long as jitter is under control
  • Conversational MOS– A prediction of the narrowband conversational quality of the audio stream played to the user. This value is indicative of how a large group of people would rate the quality of the connection for holding a conversation

NOTE: Narrowband refers to audio codec that use an 8-kHz sample rate. Wideband refers to audio codecs that use a 16-kHz sample rate. Telephone-quality communication is normally categorized as narrowband.

For the complete documentation on how to deploy and use the Monitoring Server download the Microsoft® Office Communications Server 2007 Quality of Experience Monitoring Server Guide here:

Technical Refernces:

OCS Quality of Experience (QoE) – Quick Facts

Here is a blog that covers some facts about QoE using the monitoring OCS role. I noticed three points in the blog that I found useful

  • The OCS 2007 R2 Monitoring Role service and database can be collocated with a computer running Standard Edition (very small deployments only). If you do this, the full edition of SQL Server must be installed on the server (instead of the SQL Server Express Edition that is normally used).
  • 2005 Reporting Services SP1 or SP2 on the backend QoE / Monitoring role SQL database to get reports
  • For the OCS R2 Monitoring Role, you need to install the optional Report Pack for Monitoring server component
    • When installing the Report Pack you must point to a SQL server with Reporting Services installed
    • Reporting Services does not have to be installed on the SQL server hosting the monitoring database, you can point to any SQL server with Reporting Services installed in your domain

Visit the full post for more information here,