MeetingSphere Rating supports all customary rating methods. It consists of a family of interrelated workspaces, namely rating sheets, results tables and charts by which groups can
assess ideas, opinions, facts (or whatever) by multiple criteria
make consensus or dissent visible
directly in the meeting. The results of rating are available immediately both as results tables and as charts. This lets you to focus on what is most promising, urgent or important instead of wasting time on lesser topics.
After rating, you'll know where you agree. If you disagree on certain items, this may be nothing to worry about. If it matters, you may want to pause and spend a minute or two on why you are apart.
The Rating sheet
MeetingSphere's rating sheet lists any number of items for rating. Rate items could be products, or quotations or the ideas of a prior brainstorming session. You paste items to the rating sheet or create them one-by-one.
The rating method is up to you as MeetingSphere provides all customary rating methods, namely
With MeetingSphere One, you can rate any list of items on up to 3 criteria, for instance, 'effectiveness', 'feasibility' and 'cost'. MeetingSphere Pro supports any number of criteria.
Anomymity. Usually, you'll rate with full anonymity, simply to get the most honest assessment.
Rating 'by team'. Sometimes it makes sense to pick a team before you submit your rating sheets so you can analyse the results 'by team'. Personal anonymity remains assured.
MeetingSphere delivers the results of your rating instantly. Whether you and your group prefer to use tables or charts, appraisal of the results usually centers on three questions:
What are the top-scoring items?
What items are irrelevant for the purpose of this meeting?
Where do we (dis)agree?
Analyzing the results of a single rating
For each rating sheet, there is a corresponding results table and chart.
(Single) Results tables
Results tables differ by rating method. For the most popular rating method, numeric scale, results tables give not just the mean (average) rating for each item but also the details, i.e. how many participants have selected any particular value.
You can highlight dissent, by marking up high values for (normalized) Standard Deviation.
If you have let your participants pick 'teams' prior to submitting their rating sheets, you can display the table 'by team', compare 'teams', check how the views of a specific team compare to the whole group and so on. Sometimes it helps to know whether dissent runs through the whole group or between e.g. departments. Especially, if you want to build consensus.
For analysis, you can either share your screen and walk the group through the results or let participants peruse and analyse the table independently.
The results chart maps the results into bars whose length corresponds to the mean (average) value assigned.
If you have rated 'by team', you can display teams separately or compare teams amongst each other or versus the group as a whole. For this, the bars of the various teams are color coded.
Documentation - Export
Results tables and charts are included in the Word Report, which you can create by pushing a button.
You can customize this so that
only a specified top or bottom range of results is included
results are given by 'team' (when rating has occurred with 'team' tags)
The export of results tables to Excel is also push-button.