Difference between revisions of "Evaluation report"
(→Start evaluation (manual word count)) |
(→Start evaluation (manual word count)) |
||
Line 41: | Line 41: | ||
Then press "'''Start evaluation'''" and the system will display all segments of the document. | Then press "'''Start evaluation'''" and the system will display all segments of the document. | ||
− | Thus, you will be able to select the segments | + | Thus, you will be able to select the segments for evaluation on your own. |
=='''Mistakes'''== | =='''Mistakes'''== |
Revision as of 14:41, 27 January 2020
Contents
General information
After the evaluator uploaded files, they can start the evaluation.
The evaluator can start the evaluation whether with automatic word count or enter it manually when starting the process:
For more info on both methods, please check the relevant sections below.
Start evaluation (automatic word count)
If you select this option, the system will display randomly selected segments сontaining only corrected units for evaluation:
Then you may configure the evaluation process:
- Skip repetitions—the system will hide repeated segments (only one of them will be displayed in this case).
- Skip locked units—the program will hide "frozen" units. For example, the client wants some parts, extremely important for him, stayed unchanged. Besides, extra units slow the editor’s work down.
- Evaluation sample word count limit—the number of words in edited segments chosen for evaluation.
Having applied the settings you need, press "Start evaluation" to initiate the process.
Start evaluation (manual word count)
If the word count given by the system does not correspond to the word count you want, you may manually enter the total word count before starting evaluation.
To do this, press "Start evaluation (manual word count)":
Enter the number of evaluated source words:
Then press "Start evaluation" and the system will display all segments of the document.
Thus, you will be able to select the segments for evaluation on your own.
Mistakes
You can Add mistake:
You may describe it:
You may also edit, delete mistake/comment:
Or add another mistake by pressing "Add mistake":
- View in comparison — this link redirects you on the page with the Comparison report:
When the mistakes classification is done, the project evaluator has to press "Complete evaluation":
The evaluator may describe translation in general or give advice to the translator and press the "Complete" button:
Note: If you press "Complete" and no mistakes are added to the report, the system will warn you:
Buttons and filters
At the left side of the screen, different buttons and filters are displayed:
- Complete evaluation - the button that finishes evaluation process.
- Evaluation report - evaluation report view.
- Delete evaluation report - deletes the evaluation report.
- Comparison report - comparison report view.
- Project details - basic information about the project.
- Project files - original translation and amended translation.
Markup display
Markup display option defines tags display:
- Full - tags have original length, so you can see the data within:
- Short - tags are compressed and you see only their position in the text:
- None – tags are totally hidden, so they will not distract you:
Units display
- All - all text segments are displayed:
- With mistakes - only with mistakes text segments are displayed:
- Last commented by evaluator - only last commented by evaluator text segments are displayed:
- Last commented by translator - only last commented by translator text segments are displayed:
- Last commented by arbiter - only last commented by arbiter text segments are displayed:
Export to Excel
You may export the report to Excel by pressing the "Export to Excel" link in the upper right corner of the report:
You will have the fixes in columns for reviewing.
Please note that the rows with mistakes are highlighted in red with the indication of their types and severities:
Note: If you apply the Units display filter, only the filtered data will be exported.
Evaluation and comparison details
Also, you may find here Evaluation and Comparison details, such as:
Evaluation sample details:
- Total units - the number of text segments in the sample.
- Total source words - the total number of words in the sample.
- Total mistakes - the general number of mistakes.
Evaluation details:
- Skip locked units - hidden, "frozen" units (for example, the client wants some parts, extremely important for him, stayed unchanged. Besides, extra units slow down editor’s work).
- Skip segments with match >= - predefined fuzzy match percentage (the program hides segments with match greater than or equal to that you specified).
- Total units - the total number of text segments.
- Corrected units - the number of segments with amendments.
- Total source words - the total number of words in the source.
- Source words in corrected units - the number of source words in amended segments.
- Quality score - the complex index of performing translation that depends on the total number of words, specialization, severity of mistakes, etc.
- Quality level - evaluation of translator based on quality score.
Comparison details
- Total units - the total number of segments.
- Corrected units - the number of segments with amendments.
- Total source words - the total number of words in the source.
- Source words in corrected units - the number of source words in amended segments.
Reevaluation and arbitration requests
When mistakes classification is done, the project evaluator has to press "Complete evaluation" => "Complete",
and the system will send the quality assessment report to the translator.
When the translator receive this report and look through classification of each mistake, he may Complete project
(if agree with the evaluator (in this case, the project will be completed)) or Request reevaluation (if disagree):
The project will be sent to the evaluator, who will review translator’s comments.
If they are convincing, the evaluator may change mistake severity. The translator will receive the reevaluated project.
The translator can send this project for reevaluation one more time.
If an agreement between the translator and evaluator wasn’t reached, the translator can send the project to the arbiter
by pressing "Request arbitration" (it appears instead of "Request reevaluation"):