Difference between revisions of "User manuals"

From TQAuditor Wiki
Jump to: navigation, search
(Create comparison report)
(Start evaluation (manual word count))
Line 219: Line 219:
 
Then press "'''Start evaluation'''" and the system will display all corrected segments of the document.
 
Then press "'''Start evaluation'''" and the system will display all corrected segments of the document.
  
 
+
==='''Complete evaluation'''===
 
 
 
 
 
 
For convenience, you may view every segment in the comparison report - use "'''View in comparison'''":
 
 
 
[[file:view in comparison.jpg|border|800px]]
 
 
 
  
 
When you have evaluated all segments press "'''Complete evaluation'''":
 
When you have evaluated all segments press "'''Complete evaluation'''":

Revision as of 10:32, 7 April 2020

The user manual is being edited

For a brief overview of the system, please check this video: Youtube.jpg

About TQAuditor

TQAuditor 3.04 is the system that evaluates and monitors translation quality and allows you to:

  • compare unedited translation made by a translator with edited version received from an editor;
  • generate a report about editor’s corrections;
  • classify each correction by mistake type and severity, thus allowing to get the translation quality evaluation score with a maximum 100 points;
  • ensure anonymous communication between a translator and an editor regarding corrections and mistakes classification;
  • automate a process of maintaining the evaluation project;
  • save all evaluated translations in the database and create the translation quality reports at the company:

you can create the list of top translators with the highest score, see dynamics of quality per individual translator by month, etc.

Quick compare without signing up

You can compare two versions of translated files in the system even without registering an account.

1. Go to https://tqauditor.com and click "Compare files":

Tq.png

It will open the Quick compare page.

2. Choose the translated and reviewed files, and click the "Update selected files" button:

Compare.png

TQAuditor 3.03 accepts bilingual files of different formats (Helium, XLF, XLZ, SDLXliff, TTX, TMX etc.). Click "Supported bilingual file types" to see all the file formats TQA work with.

3. The Comparison report page will be displayed:

Big page.png

Redirect.jpg Back to the table of contents.

Quick compare details

Here you can:

  • Delete comparison report if current information is no longer needed:

1 delete comparison.png

  • Upload other files for a new comparison report by pressing Upload files:

2 upload files.png

  • Export the report to Excel by clicking Export to Excel:

1000px

  • With Markup display option, you may choose tags display.
  • Full - tags have original length, so you can see the data within:

1 full.png 1.png

  • Short - tags are compressed and you see only their position in the text:

2 short.png 2.png

  • None – tags are totally hidden, so they will not distract you:

3 none.png 3.png

  • With Units display option, you may choose text segments display.
  • All units - shows all text segments:

1 all.png 1 all text .png

  • With corrections - shows nothing but amended:

2 with corrections.png 2. not all .png

Press the "Apply" button after changing the preferences:

140px

Redirect.jpg Back to the table of contents.

Register an account

You’ll need to register an account to benefit from the full functionality of the system, e.g. comparing many file pairs at once, classify mistakes, get the quality score of the translation, enable discussion between the translator and editor, and see the detailed reports of everything that happens with translation quality in your company.

1. Go to https://tqauditor.com and press "Sign Up":

Sign up.png

2. Fill in all the boxes, read our Privacy policy and Terms of service, mark the relevant checkbox and click the "Submit" button:

Screenshot 2.png

Redirect.jpg Back to the table of contents.

Add users

You can add users manually, one by one, or import their list from an Excel file. Each of these options is described below.

  • Add users manually

1. To add a new user, go to Users → New user:

Add users.png

2. Fill in all the fields and press the "Create" button:

Eva apple.png

Note: Only users with the marked "Can log in" checkbox can register, work in the system and get corresponding notifications.

You may also edit user details by clicking the user's ID:

Screenshot 23.png

There are 4 types of users with <U>different roles in the system:

  • Translator
  • Evaluator
  • Manager
  • Administrator
  • Import users from Excel

If you already have the list of users with their contact info, you can easily import it without the need to enter such information manually.

To do so, go to Users → Import from Excel:

Import from excel.png

For more detailed instructions on users import, please see the Import users from Excel page.

When all users are added, you can start working with projects.

Redirect.jpg Back to the table of contents.

Create project

1. Go to Projects → New project:

Projects.png

2. Enter the required information and press the "Create" button:

1000px

3. To check the project details, select the ID number:

Prdoject ID.png

Manager can control the project: edit & download files, reassign participants (manager, translator, evaluator or arbiter) or delete the project:

Project details.png

Note: Only the evaluator uploads the files. Manager may only download them, if needed

Redirect.jpg Back to the table of contents.

Create comparison report

After receiving an evaluation request from the system, the project evaluator has to compare edited files with unedited ones.

1. The evaluator uploads files:

1 ev upload files.png

2. When done, click "Create comparison report":

Create comparison reevport.png

Then you need to start evaluation. There are two types of evaluation.

Start evaluation (automatic word count)

If you select this option, the system will display randomly selected segments containing only corrected units for evaluation.

Click Start evaluation (automatic word count):

Start evaluation automatic.png

Then you may configure the evaluation process:

  • Skip locked units — hide "frozen" units (for example, the client wants some parts, extremely important for him, stayed unchanged. Besides, extra units slow down editor’s work).
  • Skip repetitions — the system will hide repeated segments (only one of them will be displayed in this case).
  • Skip segments with match — fuzzy match percentage. The program will hide segments with match greater than or equal to that you specified.
  • Evaluation sample word count limit — the number of words in edited segments, chosen for evaluation.

Click "Start evaluation":

Start evaluation automatic world count.png

Start evaluation (manual word count)

If the word count given by the system does not correspond to the word count you want, you may manually enter the total word count before starting evaluation.

Click Start evaluation (manual word count):

Start manual evaluation.png

Enter the number of evaluated source words:

1. start evaluation.png

Then press "Start evaluation" and the system will display all corrected segments of the document.

Complete evaluation

When you have evaluated all segments press "Complete evaluation":

400px

Describe the translation in general or give advice to the translator and press the "Complete" button:

400px

After generating the comparison report, the system sends the notification to the translator, and the translator may see all the corrections done in his deliveries.

But this is just the beginning - the project evaluator may start quality assessment process. For more info, please see the next chapter.

Redirect.jpg Back to the table of contents.

Start evaluation (manual word count)

Quality evaluation

In this step, the project evaluator has to select the sample for the quality assessment and classify every correction by type and severity.

Select "Add mistake":

800px

Add information about the mistake and click "Submit":

800px

You can also edit, delete mistake/comment:

800px

You may add another mistake by pressing "Add mistake":

800px

When the mistakes classification is done, the project evaluator has to press "Complete evaluation" => "Complete",

and the system will send the quality assessment report to the translator. See translator’s steps in the next chapter.

Please note that you may export the evaluation report with mistakes classification.

Redirect.jpg Back to the table of contents.

Discussion of mistakes

When the project evaluator finishes assessing the translation quality, the project translator gets an email notification.

Translator reviews the quality feedback

After the translator has received the email with the translation quality evaluation, the translator should do the following actions:

1. View the Comparison report. Look through all the corrections made by the evaluator.

2. Go to the Evaluation report.

3. Look through the classification of each mistake.

4. If you agree with the classification of all mistakes, press "Complete project". The project and its evaluation score are finalized at this stage:

1000px

If you do not agree with the classification of some mistakes, do the following actions:

5. Press "Add comment" in the box of the mistake that you do not agree with and enter it by clicking "Submit":

1000px

6. When you have entered all the comments, send the project for reevaluation by pressing "Request reevaluation":

1000px

7. The project will be sent to the evaluator, who will review your comments. If they are convincing, the evaluator will change the mistake

severity in your favour. You will receive the reevaluated project. You are able to send this project for reevaluation one more time.

8. If you have not reached an agreement with the evaluator, you can send the project to the arbiter by pressing "Request arbitration" (it appears instead of "Request reevaluation"):

1000px

9. The arbiter will provide a final score that cannot be disputed.

Evaluator reviews the translator's comments

At this stage, the evaluator needs to review all the translator’s comments with objections. The evaluator has the following instructions:

1. If the translator is right, change the mistake’s severity and enter your comment why it has been changed.

If the translator is wrong, enter your comment why the mistake's severity has not been changed.

2. To finish, press "Complete evaluation"=>"Complete". The project will be sent to the translator for review.

Arbiter reviews the project

Unless the system has been set up otherwise, the translator can return the project to the evaluator for 3 times.

If the translator and the evaluator have not managed to reach the agreement after 3 attempts, the translator sends the project to the arbiter.


The user, who was assigned to be the arbiter will be notified by the system.


The arbiter has to assign a final score on the disputed matters. Look through all the rows where the translator and evaluator disagree.

If the translator is right, change the mistake’s severity and enter your comment why it has been changed.

If the translator is wrong, enter your comment why the mistake severity has not been changed.

1000px

Finally, the arbiter should press "Complete project". The project will be finalized and all its participants will receive the respective message.

500px

Redirect.jpg Back to the table of contents.

Projects filters

For convenience's sake, you may apply different project filters:

250px

For more details, please see the Additional filters section.

You may also order projects by particular criteria: click the title of any column and all the projects will line up (the arrow Line up arrow.jpg button appears):

1000px

Note: The column headers which enable this sorting feature are highlighted in blue.

For more details, please see the Projects list page.

Redirect.jpg Back to the table of contents.

Reports

You can generate various reports in the system. Click the Reports menu on the top panel:

Reports.png

Users with the Translator role can access only their individual reports on their translations quality:

Users with the Evaluator role can access their individual reports on their translations quality and their evaluations reports:

Users with the Manager and Administrator roles can access all the available reports:

  • Average score reports

Here you may see the average reports: per translator and per translator company, per evaluator and per evaluator company, per manager and per manager company, per specialization.

  • Translator report

On this page, you will find information about every translator.

For more details, please see the Translator report page.

  • Evaluator report

On this page, you will find information about every evaluator.

For more details, please see the Evaluator report page.

  • Translator company report
  • Evaluator company report

Redirect.jpg Back to the table of contents.

System settings

You can change and set system values in the System menu:

200px

Each of these menu screens is described below.

Quality standards

By default, the system has pre-defined quality standards, i.e. types of mistakes, penalty scores etc.,

but you can change them to define your own corporate quality standards. To do so, go to System=>Quality standard:

700px

For more detailed information, please see the Quality standard page.

Mistake severities

Mistake severity is the gravity of mistake.

Go to System=>Mistake severities:

400px

This menu screen contains two submenus. Each of them is described below.

  • Mistake severities list - here you may view the list of default mistake severities proposed by the system:

Mistake severities list.jpg

Note: You can’t delete mistake severities connected with projects. Just select the unneeded mistake severity by pressing "Edit"

and uncheck the "Enabled" box. It will not appear in the drop-down list anymore.

  • New mistake severity - here you may add a new mistake severity:

300px

For more details, please see the Mistake severities list and New mistake severity pages.

Redirect.jpg Back to the table of contents

Mistake types

Mistake type is the kind of mistake. For example, Grammar, Punctuation, etc.

Go to System=>Mistake types:

400px

This menu screen contains two submenus. Each of them is described below.

  • Mistake types list - here you may view the list of default mistake types proposed by the system:

Mistake types list.jpg

Note: You can’t delete mistake types connected with projects. Just select the unneeded mistake by pressing "Edit"

and uncheck the "Enabled" box. It will not appear in the drop-down list anymore.

  • New mistake type - here you may add a new mistake type:

300px

For more details, please see the Mistake types list and New mistake type pages.

Redirect.jpg Back to the table of contents

Specializations

Specialization is a particular field that translation is focused on (an object of translation).

Go to System=>Specializations:

400px

This menu screen contains two submenus. Each of them is described below.

  • Specializations list - here you may view the list of default specializations proposed by the system:

400px

Note: You can’t delete specializations connected with projects. Just select the unneeded specialization by pressing "Edit"

and uncheck the "Enabled" box. It will not appear in the drop-down list anymore.

  • New specialization - here you may add a new specialization:

300px

For more details, please see the Specializations list and New specialization pages.

Redirect.jpg Back to the table of contents

Edit quality levels

Go to System=>Edit quality levels:

Edit QL menu.png

Here you may see the list of default quality levels proposed by the system:

700px

Note: You can’t remove quality levels connected with projects.

But you may add a new one (click "Add below") or edit current quality levels:

700px

Redirect.jpg Back to the table of contents

Evaluation settings

Go to System=>Evaluation settings:

Evaluation settings menu new.png

On the Evaluation settings page you may define maximum evaluation attempts and default evaluation sample word count limit:

300px

  • Maximum evaluation attempts - here you may define, how many times translator may argue in discussion with evaluator. By default, translator may leave 3 comments. 2 times replies evaluator,

but on the 3-rd time, arbiter replies and complete this discussion.

  • Default evaluation sample word count limit- here you may define the number of words for evaluation (the system offers 1000 words by default).

Redirect.jpg Back to the table of contents

Reminders

Go to System=>Reminders:

200px

The system can remind about projects or automatically close them when you need it.

700px

As you may see, the system is quite flexible in terms of settings, and you may configure the system to work a bit differently than it does by default.

Redirect.jpg Back to the table of contents

Notifications

Go to System=>Notifications:

200px

Here you can configure whether the system should send notifications of comparison reports creation to translators, and of projects completion to evaluators:

Email notification settings.jpg

Enable or disable the corresponding notification, and press "Update" to save changes.

Redirect.jpg Back to the table of contents

License

Go to System=>License:

200px

On this page, you may see your license details:

License page.jpg

By pressing "Manage license" you can manage your license.

For more details, please see the Licensing page.


It seems that’s all you need for productive work in TQAuditor.

If any questions arise, please contact us.

Good luck!

Redirect.jpg Back to the table of contents