0 %
SNOMED-CT German Validation Form
title
Concept ID: 260152009
coid
INSTRUCTIONS: On the basis of the following descriptions in English, Spanish and Swedish, please assign one label (Correct, Acceptable, Wrong) to each German translation candidate. If all candidates are wrong, you can suggest your translation at the bottom of the form.
help
line1
English description:
Cat dander
Spanish description:
caspa de gato
Swedish description:
kattmjäll
or
line2
Katzenschuppen
Correct
Acceptable
Wrong
candidate_1
Katze Schuppen
Correct
Acceptable
Wrong
candidate_2
Cat dander
Correct
Acceptable
Wrong
candidate_3
Katze rose
Correct
Acceptable
Wrong
candidate_4
Katzenfell
Correct
Acceptable
Wrong
candidate_5
line3
Insert your translation here:
own
Submit
submit