rater


Also found in: Thesaurus, Medical, Legal, Financial, Acronyms, Encyclopedia, Wikipedia.

rat·er

 (rā′tər)
n.
1. One that rates, especially one that establishes a rating.
2. One having an indicated rank or rating. Often used in combination: a third-rater; a first-rater.

rater

(ˈreɪtə)
n
1. a person or instrument that rates something
2. a person or thing with a designated or specific rating

rat•er

(ˈreɪ tər)

n.
a person or thing that is of a specific rating (usu. used in combination): The show's star is a first-rater.
[1605–15]
References in classic literature ?
As to the 52-foot linear raters, praised so much by the writer, I am warmed up by his approval of their performances; but, as far as any clear conception goes, the descriptive phrase, so precise to the comprehension of a yachtsman, evokes no definite image in my mind.
I am disposed to admire and respect the 52-foot linear raters on the word of a man who regrets in such a sympathetic and understanding spirit the threatened decay of yachting seamanship.
A FACETS analysis of rater bias in measuring Japanese second language writing performance.
By convention, the results of the first rater are traditionally shown in the rows (x values) and the results of the second rater are shown in the columns (y values).
Dallabrida explained, Use of the rater version of the C-SSRS digitally on a SitePad tablet is advantageous over completion on paper which has demonstrated to be prone to transcription errors, data discrepancies and delayed analysis.
Secondly, the rater has to match the inference about the candidate's past job behaviors drawn from the words listed on the resume to the types of behaviors needed on the job.
program, university, or corporation), pre-assessment teaching conditions, status differences among raters; 2) Room level effects, which are the organization and purpose of the particular assessment; 3) Table level effects, which are the social ecology of each table of raters; and 4) Rater level effects, which are raters' cognitive and affective reactions to Field, Room, and Table effects.
Inter rater reliability (Cronbach's Alpha) was (table-II) 0.
The raters were blinded to one another's results and data was not made available to any rater until all data collection was completed.
If the frame rate is not controlled while watching the video frame by frame, a rater may incorrectly score a severe pause between jumps when there is no flaw present.
The difference in inter rater reliability may be because the raters have different notions for assessing clinical reasoning.
Braun developed an interactive online reinforcement program to maintain nurse rater proficiency between time gaps in checklist use and to help ensure consistency of results.