CAN REALISTIC JOB DESCRIPTION INFORMATION AND PRACTICE ENABLE NAIVE RATERS TO PROVIDE POSITION ANALYSIS QUESTIONNAIRE (PAQ) RATINGS COMPARABLE TO THOSE OF EXPERTS?
Doctor of Philosophy thesis
Jones, Main, Butler, and Johnson (1982) stated that job-naive raters provided with only narrative job descriptions can produce valid and reliable Position Analysis Questionnaire (PAQ) ratings. This implies that traditional time- and labor-intensive methods of collecting job analysis information (e.g., interviews, direct observation) are not necessary in order to accurately complete the PAQ. However, PAQ ratings in the Jones et al. study were not validated against an external standard, thereby making the unambiguous interpretation of their results impossible. To determine the convergent validity of the Jones et al. approach, we provided job-naive raters with varying amounts of job descriptive information and, in some cases, prior practice rating the job with another job analysis instrument; PAQ ratings were validated against those of job analysts who were also job content experts. None of the reduced job descriptive information conditions, or practice, enabled job naive raters to obtain either acceptable levels of convergent validity with experts or high interrater reliability.