PSYD33H3 Chapter Notes - Chapter 0: Inter-Rater Reliability, Convergent Validity, Theoretical Definition

16 views5 pages
School
Department
Course
Professor

Document Summary

Reading 1: reliability and validity of measurement https://opentextbc. ca/researchmethods/chapter/reliability-and-validity-of-measurement/ Interrater reliability: is the extent to which different observers are consistent in their judgments. You can record two university students as they interact with one another, then have two or more observers watch the videos and rate each student"s level of social skills. Content validity: extent to which a measure covers the construct of interest, assessed by carefully checking the measurement method against the conceptual definition of the construct. Discriminant validity: is the extent to which scores on a measure are not correlated with measures of variables that are conceptually distinct. Main takeaways: psychological researchers do not simply assume that their measures work. Instead, they conduct research to show that they work. If they cannot show that they work, they stop using them. 4: there are two distinct criteria by which researchers evaluate their measures: reliability and validity.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers

Related Documents