Back to Jobs

Annotation Work With CredCo

The Credibility Coalition is currently laying the groundwork for further explorations of article credibility on the web. We are seeking students passionate about tackling misinformation, addressing information and news credibility and understanding the role artificial intelligence plays in this sector.

We are developing future rounds of testing data (you can find our initial study here and more info here), and we are seeking help with annotating articles – the first building block of designing artificial intelligence for information credibility. Student annotators will help develop our data by reviewing the articles and marking them up with a simple set of indicators.

An interest in journalism is preferred, but those with any background, including engineering, computer science, politics, linguistics and sociology, can apply.

Each annotator’s work takes about 10 hours. All work is done remotely.

Specifically, each annotator’s responsibilities include:

  • Reading 50 articles (subject matter TBD). Different articles have different levels of credibility.
  • Reviewing definitions of a few indicators. For instance, we might ask you to look at an article and indicate if the article contains a clickbait headline. We’ll give you clear definitions of each indicator and how to rate them.
  • Marking up a set of indicators to measure the article’s credibility.

Annotators will gain experience:

  • Understanding the annotation process of data gathering for artificial intelligence.
  • Analyzing biases in news content.
  • Learning the nuanced challenges of designing AI for addressing information and news credibility.

We offer the following compensation:

  • A stipend of $150 for each annotator’s efforts.
  • Academic incentives (class credit, etc).
  • If you choose, we can credit your support if we publish the data.

To participate in an upcoming credibility study, please submit your resume to Nevin Thompson, Community Lead at nevin@hackshackers.com.