Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Effects of Foreign Language and Task Scenario on Relevance Assessment
RISE, Swedish ICT, SICS.
RISE, Swedish ICT, SICS.ORCID iD: 0000-0003-4042-4919
Number of Authors: 22005 (English)In: Journal of Documentation, ISSN 0022-0418, E-ISSN 1758-7379, Vol. 61, no 5, p. 623-639Article in journal (Refereed) Published
Abstract [en]

Purpose – This paper aims to investigate how readers assess relevance of retrieved documents in a foreign language they know well compared with their native language, and whether work-task scenario descriptions have effect on the assessment process. Design/methodology/approach – Queries, test collections, and relevance assessments were used from the 2002 Interactive CLEF. Swedish first-language speakers, fluent in English, were given simulated information-seeking scenarios and presented with retrieval results in both languages. Twenty-eight subjects in four groups were asked to rate the retrieved text documents by relevance. A two-level work-task scenario description framework was developed and applied to facilitate the study of context effects on the assessment process. Findings – Relevance assessment takes longer in a foreign language than in the user first language. The quality of assessments by comparison with pre-assessed results is inferior to those made in the users' first language. Work-task scenario descriptions had an effect on the assessment process, both by measured access time and by self-report by subjects. However, effects on results by traditional relevance ranking were detectable. This may be an argument for extending the traditional IR experimental topical relevance measures to cater for context effects. Originality/value – An extended two-level work-task scenario description framework was developed and applied. Contextual aspects had an effect on the relevance assessment process. English texts took longer to assess than Swedish and were assessed less well, especially for the most difficult queries. The IR research field needs to close this gap and to design information access systems with users' language competence in mind.

Place, publisher, year, edition, pages
2005, 1. Vol. 61, no 5, p. 623-639
Keywords [en]
Information retrieval, Languages, Reading
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:ri:diva-20958OAI: oai:DiVA.org:ri-20958DiVA, id: diva2:1040992
Available from: 2016-10-31 Created: 2016-10-31 Last updated: 2018-03-08Bibliographically approved

Open Access in DiVA

No full text in DiVA

Search in DiVA

By author/editor
Karlgren, Jussi
By organisation
SICS
In the same journal
Journal of Documentation
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 84 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
v. 2.35.3