Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Continued experiments on cross-language relevance assessment
RISE, Swedish ICT, SICS.ORCID iD: 0000-0003-4042-4919
RISE, Swedish ICT, SICS.
Number of Authors: 2
2003 (English)In: Comparative Evaluation of Multilingual Information Access Systems: 4th CLEF workshop: Revised Selected Papers, 2003, 1Conference paper, (Refereed)
Abstract [en]

An experiment on how users assess document usefulness for an information access task in their native language (Swedish) versus a language they have near-native competence in (English). Results show that relevance assessment in a foreign language takes more time and is prone to errors compared to assessment in the reader’s first language.

Place, publisher, year, edition, pages
2003, 1.
National Category
Computer and Information Science
Identifiers
URN: urn:nbn:se:ri:diva-22420DOI: 10.1007/b102261OAI: oai:DiVA.org:ri-22420DiVA: diva2:1041965
Conference
4th CLEF workshop, 21-22 August, Trondheim, Norway
Note

DOI: 10.1007/b102261, Lecture notes in computer science 3237

Available from: 2016-10-31 Created: 2016-10-31 Last updated: 2016-12-29Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full text

Search in DiVA

By author/editor
Karlgren, Jussi
By organisation
SICS
Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar

Altmetric score

Total: 2 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
v. 2.26.0