Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Siambert: Siamese Bert-based Code Search
KTH Royal Institute of Technology, Sweden.
KTH Royal Institute of Technology, Sweden.
RISE Research Institutes of Sweden, Digital Systems, Data Science.ORCID iD: 0000-0003-3272-4145
RISE Research Institutes of Sweden, Digital Systems, Data Science.ORCID iD: 0000-0002-9546-4937
Show others and affiliations
2022 (English)In: 34th Workshop of the Swedish Artificial Intelligence Society, SAIS 2022, Institute of Electrical and Electronics Engineers Inc. , 2022Conference paper, Published paper (Refereed)
Abstract [en]

Code Search is a practical tool that helps developers navigate growing source code repositories by connecting natural language queries with code snippets. Platforms such as StackOverflow resolve coding questions and answers; however, they cannot perform a semantic search through the code. Moreover, poorly documented code adds more complexity to search for code snippets in repositories. To tackle this challenge, this paper presents Siambert, a BERT-based model that gets the question in natural language and returns relevant code snippets. The Siambert architecture consists of two stages, where the first stage, inspired by Siamese Neural Network, returns the top K relevant code snippets to the input questions, and the second stage ranks the given snippets by the first stage. The experiments show that Siambert outperforms non-BERT-based models having improvements that range from 12% to 39% on the Recall@1 metric and improves the inference time performance, making it 15x faster than standard BERT models

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers Inc. , 2022.
Keywords [en]
Codes (symbols), Natural language processing systems, Code search, Natural language queries, Natural languages, Neural-networks, Performance, Semantic search, Source code repositories, Semantics
National Category
Economics and Business
Identifiers
URN: urn:nbn:se:ri:diva-60199DOI: 10.1109/SAIS55783.2022.9833051Scopus ID: 2-s2.0-85136132400ISBN: 9781665471268 (print)OAI: oai:DiVA.org:ri-60199DiVA, id: diva2:1701922
Conference
34th Workshop of the Swedish Artificial Intelligence Society, SAIS 2022, 13 June 2022 through 14 June 2022
Available from: 2022-10-07 Created: 2022-10-07 Last updated: 2023-11-06Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Pashami, SepidehAl-Shishtawy, AhmadPayberah, Amir H.

Search in DiVA

By author/editor
Pashami, SepidehAl-Shishtawy, AhmadPayberah, Amir H.
By organisation
Data Science
Economics and Business

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 77 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf