Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
DeepFRAP: Fast fluorescence recovery after photobleaching data analysis using deep neural networks
RISE Research Institutes of Sweden.
RISE Research Institutes of Sweden, Bioekonomi och hälsa, Jordbruk och livsmedel.ORCID-id: 0000-0002-3356-0894
RISE Research Institutes of Sweden, Bioekonomi och hälsa, Jordbruk och livsmedel. Chalmers University of Technology, Sweden.ORCID-id: 0000-0001-9979-5488
RISE Research Institutes of Sweden, Bioekonomi och hälsa, Jordbruk och livsmedel. Chalmers University of Technology, Sweden.ORCID-id: 0000-0002-5956-9934
2021 (Engelska)Ingår i: Journal of Microscopy, ISSN 0022-2720, E-ISSN 1365-2818, Vol. 282, nr 2, s. 146-161Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Conventional analysis of fluorescence recovery after photobleaching (FRAP) data for diffusion coefficient estimation typically involves fitting an analytical or numerical FRAP model to the recovery curve data using non-linear least squares. Depending on the model, this can be time consuming, especially for batch analysis of large numbers of data sets and if multiple initial guesses for the parameter vector are used to ensure convergence. In this work, we develop a completely new approach, DeepFRAP, utilizing machine learning for parameter estimation in FRAP. From a numerical FRAP model developed in previous work, we generate a very large set of simulated recovery curve data with realistic noise levels. The data are used for training different deep neural network regression models for prediction of several parameters, most importantly the diffusion coefficient. The neural networks are extremely fast and can estimate the parameters orders of magnitude faster than least squares. The performance of the neural network estimation framework is compared to conventional least squares estimation on simulated data, and found to be strikingly similar. Also, a simple experimental validation is performed, demonstrating excellent agreement between the two methods. We make the data and code used publicly available to facilitate further development of machine learning-based estimation in FRAP. Lay description: Fluorescence recovery after photobleaching (FRAP) is one of the most frequently used methods for microscopy-based diffusion measurements and broadly used in materials science, pharmaceutics, food science and cell biology. In a FRAP experiment, a laser is used to photobleach fluorescent particles in a region. By analysing the recovery of the fluorescence intensity due to the diffusion of still fluorescent particles, the diffusion coefficient and other parameters can be estimated. Typically, a confocal laser scanning microscope (CLSM) is used to image the time evolution of the recovery, and a model is fit using least squares to obtain parameter estimates. In this work, we introduce a new, fast and accurate method for analysis of data from FRAP. The new method is based on using artificial neural networks to predict parameter values, such as the diffusion coefficient, effectively circumventing classical least squares fitting. This leads to a dramatic speed-up, especially noticeable when analysing large numbers of FRAP data sets, while still producing results in excellent agreement with least squares. Further, the neural network estimates can be used as very good initial guesses for least squares estimation in order to make the least squares optimization convergence much faster than it otherwise would. This provides for obtaining, for example, diffusion coefficients as soon as possible, spending minimal time on data analysis. In this fashion, the proposed method facilitates efficient use of the experimentalist's time which is the main motivation to our approach. The concept is demonstrated on pure diffusion. However, the concept can easily be extended to the diffusion and binding case. The concept is likely to be useful in all application areas of FRAP, including diffusion in cells, gels and solutions. © 2020 The Authors. 

Ort, förlag, år, upplaga, sidor
Blackwell Publishing Ltd , 2021. Vol. 282, nr 2, s. 146-161
Nyckelord [en]
confocal laser scanning microscopy, deep learning, deep neural network, diffusion, fluorescence recovery after photobleaching, machine learning, regression
Nationell ämneskategori
Naturvetenskap
Identifikatorer
URN: urn:nbn:se:ri:diva-52079DOI: 10.1111/jmi.12989Scopus ID: 2-s2.0-85099352468OAI: oai:DiVA.org:ri-52079DiVA, id: diva2:1522697
Anmärkning

Funding details: 2019‐01295; Funding details: Vetenskapsrådet, VR, 2016‐03809; Funding details: Stiftelsen för Strategisk Forskning, SSF; Funding text 1: The financial support of the Swedish Research Council for Sustainable Development (grant number 2019‐01295), the Swedish Research Council (grant number 2016‐03809), and the Swedish Foundation for Strategic Research (the project ‘Material structures seen through microscopes and statistics') is acknowledged. The computations were in part performed on resources at Chalmers Centre for Computational Science and Engineering (C3SE) provided by the Swedish National Infrastructure for Computing (SNIC). A GPU used for part of this research was donated by the NVIDIA Corporation.

Tillgänglig från: 2021-01-26 Skapad: 2021-01-26 Senast uppdaterad: 2023-10-05Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Person

Krona, AnnikaLoren, NiklasRöding, Magnus

Sök vidare i DiVA

Av författaren/redaktören
Krona, AnnikaLoren, NiklasRöding, Magnus
Av organisationen
RISE Research Institutes of SwedenJordbruk och livsmedel
I samma tidskrift
Journal of Microscopy
Naturvetenskap

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetricpoäng

doi
urn-nbn
Totalt: 57 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf