Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.13091/1168
Full metadata record
DC FieldValueLanguage
dc.contributor.authorÖztürk, Şaban-
dc.contributor.authorÖzkaya, Umut-
dc.date.accessioned2021-12-13T10:34:47Z-
dc.date.available2021-12-13T10:34:47Z-
dc.date.issued2021-
dc.identifier.issn1532-0464-
dc.identifier.issn1532-0480-
dc.identifier.urihttps://doi.org/10.1016/j.jbi.2020.103638-
dc.identifier.urihttps://hdl.handle.net/20.500.13091/1168-
dc.description.abstractnowadays, considering the number of patients per specialist doctor, the size of the need for automatic medical image analysis methods can be understood. These systems, which are very advantageous compared to manual systems both in terms of cost and time, benefit from artificial intelligence (AI). AI mechanisms that mimic the decision-making process of a specialist increase their diagnosis performance day by day, depending on technological developments. In this study, an AI method is proposed to effectively classify Gastrointestinal (GI) Tract Image datasets containing a small number of labeled data. The proposed AI method uses the convolutional neural network (CNN) architecture, which is accepted as the most successful automatic classification method of today, as a backbone. According to our approach, a shallowly trained CNN architecture needs to be supported by a strong classifier to classify unbalanced datasets robustly. For this purpose, the features in each pooling layer in the CNN architecture are transmitted to an LSTM layer. A classification is made by combining all LSTM layers. All experiments are carried out using AlexNet, GoogLeNet, and ResNet to evaluate the contribution of the proposed residual LSTM structure fairly. Besides, three different experiments are carried out with 2000, 4000, and 6000 samples to determine the effect of sample number change on the proposed method. The performance of the proposed method is higher than other state-of-the-art methods.en_US
dc.language.isoenen_US
dc.publisherACADEMIC PRESS INC ELSEVIER SCIENCEen_US
dc.relation.ispartofJOURNAL OF BIOMEDICAL INFORMATICSen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectColorectal canceren_US
dc.subjectGastrointestinal tracten_US
dc.subjectCNNen_US
dc.subjectLSTMen_US
dc.subjectTransfer learningen_US
dc.titleResidual LSTM layered CNN for classification of gastrointestinal tract diseasesen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/j.jbi.2020.103638-
dc.identifier.pmidPubMed: 33271341en_US
dc.identifier.scopus2-s2.0-85097741848en_US
dc.departmentFakülteler, Mühendislik ve Doğa Bilimleri Fakültesi, Elektrik-Elektronik Mühendisliği Bölümüen_US
dc.authoridOzturk, Saban/0000-0003-2371-8173-
dc.authorwosidOzturk, Saban/ABI-3936-2020-
dc.identifier.volume113en_US
dc.identifier.wosWOS:000615920800007en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.authorscopusid57191953654-
dc.authorscopusid57191610477-
dc.identifier.scopusqualityQ1-
item.openairetypeArticle-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.grantfulltextembargo_20300101-
item.fulltextWith Fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
crisitem.author.dept02.04. Department of Electrical and Electronics Engineering-
Appears in Collections:Mühendislik ve Doğa Bilimleri Fakültesi Koleksiyonu
PubMed İndeksli Yayınlar Koleksiyonu / PubMed Indexed Publications Collections
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collections
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collections
Files in This Item:
File SizeFormat 
1-s2.0-S1532046420302677-main.pdf
  Until 2030-01-01
5.25 MBAdobe PDFView/Open    Request a copy
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

17
checked on Mar 23, 2024

WEB OF SCIENCETM
Citations

45
checked on Mar 23, 2024

Page view(s)

80
checked on Mar 25, 2024

Download(s)

6
checked on Mar 25, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.