Please use this identifier to cite or link to this item:
https://hdl.handle.net/20.500.13091/3266
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Abdulghani, Sema | - |
dc.contributor.author | Fadhil, Ahmed Freidoon | - |
dc.contributor.author | Gültekin, Seyfettin Sinan | - |
dc.date.accessioned | 2023-01-08T19:04:21Z | - |
dc.date.available | 2023-01-08T19:04:21Z | - |
dc.date.issued | 2020 | - |
dc.identifier.issn | 2148-2683 | - |
dc.identifier.uri | https://doi.org/10.31590/ejosat.806679 | - |
dc.identifier.uri | https://search.trdizin.gov.tr/yayin/detay/1136061 | - |
dc.identifier.uri | https://hdl.handle.net/20.500.13091/3266 | - |
dc.description.abstract | Breast cancer is one of the leading causes of women death worldwide currently. Developing a computer-aided diagnosis system for breast cancer detection became an interesting problem for many researchers in recent years. Researchers focused on deep learning techniques for classification problems, including Convolutional Neural Networks (CNNs), which achieved great success. CNN is a specific class of deep, feedforward network that has obtained attention from the research community and achieved great successes, especially in biomedical image processing. In this paper, deep feature extraction methods are used which with pre-trained CNN model to classify breast cancer histopathological images from the publically available (BreakHis dataset). The data set includes two classes, benign and malignant, with four different magnification factors. A patch strategy method proposed based on the extraction of image patches for training the CNN and the combination of these patches for classification. AlexNet model is considered in this work with patch strategy, and pre-trained AlexNet is used for fine-tuning the system. Then, the Support Vector Machine (SVM) was used to classify the obtained features.The evaluation results show that the pre-trained Alexnet with SVM classification and patch strategy yields the best accuracy. Accuracy between 92% and 96% was achieved using five-fold cross-validation technique for different magnification factors. | en_US |
dc.language.iso | en | en_US |
dc.relation.ispartof | Avrupa Bilim ve Teknoloji Dergisi | en_US |
dc.rights | info:eu-repo/semantics/openAccess | en_US |
dc.subject | Breast Cancer | en_US |
dc.subject | Convolutional Neural Network | en_US |
dc.subject | Alexnet | en_US |
dc.subject | Transfer Learning | en_US |
dc.subject | and Support Vector Machine Meme Kanseri | en_US |
dc.subject | Evrişimli Sinir Ağı | en_US |
dc.subject | Alexnet | en_US |
dc.subject | Transfer Öğrenimi | en_US |
dc.subject | ve Destek Vektör Makinesi | en_US |
dc.title | Transfer Learning using Alexnet with Support Vector Machine for Breast Cancer Detection | en_US |
dc.type | Article | en_US |
dc.identifier.doi | 10.31590/ejosat.806679 | - |
dc.department | Fakülteler, Mühendislik ve Doğa Bilimleri Fakültesi, Elektrik-Elektronik Mühendisliği Bölümü | en_US |
dc.identifier.volume | 0 | en_US |
dc.identifier.issue | Ejosat Özel Sayı 2020 (ICCEES) | en_US |
dc.identifier.startpage | 423 | en_US |
dc.identifier.endpage | 430 | en_US |
dc.institutionauthor | Gültekin, Seyfettin Sinan | - |
dc.relation.publicationcategory | Makale - Ulusal Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
dc.identifier.trdizinid | 1136061 | en_US |
item.grantfulltext | open | - |
item.openairetype | Article | - |
item.fulltext | With Fulltext | - |
item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
item.languageiso639-1 | en | - |
item.cerifentitytype | Publications | - |
Appears in Collections: | Mühendislik ve Doğa Bilimleri Fakültesi Koleksiyonu TR Dizin İndeksli Yayınlar Koleksiyonu / TR Dizin Indexed Publications Collections |
Files in This Item:
File | Size | Format | |
---|---|---|---|
10.31590-ejosat.806679-1331783.pdf | 686.71 kB | Adobe PDF | View/Open |
CORE Recommender
Page view(s)
46
checked on Apr 29, 2024
Download(s)
16
checked on Apr 29, 2024
Google ScholarTM
Check
Altmetric
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.