DSpace Repository

Breast lesion classification from bi modal ultrasound images by convolutional neural network

Show simple item record

dc.contributor.advisor Ariful Haque, Dr. Mohammad
dc.contributor.author Shamim Hussain, Md.
dc.date.accessioned 2019-08-06T03:58:59Z
dc.date.available 2019-08-06T03:58:59Z
dc.date.issued 2019-01-26
dc.identifier.uri http://lib.buet.ac.bd:8080/xmlui/handle/123456789/5296
dc.description.abstract Ultrasound imaging provides a convenient and easily accessible means for breast cancer detection. Quasi-static Elastography is a very useful imaging modality which can be combined with conventional B-mode imaging to implement a non-invasive lesion classification system. Computer Aided Diagnosis (CADx) can provide an objective opinion alongside the radiologist's diagnosis to increase the reliability of such a system. Traditionally, CADx based systems have relied on statistical features derived from the morphology and/or texture of the lesions which are fitted to a machine learning model– to classify the lesions into either malignant or benign category. The performance of this approach is highly dependent on the selection of an appropriate set of features which is found to be a difficult task. The segmentation process required for feature extraction is time-consuming and introduces subjectivity in the classification process. Although a Computer Aided Diagnosis system based on object recognition techniques by deep Convolutional Neural Networks (CNN) holds the possibility of real-time lesion classification directly from images, this approach faces the difficulty of gathering enough data for training such a network from scratch. In this work, we investigated the use of transfer learning to alleviate this difficulty. We show that a CNN trained on ImageNet can be used as a starting point to design a deep CNN which can be trained easily on a small dataset of lesions. Also, we integrate both ultrasound B-mode and elastography images in a single unified network for lesion classification that can be trained end-to-end. On a dataset of 217 clinically proven cases, our approach achieves >91% accuracy, >88% sensitivity and >92\% specificity. Apart from achieving satisfactory classification performance on our dataset, the proposed method shows indications of improvement with increasing dataset size. This approach, which is based on transfer learning, is applicable to a dataset of any reasonable size and also maintains the scalability and flexibility of deep learning. Furthermore, this method is completely objective, requires no segmentation of the lesion or ROI selection and is suitable for a real-time classification system. Additionally, we show that classification results can be further improved by multi-task learning of relevant tasks or inclusion of additional qualitative features of the lesions. en_US
dc.language.iso en en_US
dc.publisher Department of Electrical and Electronic Engineering (EEE), BUET en_US
dc.subject Image processing en_US
dc.title Breast lesion classification from bi modal ultrasound images by convolutional neural network en_US
dc.type Thesis-MSc en_US
dc.contributor.id 0417062229 en_US
dc.identifier.accessionNumber 117019
dc.contributor.callno 623.67/SHA/2019 en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search BUET IR


Advanced Search

Browse

My Account