Chest X-Ray Pneumonia Classification
-
5856 chest X-ray images are used
- Three research questions which are tried to be answered:
- Do different networks result similar/same performance?
-
Does applying transfer learning influence performance?
-
Does using deep networks without pretraining on ImageNet have similar performance with the pretrained version?
-
Data preprocessing methods are conducted to have better results in further modeling studies.
-
Different models are run and finally, the models are evaluated with appropriate metrics and the results are interpreted accordingly
- Future works:
- To use a source dataset which is in the similar domain with the target dataset during transfer learning
- To increase the number of images is data augmentation
- To train the deep networks partially
Medical Image Segmentation: A Comparative Study of SAM and MedSAM Models
-
130 breast cancer images with ground truth are utilized
- SAM and MedSAM models are used to segment cancerous regions
- Predictions are made using SAM and MedSAM
- Images are segmented as cancerous and non-cancerous based on the predictions
- The segmented regions are visualed
- Different performance metrics appropriate for segmentation are defined and evaluated
-
Compared the performance of MedSAM and SAM
-
Seen that MedSAM improves the performance of SAM for 130 breast cancer images
-
Future works:
-
Evaluating the performance of the models with other datasets
- Fine-tuning MedSAM with additional breast cancer datasets
- Building other networks such as U-Net
-
Building a Deep Learning Model Uses CT Images for Covid-19 Diagnosis
-
Lung CT images which are taken from Tongji Hospital, Wuhan, China for January 2020, and April 2020 are analyzed
-
The whole image dataset consists of 349 CT images from 296 patients diagnosed with Covid-19 and 397 CT images from non-covid patients
-
The images are resized to train the model faster as 100x100 and converted into torch tensors
-
The whole dataset is splitted into train, validation, and test datasets
-
A convolutional neural network with 3 convolutional and 3 pooling layers, and a final fully connected layer is used
-
Batch normalization after convolutional layers are applied to speed up training
-
ReLU activation function is used after hidden layers to eliminate vanishing gradient problem
-
Softmax activation function in final layer to turn the vector into probabilities that sum up to 1
-
Cross entropy loss function is chosen since it is wiser to use it in classification problems
-
Adam and SGD optimizers are used