Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (4)
  • Open Access


    MSC-YOLO: Improved YOLOv7 Based on Multi-Scale Spatial Context for Small Object Detection in UAV-View

    Xiangyan Tang1,2, Chengchun Ruan1,2,*, Xiulai Li2,3, Binbin Li1,2, Cebin Fu1,2

    CMC-Computers, Materials & Continua, Vol.79, No.1, pp. 983-1003, 2024, DOI:10.32604/cmc.2024.047541

    Abstract Accurately identifying small objects in high-resolution aerial images presents a complex and crucial task in the field of small object detection on unmanned aerial vehicles (UAVs). This task is challenging due to variations in UAV flight altitude, differences in object scales, as well as factors like flight speed and motion blur. To enhance the detection efficacy of small targets in drone aerial imagery, we propose an enhanced You Only Look Once version 7 (YOLOv7) algorithm based on multi-scale spatial context. We build the MSC-YOLO model, which incorporates an additional prediction head, denoted as P2, to… More >

  • Open Access


    MSADCN: Multi-Scale Attentional Densely Connected Network for Automated Bone Age Assessment

    Yanjun Yu1, Lei Yu1,*, Huiqi Wang2, Haodong Zheng1, Yi Deng1

    CMC-Computers, Materials & Continua, Vol.78, No.2, pp. 2225-2243, 2024, DOI:10.32604/cmc.2024.047641

    Abstract Bone age assessment (BAA) helps doctors determine how a child’s bones grow and develop in clinical medicine. Traditional BAA methods rely on clinician expertise, leading to time-consuming predictions and inaccurate results. Most deep learning-based BAA methods feed the extracted critical points of images into the network by providing additional annotations. This operation is costly and subjective. To address these problems, we propose a multi-scale attentional densely connected network (MSADCN) in this paper. MSADCN constructs a multi-scale dense connectivity mechanism, which can avoid overfitting, obtain the local features effectively and prevent gradient vanishing even in limited… More >

  • Open Access


    Attention Guided Multi Scale Feature Fusion Network for Automatic Prostate Segmentation

    Yuchun Li1,4, Mengxing Huang1,*, Yu Zhang2, Zhiming Bai3

    CMC-Computers, Materials & Continua, Vol.78, No.2, pp. 1649-1668, 2024, DOI:10.32604/cmc.2023.046883

    Abstract The precise and automatic segmentation of prostate magnetic resonance imaging (MRI) images is vital for assisting doctors in diagnosing prostate diseases. In recent years, many advanced methods have been applied to prostate segmentation, but due to the variability caused by prostate diseases, automatic segmentation of the prostate presents significant challenges. In this paper, we propose an attention-guided multi-scale feature fusion network (AGMSF-Net) to segment prostate MRI images. We propose an attention mechanism for extracting multi-scale features, and introduce a 3D transformer module to enhance global feature representation by adding it during the transition phase from More >

  • Open Access


    Multi-Scale Attention-Based Deep Neural Network for Brain Disease Diagnosis

    Yin Liang1,*, Gaoxu Xu1, Sadaqat ur Rehman2

    CMC-Computers, Materials & Continua, Vol.72, No.3, pp. 4645-4661, 2022, DOI:10.32604/cmc.2022.026999

    Abstract Whole brain functional connectivity (FC) patterns obtained from resting-state functional magnetic resonance imaging (rs-fMRI) have been widely used in the diagnosis of brain disorders such as autism spectrum disorder (ASD). Recently, an increasing number of studies have focused on employing deep learning techniques to analyze FC patterns for brain disease classification. However, the high dimensionality of the FC features and the interpretation of deep learning results are issues that need to be addressed in the FC-based brain disease classification. In this paper, we proposed a multi-scale attention-based deep neural network (MSA-DNN) model to classify FC… More >

Displaying 1-10 on page 1 of 4. Per Page