In the last few years,with the development of generative adversarial networks (GAN), Significant technical updates have been made in the field of face attribute editing. A new method is proposed in this paper for editing face attributes. Based on the advantages of StyleGAN in face generation, TransUNet and High-Fidelity Encoder are added to the entire network to achieve accurate, controllable and highly realistic editing effects. By integrating the powerful ability of TransUNet to extract image feature information, the structure and semantic information of face images can be accurately captured, and accurate attribute editing can be achieved. In addition, we have designed High-Fidelity Encoder that focuses on maintaining the quality of the visual effects and naturalness of the images during the editing process, minimizing visual artifacts and unnatural appearances, and producing highly realistic editing results that are indistinguishable from real photos. From the results obtained by our experiments, our method has significant advantages in attribute accuracy and visual effect quality.
Synthetic Aperture Radar (SAR) has become one of the primary means of current earth observation due to its unique technical advantages, such as all-weather, all-time, and extended operating distance. However, most SAR detectors based on deep learning methods use outdated ResNet backbone networks, and the detection model detection accuracy is low. This paper proposes a new network called Dynamic IoU R-CNN (DIoU R-CNN) to transfer the self-supervised learning method moby based on Swin Transformer to the complex downstream task of SAR ship detection. DIoU R-CNN adds the dynamic IoU module to Faster R-CNN and the advanced BalancedL1 loss function, achieves relatively high accuracy SAR ship detection in the SSDD dataset without much increase in the number of parameters and training time. And the Swin Transformer with self-supervised learning performs even better than the supervised learning method in the comparison experiments.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.