32
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Swimtrans Net: a multimodal robotic system for swimming action recognition driven via Swin-Transformer

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Introduction

          Currently, using machine learning methods for precise analysis and improvement of swimming techniques holds significant research value and application prospects. The existing machine learning methods have improved the accuracy of action recognition to some extent. However, they still face several challenges such as insufficient data feature extraction, limited model generalization ability, and poor real-time performance.

          Methods

          To address these issues, this paper proposes an innovative approach called Swimtrans Net: A multimodal robotic system for swimming action recognition driven via Swin-Transformer. By leveraging the powerful visual data feature extraction capabilities of Swin-Transformer, Swimtrans Net effectively extracts swimming image information. Additionally, to meet the requirements of multimodal tasks, we integrate the CLIP model into the system. Swin-Transformer serves as the image encoder for CLIP, and through fine-tuning the CLIP model, it becomes capable of understanding and interpreting swimming action data, learning relevant features and patterns associated with swimming. Finally, we introduce transfer learning for pre-training to reduce training time and lower computational resources, thereby providing real-time feedback to swimmers.

          Results and discussion

          Experimental results show that Swimtrans Net has achieved a 2.94% improvement over the current state-of-the-art methods in swimming motion analysis and prediction, making significant progress. This study introduces an innovative machine learning method that can help coaches and swimmers better understand and improve swimming techniques, ultimately improving swimming performance.

          Related collections

          Most cited references40

          • Record: found
          • Abstract: not found
          • Article: not found

          Mechanics and Control of Swimming: A Review

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Reinforcement Learning and Wavelet Adapted Vortex Methods for Simulations of Self-propelled Swimmers

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found
              Is Open Access

              FL-PMI: Federated Learning-Based Person Movement Identification through Wearable Devices in Smart Healthcare Systems

              Recent technological developments, such as the Internet of Things (IoT), artificial intelligence, edge, and cloud computing, have paved the way in transforming traditional healthcare systems into smart healthcare (SHC) systems. SHC escalates healthcare management with increased efficiency, convenience, and personalization, via use of wearable devices and connectivity, to access information with rapid responses. Wearable devices are equipped with multiple sensors to identify a person’s movements. The unlabeled data acquired from these sensors are directly trained in the cloud servers, which require vast memory and high computational costs. To overcome this limitation in SHC, we propose a federated learning-based person movement identification (FL-PMI). The deep reinforcement learning (DRL) framework is leveraged in FL-PMI for auto-labeling the unlabeled data. The data are then trained using federated learning (FL), in which the edge servers allow the parameters alone to pass on the cloud, rather than passing vast amounts of sensor data. Finally, the bidirectional long short-term memory (BiLSTM) in FL-PMI classifies the data for various processes associated with the SHC. The simulation results proved the efficiency of FL-PMI, with 99.67% accuracy scores, minimized memory usage and computational costs, and reduced transmission data by 36.73%.
                Bookmark

                Author and article information

                Contributors
                Role: Role: Role: Role: Role: Role: Role: Role: Role: Role: Role: Role: Role: Role:
                URI : http://loop.frontiersin.org/people/2759803/overviewRole: Role: Role: Role: Role: Role: Role: Role: Role:
                Journal
                Front Neurorobot
                Front Neurorobot
                Front. Neurorobot.
                Frontiers in Neurorobotics
                Frontiers Media S.A.
                1662-5218
                24 September 2024
                2024
                : 18
                : 1452019
                Affiliations
                [1] 1Department of Physical Education, Sangmyung University , Seoul, Republic of Korea
                [2] 2Nanjing University of Technology, Nanjing , Jiangsu, China
                Author notes

                Edited by: Yingbai Hu, The Chinese University of Hong Kong, China

                Reviewed by: Shunlei Li, Italian Institute of Technology (IIT), Italy

                Jiyong Tan, University of Electronic Science and Technology of China, China

                *Correspondence: Xiaoyu Yue yuexiayu@ 123456njtech.edu.cn
                Article
                10.3389/fnbot.2024.1452019
                11458561
                39381775
                758bd833-e1a3-4303-99f4-cf8fa73626a5
                Copyright © 2024 Chen and Yue.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 20 June 2024
                : 09 August 2024
                Page count
                Figures: 5, Tables: 8, Equations: 16, References: 42, Pages: 16, Words: 10126
                Funding
                The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.
                Categories
                Neuroscience
                Original Research

                Robotics
                swin-transformer,clip,multimodal robotic,swimming action recognition,transfer learning
                Robotics
                swin-transformer, clip, multimodal robotic, swimming action recognition, transfer learning

                Comments

                Comment on this article

                scite_
                0
                0
                0
                0
                Smart Citations
                0
                0
                0
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content106

                Most referenced authors258