Path-planning is a crucial part of robotics, helping robots move through challenging places all by themselves. In this paper, we introduce an innovative approach to robot path-planning, a crucial aspect of robotics. This technique combines the power of Genetic Algorithm (GA) and Probabilistic Roadmap (PRM) to enhance efficiency and reliability. Our method takes into account challenges caused by moving obstacles, making it skilled at navigating complex environments. Through merging GA’s exploration abilities with PRM’s global planning strengths, our GA-PRM algorithm improves computational efficiency and finds optimal paths. To validate our approach, we conducted rigorous evaluations against well-known algorithms including A*, RRT, Genetic Algorithm, and PRM in simulated environments. The results were remarkable, with our GA-PRM algorithm outperforming existing methods, achieving an average path length of 25.6235 units and an average computational time of 0.6881 seconds, demonstrating its speed and effectiveness. Additionally, the paths generated were notably smoother, with an average value of 0.3133. These findings highlight the potential of the GA-PRM algorithm in real-world applications, especially in crucial sectors like healthcare, where efficient path-planning is essential. This research contributes significantly to the field of path-planning and offers valuable insights for the future design of autonomous robotic systems.
Path planning is an essential concern in robotic systems, and it refers to the process of determining a safe and optimal path starting from the source state to the goal one within dynamic environments. We proposed an improved path planning method in this article, which merges the Dijkstra algorithm for path planning with Potential Field (PF) collision avoidance. In real-time, the method attempts to produce multiple paths as well as determine the suitable path that’s both short and reliable (safe). The Dijkstra method is employed to produce multiple paths, whereas the Potential Field is utilized to assess the safety of each route and choose the best one. The proposed method creates links between the routes, enabling the robot to swap between them if it discovers a dynamic obstacle on its current route. Relating to path length and safety, the simulation results illustrate that Dynamic Dijkstra-Potential Field (Dynamic D-PF) achieves better performance than the Dijkstra and Potential Field each separately, and going to make it a promising solution for the application of robotic automation within dynamic environments.
Transmitting binary data across a network should generally avoid transmitting raw binary data over the medium for several reasons, one would be that the medium may be a textual one and may not accept or correctly handle raw bitstream, another would be that some protocols may misinterpret the meaning of the bits and causes a problem or even loss of the data. To make the data more readable and would avoid misinterpretation by different systems and environments, this paper introduces encoding two of the most broadly used data interchange formats, XML and JSON, into the Base64 which is an encoding scheme that converts binary data to an ASCII string format by using a radix-64 representation. This process, will, make the data more readable and would avoid misinterpretation by different systems and environments. The results reflect that encoding data in Base64 before the transmission will present many advantages including readability and integrity, it will also enable us to transmit binary data over textual mediums, 7 Bit protocols such as SMTP, and different network hardware without risking misinterpretation.
With the recent developments of technology and the advances in artificial intelligence and machine learning techniques, it has become possible for the robot to understand and respond to voice as part of Human-Robot Interaction (HRI). The voice-based interface robot can recognize the speech information from humans so that it will be able to interact more naturally with its human counterpart in different environments. In this work, a review of the voice-based interface for HRI systems has been presented. The review focuses on voice-based perception in HRI systems from three facets, which are: feature extraction, dimensionality reduction, and semantic understanding. For feature extraction, numerous types of features have been reviewed in various domains, such as time, frequency, cepstral (i.e. implementing the inverse Fourier transform for the signal spectrum logarithm), and deep domains. For dimensionality reduction, subspace learning can be used to eliminate the redundancies of high-dimensional features by further processing extracted features to reflect their semantic information better. For semantic understanding, the aim is to infer from the extracted features the objects or human behaviors. Numerous types of semantic understanding have been reviewed, such as speech recognition, speaker recognition, speaker gender detection, speaker gender and age estimation, and speaker localization. Finally, some of the existing voice-based interface issues and recommendations for future works have been outlined.
This paper presents a numerical analysis for the effect of temperature variations on the strain response of polymer optical fiber (POF) Bragg gratings. Results show that the dependence of the Bragg wavelength (λ B ) upon strain and temperature variations for the POF Bragg gratings is lies within the range of 0.462 – 0.470 fm με -1 °C -1 compare with 0.14 – 0.15 fm με -1 °C -1 for the SOFs Bragg gratings. Also, results show that the strain response for the POF Bragg gratings changed on average by 1.034 ± 0.02fm με - important for strain sensor applications especially in the environments where the temperature change.
In recent years, wireless microrobots have gotten more attention due to their huge potential in the biomedical field, especially drug delivery. Microrobots have several benefits, including small size, low weight, sensitivity, and flexibility. These characteristics have led to microscale improvements in control systems and power delivery with the development of submillimeter-sized robots. Wireless control of individual mobile microrobots has been achieved using a variety of propulsion systems, and improving the actuation and navigation of microrobots will have a significant impact. On the other hand, actuation tools must be integrated and compatible with the human body to drive these untethered microrobots along predefined paths inside biological environments. This study investigated key microrobot components, including medical applications, actuation systems, control systems, and design schemes. The efficiency of a microrobot is impacted by many factors, including the material, structure, and environment of the microrobot. Furthermore, integrating a hybrid actuation system and multimodal imaging can increase the microrobot’s navigation effect, imaging algorithms, and working environment. In addition, taking into account the human body’s moving distance, autonomous actuating technology could be used to deliver microrobots precisely and quickly to a specific position using a combination of quick approaches.
In this paper, we develop an analytical energy efficiency model using dual switched branch diversity receiver in wireless sensor networks in fading environments. To adapt energy efficiency of sensor node to channel variations, the optimal packet length at the data link layer is considered. Within this model, the energy efficiency can be effectively improved for switch-and-stay combiner (SSC) receiver with optimal switching threshold. Moreover, to improve energy efficiency, we use error control of Bose-Chaudhuri-Hochquengh (BCH) coding for SSC-BPSK receiver node compared to one of non-diversity NCFSK receiver of sensor node. The results show that the BCH code for channel coding can improve the energy efficiency significantly for long link distance and various values of high energy consumptions over Rayleigh fading channel.
Given the role that pipelines play in transporting crude oil, which is considered the basis of the global economy and across different environments, hundreds of studies revolve around providing the necessary protection for it. Various technologies have been employed in this pursuit, differing in terms of cost, reliability, and efficiency, among other factors. Computer vision has emerged as a prominent technique in this field, albeit requiring a robust image-processing algorithm for spill detection. This study employs image segmentation techniques to enable the computer to interpret visual information and images effectively. The research focuses on detecting spills in oil pipes caused by leakage, utilizing images captured by a drone equipped with a Raspberry Pi and Pi camera. These images, along with their global positioning system (GPS) location, are transmitted to the base station using the message queuing telemetry transport Internet of Things (MQTT IoT) protocol. At the base station, deep learning techniques, specifically Holistically-Nested Edge Detection (HED) and extreme inception (Xception) networks, are employed for image processing to identify contours. The proposed algorithm can detect multiple contours in the images. To pinpoint a contour with a black color, representative of an oil spill, the CIELAB color space (LAB) algorithm effectively removes shadow effects. If a contour is detected, its area and perimeter are calculated to determine whether it exceeds a certain threshold. The effectiveness of the proposed system was tested on Iraqi oil pipeline systems, demonstrating its capability to detect spills of different sizes.
In coordination of a group of mobile robots in a real environment, the formation is an important task. Multi- mobile robot formations in global knowledge environments are achieved using small robots with small hardware capabilities. To perform formation, localization, orientation, path planning and obstacle and collision avoidance should be accomplished. Finally, several static and dynamic strategies for polygon shape formation are implemented. For these formations minimizing the energy spent by the robots or the time for achieving the task, have been investigated. These strategies have better efficiency in completing the formation, since they use the cluster matching algorithm instead of the triangulation algorithm.
The Internet of Things (IoT) has become a major enabler for sustainable development and has begun to have an impact on society as a whole. Marshes are significant ecosystems for the environment that are essential to biodiversity support and ecological equilibrium. However, environmental changes and human activity are posing an increasing threat to these fragile ecosystems. An Internet of Things (IoT)-based marsh monitoring system was created and put into operation to gather data in real-time on a variety of environmental factors, such as wind speed, CO2 and hydrogen levels, temperature, humidity, voltage, and current. The system makes use of a network of sensors spread out throughout the marsh, which may promote sustainable development. send data to a central node for processing before sending it to a platform hosted in the cloud. After that, an interactive online application is used to visualize the data, giving stakeholders important information about the condition and health of the marsh. Because the suggested system makes it possible to monitor and manage marsh ecosystems effectively, it may promote sustainable development.
This paper focuses on designing distributed wireless sensor network gateways armed with Intrusion Detection System (IDS). The main contribution of this work is the attempt to insert IDS functionality into the gateway node (UBICOM IP2022 network processor chip) itself. This was achieved by building a light weight signature based IDS based on the famous open source SNORT IDS. Regarding gateway nodes, as they have limited processing and energy constrains, the addition of further tasks (the IDS program) may affects seriously on its performance, so that, the current design takes these constrains into consideration as a priority and use a special protocol to achieve this goal. In order to optimize the performance of the gateway nodes, some of the preprocessing tasks were offloaded from the gateway nodes to a suggested classification and processing server and a new searching algorithm was suggested. Different measures were taken to validate the design procedure and a detailed simulation model was built to discover the behavior of the system in different environments.
Searchable symmetric encryption (SSE) enables clients to outsource their encrypted documents into a remote server and allows them to search the outsourced data efficiently without violating the privacy of the documents and search queries. Dynamic SSE schemes (DSSE) include performing update queries, where documents can be added or removed at the expense of leaking more information to the server. Two important privacy notions are addressed in DSSE schemes: forward and backward privacy. The first one prevents associating the newly added documents with previously issued search queries. While the second one ensures that the deleted documents cannot be linked with subsequent search queries. Backward has three formal types of leakage ordered from strong to weak security: Type-I, Type-II, and Type-III. In this paper, we propose a new DSSE scheme that achieves Type-II backward and forward privacy by generating fresh keys for each search query and preventing the server from learning the underlying operation (del or add) included in update query. Our scheme improves I/O performance and search cost. We implement our scheme and compare its efficiency against the most efficient backward privacy DSSE schemes in the literature of the same leakage: MITRA and MITRA*. Results show that our scheme outperforms the previous schemes in terms of efficiency in dynamic environments. In our experiments, the server takes 699ms to search and return (100,000) results.
Although the Basic RRT algorithm is considered a traditional search method, it has been widely used in the field of robot path planning (manipulator and mobile robot), especially in the past decade. This algorithm has many features that give it superiority over other methods. On the other hand, the Basic RRT suffers from a bad convergence rate (it takes a long time until finding the goal point), especially in environments with cluttered obstacles, or whose targets are located in narrow passages. Many studies have discussed this problem in recent years. This paper introduces an improved method called (Hybrid RRT-A*) to overcome the shortcomings of the original RRT, specifically slow convergence and cost rate. The heuristic function of A-star algorithm is combined with RRT to decrease tree expansion and guide it towards the goal with less nodes and time. Various experiments have been conducted with different environment scenarios to compare the proposed method with the Basic RRT and A-star under the same conditions, which have shown remarkable performance. The time consumed to find the path of the worst one of these scenarios is about 4.9 seconds, whereas it is 18.3 and 34 for A-star and RRT, respectively.
Efficient energy collection from photovoltaic (PV) systems in environments that change is still a challenge, especially when partial shading conditions (PSC) come into play. This research shows a new method called Maximum Power Point Tracking (MPPT) that uses fuzzy logic and neural networks to make PV systems more flexible and accurate when they are exposed to PSC. Our method uses a fuzzy logic controller (FLC) that is specifically made to deal with uncertainty and imprecision. This is different from other MPPT methods that have trouble with the nonlinearity and transient dynamics of PSC. At the same time, an artificial neural network (ANN) is taught to guess where the Global Maximum Power Point (GMPP) is most likely to be by looking at patterns of changes in irradiance and temperature from the past. The fuzzy controller fine-tunes the ANN’s prediction, ensuring robust and precise MPPT operation. We used MATLAB/Simulink to run a lot of simulations to make sure our proposed method would work. The results showed that combining fuzzy logic with neural networks is much better than using traditional MPPT algorithms in terms of speed, stability, and response to changing shading patterns. This innovative technique proposes a dual-layered control mechanism where the robustness of fuzzy logic and the predictive power of neural networks converge to form a resilient and efficient MPPT system, marking a significant advancement in PV technology.
Today, the trends are the robotics field since it is used in too many environments that are very important in human life. Covid 19 disease is now the deadliest disease in the world, and most studies are being conducted to find solutions and avoid contracting it. The proposed system senses the presence according to a specific injury to warn of it and transfer it to the specialist doctor. This system is designed to work in service departments such as universities, institutes, and all state departments serving citizens. This system consists of two parts: the first is fixed and placed on the desk and the other is mobile within a special robot that moves to perform the required task. This system was tested at the University of Basrah within the college of engineering, department of electrical Engineering, on teaching staff, students, and staff during the period of final academic exams. The presence of such a device is considered a warning according to a specific condition and isn’t a treatment for it, as the treatment is prescribed by the specialist doctor. It is found that the average number of infected cases is about 3% of the total number of students and the teaching staff and the working staff. The results were documented in special tables that go to the dean of the college with the attendance tables to know the daily health status of the students.