The article discusses the problems of wear of the feeding machine rollers associated with speed mismatch in the material tracking mode. Existing methods of dealing with wear and tear struggle with the effect of the problem not the cause. One of the ways to reduce the intensity of wear of roller barrels is to develop a method of controlling the speed of the feeding machin, which reduces the mismatch between the speeds of rollers and rolled products without violating the known technological requirements for creating pulling and braking forces. Disclosed is an algorithm for calculating speed adjustment based on metal tension which compensates for roller wear and reduces friction force. Modeling of the system with the developed algorithm showed the elimination of speed mismatch during material tracking and therefore it will reduce the intensity of roller wear.
Keywords: speed correction system, feeding machine, roller wear, metal tension, control system, speed mismatch, friction force reduction
Relevance of the research topic. Modern cyber attacks are becoming more complex and diverse, which makes classical methods of detecting anomalies, such as signature and heuristic, insufficiently effective. In this regard, it is necessary to develop more advanced systems for detecting network threats based on machine learning and artificial intelligence technologies. Problem statement. Existing methods of detecting malicious traffic often face problems associated with high false-positive response and insufficient accuracy in the face of real threats on the network. This reduces the effectiveness of cybersecurity systems and makes it difficult to identify new attacks. The purpose of the study. The purpose of this work is to develop a malicious traffic detection system that would increase the number of detected anomalies in network traffic through the introduction of machine learning and AI technologies. Research methods. To achieve this goal, a thorough analysis and preprocessing of data obtained from publicly available datasets such as CICIDS2017 and KDD Cup 1999 was carried out.
Keywords: anomaly detection, malicious traffic, cybersecurity, machine learning, artificial intelligence, signature methods
The main maintenance of a diversification of production as activity of subjects of managing is considered. being shown in purchase of the operating enterprises, the organizations of the new enterprises, redistribution of investments in interests of the organization and development of new production on available floor spaces. The most important organizational economic targets of a diversification of management are presented by innovative activity of the industrial enterprise.
Keywords: construction, lean manufacturing, process approach, value creation process, IDEF0 notation, kpi
The current problems of the construction industry are considered and an algorithm for the introduction of modern, flexible management methodologies to improve the efficiency of the management process in construction design organizations is proposed, as well as a variant of an integrated efficiency assessment system taking into account KPIs is being developed.
Keywords: construction, design organizations, KPIs, flexible management, Agile, Lean manufacturing, stakeholders, efficiency
Modern digitalization processes involve the use of intelligent systems at key stages of information processing. Given that the data available for intelligent analysis in organizational systems are often fuzzy, there is a problem of comparing the corresponding units of information with each other. There are several known methods for such a comparison. In particular, for random fuzzy variables with known distribution laws, the degree of coincidence of these distribution laws can be used as a criterion for the correspondence of one random variable to another. However, this approach does not have the necessary flexibility required to solve practical problems. The approach we propose allows you to compare fuzzy, fuzzy and clear, as well as clear and clear data. The paper will provide an example illustrating this approach. The material presented in the study was initially focused on managing organizational systems in education. However, its results can be extended to other organizational systems.
Keywords: fuzzy data, weakly structured problems, comparison criteria, hierarchy analysis method, systems analysis, fuzzy benchmarking
This study investigates the integration of piezoelectric elements with marine buoys for the purpose of utilising wave energy in autonomous marine devices. The buoy system was subjected to controlled wave conditions during testing, resulting in a peak voltage of 5.6 V and a maximum power of 40 microW. The findings indicate the viability of the system in powering low-power marine equipment. The integration of piezoelectric elements into marine buoy systems offers a cost-effective hybrid solution, making it a promising power source for powering buoys and sensors in remote offshore environments.
Keywords: wave energy conversion, sea waves, piezoelectric elements, wave height, wavelength
The purpose of research is to increase the level of specification of sentiment within the framework of sentiment analysis of Russian-language texts by developing a dataset with an extensive set of emotional categories. The paper discusses the main methods of sentimental analysis and the main emotional models. A software system for decentralizing data tagging has been developed and described. The novelty of this work lies in the fact that to determine the emotional coloring of Russian-language texts, an emotional model is used for the first time, which contains more than 8 emotional classes, namely the model of R. Plutchik. As a result, a new dataset was developed for the study and analysis of emotions. This dataset consists of 24,435 unique records labeled into 32 emotion classes, making it one of the most diverse and detailed datasets in the field. Using the resulting dataset, a neural network was trained that determines the author’s set of emotions when writing text. The resulting dataset provides an opportunity for further research in this area. One of the promising tasks is to enhance the efficiency of neural networks trained on this dataset.
Keywords: sentiment, analysis, model, Robert Plutchik, emotions, markup, text
With the development of low-orbit satellite Internet systems (NSIS), issues of ensuring effective operation in conditions of intentional interference come to the fore. One of the solutions is related to the use of systems using both OFDM methods and generators implementing frequency hopping (HF). Obviously, the more complex the algorithm for selecting operating frequencies, the more efficient the operation of the microwave. In the article, it is proposed to use the SPN cipher "Grasshopper" as a generator for selecting operating frequencies. As a result, the CCF system will have a high resistance to calculating operating frequency numbers by electronic warfare systems. However, failures and failures may occur during the operation of the SSC. To prevent their consequences, it is proposed to implement an SPN cipher using polynomial modular codes of residue classes (PMCC). One of the transformations in the "Grasshopper" is a nonlinear transformation that performs the substitution operation. It is obvious that the creation of a new mathematical model for performing a nonlinear transformation using MCCS will ensure the operation of the SPN-cipher-based RF generator in conditions of failures and failures.
Keywords: low-orbit satellite Internet systems, the Grasshopper SPN cipher, nonlinear transformations, modular codes of residue classes, mathematical model, fault tolerance, frequency hopping, polynomial modular code of residue classes
The article presents a technique for automated control of the gloss of chocolate bars based on machine vision, integrated into the functional scheme of automation of cooling and molding processes. The key factors affecting gloss are considered, existing control methods are analyzed and the need for continuous objective quality assessment is substantiated. To optimize the process, a digital simulation has been created in the R-PRO environment, which allows simulating various technological modes. The developed image processing algorithms calculate quantitative gloss values and form feedback with the control system, adjusting key production parameters. The proposed approach improves the accuracy of control, reduces the volume of defects and reduces the time for debugging equipment, creating conditions for the further development of full automation in the chocolate factory.
Keywords: chocolate, surface gloss, automation, machine vision, quality control, cooling and molding, digital simulation
The article presents the results of a study devoted to the development of an identification subsystem for an industrial process operator in a mobile simulator used for training and monitoring professional skills. The functional requirements for the operator identification subsystem based on neural network technologies, the processes of user interaction with the personality recognition subsystem, and loading a reference image for further identification of the operator during training and monitoring on the simulator are formalized using visual models in UML notation. A prototype of the subsystem has been developed based on the Kotlin programming language and the TensorFlow library. The developed image analysis subsystem has a high speed of face detection and initialization, reaching less than 0.5 s, which makes it especially effective for real-time tasks where performance plays a key role. Local data processing on mobile devices ensures protection of user privacy by eliminating data transfer to remote servers, which minimizes the risks of information leaks. Optimization of power consumption ensures long-term operation on devices with limited battery capacity, which makes the system convenient and practical to use. The considered subsystem is planned to be adapted for monitoring the formation of skills for working on equipment during operator training on mobile simulators. The subsystem, based on VR/AR technologies, as well as a trained neural network, will analyze data on user reactions in real time.
Keywords: mobile simulators, neural networks, user identification, professional training, UML diagrams, TensorFlow
The present study aims to explore the methodologies employed in practice to ascertain the parameters of processes occurring in supercritical fluid media. A primary focus of this investigation lies in the solubility of key components of the system in supercritical fluid solvents, with a view to understanding the limitations of mathematical models in qualitatively predicting solubility outside the investigated ranges of values. This analysis seeks to elucidate the potential challenges and opportunities in conducting experimental studies in this domain. However, within the domain of supercritical fluid technologies, the optimization of processes and the prediction of their properties is attainable through the utilization of models and machine learning methodologies, leveraging both accumulated experimental and calculated data. The present study is dedicated to the examination of this approach, encompassing the consideration of system input parameters, solvent properties, solute properties, and the designated output parameter, solubility. The findings of the present study demonstrate the efficacy of this approach in predicting the solubility process through machine learning.
Keywords: supercritical fluids, solubility of substances, solubility factors, solubility prediction, machine learning, residue analysis, feature importance analysis
Optimization of the composition of heavy cement concretes modified with a complex additive based on industrial waste (alumina-containing component - aluminum slag (ASH), spent molding mixture (OFS) using the PlanExp B-D13 software package is a three-factor planned experiment, according to the criteria: compressive strength on the 2nd and 28th days of hardening.
Keywords: heavy cement concretes, fast-hardening concretes, optimization, experiment planning, strength indicators, industrial waste, spent molding mixture, aluminum slag
The article is devoted to the development and implementation of a two-stage magnetometer calibration algorithm integrated into the navigation system of a small-class unmanned underwater vehicle. At the first stage, an ellipsoidal approximation method is used to compensate for soft iron and hard iron distortion, ensuring the correct geometric location of magnetometer measurements. The second stage of calibration involves a method for estimating rotation between the coordinate systems of the magnetometer and accelerometer using quaternions as rotation parameters. Experimental verification of the algorithm demonstrated its effectiveness. Following completion of the two-step calibration, calibration parameters were determined and their use confirmed good consistency between magnetometer readings and actual magnetic field data, indicating the feasibility of using this technique for calibrating magnetometers.. The proposed algorithm for two-stage magnetometer calibration does not require laboratory equipment and can be carried out under real-world operating conditions. This makes it possible to integrate it into the onboard software of unmanned underwater vehicles.
Keywords: calibration, magnetometer, accelerometer, MEMS sensor, AHRS, navigation system, unmanned underwater vehicle, ellipsoid approximation, quaternion, magnetic inclination
When evaluating student work, the analysis of written assignments, particularly the analysis of source code, becomes particularly relevant. This article discusses an approach for evaluating the dynamics of feature changes in students' source code. Various metrics of source code are analyzed and key metrics are identified, including quantitative metrics, program control flow complexity metrics, and the TIOBE quality indicator. A set of text data containing program source codes from a website dedicated to practical programming, was used to determine threshold values for each metric and categorize them. The obtained results were used to conduct an analysis of students' source code using a developed service that allows for the evaluation of work based on key features, the observation of dynamics in code indicators, and the understanding of a student's position within the group based on the obtained values.
Keywords: machine learning, text data analysis, program code analysis, digital footprint, data visualization
The study presents an approach to modelling multivariate time series using parameterisation, using yield curve as an example. The effectiveness of adding parameterisation coefficients to predicates is evaluated, and new loss functions are proposed that focus on modelling the shape of the curve. Prediction models including LSTM, Prophet and hybrid combinations were applied. A Python-based system was developed to automate data processing and evaluation. The method improves the accuracy and interpretability of forecasts, offering a promising tool for financial modelling.
Keywords: machine learning, financial engineering, stock market modeling, bond market