The article is devoted to the study of the possibilities of automatic transcription and analysis of audio recordings of telephone conversations of sales department employees with clients. The relevance of the study is associated with the growth of the volume of voice data and the need for their rapid processing in organizations whose activities are closely related to the sale of their products or services to clients. Automatic processing of audio recordings will allow checking the quality of work of call center employees, identifying violations in the scripts of conversations with clients. The proposed software solution is based on the use of the Whisper model for speech recognition, the pyannote.audio library for speaker diarization, and the RapidFuzz library for organizing fuzzy search when analyzing strings. In the course of an experimental study conducted on the basis of the developed software solution, it was confirmed that the use of modern language models and algorithms allows achieving a high degree of automation of audio recordings processing and can be used as a preliminary control tool without the participation of a specialist. The results confirm the practical applicability of the approach used by the authors for solving quality control problems in sales departments or call centers.
Keywords: call center, audio file, speech recognition, transcription, speaker diarization, replica classification, audio recording processing, Whisper, pyannote.audio, RapidFuzz
The article addresses the issues of integration and processing heterogeneous data within a single company as well as during interaction between various participants of business processes under conditions of digital transformation. Special attention is given to collaboration between equipment manufacturers and industrial enterprises, emphasizing the importance of aligning and transforming data when interacting with heterogeneous information systems. The problem of integrating historical data, challenges arising from transitioning to new infrastructure, and a solution based on principles similar to those used by open standards such as OpenCL are discussed. Particular emphasis is placed on providing complete and consistent datasets, developing effective mechanisms for semantic integration, and using ontological approaches to address difficulties in comparing and interpreting diverse data formats. It highlights the necessity of continuously updating metadata dictionaries and establishing connections between different data sources to ensure high-quality and reliable integration. The proposed methods aim at creating sustainable mechanisms for exchanging information among multiple business entities for making informed management decisions.
Keywords: digital transformation, heterogeneous systems, erp/mes systems, ontology, semantic integration, metadata, data mapping
The article discusses the structure and principle of operation of an improved centrifugal unit for mixing bulk materials. A special feature of which is the ability to control mixing modes. Due to its design, the selection of a rational position of the bump makes it possible to provide such conditions for the impact interaction of particle flows, in which a high-quality homogeneous mixture of components is formed, the particles of which have different sizes, shapes and other parameters. To characterize the resulting mixture, the coefficient of heterogeneity was used, the conclusion of which is based on a probabilistic approach. A computational scheme of the rarefied flow formation process is given. An expression is derived for calculating the coefficient of heterogeneity when mixing bulk media, the particles of which have different sizes, shapes and other parameters. The research conducted in the article allows not only to predict the quality of the resulting mixture, but also to identify the factors that have the greatest impact on achieving the required uniformity.
Keywords: aggregate, bulk media, mixing, coefficient of heterogeneity, concentration, design scheme, particle size
The article discusses some methods for the construction of long-span coverings from precast reinforced concrete elements and prefabricated steel structures. To systematize these design and technological solutions and determine the effectiveness of their application based on the parameters of manufacturability, a comparative analysis was carried out. The construction technologies were compared according to the following parameters: specific and total labor intensity, the level of mechanization, the total number of elements, the average and maximum mass of one element, the total mass of the mounted elements, and the equilibrium coefficient. The analysis showed that for reinforced concrete structures, installation in blocks is most effective, involving preliminary enlargement at ground level, followed by lifting and installation in the design position. Precast reinforced concrete shells have a higher level of mechanization and a degree of equilibrium, which makes it possible to use crane equipment efficiently, but due to their considerable weight, they require the use of supporting structures and high-load cranes. The installation of prefabricated steel structures in its entirety with preliminary enlargement at ground level is the least laborious, but the need to install a large number of low-mass piece elements reduces manufacturability.
Keywords: installation of long-span structures, installation of triple-layer rotational shells of double curvature, installation of steel beam structures, installation of a spatial structural roof unit, installation of the entire roof structure as a single unit
The article discusses a software module developed by the authors for automatic generation of program code based on UML diagrams. The relevance of developing this module is due to the limitations of existing foreign code generation tools related to functionality, ease of use, support for modern technologies, as well as their unavailability in Russian Federation. The module analyzes JSON files obtained by exporting UML diagrams from the draw.io online service and converts them into code in a selected programming language (Python, C++, Java) or DDL scripts for DBMS (PostgreSQL, Oracle, MySQL). The Python language and the Jinja2 template engine were used as the main development tools. The operation of the software module is demonstrated using the example of a small project "Library Management System". During the study, a series of tests were conducted on automatic code generation based on the architectures of software information systems developed by students of the Software Engineering bachelor's degree program in the discipline "Design and Architecture of Software Systems". The test results showed that the code generated using the developed module fully complies with the original UML diagrams, including the structure of classes, relationships between them, as well as the configuration of the database and infrastructure (Docker Compose). The practical significance of the investigation is that the proposed concept of generating program code based on visual models of UML diagrams built in the popular online editor draw.io significantly simplifies the development of software information systems, and can be used for educational purposes.
Keywords: code generation, automation, python, jinja2, uml diagram, json, template engine, parsing, class diagram, database, deployment diagram
This article presents a structured approach to deploying and integrating Grafana, Loki, and Alloy in Kubernetes environments. The work was performed using a cluster managed via Kubespray. The architecture is focused on ensuring external availability, high fault tolerance, and universality of use.
Keywords: monitoring, ocestration, containerization, Grafana, Loki, Kubernetes, Alloy
Currently, one of the main factors influencing the formation of architecture is the functional purpose of the building, since it determines the essence of the architectural object. The purpose of the scientific work is to study the influence of building functions on the historical architecture of Europe and their impact on the development of modern architecture. This article sets the objectives of studying the classification of functional purposes of buildings, conducting a retrospective analysis of the development and formation of architectural styles in Europe, based on world design experience and the conducted research to identify the influence of the building function on its planning and volumetric-spatial solutions in the process of architecture development. The research method is the analysis of the historical architecture of Europe from the time of the inception of architecture to the present day, carried out on the basis of world design experience in different eras. In the course of the study, four main trends in the development of the functions of modern architecture were identified: integration with nature, creation of adaptive spaces, multifunctionality and development of new functions. It is concluded that the building function played the most important role throughout the entire period of architecture formation, which led to the emergence of a huge variety of building types today and made a significant contribution to the development of architecture of the XXI century.
Keywords: architecture, historical architecture, architectural style, functional purpose, European architecture, building type, retrospective analysis, function, influence, development
Choosing a programmable logic controller is one of the most important tasks when designing an automated system. The modern market offers many options, different in characteristics, which have different priorities for production. The paper proposes a method for evaluating the overall effectiveness of software logic controllers. When evaluating the selected characteristics, linear scaling and weight coefficients are introduced that take into account the importance of the parameter for the controller in question compared to others. The weight of the parameter in the calculation is set using a coefficient. The values of the weight coefficients may vary depending on the requirements of the technological process.
Keywords: programmable logic controller, efficiency evaluation method, weight ratio, petal diagram
This paper considers the problem of task scheduling in manufacturing systems with multiple machines operating in parallel. Four approaches to solving this problem are proposed: pure Monte Carlo Tree Search (MCTS), a hybrid MCDDQ agent combining reinforcement learning based on Double Deep Q-Network (DDQN) and Monte Carlo Tree Search (MCTS), an improved MCDDQ-SA agent integrating the Simulated Annealing (SA) algorithm to improve the quality of solutions, and a greedy algorithm (Greedy). A model of the environment is developed that takes into account machine speeds and task durations. A comparative study of the effectiveness of methods based on the makespan (maximum completion time) and idle time metrics is conducted. The results demonstrate that MCDDQ-SA provides the best balance between scheduling quality and computational efficiency due to adaptive exploration of the solution space. Analytical tools for evaluating the dynamics of the algorithms are presented, which emphasizes their applicability to real manufacturing systems. The paper offers new perspectives for the application of hybrid methods in resource management problems.
Keywords: machine learning, Q-learning, deep neural networks, MCTS, DDQN, simulated annealing, scheduling, greedy algorithm
This article is devoted to the study of the possibilities of machine learning technology for forecasting the demand for goods. The study analyzes various models and the possibilities of their application as part of the task of predicting future sales. The greatest attention is focused on modern methods of time series analysis, in particular neural network and statistical approaches. The results obtained during the study clearly demonstrate the advantages and disadvantages of different models, the degree of influence of their parameters on the accuracy of the forecast within the framework of the demand forecasting task. The practical significance of the findings is determined by the possibility of using the results obtained in the analysis of a similar data set. The relevance of the study is due to the need for accurate forecasting of demand for goods to optimize inventory and reduce costs. The use of modern machine learning methods makes it possible to increase the accuracy of predictions, which is especially important in an unstable market and changing consumer demand.
Keywords: machine learning algorithms, demand estimation, forecasting accuracy, time sequence analysis, sales volume prediction, Python, autoregressive integrated moving average, random forest, gradient boosting, neural networks, long-term short-term memory
This paper explores the content-based filtering approach in modern recommender systems, focusing on its key principles, implementation methods, and evaluation metrics. The study highlights the advantages of content-based systems in scenarios that require deep object analysis and user preference modeling, especially when there is a lack of data for collaborative filtering.
Keywords: сontent - oriented filtering, recommendation systems, feature extraction, similarity metrics, personalization
The article presents the results of comparing numerical modeling of wooden structures with laboratory and full-scale tests. In the course of the work, numerical models of the material were created in the Ansys Workbench software package from volumetric finite elements with a variant set of physico-mechanical parameters simulating the behavior of real wood. The simulation parameters were based on the laboratory testing results of a solid wood beam. The simulation results were compared with the full-scale test results of a composite wood slab. Modeling of constructions was carried out in the form of linear, bilinear and multilinear models.
Keywords: solid wood beam, composite wood slab, bilinear finite element model, multilinear finite element model, stress-strain state
The article is devoted to the study of the influence of the choice of the calculation scheme on the accuracy of the engineering assessment of the behavior of monolithic reinforced concrete frame structures. Various types of models are considered: rod, plate and volumetric, taking into account both linear and physical nonlinearity. It is emphasized that the adequacy of accounting for the spatial interaction of elements, the reliability of the assessment of forces and stresses, as well as the possibility of optimizing design solutions, especially under seismic and wind loads, depend on the correctness of the adopted calculation scheme. As part of the study, a single-span reinforced concrete frame was modeled, the load on which varied from 5 to 55 kN. A comparison of the calculated results with experimental data was carried out. It is shown that models that take into account physical nonlinearity and use more detailed modeling (for example, volumetric finite elements) provide the greatest accuracy in predicting deflections and stresses in the structure. The obtained results confirm the necessity of a careful approach to the choice of the calculation scheme in design, especially in the design of high-rise buildings and structures in seismically dangerous areas. Recommendations are made on the rational use of models of different levels of detail in engineering practice.
Keywords: linear calculation, nonlinear calculation, frames, reinforced concrete, deflections, modeling
Introduction: Mobile Gaming Addiction (MGA) has emerged as a significant public health concern, with the World Health Organization recognizing it as a gaming disorder. Russia, with its growing mobile gaming market, is no exception. Aims and Objectives: This study aims to explore the feasibility of using neural networks for early MGA detection and intervention, with a focus on the Russian context. The primary objective is to develop and evaluate a neural network-based model for identifying behavioral patterns associated with MGA. Methods: A proof of concept study was conducted, employing a simplified neural network architecture and a dataset of 101 observations. The model's performance was evaluated using standard metrics, including accuracy, precision, recall, F1-score, and AUC-ROC score. Results: The study demonstrated the potential of neural networks in detecting MGA, achieving an F1-score of 0.75. However, the relatively low AUC-ROC score (0.58) highlights the need for addressing dataset limitations. Conclusion: This study contributes to the growing body of literature on MGA, emphasizing the importance of considering regional nuances and addressing dataset limitations. The findings suggest promising avenues for future research, including dataset expansion, advanced neural architectures, and region-specific mobile applications.
Keywords: neural networks, neural network architectures, autoencoder, digital addiction, gaming addiction, digital technologies, machine learning, artificial intelligence, mobile game addiction, gaming disorder
The article focuses on the application of machine learning methods for predicting failures in industrial equipment. A review of modern approaches such as Random Forest, SVM, and XGBoost is presented, with emphasis on their accuracy, robustness, and suitability for engineering tasks. Based on the analysis of real-world data (temperature, pressure, vibration, humidity), models were trained and compared, with XGBoost demonstrating the best performance. Key parameters influencing failures were identified, and a recommendation system was proposed, combining statistical analysis and predictive modeling. The developed solution enables timely detection of failure risks and optimization of maintenance processes.
Keywords: machine learning, predictive modeling, equipment management, failure prediction, data analysis