Information Science and Technology
↳ Modern Technology
UAS design using MATLAB and R
Introduction
The programming languages that are used in Unmanned Aerial Systems play a cardinal role to influence their functionality as well as performance. Operators basically use the languages for various purposes such as data gathering, analysis, mapping, database retrieval, as well as project presentation. According to Ismail et al (2018), UAS systems work by showcasing advanced behaviour as there is a limited level of human intervention. Thus the programming languages that are used act as the main medium to shape how these innovative systems respond and react to diverse situations (Ismail et al., 2018).
In the current times there exist a broad range of programming languages such as Matlab, R, Java, Python, etc that can be integrated into drones. But the UAS operators must make sure that a suitable programming language is chosen which simplified how the systems perform. The application of popular languages including MATLAB and R has been critically explored to understand how they contribute to UAS designing.
Use of programming language in UAS designing
Unmanned Aerial Systems are the autonomous vehicles that can be remotely controlled by the operator. As there is no pilot on board in these vehicles, the operators rely on the communication and engagement that they have with the system which influences the operational and performance aspects. The programming languages that are used act as the medium which connects the operator with a UAS system. According to Mottola, autonomous drones are extremely powerful devices whose capabilities are influenced by the language that is used. The language acts as the chief tool which enables the autonomous system with reason with the external space and proceeded in the intended path in a seamless manner. The use of a suitable programming language plays a key role in drones as it allows the programmers to deal with concurrent issues and problems that could exist between the drone operations in the field and the data processing aspect. The use of a suitable programing language in UAS systems influences the sensing application. It enables the operator to have a better degree of control over how the vehicle functions with limited human intervention (Mottola et al., 2014). The role of programming languages is considered to be indispensable in Unmanned Aerial Systems. This is because the language enables the system to perform a level and stable flight which is the most fundamental and important function of the drone.
Use of MATLAB in the designing of Unmanned Aerial Systems
MATLAB is a programming language that is commonly known as matrix laboratory. It serves as a multi-paradigm programming language as well as a numeric-based computing environment. It is considered to be very effective as compared to other programming languages. While other languages typically work with numbers in a sequential manner, MATLAB operates on whole matrices as well as arrays (Language Fundamentals. Language Fundamentals - MATLAB & Simulink - MathWorks India, 2020).
A number of studies has been carried out in the recent times that shed light on how the programing language influences the functional and technical aspects of an Unmanned Aerial System. According to Alarcon and Santamaria, the model-based design of a UAS system acts as the very backbone of its development phase. The programming language that is implemented in the design has a cardinal impact on the functioning of the core design elements. The authors in their study have made use of MATLAB/Simulink code generation tools. The programming language that is used in drones impacts the degree of control over the robotic components of the system such as the arms (Santamaría et al., 2012).
According to the study by Bagheri, MATLAB acts as an extremely powerful tool for engineers working on UAS systems as it enables them to simulate and analyse a diverse range of mathematical models as well as control systems. For instance, the programming language has MATLAB Lab Aerospace Toolbox/ Blockset which helps in the simulate of aircraft control and dynamics. The toolbox in the programming language fundamentally helps to visualize and assess the aerospace vehicle and thus assists in the smooth functioning of UAS systems with limited human interaction (Bagheri, 2014,p . 58).
The functional aspects of an unmanned aerial system can be categorized into aircraft or platform, sense, perceive, plan and decide, control and connect or communicate. As humans have restricted involvement when it comes to handling these activities, the responsibility of each of these functions lies on the system and its technical components. An operator merely provides direction to the autonomous machine. In such a scenario, the role of the MATLAB programming language is pivotal as it impacts the management of sensor data. It basically assists in the designing and development of autonomous algorithms and supports interface with sensors and other applications. The properties of the language support persistent memory within the autonomous object which shapes the function for each and every phase of the system including validation, initialization, and reset.
Role of Simulink in UAS
Simulink is a well-known MATLAB-based graphical programming environment that is used for the purpose of modelling, performing simulation and examining multi-domain dynamical systems. In the Unmanned Aerial System context, the block diagram environment is considered to play an important role as it significantly contributes in the design process. It primarily facilitates system-level design and supports the testing of the device before it is actually modelled as a drone. Simulink is considered to be very useful by operators as well as engineers who work on UAS designing aspects as it assists them while working on automatic code generation, supports continuous testing and facilitates the verification of the embedded systems in the Unmanned Aerial System. According to the study by Casaso, MATLAB-based Simulink offers requisite support in drones by deploying flight control algorithms. It is considered to be very effective in nature as it integrates the ‘model-based design’ method which is considered to be a necessity in the designing of cyber-physical equipment (Casado & Bermúdez, 2021). Simulink also influences the navigation system and the vision subsystem which is responsible to process the images that have been captured by the camera. It also supports the flow of information which impacts the overall operations of the UAS systems (Casado & Bermúdez, 2021, p. 3).
Use of R programming language in the designing of Unmanned Aerial Systems
R is a common programming language that is used for performing a diverse range of functions such as statistical analysis, reporting as well as graphical representations. This environment is considered to be suitable in different kinds of domains including the field of Unmanned Aerial Systems (What is R, 2020).
In the aerospace context, the R language is considered to be one of the best programming languages which influences various elements such as the ability to interface with the hardware and the ease of conducting tests. One of the main attributes of the language that significantly contributes in drone designing is that all the actions of the R language are performed on the objects that are stored in the active memory of the system and thus no temporary files are used. The data that is collected and interpreted by using the language can be easily used from a remote server via the internet by the engineer to work on the technicalities of the UAS system (Paradis, 2005). The high-quality graphical output that is generated by using the programming language simplifies the mapping process. In addition to this, the language enables the operator to carry out a comprehensive spatial analysis by using a broad range of statistical tests. Some of the key functions of the programming language that assist in the designing of Unmanned Aerial Systems include data manipulation, linear and non-linear statistical modelling, etc. The environment allows a UAS operator to create a function and pass on control to it along with suitable arguments. Once the intended actions have been performed, the function gives back control to the operator and thus assists in designing and streamlining the automatic components of a drone.
Basic needs of a UAS application
An unmanned Aerial System needs a number of elements without which it cannot function. Giordan and Adams have identified several cardinal components of UAS systems. It is necessary to take into account various aspects so that the basic requirements of a UAS application can be taken care. Some of the key considerations relate to the ease of use, cost aspects, transportability, and the ability to function in diverse areas (Giordan et al., 2020). As a UAS system is an aerial system which functions by using technology, it is very important to lay emphasis on the hardware as well as the software requirements. The core technical components that are required for an Unmanned Aerial System to function include the airframe, the navigation system, and the control systems. The programming language that is used in a UAS acts as the binding components that ensure that each of the elements of a drone operates and functions in a streamlined manner (Giordan et al., 2020). It is necessary to take into consideration the security aspects as well while working on a UAS application. According to Siddappaji, a solid cybersecurity framework can safeguard a UAS application and its technologies and processes from unauthorized access by online criminals and attackers (Siddappaji & Akhilesh, 2020).
Conclusion
Unmanned Aerial Systems rely on various technical elements and components to effectively and successfully function remotely. The programming language that is used by the operators and engineers influences various design aspects of these automated systems. MATLAB and R are two of the most effective programming languages that influence various activities and functions in UAS or drone systems such as data collection and data analysis, mapping, database retrieval, and project presentation. In fact, the language that is selected in a UAS application acts as the ultimate medium between the UAS operator and the autonomous machine. It helps to establish a connection between them so that the autonomous vehicle can be remotely handled.
References
Bagheri, S. (2014). Modeling, simulation and control system design for civil unmanned aerial vehicle (uav).
Casado, R., & Bermúdez, A. (2021). A Simulation Framework for Developing Autonomous Drone Navigation Systems. Electronics, 10(1), 7.
Giordan, D., Adams, M. S., Aicardi, I., Alicandro, M., Allasia, P., Baldo, M., ... & Troilo, F. (2020). The use of unmanned aerial vehicles (UAVs) for engineering geology applications. Bulletin of Engineering Geology and the Environment, 79(7), 3437-3481.
Language Fundamentals. Language Fundamentals - MATLAB & Simulink - MathWorks India. (2020). https://in.mathworks.com/help/matlab/language-fundamentals.html.
Paradis, E. (2005). R for Beginners (pp. 37-71). Institut des Sciences de l'Evolution. Université Montpellier II.
Santamaría, D., Alarcón, F., Jiménez, A., Viguria, A., Béjar, M., & Ollero, A. (2012). Model- based design, development and validation for UAS critical software. Journal of Intelligent & Robotic Systems, 65(1), 103-114.
Siddappaji, B., & Akhilesh, K. B. (2020). Role of cyber security in drone technology. In Smart Technologies (pp. 169-178). Springer, Singapore.
What is R? R. (2020). https://www.r-project.org/about.html.
Information Science and Technology
↳ Modern Technology
Significance of statistical programming languages
Typically, programming languages are widely used by software developers to develop applications and software systems. However, statistical programming languages such as R, SQL, SAS, Python and MATLAB are widely used by data scientists and data analysts to develop algorithms to process data. Statistical programming languages and visual programming languages such as RapidMiner play a pivot role in enabling data scientists and analysts process data. These statistical programming languages allow the processing of all forms of data such as structured, unstructured and semi-structured data (Chen & et al., 2005). Having robust analytical skills demands that the data scientists must have extensive knowledge in using one or more statistical programming languages.
One of the primary advantages of statistical programming languages is that they enable data scientists to develop algorithms to process the data. Most of the statistical programming languages are coded and require the data scientists and analysts to have knowledge of the syntax of the language. Others such as RapidMiner are visual platforms that allow the analysts to develop algorithms using drag-and-drop techniques.
Statistical programming languages also provide the means and techniques to process data. For instance, they provide algorithms to conduct classification, clustering, regression analysis and association rule mining. Advancement in technology and the embracement of artificial intelligence and machine learning in data mining has led to the automation of analytical algorithms. As a result, analytical tasks such as predictive modelling can be executed using automated systems, thus reducing the amount of time and energy required to be dedicated by the data scientists in analytical tasks (Chen & et al., 2005). Such platforms also help in the modelling and visualization of the results generated. This means that the data scientists can easily and effectively present and describe the results to the target audience since the results are presented in graphs and charts. In the long run, the statistical programming languages help data scientists to turn massive amounts of data into meaningful and actionable information.
The digitization of the current business environment has enabled people to collaborate in the execution of tasks. Similarly, data scientists and analysts from across the globe can collaborate in analytical tasks since the statistical programming languages provide a standardized platform to process data. Data scientists can also similar or different languages to determine the accuracy of analytical results (Raychev & et al., 2014). These capabilities have significantly improved the effectiveness and suitability of data analytical in the business world. On an overall note, both coded and visual statistical programming languages play a fundamental role in data analytics by providing the means to process and convert data into knowledge and useful information.
Advantages of R
R is a statistical programming language that is typically developed and edited using R Studio. The tool is one of the most applied statistical programming languages, and extensive popularity can be attributed to the tool's numerous advantages. One of the notable benefits of R is that it is open-source software. This means data scientist can use the platform freely and can review its source code to understand its functionality as well as determine ways to improve it. R is a free platform, thus eliminating the need to pay for license or subscription fee when working with it or an integral part of R (Matloff, 2011). another significant advantage of R is that it is highly compatible with other technologies.
R is platform-independent, meaning that it does not rely on the underlying hardware and software resources when being installed (Matloff, 2011). Additionally, R can be used directly from the internet without installing it on a computer. R is also highly compatible with other database management systems and data processing technologies. The open-source nature of nature enables developers and data scientists to point out errors and areas where the platform could further be improved. As a result, R is under constant improvement, with every version of R addressing the challenging the challenges that limited the functionality of the previous versions. R is still an immature technology, and continuous development promise to deliver a platform that has embraced all emerging data mining technologies.
Disadvantages of R
R is a coded programming language that requires data scientists and analysts to understand its language's syntax. R has a complicated syntax, thus making it difficult to learn, especially for inexperienced developers. As a result, many data scientists and analysts prefer other simpler languages such as Python whose syntax is similar to that of the natural language. Another significant disadvantage is that R is relatively slower compared to other statistical programming languages (Matloff, 2011). R is, therefore not suitable in analytical tasks that process data and generate results in real-time. R also requires more memory space compared to other analytical tools since it stores data in more objects. R does not also provide enhanced security features that determine whether a statistical programming platform is suitable for use in web applications. Future advancements of R are anticipated to address these challenges.
References
Chen, Y., Dios, R., Mili, A., Wu, L., & Wang, K. (2005). An empirical study of programming language trends. IEEE software, 22(3), 72-79.
Matloff, N. (2011). The art of R programming: A tour of statistical software design. No Starch Press.
Raychev, V., Vechev, M., & Yahav, E. (2014, June). Code completion with statistical language models. In Proceedings of the 35th ACM SIGPLAN Conference on Programming Language Design and Implementation (pp. 419-428).
Quantum Image Processing
INTRODUCTION
Image processing has become a popular and critical technology and field of study for our everyday lives. The need to extract important data from visual information arose in many fields like biomedicine, military, economics, industry, and entertainment [1]. Analysis and processing of images requires representing our 3D world in 2D spaces, using different complex algorithms to highlight and examine essential features [2]. With the rapid growth of the volume of visual information, these operations are requiring more computing power. According to Moore’s law, computing performance of classical computers doubles every 18 months. However, experts claim that this law will not hold true for very long [1]. Thus, classical computers will not be able to solve image processing problems with big sets of data within reasonable time limits.
Failure of the Moore’s law can be solved with quantum computation. Existence of more efficient quantum algorithms and their ability to perform calculations faster than classical computers was shown by researchers [1]. Quantum computing also can dramatically improve areas of image processing [2]. Applying quantum computation to image processing tasks is rereferred as Quantum Image Processing or QIMP. This paper will review the basics of quantum image processing and computation, go into its use, with focus on security technologies, and discuss the challenges and future of QIMP.
QUANTUM COMPUTATION
A new method of computation known as quantum computing could completely change the field of computer science. In 1982, the late Nobel Prize-winning physicist Richard Feynman began exploring the possibilities of using quantum systems for computing [1]. He was interested in modeling quantum systems on computers. He realized that the number of particles has an exponential effect on the amount of classical memory needed for a quantum system. Thus, when simulating 20 quantum particles, only 1 million values need to be stored, while when simulating 40 quantum particles, 1 trillion values need to be stored. It's impossible to do interesting simulations with 100 or 1000 particles, even with all the computers on Earth [2]. Thus, the concept of using quantum mechanical effects to perform calculations was developed when he proposed the creation of computers that used quantum particles as a computational resource that could model general quantum systems for mass simulation. Researchers have taken a closer look at the processing capability of quantum systems as a result of exponential storage capacity and some disturbing phenomena such as quantum entanglement [4]. Over the past 20 years, quantum computing has exploded, proving that it can solve some problems exponentially faster than any computer [3]. If quantum computers can be built massive enough, the best-known algorithm, Peter Shor's integer decomposition algorithm, will make it easier to break the most common encryption methods currently in use [1].
All modern mainstream computers fall under the category of classical computers, which operate on a "Von Neumann architecture," which is based on an abstraction of discrete chunks of information [1]. Since a computer must eventually be a physical device, scientists recently have moved away from this abstraction of computation and realized that the laws regulating computation should be derived from physical law. One of the most fundamental physical theories, quantum mechanics was a good candidate to investigate the physical feasibility of computational operations [5]. The important finding of this study is that quantum mechanics permits machines that are substantially more powerful than the Von Neumann abstraction.
Along with Shor's factoring algorithm, Lov Grover's search algorithm is a fantastic quantum technique that significantly lessens the amount of work required to look for a certain item. For instance, it takes an average of 500,000 operations on a classical computer to search through a million unsorted names for a given name, and the Von Neumann model of computing offers no faster method [1]. However, using Grover's approach, which takes use of quantum mechanics' parallelism, the name may be obtained with just 1,000 comparisons under the quantum model. Grover's approach outperforms the conventional one considerably more for longer lists.
The subject of quantum computing is huge and diverse today. There are researchers working on a variety of topics, from the creation of physical devices employing various technologies like trapped ions and quantum dots to those tackling challenging algorithmic problems and attempting to pinpoint the precise limits of quantum processing [5]. It has been established that quantum computers are inherently more powerful than classical ones, although it is still unclear how much more powerful they are. And a technological challenge is how to construct a large quantum computer [3].
So, quantum computation is still in its infancy. If the technical challenges are overcome, perhaps quantum computation will one day supersede all current computation techniques with a superior form of computation, just as decades of work have refined the classical computer from the bulky, slow vacuum-tube dinosaurs of the 1940s to the sleek, minimalist, fast transistorized computers that are now widely used. All of this is based on the peculiar laws and procedures of quantum physics, which are themselves anchored in the peculiarities of Nature. What computers will be derived from more complex physical theories like quantum field theory or superstring theory remains to be seen.
BACKGROUND
The field of quantum image processing aims to adapt traditional image processing techniques to the quantum computing environment. Its main focus is on using quantum computing technologies to record, modify, and recover quantum pictures in various formats and for various goals. It is believed that QIMP technologies would offer capabilities and performances that are yet unmatched by their traditional equivalents because of some of the astonishing aspects of quantum processing, including entanglement and parallelism. These enhancements could be in the form of increased computer speed, ensured security, reduced storage needs, etc [3].
The first published work connecting quantum mechanics to image processing was Vlasov's work from 1997. It concentrated on using a quantum system to distinguish orthogonal images. Then, efforts were made to look for certain patterns in binary images and identify the target's posture using quantum algorithms. In 2003 publication of Venegas-Andraca and Bose's Qubit Lattice description for quantum pictures greatly contributed to the research that gave rise to what is now known as QIMP. The Real Ket, which Lattorre developed as a follow-up representation, was designed to encode quantum pictures as a foundation for more QIMP applications [1][3].
The proposal of Flexible representation for quantum images by Le et al. genuinely sparked the research in the context of current descriptions of QIMP. This might be explained by the adaptable way in which it enables the integration of the quantum picture into a normalized state, which makes it easier for auxiliary transformations on the image's contents. Since the FRQI, a wide range of computational frameworks that focus on the spatial or chromatic content of the picture have also been presented, along with numerous alternative quantum image representations (QIRs).
The representative QIRs that can be linked back to the FRQI representation include the multi-channel representation for quantum images (MCQI) and novel enhanced quantum image representation (NEQR). The development of algorithms to alter the location and color information encoded using the FRQI and its several variations has also received a lot of attention in QIMP [5]. For instance, it was initially suggested to use FRQI-based fast geometric transformations, which include swapping, flipping, rotations, and restricted geometric transformations to limit these operations to a specific region of an image [3]. Recent discussions have focused on quantum image scaling and NEQR based quantum image translation, which transfer each picture element's position in an input image to a new position in an output image. While single qubit gates like the X, Z, and H gates were initially used to propose FRQI-based broad forms of color transformations. Later, MCQI-based channel of interest operator, which involves moving the preselected color channel's grayscale value, and channel swapping operator, which involves switching the grayscale values of two channels, were further studied [3].
Researchers always prefer to mimic the digital image processing jobs based on the QIRs that we already have in order to demonstrate the viability and competence of QIP methods and applications. Researchers have so far made contributions to quantum image feature extraction, quantum image segmentation, quantum image morphology, and quantum image comparison using the fundamental quantum gates and mentioned operations [5]. QIMP based security technologies in particular have drawn a lot of interest from researchers.
IV. SECURITY TECHNOLOGIES
The necessity for secure communication has developed along with mankind's need to transfer information. With the development of digital technology, the demand for secure communication has increased. In order to realize secure, effective, and cutting-edge technologies for cryptography and information concealment, QIMP is totally based on the extension of digital image processing to the quantum computing domain [3]. Indeed, quantum computation and QIMP offer the potential for secure communication in fields like encryption, steganography, and watermarking.
Encryption is the practice of hiding information to render it unintelligible to those lacking specialized knowledges as a direct application of the science of cryptography. This is frequently done for confidential communications in order to maintain confidentiality. Information hiding focuses on hiding the existence of messages, whereas cryptography is concerned with safeguarding the content of messages. Since attackers cannot easily detect information hidden using techniques like steganography and watermarking, it appears to be safer [3]. The high requirements for the quantity of information that can be concealed under the cover image without changes to its perceived imperceptibility are one of its key limitations, though. Even though steganography and watermarking are similar, they have different goals and/or applications as well as different needs for those goals [3]:
In watermarking, the carrier image is the obvious content, but the copyright or ownership is concealed and subject to authentication. In the instance of steganography, it aims to safely transmit the secret message by disguising it as an insignificant component of the carrier image without raising any red flags with outside opponents.
Information is concealed through watermarking in the form of a stochastic serial number or an image, such a logo. As a result, watermarked photos typically contain some little copyright ownership information. Steganography frequently needs a huge carrying capacity in terms of the carrier picture because its goal is to conceal the presence of the concealed message.
When watermarking, the content can be subject to many sorts of infringements, such as cropping, filtering, channel noise, etc., whereas steganography pictures don’t face such issues.
FUTURE DIRECTIONS AND CONCLUSIONS
Research is concentrated on what can be accomplished with quantum technologies once increased realization has been achieved, beyond the continuing work toward the physical implementation of quantum computer hardware [3]. One of these is the nexus of quantum computation with image processing, which is known as quantum image processing. Researchers are confronting both enormous potential and problems to create more effective and usable services because it is a relatively new phenomenon.
All the experimental QIP protocol implementations that have taken place so far have been limited to using traditional PCs and MATLAB simulations built on linear algebra using complex vectors as quantum states and unitary matrices as unitary transforms [5]. These provide a fairly constrained implementation of the potential of quantum computation. Therefore, it is crucial to understand the function of quantum computing software needed to implement the various algorithms that we have in order for them to complement the hardware as researchers intensify their efforts to advance and expand QIP technology [3].
REFERENCES
Beach, G., Lomont, C., & Cohen, C. (2003, October). Quantum image processing (quip). In 32nd Applied Imagery Pattern Recognition Workshop, 2003. Proceedings. (pp. 39-44). IEEE.
Anand, A., Lyu, M., Baweja, P. S., & Patil, V. (2022). Quantum Image Processing. arXiv preprint arXiv:2203.01831.
Yan, F., Iliyasu, A. M., & Le, P. Q. (2017). Quantum image processing: a review of advances in its security technologies. International Journal of Quantum Information, 15(03), 1730001.
Cai, Y., Lu, X., & Jiang, N. (2018). A survey on quantum image processing. Chinese Journal of Electronics, 27(4), 718-727.
Ruan, Y., Xue, X., & Shen, Y. (2021). Quantum image processing: opportunities and challenges. Mathematical Problems in Engineering, 2021.
Peli, T., & Malah, D. (1982). A study of edge detection algorithms. Computer graphics and image processing, 20(1), 1-21
Information Science and Technology
↳ Modern Technology
Numerical Analysis
Computer Software and Modern Applications.
Introduction
In this article, we are going to discuss how computer software has helped students and users in numerical data analysis with practical issues like languages used in programming. In addition, the interaction between numerical computation and symbolic computation will also be reviewed.
Models used in numerical analysis
Mathematical modeling and numerical analysis have been very important in current life affairs. Pragmatic numerical analysis software has been integrated into most software packages, such as programs in the spreadsheet, enabling people to perform mathematical modelling without prior knowledge of the processes involved. This, therefore, demands the installation of numerical analysis software which can be relied on by the analysts. Designing of Problem solving environments (PSE) enables us to solve and model many situations. Interface for graphical users has made PSE for modeling a given situation easy with good models for mathematical theories. There have been modern applications of numerical analysis in computer software of late in many fields. For example, computer-aided manufacturing and computer-aided design in the engineering sector have led to improved PSEs to be developed for both CAD and CAM. Mathematical models in this field are based on the basic laws of newton on mechanics. Algebraic expressions and ordinary differential equations are mostly involved in mathematical models. Manipulation of the mixed systems of these models is very difficult but very important in the modeling of mechanical systems such as car simulators, plane simulators and other engine moving mechanicals needs real-time solving of differential-algebraic systems.
Atmospheric modeling is important in understanding the effects of human activities on the atmosphere. A great number of variables such as the velocity of the atmosphere in a given point at a given time, temperature, and pressure need to be calculated. In addition, chemicals in the atmosphere such as carbon dioxide, which is a pollutant, and their reactions need to be studied—studying velocity, pressure, and time which are defined with partial differential equations and the kinetic, chemical reactions which are defined using ordinary differential equations which very complex needs sophisticated software to handle. Businesses have incorporated the use of optimization methods in decision-making on efficient resource allocation. Locating manufacturing and storage facilities, inventory control problems and proper scheduling are some of the problems which require numerical analysis of optimization. (Brinkgreve, R. B. J. (1996).)
Numerical software sources
Fortran has remained the widely used programming language, and it keeps on being updated to meet the required standards, with Fortran 95 as the latest version. Other useful languages include C++, C, and java. In numerical data analysis, there are several numerical analysis software packages used. The following is a list of the packages used in data analysis with computer software. 1. Analytica software is a wide range of tools used in analyzing and generating numerical models. This is a language programmed visually and linked with influence diagrams. 2. FlexPro program is used in data analyzing and presenting measurement data. It has an excellent interface similar to Excel program with a vector programming language built in it. 3. GNU Octave-this is a high-end language used in the computation of numbers. It has a command- line interface that is used in numerically solving nonlinear and linear problems. Numerical experiments are solved with a language that is mostly suited with MATLAB. There are several newly developed programs of Linux such as cantor and KAlgebra, which offers Octave a GUI front ends. 4. Jacket is a MATLAB GPU tool that enables offloading of computations for MATLAB to the GPU to visualize data and acceleration purposes. 5. Pandas. It is a BSD- licensed python programming language tool used for the provision of data structures and data analysis. 6. Torch provides support for manipulation, analysis of statistics, and tensor presentation. 7. TK Solver is a commercialized tool by the universal technical system used in problem-solving and mathematical modeling software basically on rule-based and declarative language. 8. fit is a statistical analysis and curve-fitting plugin to excel. 9. GNU MCSim is a package for numerical integration and simulation with fast Markov chain Monte Carlo and Monte Carlo capabilities. 10. Sysquake is an application based on MATLAB-compatible language for computing environment with interactive graphics engineering, mathematics, and physics.( Conte, S. D., & De Boor, C. (2017).)
Software development tools
There have been efficient tools in the types of programming languages that creates computer solutions. The following are some of the basic qualities that a mathematical programming language should possess. First, a syntax that enables accurate and fast transformation from mathematical formulae into program statements should be possessed by the language. Also, the language should be rooted on primitives next similar to the basic concepts of mathematics. Lastly, tools for efficient and fast execution should be included in the language. The programming languages have been categorized into different generations: First Generation languages 1954-1958 (Fortran I, ALGOL 58, Flowmatic, and IPL V). Second-generation languages 1959-1961 (Fortran II, ALGOL 60, COBOL, and LISP). Third Generation Languages 1962-1970 (PL/1 (Fortran +COBOL+ALGOL), ALGOL 68, PASCAL, SIMULA, and APL. The Generation Gap 1970-1980. This had many languages which were different. (Bartholomew- Biggs, M. C. (2000).)
Software options for solving problems
There are three classes for software. (1) Compilers for language and graphic packages, which are basic tools. (2) Tools which can solve the users’ problems, such as systems in structuring engineering. (3) Widely applicable generic tools such as mathematical systems and compiler generators.
A course of several actions has to be undertaken for a numerical solution to be attained, which include: (1) using the existing black-box package. Those packages include PAFEC for elementary work and GENSTAT used in the analysis of statistics. (2) Application of library routines like IMSL, NETLIB, and NAG after splitting the problem in hand to components well defined. (3) Writing a whole purpose-built program sometimes which needs deep computing and analytical knowledge.
Numerical libraries
Design issues
Numerical libraries mainly perform normal numerical linear algebra operations, disintegration of singular value, and transformation of Fast Fourier, optimization of nonlinear problems, linear programming, curve fitting, quadrature, and performing special functions.
NAG
In May 1970, a group of six UK universities centers for computing decided to create a numerical routine library. A year later, they released Mark 1 of the NAG library, which contains 98 routines documented in it. In 1989 mark 12 had 688 routines and many other library versions created in Algol 60, Pascal, and Algol 68. In addition, there are specific library versions used by for computers from cray, CDC, Data General, Harris, Telefunken, Xerox, and Philips. The Philosophy of NAG of giving maximum efficiency indicates that the software is trying to excellently calculate mathematical problems within the algorithm domain solution. It also strives to signal and reject a condition erred and returning a best upper bound of the erred condition where possible to the user-supplied tolerance.
International mathematics and statistical libraries (IMSL)
This contains a big mathematical software library. Its main aim is achieving success commercially with the lowest cost and a resulting high volume. Over 350 subroutine-specific software types are readily compatible for computers from Data General, Xerox, DEC, Hewlett- Packard, and Burroughs. (Wang, J. Y., & Garbow, B. S. (1981).)
Tea pack
This is a Fortran-based subroutine library that is assumed to be easy to use than IMSL and NAG which are commercial libraries. Peapack was designed in the 1980s with a documents for teaching numerical analysis introduction. This package contains routines for most basics, polynomial roots, interpolation, and ordinary differential equation calculations.
Machine-Dependent Libraries
Supercomputers contain software libraries like floating point systems kind of machines. In 1989 July, Brad Carlile gave a report to the NA Digest, which informed that computing for FPS had announced "at-cost" FPSmath availability which was a standard library de facto software for scientific algorithms and engineering systems. This speeds the application research and development by allowing institutions to have the same mathematical tools across all their environment for computing a nominal cost hence guaranteeing portability and taking merits of supercomputer and features of acceleration.
Sources of documentation
Users are provided with various documentation categories, including condensed information, encyclopedic, detective, and specialized information.
Comparison and testing algorithm
The following factors affect the choice of which algorithm to choose: efficiency, storage costs, generality, and reliability of solving all problems.
General-purpose computer algebra systems.
A computer algebra system is a package for software used in mathematical formulae manipulation. The computer algebra system's main purpose is to automate tedious calculations and solve hard algebraic expressions. Computer algebraic systems' ability to solve equations symbolically makes the main difference between it and the traditional calculator. Computer Algebra systems provide the user with a programming language to define her procedures and graphing equations facilities. Some of the widely used systems include Mathematica, MathCAD, and maple. They are used in rational functions simplifications, factor polynomials, solving equations, and many other calculations. While Newton and Leibniz's algorithmic processes of solving calculus are very hard and tedious to solve, computer Algebra systems perform these tasks and outdo the man from the processes. The following is a summary of how these computer algebra systems work.
Speakeasy-this was developed in the 1960s with its main being manipulation of matrix, and within the process of evolution, it geared the most common paradigms tools with typing dynamically the structured data objects, collection of garbage and dynamic allocation, overloading operators, and connecting added modules by the groups of users.
Sampath-this is software that is an open-source containing a unified python interface. Examples are proprietary general and open source purposes CAS and other programs such as GP, Magma, GAP, and Maple.
PARI-matrices, algebraic numbers, and polynomials are computed in this computer algebra system.
Mathematica-this has computer algebra capabilities and programming languages that are designed for number theory computations.
Mathcad which has WYSIWYG interface which aid in mathematical equations publication quality.
Trivino - It is an open-source, object-oriented collection of libraries applied in engineering and science, and it solves linear and parallel algebra algorithms. (Wester, M. J. (1999).)
Computer-assisted data analysis
Computer-assisted software for qualitative data analysis like MAXQDA gives the solutions to given data problems without directly giving interpretations to the user. Qualitative data software tools give way to structuring, sorting, and analyzing big data, which facilitates evaluation and interpretation management. Quality data analysis depends on methods of organizing, systemizing, and analyzing non-numeric like those used in the analysis of qualitative content, analysis of mixed methods, group discussions, case and field studies, and Grounded theory. Computer-assisted data analysis packages should facilitate and support methods of sorting, analyzing, and structuring data content despite which approach the researcher chose. Data in the form of image files, video, audio material, and data from social media can also be included in these packages with sophisticated computer-assisted data analysis software authorizes for transcribing and importing the content to the program direct. Software for QDA, such as MAXQDA, provides support to the whole process of analyzing by providing overviews and relationships visualization. It also provides space for the addition of memos to the various analytical processes which aid the user in understanding them better. The first version of MAXQDA computer-assisted data analysis software was created in 1989, and this makes it a pioneer software program in the area. From collecting data to publishing the final report regardless of the approach used, the program provides support to the user. Coding or systematic assignment of portions of texts to themes and probability of making notes and associations are the central elements of MAXQDA. In MAXQDA, evaluation and interpretation of data are performed by sorting materials into portions using a system of hierarchical coding through variables defining, tabular overviews provision, and colors assigning to segments of text. The procedures can be easily tracked, and within a few steps, the results are easily accessed. Creating stunning visualizations helps the user to view the data from a completely different perspective and be able to test theories. The results can be projected and exported to many programs so as to be included in the final publication through these visualizations. (Chitu, C., & Song, H. (2019).)
Data analytics and processing platforms in Cyber-Physical Systems.
The speed of current developments in cyber physical systems and IoT leads to new challenges for business owners and data analysts to come up with new techniques for analyzing big data. Cyber-physical systems is the integration of systems of computer connected to the physical world. The systems process and display signals for the problem in the hand of the user.
Data types
The most important aspect to consider when in the journey of understanding data is to be able to distinguish the different types of data. Therefore, the implementation of a machine learning algorithm in order is very important. Variables are either numerical or categorical types. The categorical data is subdivided into two subcategories: nominal, which means there is no meaningful order, and ordinal type, which shows there is an obvious order. Numerical data is counts or measurements and is grouped into two other types: discrete or integers and continuous data. There are several types of data analysis using computer software as listed here: qualitative analysis, hierarchical analysis, graph analysis, spatial analysis, and textual data analysis.
Hybrid systems
This is a system where the interested area behavior is determined by coupling processes of distinct characteristics in specific discrete dynamics and coupling them continuously. They generate signals consisting of discrete-valued signals and continuous signals. The signals depend on variables that are independent such as time. Hybrid models are used in the control of automotive engines by solving control algorithms that are implemented through embedded controllers, which reduce gas consumption and pollutant emissions with neutral car performance.
B soft computing
This is an approach used in the separation of soft computing knowledge grounded on intelligence on computation from hard computing skills grounded on artificial intelligence computation. Hard computing has characteristics of formality and precision, and it is channeled toward analyzing and designing systems and physical processes. It handles crisp systems, probability theories, mathematical programming, binary logic, approximation theory, and differential equations. Soft computing is used in analyzing and designing intelligent systems, and it handles problems related to fuzzy logic, probabilistic reasoning, and neural networks.
Data structures.
For a program in a computer to manipulate an equation symbolically, the equation has first to be stored in certain computer memory. At the center of any computer algebra system, there is a single data structure or a combination of many data structures responsible for mathematical equation describing. Equations might have other functions references, or be rational functions, and sometimes exist in several variables. Hence, there is no certain specific solution of an equation to a structure of data presentation. A presentation can be Complex in space and time; hence it becomes inefficient, but it may be easy to program. An efficient presentation to a certain mathematical problem does not mean it is efficient to others; hence there is no specific answer to a given problem.
Interface-oriented software
COMSOL- This is a simulation and solver software for doing engineering and physics applications, specifically coupled phenomena.
Baudline- this is used for scientific visualization and numerical signal analysis, and it is a time-frequency browser.
Dataplot, which is provided by the NIST.
Euler Mathematical toolbox. It is a laboratory-powerful numerical programming language which can solve complex, interval, and real numbers, matrices, and vectors.
Hermes which is a library tool for C++ of improved finite algorithms for solving partial differential equations and problems coupled with multiphysics.
DADiSP. It is a DSP-focused program for combining MATLAB numerical capability with a spreadsheet interface.
Flexpro. It is a program used commercially for an automated and interactive presentation and analysis of measurement data. Other programs include IGOR Pro, FEniCS project, Fityk, and Labplot.
Language oriented software
ADMB. It is a C++ software that uses automatic differentiation to model nonlinear statistics.
AcsIX. It is an application software for evaluating and modeling the continuous system performance described by nonlinear differential equations and time-dependent.
AMPL is a language for mathematical modeling for solving and describing problems of high complexity for optimization on a large scale.
Armadillo which is a C++ program for linear algebra which has factorizations, decompositions, and functions of statistics.
APMonitor. It is a language for mathematical modeling used for solving and describing physical system representations inform of algebraic and differential equations.
Clojure. It contains numeric Neanderthal libraries, ClojureCL, and ClojureCUDA to handle linear algebraic functions and optimized matrix functions on GPU and CPU.
R is a system for data manipulation and statistical analysis in which the SAS language is implemented.
SAS is statistics software that includes a matrix programming language.
VisSim. It is a nonlinear dynamic simulation and a visual block-diagram program which supports fast ordinary differential equations with the simulation of real-time complex large scale models.
World Programming Systems (WPS), which supports python mixing SAS and R languages using a single-user program for analysis of statistics and evaluation of data. Other language-oriented software includes Julia, Madagascar, O-Matrix, Optim, GAUSS, pearl data language, and many others.
Conclusion
Due to the current trend in transforming the world in many aspects of life, computers have to be included in the numerical data analysis systems to ease faster and accurate data simulations. Industrialization increased business capital, future reverences demands, increasing populations, and other life affairs have led to the demand for computer-aided data analysis systems in the numerical analysis field, as discussed in this article.
References
Bartholomew-Biggs, M. C. (2000). Software to support numerical analysis teaching. International Journal of Mathematical Education in Science and Technology, 31(6), 857-867.
Brinkgreve, R. B. J. (1996). Geomaterial models and numerical analysis of softening. Chitu, C., & Song, H. (2019). Data analytics and processing platforms in CPS. In Big
Data Analytics for Cyber-Physical Systems (pp. 1-24). Elsevier. Conte, S. D., & De Boor, C. (2017). Elementary numerical analysis: an algorithmic approach. Society for Industrial and Applied Mathematics.
G Golub and C Van Loan. Matrix Computations, 3rd ed., John Hopkins University Press, 1996.
Wang, J. Y., & Garbow, B. S. (1981). Guidelines for using the AMDLIB, IMSL, and NAG mathematical software libraries at ANL (No. ANL-81-73). Argonne National Lab., IL (USA).
Wester, M. J. (1999). Computer algebra systems: a practical guide. John Wiley & Sons, Inc..
Information Science and Technology
↳ Modern Technology
Data Analysis Tools
The modern-day data analyst uses computer software programs to perform data analysis tasks. Several data analysis tools serve different analysis purposes. A successful data analyst should have in-depth knowledge and understanding of the inner workings of these tools. Using this understanding, data analysts can select the appropriate tool for a given data analysis task.
This paper will discuss six types of data analysis tools: spreadsheets, databases, self-service data visualization, programming languages, big data tools, and cloud.
Spreadsheets are suitable for data collection. The two categories of spreadsheets are Excel and Google Sheets. Spreadsheets are structured such that they have fields in which users can enter data. Specifically, Google Sheets are great for data collection because they are centralized, and Google Forms can easily be shared online with respective parties (Nguyen, 2018). Data collected through spreadsheets can then be uploaded to databases. However, spreadsheets could be better for data cleaning.
Databases can be relational, column, document, or graph. Relational databases are the most commonly referred to type of databases. Their relational databases include Microsoft Access, SQL Server, and Oracle. Databases are not suitable for analysis (Nguyen, 2018). They are, however, suitable for data collection. Relational databases are designed for data storage and transactions. They play a role in data analysis by allowing querying and generation of samples, from stored data, for analysis.
Self-service data visualization tools bring the analyst experience to non-analysts. Such tools include Tableau, Power BI, Qlik Sense, and AWS Quicksight. These tools are simplified so that users can visualize data without undergoing technical training (Nguyen, 2018). Power BI is available as a free download, but sharing the results requires a paid service. Tableau and Qlik Sense are paid, while AWS Quicksight offers a cheaper alternative for online cloud-based analytical tools.
Programming languages for data analysis comprise R and Python. The two languages are successors to older tools such as Matlab, SPSS, and SaaS, which were more expensive. The two languages are appropriate for simple data analysis tasks and complex data science tasks (Nguyen, 2018). R and Python are open source, thus offering lower costs. The above tools are suitable for performing analysis tasks relating to moderately small data (ProjectPro, 2023). Next is the analysis of big data.
Big data implies vast volumes of data that a single computer may not handle, requiring multiple computers to share the process (Nguyen, 2018) simultaneously. This data type uses distributed computing, where several computers operate as clusters. Examples of such computing are Hadoop, Data Lakes, and Spark (ProjectPro, 2023). Data formats that require huge memory, such as videos and images, necessitate big data solutions.
Cloud-based solutions are not software running locally on a computer. It is usually accessed through the Internet. Multiple computers may be used to store and share processes. Examples of cloud are AWS, Microsoft Azure, and Google Cloud Platform. Cloud is cheap and straightforward for a business just setting up an IT infrastructure. Cloud is also suitable for requiring big data (Nguyen, 2018). Because the cloud can be scaled up or down easily or switched off without much hardware and software sunk cost, cloud-based solutions are some of the best computing solutions. Cloud is great at big data because it accesses multiple databases with different data sets.
A data science professional should choose the appropriate tool for the right task. Simple spreadsheets are great for data collection. Databases provide an excellent source for stored data. Self-service visualization tools enable non-technical users to visualize data. Programming languages are suitable for large data tasks. Big data tools are good for analyzing big data, for example, data from the cloud. The cloud is an excellent source of big data. Each tool is valid at a different cycle of the data analytics process.
References
Nguyen, J. (2018, November 29). Top 6 Tool Types For Data Analysis / Data Science - Save hours by using the right tool. Www.youtube.com. https://www.youtube.com/watch? v=23QtdnfhBRY
ProjectPro. (2023, July 15). R Hadoop – A perfect match for Big Data. ProjectPro. https://www.projectpro.io/article/r-hadoop-a-perfect-match-for-big- data/292#:~:text=Since%2C%20R%20is%20not%20very