Fill This Form To Receive Instant Help

Help in Homework

Anomaly detection

  • Words: 2594

Published: May 29, 2024

Although there are several security measures in use, professionals in the field of computer security have categorized them into different classes. There are strategies that aid in resisting attacks, spotting attacks, detecting attacks, and detecting attacks as well as strategies that aid in recovering from attacks. Although the focus of this study is anomaly detection, it is important to remember that installing a monitor sensor in one's home is a practical technique to notice an assault.

According to Bruce et al., (2020), the practice of comparing definitions of an activity that is thought to be normal versus an observed occurrence in order to discover substantial differences is known as anomaly identification. In many instances, intrusion detection systems are used to identify attacks (Bruce et al., 2020). These systems operate by evaluating the traffic flow toward a computer-based system. However, computer specialists compare the historical baseline when it comes to anomaly detection.

Anomaly detection, according to Bruce et al., (2020), is a procedure that is predicated on the idea that intrusive or improper behavior deviates from how a typical system is used. Therefore, the majority of anomaly detection systems are able to identify the activity profile of a typical system and then highlight any system events that statistically differ from the established profile. One advantage of identifying anomalies is that it facilitates the abstraction of data related to a system's typical behavior and aids in the detection of assaults, whether or not the system has previously encountered them.

By using metrics generated from system measures including CPU utilization, number and length of logins, memory consumption, and network activity, computer security specialists have created behavior models. The susceptibility of a hacker who manages to enter the system during an anomaly detection learning period, however, is a serious flaw in anomaly detection (Bruce et al., 2020). A cunning hacker might be able to teach the anomaly detection how to interpret invasive events so they appear to be typical system behavior.

A number of approaches have been devised in anomaly detection. However, experts have concerns about these approaches regarding the unauthorized access to source computer devices. For instance, the statistical approach to anomaly detection provides a system that allows the anomaly detector to measure the deviance of the present behavior profile from the original profile (Kibria et al., 2018). In this case, only computer that have been trained normally is learned by the system, which tries to extrapolate anomalous behavior by employing low probabilistic outcomes of the testing computer outcomes.

Concerns here, however, lie with the fact that, false negatives and positives may be generated due to the inadequacy of the statistical measures that are employed. There are also concerns of whether enough normal training computers were collected (Rahul et al., 2020). To add on that, source computer devices unauthorized access covers a range of concerns, like hackers trying to use them as a way of launching an attack that may involve employing collected source computer or information stealing ((Kibria et al., 2018). In network security and performance context, the source of system log as well as network computer flow is regarded as the infrastructure itself, which is another concern.

Different processes have been implemented to govern access rights network infrastructure devices. One of the processes that have been employed by many organizations is to come up with a policy that helps to classify information spelling out the importance of the information stored in the system (Rettig et al., 2019). Detection of anomalies in a network is one of the major deficiencies computer and network security personnel have. The research provides an insight for computer and network security, and to be more specific in the research thesis, sources that provide ample information on securing wireless sensor network ( Kibria et al., 2018). He research which I have decided to use a comparative approach will combine literature from a number of previous and historical researches, to provide a great resource to those who are willing to have solutions to computer and network security issues.

In their article Kibria et al., (2018), discuss DoS assaults against WSN. If deployed in an insecure environment, the wireless network devices and sensors are unable to protect the wireless media from attacks and are susceptible to physical tampering. The use of symmetric cryptography, which uses shorter encryption keys and is arguably superior to public key cryptography when used in sensor networks, is one of the generic security mechanisms suggested by the authors. Each protocol layer's weaknesses and suggested protection measures are described in the article. For instance, on a physical protocol layer, jamming and node destruction/tampering are used to assault the network and sensor devices (Munir, 2021). The remaining defenses strategies include detect and sleep, avoid jammed areas by traveling through them, conceal or disguise nodes, tamper-proof packaging, authentication, and interaction protection. The goal is to use prior research to give a more thorough, full list of protection mechanisms and remedies to security concerns. A study on assaults, security measures, and difficulties in wireless sensor networks is another crucial source.

According to Xie et al. (2018), there are two primary types of computers in computer science: spatial computers and temporal computers. The first step in developing a spatial computer entails creating an intriguing needed series that is not already recorded in spatial storage. Using a spatial computer, the analyst applies spatial information to generate the necessary data. Identifying series is one of several difficulties in the application of spatial computers, particularly in research analysis. According to Mishra & Jena (2021), Clementine and Enterprise Miner are the two commonly utilized tools for managing spatial computers. These techniques are primarily used for the analysis of various spatial computers, including genomic computers and web computers. Latitude and longitude information are stored in the spatial computer. Additionally, it includes coordinates pointing to a location in space.

The spatial computer also has a number of features that help locate various geographic locations and images of those locations. The temporal computer, on the other hand, displays the situation in real time. The temporal computer is seen as transient because it doesn't last for a very long time (Midani et al., 2019). Temporal computers are typically employed for demographic research, traffic management, and weather analysis. The analytics performed during temporal analysis are utilized to pinpoint a problem's root cause, which aids in providing a remedy.

According to the pattern of phenomena that has been investigated, the answer is assured. The temporal computer performs a wide range of tasks, such as computer categorization and comparisons, trend analysis, correlation analysis between computers, computer series analysis, and many other things (Tschimben, 2022). The basic goal of a temporal computer is to pinpoint the temporal series, correlation, and sequence that exist within it and to gather the data necessary to display the computer's behavior over a certain time frame. The computation of numerous main key values at various points in a given time is made possible by the temporal computer, which also distributes the computer's time sequence. The two categories are really different even though they seem to be the same. To start, according to its definition, a spatial computer derives data and correlation from local computers contained within a computer base, whereas a temporal computer only extracts trustworthy data from temporal computers, aiding in pattern recognition.

In connection, the platform for computer science permits forecasting through the use of codes and powerful computers. The created model makes it easier to develop trustworthy solutions to the problem that needs to be solved. The extent of the inputs and the precision of the varying computer acquired are the main determinants of accuracy during modeling. This system makes use of Hadoop (Bruce et al., 2020). Traditional computer bases and statistical tools are not capable of handling great structured and unstructured computer, but platform tools can computer scientists primarily use the platforms for cleaning, computer visualization using statistical analysis, and modeling code, among other tasks.

According to Bruce et al. (2020), business analysts also use the platform to understand their clients' businesses. Replications based on stakeholder information are always supported by the platform. There are related tools from computer science. The computer science tool functions as a platform for computer science to organize, examine, and visualize the computer. The computer science tool can only be used one at a time, whereas the computer science platform can include multiple programming tools. This is the main difference between the two. R is a useful example of a computer science tool. R is open-source and cost-free software that is used for computing and visualizing statistical data (Jha & Sharma, 2021). R has around 9900 packages, including ggpubr, ggplot, tidry, and others that allow computer scientists to conduct analysis, according to Bruce et al. (2020). The integration of R with other languages, including Python, SQL, and others, is quite easy.

 

Conclusion

It has been highlighted that the field of computer science is expanding quickly in the modern technological environment. As a result of this increase, as mentioned, computer scientists and businesses are facing numerous challenges, particularly in managing the computer. Since computers come from diverse sources, management of them is a concern. Additionally, it has been emphasized that the computer analyst must possess the necessary abilities to control and resolve any computer-related issues that may arise. Additionally, it has been highlighted that computers may classify data in two different ways: spatially and temporally.

Computer analysis and visualization are required for the desired outcomes in each of these classifications. The life cycle of computer science, which starts with computer collecting and ends with computer visualization, has also been noticed. The computer science platform is referenced in relation to supporting various programming tools for analysis. In conclusion, it has been mentioned that various programming languages are utilized for analysis, including R, which has a number of internal packages that aid in computer visualization for improved decision-making.

References

  • Bruce, P., Bruce, A. and Gedeck, P., 2020. Practical statistics for data scientists: 50+ essential concepts using R and Python. O'Reilly Media.
  • Jha, P., & Sharma, A. (2021, January). Framework to analyze malicious behaviour in cloud environment using machine learning techniques. In 2021 International Conference on Computer Communication and Informatics (ICCCI) (pp. 1-12). IEEE.
  • Kibria, M.G., Nguyen, K., Villardi, G.P., Zhao, O., Ishizu, K. and Kojima, F., 2018. Big data analytics, machine learning, and artificial intelligence in next-generation wireless networks. IEEE access, 6, pp.32328-32338.
  • Midani, W., Fki, Z., & BenAyed, M. (2019, October). Online anomaly detection in ECG signal using hierarchical temporal memory. In 2019 Fifth International Conference on Advances in Biomedical Engineering (ICABME) (pp. 1-4). IEEE.
  • Mishra, B., & Jena, D. (2021). Mitigating cloud computing cybersecurity risks using machine learning techniques. In Advances in Machine Learning and Computational Intelligence: Proceedings of ICMLCI 2019 (pp. 525-531). Springer Singapore.
  • Munir, M. (2021). Thesis approved by the Department of Computer Science of the TU Kaiserslautern for the award of the Doctoral Degree doctor of engineering (Doctoral dissertation, Kyushu University, Japan).
  • Rahul, K. and Banyal, R.K., 2020. Data life cycle management in big data analytics. Procedia Computer Science, 173, pp.364-371.
  • Rettig, L., Khayati, M., Cudré-Mauroux, P. and Piórkowski, M., 2019. Online anomaly detection over big data streams. In Applied data science (pp. 289-312). Springer, Cham.
  • Tschimben, S. (2022). Anomaly Detection in Shared Spectrum (Doctoral dissertation, University of Colorado at Boulder).

Get high-quality help

img

Barry Silbert

imgVerified writer
Expert in:Information Science and Technology

4.7 (135 reviews)

I recently got 90% on the research paper these guys wrote. The writer was really intense and made sure it met guidelines required.


img +122 experts online

Learn the cost and time for your paper

- +

In addition to visual imagery, Cisneros also employs sensory imagery to enhance the reader's experience of the novel. Throughout the story

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

+122 experts online
img