Project Materials

COMPUTER SCIENCE PROJECT TOPICS

ON BIG DATA MANAGEMENT IN INTERNET OF THINGS

ON BIG DATA MANAGEMENT IN INTERNET OF THINGS

Need help with a related project topic or New topic? Send Us Your Topic 

DOWNLOAD THE COMPLETE PROJECT MATERIAL

ON BIG DATA MANAGEMENT IN INTERNET OF THINGS

Chapter One: Introduction (1.0)
Advancements in sensor technology, communication, and data analytics have created new opportunities. Nanotechnology allows manufacturers to create sensors that are both compact and sophisticated, making them suitable for almost any application.

Improved communication protocols among devices enable effective and real-time data transmission from sensors. There are growing techniques for processing this data.

In recent years, researchers have shown increased interest in the Internet of Things (IoT) due to these events. The Internet of Things (IoT) allows users to link their “things” with anything, anywhere, and at any time over any communication media.

The term “things” refers to any type of linked gadget. By 2020, an estimated 50-100 billion gadgets will be connected to the internet [2].

These devices will generate an enormous amount of diverse data. The term “Big Data” refers to large amounts of heterogeneous data that are rapidly generated. The 3Vs (volume, variety, and velocity) or 5Vs (value, veracity) are commonly used to define big data [3], [12].

Well-managed data can provide significant insights on the behaviour of people and “things” with numerous uses.

IoT data is rapidly transforming our daily lives. People are increasingly accepting and trusting IoT data analytics in sensitive settings, including stock market trading [1].

Efficient management of large, fast-moving data streams is crucial, as evidenced by recent advancements. Hadoop and other distributed processing systems manage massive amounts of data, but not streams.

Latency is a fundamental problem in distributed environments such as Hadoop. The traditional Store-Process-and-Forward strategy makes them unsuitable for real-time processing, which is required for present and emerging applications [4].

IoT data’s high velocity and unstructured nature make store and forward methods unsuitable for meeting latency requirements. Stream processing frameworks, as Apache Storm and Samza, are designed to address this issue.

Stream processing allows for continuous processing of data sources without the requirement for initial storage. This reduces latency, particularly in stateless stream processing, when data is processed without regard for the system’s present state.

Stream processing frameworks are not customised for IoT data management systems, but rather for general data processing. IoT applications often have severe latency requirements.

IoT applications rely heavily on machine-to-machine (M2M) connectivity. Emerging IoT applications require a new way to minimise latency and efficiently exploit “things” data.

1.1 Research Question.

Is it possible to create a Big Data management system that reduces latency in intelligent responses to actionable events in IoT applications?

1.2 Objective of the Research
This paper proposes and demonstrates a scalable and robust Big Data management solution for IoT, extracting real-time value from data in an application area. Our goal is to reduce latency in streaming data from a network of connected devices, allowing for real-time collection of events and triggering actions.

1.3 Implications of Research

This paper proposes a latency-reducing method to IoT data management, regardless of data source, kind, or connection protocol. This technique will enhance the speed and responsiveness of real-time applications and expand the use of IoT in latency-critical sectors.

This strategy improves the reaction time of IoT applications, allowing for faster responses to meet the needs of developing applications. This idea will apply to both commercial and open-source IoT data management applications.

1.4 Scope of Work This work covers the following:

I. To provide a novel way to IoT data handling in order to reduce latency.

II. Use software tools to achieve this method.

III. Apply this implementation to a difficult use case.

1.5 Organisation.

The second chapter of this write-up contains a detailed examination of the literature. This contains a review of key concepts, technology, and relevant research in the domain.

The third chapter introduces the suggested paradigm, explains its functionality, and highlights its merits in lowering latency. The fourth chapter outlines the model’s execution, including outcomes and evaluations. The fifth chapter concludes and discusses future research.

Need help with a related project topic or New topic? Send Us Your Topic 

DOWNLOAD THE COMPLETE PROJECT MATERIAL

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Advertisements