Found 779 repositories(showing 30)
mrmierzejewski
A sleek, minimalist Conky configuration that displays real-time system information and weather data. Features live weather forecasts, system resource monitoring (CPU, memory, disk I/O), network statistics, and process tracking - all in a clean, transparent overlay that stays out of your way.
DhanushN2005
A real-time weather data streaming and analysis pipeline built with Python and Apache Kafka that fetches weather information from an online API, streams it to Kafka topics, and processes it for analytics or storage.
Power system components usually have standard static ratings that determine the load constraints. The term ‘‘rating’’ in the power line system refers to the maximum allowable conductor current that raises the line temperature without infringing ground clearance and causing the loss of conductor tensile strength due to annealing. Traditional, a load-ability concept established on the basis of the Overhead Transmission Lines static thermal rating is referred to the worst-case conditions (high ambient temperature, full sun and low wind speed). It is one of the reasons, why power systems do not use all of their potential transmission capacity. In fact, the real-rating capacity of an overhead line increases when wind speed is high, due to the cooling caused by wind in the distribution lines. Dynamic line rating solution allow assets real weather conditions and calculating the real capacity of transmission lines. Thus, when planning wind power integration, if dynamic line limits are considered instead of the conservative and static limits, estimated capacity increases. Research has shown that actual line ratings are higher than static rating most of the time. The potential of using the dynamic line rating solution help to increase the reliability of power systems is therefore significant. Almost every country has begun the process of increasing the integration of intermittent sources like wind power into their network, and consequently, the dynamic rating technology has become increasingly important and is incorporated in the smart grid vision. This report provides a comprehensive study of the literature on dynamic line rating. Next, the current constraints on the power transmission grid were identified, which are related to the history and geography of Vietnam. On this basis, several suggestions were put forward to improve the power transmission grid in Vietnam, namely, developing renewable energy sources, introducing Dynamic line rating solution if conditions permit. Different scenarios for analysis of real time weather data and ways of application of Dynamic Rating to the typical power line system 220 kV Vantri-Chem 17 km are taken into consideration. The study provides a reference for the future development of the smart-grid concern in Vietnam. Keywords: Dynamic Rating; Thermal; Load-ability; Ampacity; Clearance, Overhead Conductor; Sag; Tension; Thermal capacity.
ranizouaoui
Distributed application includes features like short-term forecasts, long-term weather patterns, and real-time alerts for severe weather conditions. the app use Kafka for handling live data streams. Moreover, Python pipeline is employed to process the weather information and store it in a Mongodb database
bbarker505
A final production version of the DDRP platform that includes cohorts, parallel processing, and improving mapping routines. The objective of the Degree-Day, establishment Risk, and Pest event mapping system (DDRP) is to predict phenology and climate suitability of invasive, biocontrol, and IPM species for the conterminous United States. DDRP is written entirely in the R statistical programming language (R Development Core Team 2019), making it flexible and extensible, and has a simple command-line interface that has already been adapted for online use. The platform can use a variety of gridded weather and climate data types for any historical (post-hoc), real-time, or future (downscaled GCM) time period. Model products include gridded (raster) and graphical outputs of number of completed generations, phenological/pest events, and climate suitability (i.e., establishment risk maps). The platform is described in a peer-reviewed paper in PLoS ONE (https://doi.org/10.1371/journal.pone.0244005).
andrewmogbolu2
Blockchain and AI are on just about every chief information officers watchlist of game-changing technologies that stand to reshape industries. Both technologies come with immense benefits, but both also bring their own challenges for adoption. It is also fair to say that the hype surrounding these technologies individually may be unprecedented, so the thought of bringing these two ingredients together may be viewed by some as brewing a modern-day version of IT pixie dust. At the same time, there is a logical way to think about this mash-up that is both sensible and pragmatic. Today, AI is for all intents and purposes a centralized process. An end user must have extreme faith in the central authority to produce a trusted business outcome. By decentralizing the three key elements of AI — that is, data, models, and analytics — blockchain can deliver the trust and confidence often needed for end users to fully adopt and rely on AI-based business processes. Let’s explore how blockchain is poised to enrich AI by bringing trust to data, models and analytics. Your data is your data Many of the world’s most notable AI technology services are centralized — including Amazon, Apple, Facebook, Google, as well as Chinese companies Alibaba, Baidu and Tencent. Yet all have encountered challenges in establishing trust among their eager, but somewhat cautious users. How can a business provide assurance to its users that its AI has not overstepped its bounds? Imagine if these AI services could produce a “forensic report,” verified by a third party, to prove to you, beyond a reasonable doubt, how and when businesses are using your data once those are ingested. Imagine further that your data could be used only if you gave permission to do so. A blockchain ledger can be used as a digital rights management system, allowing your data to be “licensed” to the AI provider under your terms, conditions and duration. The ledger would act as an access management system storing the proofs and permission by which a business can access and use the user’s data. Trusted AI models Consider the example of using blockchain technology as a means of providing trusted data and provenance of training models for machine learning. In this case, we’ve created a fictitious system to answer the question of whether a fruit is an apple or orange. This question-answering system that we build is called a model, and this model is created via a process called training. The goal of training is to create an accurate model that answers our questions correctly most of the time. Of course, to train a model, we need to collect data to train on — for this example, that could be the color of the fruit (as a wavelength of light) and the sugar content (as a percentage). With blockchain, you can track the provenance of the training data as well as see an audit trail of the evidence that led to the prediction of why a particular fruit is considered an apple versus an orange. A business can also prove that it is not “juicing up” its books by tagging fruit more often as apples, if that is the more expensive of the two fruits. Explaining AI decisions The European Union has adopted a law requiring that any decision made by a machine be readily explainable, on penalty of fines that could cost companies billions of dollars. The EU General Data Protection Regulation (GDPR), which came into force in 2018, includes a right to obtain an explanation of decisions made by algorithms and a right to opt out of some algorithmic decisions altogether. Massive amounts of data are being produced every second — more data than humans have the ability to assess and use as the basis for drawing conclusions. However, AI applications are capable of assessing large data sets and many variables, while learning about or connecting those variables relevant to its tasks and objectives. For this very reason, AI continues to be adopted in various industries and applications, and we are relying more and more on their outcomes. It is essential, however, that any decisions made by AI are still verified for accuracy by humans. Blockchain can help clarify the provenance, transparency, understanding, and explanations of those outcomes and decisions. If decisions and associated data points are recorded via transactions on a blockchain, the inherent attributes of blockchain will make auditing them much simpler. Blockchain is a key technology that brings trust to transactions in a network; therefore, infusing blockchain into AI decision-making processes could be the element needed to achieve the transparency necessary to fully trust the decisions and outcomes derived from AI. Blockchain and the Internet of Things More than a billion intelligent, connected devices are already part of today’s IoT. The expected proliferation of hundreds of billions more places us at the threshold of a transformation sweeping across the electronics industry and many other areas. With the advancement in IoT, industries are now enabled to capture data, gain insight from the data, and make decisions based on the data. Therefore, there is a lot of “trust” in the information obtained. But the real truth of the matter is, do we really know where these data came from and should we be making decisions and transacting based on data we cannot validate? For example, did weather data really originate from a censor in the Atlantic Ocean or did the shipping container really not exceed the agreed temperature limit? The IoT use cases are massive, but they all share the same issue with trust. IoT with blockchain can bring real trust to captured data. The underlying idea is to give devices, at the time of their creation, an identity that can be validated and verified throughout their lifecycle with blockchain. There is great potential for IoT systems in blockchain technology capabilities that rely on device identity protocols and reputation systems. With a device identity protocol, each device can have its own blockchain public key and send encrypted challenge and response messages to other devices, thereby ensuring a device remains in control of its identity. In addition, a device with an identity can develop a reputation or history that is tracked by a blockchain. Smart contracts represent the business logic of a blockchain network. When a transaction is proposed, these smart contracts are autonomously executed within the guidelines set by the network. In IoT networks, smart contracts can play a pivotal role by providing automated coordination and authorization for transactions and interactions. The original idea behind IoT was to surface data and gain actionable insight at the right time. For example, smart homes are a thing of the present and most everything can be connected. In fact, with IoT, when something goes wrong, these IoT devices can even take action — for example, ordering a new part. We need a way to govern the actions taken by these devices, and smart contracts are a great way to do so. In an ongoing experiment I have followed in Brooklyn, New York, a community is using a blockchain to record the production of solar energy and enable the purchase of excess renewable energy credits. The device itself has an identity and builds a reputation through its history of records and exchange. Through the blockchain, people can aggregate their purchasing power more easily, share the burden of maintenance, and trust that devices are recording actual solar production. As IoT continues to evolve and its adoption continues to grow, the ability to autonomously manage devices and actions taken by devices will be essential. Blockchain and smart contracts are positioned well to integrate those capabilities into IoT.
It's a project combined with hardware and software, the goal is to make a smart watch based on esp8266 chip. The smart watch has so many features such as time display, alarm, brightness adjustment, text scrolling, weather display, voice control and gesture recognition:. The first five features time display, alarm alert, brightness adjustment and text scrolling are based on the local programming. The basic idea for these is to check the document for esp8266, getting to know how to set up the circuit, how to get the real time, how to control the OLED screen, bus, light sensors etc. The last three features mainly depends on the cloud coding, and are all based on the Client/server architecture. For Weather Display, esp8266 severed as a client. A connection was built by Google Geolocation API between the smart watch(client) and Google Weather Service Server(server). Send the JSON string including location information to the server and get the weather information back. For Voice Control, esp8266 was configured as a server by ngrok and an Android application was used as the client. This app aims to transfer voice signal into text commands by Google Speech API. Connected the app(client) to the smart watch(server) by HTTP request, and used translated commands to control the smart watch. For gesture recognition, esp8266 was client and server was the EC2 Linux instance set up in AWS. In order to teach the smart watch to identify gestures, both the training and predicting process needs to be done. Mongodb is used to keep the large amount of data from accelerometer training. SVM model is selected to do the training, the data for the same gesture was labeled. After training, if someone is holding the smart watch and moving it in some preset gestures, the smart watch could map these data from the accelerometer into a specific gesture. The final recognition accuracy is 98%.
Quantitative Precipitation Estimation (QPE) based on weather radar observations plays a significant role in the understanding of weather events, especially in real-time, where fast evolving phenomena like convective storm cells can be dangerous. We wish to demonstrate QPE using deep learning as an alternative approach to empirical relationship equations between rainfall rate and reflectivity which were developed in the past. QPE using radar reflectivity is one of the possible applications of deep learning in the weather radar field. Preprocessing this data and saving it in real-time on cloud would let the users skip the time-consuming preprocessing step and assist them to directly get to the deep learning phase. To train and test deep learning models with radar data, we must align rain gauge data in space and time. This data preprocessing requires time and resource consuming processes that involve downloading, extracting, gridding, aligning the radar data with respect to every gauge in the region. If this preprocessed dataset was readily available in real-time, deep learning can be easily performed on it by anyone without going through the heavy computations required in the process. EarthCube’s CHORDS tool is a real-time data service that can be used to store preprocessed data on cloud so that it can be accessed whenever and wherever required. In this work, we demonstrate the steps involved in preprocessing such as accessing WSR-88D radar and NASA-TRMM rain gauge data, Cartesian gridding of radar data, aligning the radar data with gauge data in real-time. This aligned data is stored on cloud using CHORDS, so that it can be readily available to users who wish to use it for deep learning. The notebook will also demonstrate the procedure for storing and retrieving the dataset from CHORDS server and an example of the deep learning process on the downloaded dataset.
tatimohammed
Near real-time weather data processing
Santosh-2003-sahoo
Real-Time Data Processing System for Weather Monitoring with Rollups and Aggregates
AshishRana28
An IoT-based system to recommend suitable crops using real-time soil and weather data processed with machine learning algorithm
yeshwanthlm
This is a hardware and software system for real time monitoring and detection of forest fires. With its help remote recognition of wood fires is possible as well as high-accuracy positioning of flame base. Hardware part of the system consists of a set of intellectual sensors which are installed inside the forese. The action range of sensors is 250 - 500m depending on the RF device and type of sensor. As for the sensors, video cameras, infrared imagers and other intellectual equipment are used. They detect fire areas by a number of measures and under different conditions. If the sensor detects a fire, the information is transmitted to control unit via various communication channels: optical, radio, wire, GSM, etc. In such a way forest data are transmitted to the software part of our project where they are processed and analyzed. The system will automatically find and identify the fire area. Thereafter the information is passed to special departments via built-in alerting service, Internet and even mobile networks. Characteristics of the System • Fire detection accuracy - up to 250 m • Direction detection accuracy -- 0.5° • Possibility to integration data from other information sources (weather and satellite information). • Possibility of efficient scaling and broadening of the system for coverage range extension. • Number of users - without limit. • Possibility to get information on mobile phones. • Automatic detection of potentially dangerous objects (smoke and flame). Advantages of System 1. Automatization of monitoring 2. Centrally managed monitoring of large areas 3. Opportunity detect fires at an early stage and its spread 4. High accuracy of fire detection 5. Decrease of human factor role when detecting fires 6. Low cost of installation and exploitation of the system in comparison with other forms of monitoring 7. Flexibility of the system depending on relief and customer wants
juitawde
WeatherScope is a web-based dashboard that fetches real-time weather data for multiple cities using a weather API. It processes the data to calculate statistics such as average, highest, and lowest temperatures 🌦️
No description available
Hridya2001
Project file for real time weather data pipeline integrating AWS sevices and Snowflakes for data processing, prediction and analysis.
WANDA (Weather Analysis and Notification Data Assistant) is a Python/Node.js system that fetches and analyzes real-time weather data for any city. Using a state machine for workflow management, it determines notification needs based on current conditions. WANDA efficiently integrates data retrieval, processing, and user alerts in one package.
YasmineHabchi
This project streams real-time weather data using WeatherAPI, processes it with Apache Kafka and PySpark, and performs time series forecasting using a Deep LSTM model. It stores the processed data in Delta Lake and performs dynamic and non-dynamic time series forecasting using historical weatherdata to predict future patterns with the SARIMAX model
mohamed-zakariya
is a real-time sensor data platform designed to collect, process, and visualize information from environmental and traffic-related sensors. The system provides a centralized dashboard that delivers actionable insights on weather conditions, road congestion, and more, enabling smarter monitoring and decision-making.
HarshVardhanSh
Data Source: Satellite Image from WIFIRE Project WIFIRE is an integrated system for wildfire analysis, with specific regard to changing urban dynamics and climate. The system integrates networked observations such as heterogeneous satellite data and real-time remote sensor data, with computational techniques in signal processing, visualization, modeling, and data assimilation to provide a scalable method to monitor such phenomena as weather patterns that can help predict a wildfire's rate of spread. You can read more about WIFIRE at: https://wifire.ucsd.edu/ In this example, we will analyze a sample satellite image dataset from WIFIRE using the numpy Library.
A real-time weather monitoring system for major Indian cities using the OpenWeatherMap API. It features interactive temperature charts and alerts for critical conditions. Built with React and TypeScript, it leverages Vite, Tailwind CSS, React Query, and Recharts for efficiency and responsiveness, ensuring scalability and maintainability.
amritrout
Weather-Monitor is a real-time data processing system designed to monitor weather conditions and provide summarized insights using rollups and aggregates.
mbataiev
The Microservices Weather Analyzer Application is a sophisticated and robust weather monitoring and analysis system designed to provide real-time weather data processing, storage, analytics, and notifications.
msreekanth02
An intelligent weather application powered by AI and NLP. Get real-time weather data, AI-powered analysis, and conversational recommendations using advanced natural language processing.
Pranay0205
The Real-time Weather Data Analysis System is a cloud-based platform leveraging AWS services to collect, process, and analyze weather data from multiple sources in real-time. This system provides actionable insights for various applications including agriculture, transportation, and emergency response services.
Bayzid03
🌦️ An automated Airflow pipeline that extracts real-time weather data, transforms it through ETL processes, and loads it into PostgreSQL for analysis. This project showcases workflow orchestration, data processing, and database operations using industry-standard tools.
nhawtanhy
A big data project that builds an end-to-end real-time pipeline for collecting, processing, and analyzing weather data in Vietnam. The system leverages Apache Kafka and Apache Spark for streaming and batch processing, integrates with distributed storage for scalability, and applies forecasting models to predict short-term weather trends.
mariaelhoudaigui
This project implements a Big Data pipeline for real-time monitoring and analysis of weather data. Its goal is to continuously collect meteorological information, process and analyze it efficiently, and provide actionable insights for tracking weather patterns and trends over time. The system is designed to ensure reliable data flow, timely process
BearHuddleston
PipelineIQ is a modern data processing and analysis platform that transforms raw data into actionable insights through a seamless pipeline architecture. The platform integrates real-time data from cryptocurrency and weather APIs, processes it efficiently, and leverages AI to deliver comprehensive analysis.
Harsh5225
A scalable Spring Boot backend service that fetches, processes, and caches real-time weather data using the Weather API. Built with clean layered architecture and optimized persistence to deliver low-latency, production-ready APIs.
rishraks
Weather Forecast Application – A full-stack application that provides real-time and forecasted weather data. The backend, built with Spring Boot (WebFlux), fetches and processes weather information from external APIs, while the frontend, developed in Next.js, offers a clean and responsive user interface for viewing the forecasts.