Found 20 repositories(showing 20)
aws-samples
This workshop shows you how to build a Web Application that demonstrates how easy it is to create data driven web applications all with no servers. You will build a serverless web application that lets users search for popular tourist destinations. The application will use AWS AppSync and the AWS Serverless platform to provide real-time weather analysis of the indexed destinations.
a weather station using Raspberry Pi, weather sensors, AWS Cloud, and Machine learning for data analysis
abhinabasadhu
• create a website that displays numerical data, predictions about the numerical data and the results of sentiment analysis. • The numerical data will be obtained from web services. It cannot be obtained from web scraping. For example, it could be product price data from web services, stock prices, exchange rate prices, weather, football results, etc. • The text data for sentiment analysis will be obtained from web services, such as the Twitter API or Facebook Graph. • Machine learning will be used to make predictions about future values of the data. • Also display synthetic data that we will provide to check data visualization and machine learning. • All third party data will be stored in the cloud. • The front end of the website only has to display visualizations of the data, predictions about the data and the results of the sentiment analysis. No other functionality is required. • The code that downloads data from web services and uploads it to the cloud must be written in TypeScript. • Website will be hosted on the cloud using serverless technology. Lambda functions on the server can be written in any programming language (JavaScript is recommended). • The front end of website can use ordinary JavaScript or a JavaScript framework. • WebSockets will push new data items to subscribed clients. • The coursework and teaching materials will be based on Amazon Web Services (AWS). use a different cloud provider. However, we will only be able to provide very limited support with projects that are based on a different cloud provider.
This project implements a batch data ingestion and transformation pipeline for car rental data using Python, PySpark, Airflow, GCP Dataproc, and Snowflake.
prayagnshah
This project showcases the implementation of a data pipeline using Apache Airflow. Leveraging the OpenWeather API, it efficiently fetches real-time weather data and performs ETL processing. Results are seamlessly stored in AWS S3 buckets for further analysis. Moreover, the integration of Slack notifications ensures timely alerts to myself.
Rajender55
Orchestration of data ingestion from weather API using Airflow and storing and analysis in AWS
vidhi990
A Interactive Weather Forecasting Website developed using AWS Sage-Maker and DynamoDB, implementing a complete data science pipeline and sentiment analysis.
This project is an overview of an Weather Data Analysis Pipeline that extracts the weather data live from the weather APIs and load it into the Readshift after reuired transformation. This Project is using the the AWS Services like S3, CodeBuild, Airflow, Glue, Redshift etc.
This repository demonstrates how to build an automated weather data pipeline using Apache Airflow to extract data from the OpenWeatherMap API and store it in an AWS S3 bucket for further analysis
Our project builds a real-time data pipeline that collects weather data from an external API, processes it using AWS services, and loads it into Snowflake for instant analysis. It’s a fully automated, serverless solution designed for scalable and efficient weather data insights.
hassandsriaz
SparkClimate Pakistan is a data processing pipeline designed to improve weather data accessibility and reliability. Built using Apache Spark on AWS EMR, this project automates data ingestion, transformation, and storage to support climate analysis and decision-making.
Naga-Manohar-Y
This data pipeline is designed to extract weather data from the OpenWeather API, store it in Amazon S3, process it using AWS Glue, and finally load the processed data into Amazon Redshift for further analysis. The project uses Amazon MWAA to orchestrate the entire process and AWS CodeBuild for CI & CD
iamshivapandey
In this project, i am fetching live weather data using rapid api. Using kafka producer data is streaming to kafka consumer and dumping it into aws s3 bucket, then loading that data from s3 bucket using pyspark dataframes and after doing data cleaning, flitering etc. i am saving that cleaned data into hdfs in csv format for furthure analysis.
diegovillatoromx
This repository contains a project dedicated to the analysis of meteorological data using simulated datasets and AWS services. The goal is to build a dashboard that visualizes climate patterns and facilitates the interpretation of weather data for informed decision-making.
ragini-karunanithi
This project demonstrates a real-time data pipeline leveraging AWS services like DynamoDB, EventBridge, and Lambda, along with Snowflake. It integrates a weather API to ingest live data into DynamoDB and seamlessly streams it to Snowflake using Snowpipe for further analysis and reporting.
bhavanachitragar
This project uses AWS Lambda to fetch live weather data from an API and store it in DynamoDB. Changes in DynamoDB trigger another Lambda function to format and store the data in CSV files on Amazon S3. Snowflake then automatically ingests this data from S3 using Snowpipe for real-time analysis and visualization, ensuring efficient data processing
This project uses AWS Lambda to fetch live weather data from an API and store it in DynamoDB. Changes in DynamoDB trigger another Lambda function to format and store the data in CSV files on Amazon S3. Snowflake then automatically ingests this data from S3 using Snowpipe for real-time analysis and visualization, ensuring efficient data processing
OlawumiSalaam
This project demonstrates the creation of an automated ETL pipeline using Apache Airflow on an AWS EC2 instance. The pipeline extracts real-time weather data from the OpenWeather API, performs transformations, and loads the processed data into an Amazon S3 bucket for storage and further analysis.
roniketraut
This project implements an automated End-to-End (ETL) data pipeline that extracts daily weather information for a predefined list of global cities, transforms the collected data, and loads it into an AWS S3 bucket for storage and further analysis. The entire workflow is orchestrated using Apache Airflow
An IoT Weather Monitoring System uses sensors (e.g., DHT22 for temperature/humidity, BMP180 for pressure) connected to a microcontroller (e.g., ESP8266). The microcontroller reads sensor data and transmits it to a cloud platform (e.g., AWS IoT Core) for storage and visualization, enabling real-time environmental monitoring and analysis.
All 20 repositories loaded