Found 18 repositories(showing 18)
hk60906632
Diabetic Retinopathy (DR) is one of the eye-related disease that reduces the integrity of the blood vessels in the retinal layers which leads to retinal blood vessel leakage [2]. Sodium Fluorescein Angiography (FA) is widely used to monitor the leakage or the permeability of the vessel by imaging the back of the eyes as an important diagnostic value. Gamez [2] which is a PhD student in University of Bristol started FA on mice. Gamez [2] manually extracted fluorescent intensity data from the resulting FA videos and a graph of the fluorescence intensity ratio (FIR) versus time was plotted to obtain the gradient which is the solute flux (ΔIf/ Δt). These data was then used to assist the development of the Fick's Law adapted equation P=ΔIf/ Δt /(ΔC × A) to obtain the permeability of the vessel. The obstacle of this method was the manual data capturing process was too time consuming. This method also requires a lot of manual adjustments due to the movement of the camera caused by the heartbeat of the mice and their eyeball motion. The movement of the camera also caused blurry and unsharp images in the FA videos which led to inaccurate fluorescent intensity. A more intelligent way of data capturing was developed in this project using openCV with Python. This project firstly experimented on using K-means clustering to segment the exchange vessel groups and the large vessel group out of the FA frames to obtain the FIR for the FIR vs time graph. This project experimented on the two settings of K-mean clustering. One used random initial centers and the other one used the previous frame’s centers found by K-means clustering as the initial centers of the K-means clustering for the current frame (Reuse center K-means clustering). The experiment found that the random initial centers K-means clustering output stable FIR when the maximum iteration was 7 or above and the best epsilon (specific accuracy) was 0.1. Maximum iteration below 7 cannot be used due to FIR vs time graph showed large amount of noise and severe deformation. Conversely, the reuse center K-means clustering showed no deformation and noise on the FIR vs time graph when the maximum iteration was 7 or below and a much shorter execution time than the random initial centers K-means clustering. Then, the difference on the gradient of the FIR vs time graph was further examine between the two K-means clustering setting. The random initial centers Kmeans clustering showed fluctuation on the gradient value when the maximum iteration was between 7 and 15. The reuse center K-means clustering showed either an ascending or descending trend on the gradient value when the maximum iteration was below 7 and the gradient value stabilized when the maximum iteration was between 7 and 15. Reuse center K-means clustering was decided to implement in the final software and maximum iteration 7 was set as default to prioritize gradient accuracy over the execution time, and allow user to lower the maximum iteration to reduce execution time. This project then experimented on blurry frame classification by using Sobel edge detection. Convolution was performed with Sobel derivative operator on each FA frame to obtain an edge sharpness value. The edge sharpness versus frame number graphs were examined for all video and discovered a great separation between sharp and blurry frames in edge sharpness value. Sharp frames had higher edge sharpness and blurry frames had lower edge sharpness. A piece of code was created to loop along all the data points in edge sharpness vs frame number graph to classify sharp and blurry frames. The code firstly checked if the range of several neighboring data point (PtPbox) is larger than a specific value (tolerance value), then the data point needed a sharpness check, which take the mean of several neighboring data point (meanBox) and check is the current data point edge sharpness is lower or higher than the mean value. Lower means blurry frame and higher means sharp frames. A series of experiments were performed and the optimal value for PtPbox is 20, the meanBox is 10, the tolerance value is 0.1 and no histogram equalization is required. The sharp frames identification accuracy was above 80% and the blurry frames identification accuracy was above 96% for all the tested FA videos. All these experimental codes were then connected by a graphical user interface based on python with PyQT4. Finally, the PyInstaller was used to package these Python codes into a stand-alone Microsoft Window executable for Gamez [2] to use.
gouthamiexcelr
Cloud computing is nothing but sharing and storing data that helps businesses to replace high infrastructure expenses with low variable costs that scale with the business. Businesses don’t need to plan to get many advanced servers and can deliver results quickly by accessing the servers in the cloud virtually within few minutes. The trend of cloud is rapidly transforming information technology from on-premises computing and software approach where organizations must assemble and manage the resources themselves in data centres. Businesses have bought or leased to the new way, cloud computing where businesses rent instead of buy computing, software and services and the cloud provider manages all the undifferentiated heavy lifting for the underlying infrastructure. As more businesses make the move to the cloud, they are flocking to AWS, providing the money the cloud company can reinvest into launching new services and features every day. There is lots of cloud noise out, and is causing confusion for potential buyers of cloud computing as to whom to buy from. Should I go with AWS, Microsoft, Google, Oracle, IBM, or others? The list of cloud suppliers is getting longer every day and every cloud supplier is hyping some sort of victory or benefit for customers. Many cloud competitors are quickly trying to build their own clouds as replicas of AWS while others have abandoned cloud altogether. But many of these vendors are conflicted internally trying to protect and extend the runway for their existing legacy products, which is stifling their innovation. Amazon offered new web services in the form of IT infrastructure services know as Amazon Web Services for cloud computing. Initially, AWS was primarily used as a cheap way to test things or running a simple website which was developed by small developers. Now most of the companies running small experimental apps in AWS is moving the core of their growing businesses. The early rise of AWS was born for developers building small and cool apps. AWS’s focus on its customers means that it is not afraid of utilizing itself in inventing new technology. It became clear that AWS’s competitive strategy is twofold. First, keep inventing and releasing more services that solve customers’ problems faster than the competition. And second, don’t be afraid to change which means cannibalizing existing offerings with new, easier, faster, more cost-effective and efficient offerings. The question for businesses looking to move to the cloud is to determine what technology is cloud technology and which just retrofitted legacy technology is with the term “cloud” affixed to its name. As businesses start seeing the benefits of cloud and the financial results that come with it, they want more cloud, not less. AWS’s thundering pace of innovation and scale, which has created more capability for customers, also has increased its total addressable market. AWS launches thousand plus features and services every successive year to make it easier in the market to use AWS. Security and privacy are obstructing the cloud deployment in Malaysia. The most significant issue is comprehending and then minimizing the risks associated with cloud deployment. This problem is more significant in off-premise clouds, where contents are stored on the cloud vendor’s premises. Although data owners have full control over the infrastructure, hardware resources, and software systems with their traditional physical servers or on-premise clouds, it’s nontrivial to enforce the risk of letting contents stay in third-party premises. In this situation, the data owner has limited access to monitor his or her property in the virtual servers in the absence of advanced and detailed cloud-monitoring tools. Another challenge which decelerates cloud adaption is information disclosure from government agencies. The market wave of the cloud will create lots of value and he is betting that AWS, along with its partner ecosystem of developers and companies, will take most of the share of that value. The industry shift happens with every big technology innovation, which is when you make something much more cost effective and much faster to get done. People consume more technology with a limitless amount of ideas. More and more people are actually using the cloud, and building more things, and will in the future.
TheGameStopsNow
TISA is a novel algorithm for aligning and comparing financial time series in a way that is invariant to non-linear transformations (like different volatility regimes or trends) and robust to noise. It combines segmentation with dynamic programming to find the optimal alignment between two series.
gouthamiexcelr
Cloud computing is nothing but sharing and storing data that helps businesses to replace high infrastructure expenses with low variable costs that scale with the business. Businesses don’t need to plan to get many advanced servers and can deliver results quickly by accessing the servers in the cloud virtually within few minutes. The trend of cloud is rapidly transforming information technology from on-premises computing and software approach where organizations must assemble and manage the resources themselves in data centres. Businesses have bought or leased to the new way, cloud computing where businesses rent instead of buy computing, software and services and the cloud provider manages all the undifferentiated heavy lifting for the underlying infrastructure. As more businesses make the move to the cloud, they are flocking to AWS, providing the money the cloud company can reinvest into launching new services and features every day. There is lots of cloud noise out, and is causing confusion for potential buyers of cloud computing as to whom to buy from. Should I go with AWS, Microsoft, Google, Oracle, IBM, or others? The list of cloud suppliers is getting longer every day and every cloud supplier is hyping some sort of victory or benefit for customers. Many cloud competitors are quickly trying to build their own clouds as replicas of AWS while others have abandoned cloud altogether. But many of these vendors are conflicted internally trying to protect and extend the runway for their existing legacy products, which is stifling their innovation. Amazon offered new web services in the form of IT infrastructure services know as Amazon Web Services for cloud computing. Initially, AWS was primarily used as a cheap way to test things or running a simple website which was developed by small developers. Now most of the companies running small experimental apps in AWS is moving the core of their growing businesses. The early rise of AWS was born for developers building small and cool apps. AWS’s focus on its customers means that it is not afraid of utilizing itself in inventing new technology. It became clear that AWS’s competitive strategy is twofold. First, keep inventing and releasing more services that solve customers’ problems faster than the competition. And second, don’t be afraid to change which means cannibalizing existing offerings with new, easier, faster, more cost-effective and efficient offerings. The question for businesses looking to move to the cloud is to determine what technology is cloud technology and which just retrofitted legacy technology is with the term “cloud” affixed to its name. As businesses start seeing the benefits of cloud and the financial results that come with it, they want more cloud, not less. AWS’s thundering pace of innovation and scale, which has created more capability for customers, also has increased its total addressable market. AWS launches thousand plus features and services every successive year to make it easier in the market to use AWS. Security and privacy are obstructing the cloud deployment in Malaysia. The most significant issue is comprehending and then minimizing the risks associated with cloud deployment. This problem is more significant in off-premise clouds, where contents are stored on the cloud vendor’s premises. Although data owners have full control over the infrastructure, hardware resources, and software systems with their traditional physical servers or on-premise clouds, it’s nontrivial to enforce the risk of letting contents stay in third-party premises. In this situation, the data owner has limited access to monitor his or her property in the virtual servers in the absence of advanced and detailed cloud-monitoring tools. Another challenge which decelerates cloud adaption is information disclosure from government agencies. The market wave of the cloud will create lots of value and he is betting that AWS, along with its partner ecosystem of developers and companies, will take most of the share of that value. The industry shift happens with every big technology innovation, which is when you make something much more cost effective and much faster to get done. People consume more technology with a limitless amount of ideas. More and more people are actually using the cloud, and building more things, and will in the future.
DaethRa
End-to-End Data Pipeline (n8n, LLM, Supabase) and Statistical EDA (Python) to separate real tech trends from noise in the Data Analyst job market (2022-2026).
maira-prog
Code for the paper "Improving Forecasts of Climatic Variables with Noise or Sharp Transitions through Trend-Aware Neural Networks"
JJFrisch
Algorims to analyze data with noise or complex speeds. Deals with acceleration, speed, trends, and maxes and mins. Based off of a AP Physics lab.
Jashan122005
This project involves time series decomposition to analyze daily website traffic. Using Python (statsmodels) or R, we clean the data and apply additive or multiplicative models to isolate the underlying Trend, Seasonality (weekly cycles), and Residuals (noise).
Repository for the Braced Fourier Continuation and Regression (BFCR) algorithm, a novel and computationally efficient means of finding nonlinear regressions or trend lines in arbitrary data sets with a non-trivial amount of noise.
yobuuubub
The WriteWay is a youth journalism platform for clear, credible reporting. Young writers publish stories with editorial review and strong safety standards. It is built for depth, clarity, and accountability over noise, metrics, or viral trends.
yobuuubub
The WriteWay is a youth journalism platform for clear, credible reporting. Young writers publish stories with editorial review and strong safety standards. It is built for depth, clarity, and accountability over noise, metrics, or viral trends.
xfordx058
Trendiyo is a curated affiliate discovery web platform that highlights trending products from marketplaces like Shopee and TikTok Shop. It focuses on clean UI, fast performance, and product curation helping users discover what’s popular without noise or clutter.
aarthiraju2003-creator
This project focuses on designing, implementing, and interpreting an advanced deep learning model for multi-step time series forecasting, emphasizing both high predictive accuracy and model explainability. A complex multivariate time series dataset (with seasonality, trend, and noise) is generated or acquired.
ShafinMirkar
Vubix is a modern video-sharing platform built for creators and viewers who crave freedom, creativity, and community. Whether you're uploading your first vlog, streaming live gameplay, or exploring trending content, Vubix offers a fast, clean, and user-friendly experience—without the noise.
This project implements a complete deep learning pipeline for forecasting complex, multivariate time-series data using a stacked LSTM neural network. The dataset is programmatically generated to simulate realistic non-stationary behavior with seasonality, trend, and noise—similar to energy consumption or financial market data.
shams01-bit
NewzPitara delivers fast, reliable, and well-curated updates from across the world. It combines top headlines, trending stories, and real-time weather insights in one clean dashboard. The platform focuses on accuracy, clarity, and speed—so users get the most important information without noise or clutter.
RichieGarafola
Time series decomposition is breaking down a single time series into different parts. Each part represents a pattern that you can try to model and predict. The patterns usually fall into three categories: trend, seasonality, and noise. Time series decomposition models are additive, multiplicative, or some combination of both.
Flaskhan
This is the code used in the study "Khan H., Laas A., Marcé R., Obrador B., "Major effects of alkalinity on the relationship between metabolism and dissolved inorganic carbon dynamics in lakes" . The code was used in the mentioned study to isolate the diel metabolic signal (seasonal pattern of 24 hours) from High Frequency Measurements time series of dissolved oxygen and dissolved inorganic carbon. It can be used to extract seasonal patterns, trends or noise signals from a time series using Singular Spectrum Analysis (SSA).
All 18 repositories loaded