Found 151 repositories(showing 30)
In this repository, you can find the notebooks and data regarding the lessons on Data-driven building behaviour prediction and simulation which have been offered in the context of "Energy and Environmental Technologies for Building System" course at Politecnico di Milano
reddyprasade
Prepare to Technical Skills Here are the essential skills that a Machine Learning Engineer needs, as mentioned Read me files. Within each group are topics that you should be familiar with. Study Tip: Copy and paste this list into a document and save to your computer for easy referral. Computer Science Fundamentals and Programming Topics Data structures: Lists, stacks, queues, strings, hash maps, vectors, matrices, classes & objects, trees, graphs, etc. Algorithms: Recursion, searching, sorting, optimization, dynamic programming, etc. Computability and complexity: P vs. NP, NP-complete problems, big-O notation, approximate algorithms, etc. Computer architecture: Memory, cache, bandwidth, threads & processes, deadlocks, etc. Probability and Statistics Topics Basic probability: Conditional probability, Bayes rule, likelihood, independence, etc. Probabilistic models: Bayes Nets, Markov Decision Processes, Hidden Markov Models, etc. Statistical measures: Mean, median, mode, variance, population parameters vs. sample statistics etc. Proximity and error metrics: Cosine similarity, mean-squared error, Manhattan and Euclidean distance, log-loss, etc. Distributions and random sampling: Uniform, normal, binomial, Poisson, etc. Analysis methods: ANOVA, hypothesis testing, factor analysis, etc. Data Modeling and Evaluation Topics Data preprocessing: Munging/wrangling, transforming, aggregating, etc. Pattern recognition: Correlations, clusters, trends, outliers & anomalies, etc. Dimensionality reduction: Eigenvectors, Principal Component Analysis, etc. Prediction: Classification, regression, sequence prediction, etc.; suitable error/accuracy metrics. Evaluation: Training-testing split, sequential vs. randomized cross-validation, etc. Applying Machine Learning Algorithms and Libraries Topics Models: Parametric vs. nonparametric, decision tree, nearest neighbor, neural net, support vector machine, ensemble of multiple models, etc. Learning procedure: Linear regression, gradient descent, genetic algorithms, bagging, boosting, and other model-specific methods; regularization, hyperparameter tuning, etc. Tradeoffs and gotchas: Relative advantages and disadvantages, bias and variance, overfitting and underfitting, vanishing/exploding gradients, missing data, data leakage, etc. Software Engineering and System Design Topics Software interface: Library calls, REST APIs, data collection endpoints, database queries, etc. User interface: Capturing user inputs & application events, displaying results & visualization, etc. Scalability: Map-reduce, distributed processing, etc. Deployment: Cloud hosting, containers & instances, microservices, etc. Move on to the final lesson of this course to find lots of sample practice questions for each topic!
Artificial Intelligence and Machine Learning have empowered our lives to a large extent. The number of advancements made in this space has revolutionized our society and continue making society a better place to live in. In terms of perception, both Artificial Intelligence and Machine Learning are often used in the same context which leads to confusion. AI is the concept in which machine makes smart decisions whereas Machine Learning is a sub-field of AI which makes decisions while learning patterns from the input data. In this blog, we would dissect each term and understand how Artificial Intelligence and Machine Learning are related to each other. What is Artificial Intelligence? The term Artificial Intelligence was recognized first in the year 1956 by John Mccarthy in an AI conference. In layman terms, Artificial Intelligence is about creating intelligent machines which could perform human-like actions. AI is not a modern-day phenomenon. In fact, it has been around since the advent of computers. The only thing that has changed is how we perceive AI and define its applications in the present world. The exponential growth of AI in the last decade or so has affected every sphere of our lives. Starting from a simple google search which gives the best results of a query to the creation of Siri or Alexa, one of the significant breakthroughs of the 21st century is Artificial Intelligence. The Four types of Artificial Intelligence are:- Reactive AI – This type of AI lacks historical data to perform actions, and completely reacts to a certain action taken at the moment. It works on the principle of Deep Reinforcement learning where a prize is awarded for any successful action and penalized vice versa. Google’s AlphaGo defeated experts in Go using this approach. Limited Memory – In the case of the limited memory, the past data is kept on adding to the memory. For example, in the case of selecting the best restaurant, the past locations would be taken into account and would be suggested accordingly. Theory of Mind – Such type of AI is yet to be built as it involves dealing with human emotions, and psychology. Face and gesture detection comes close but nothing advanced enough to understand human emotions. Self-Aware – This is the future advancement of AI which could configure self-representations. The machines could be conscious, and super-intelligent. Two of the most common usage of AI is in the field of Computer Vision, and Natural Language Processing. Computer Vision is the study of identifying objects such as Face Recognition, Real-time object detection, and so on. Detection of such movements could go a long way in analyzing the sentiments conveyed by a human being. Natural Language Processing, on the other hand, deals with textual data to extract insights or sentiments from it. From ChatBot Development to Speech Recognition like Amazon’s Alexa or Apple’s Siri all uses Natural Language to extract relevant meaning from the data. It is one of the widely popular fields of AI which has found its usefulness in every organization. One other application of AI which has gained popularity in recent times is the self-driving cars. It uses reinforcement learning technique to learn its best moves and identify the restrictions or blockage in front of the road. Many automobile companies are gradually adopting the concept of self-driving cars. What is Machine Learning? Machine Learning is a state-of-the-art subset of Artificial Intelligence which let machines learn from past data, and make accurate predictions. Machine Learning has been around for decades, and the first ML application that got popular was the Email Spam Filter Classification. The system is trained with a set of emails labeled as ‘spam’ and ‘not spam’ known as the training instance. Then a new set of unknown emails is fed to the trained system which then categorizes it as ‘spam’ or ‘not spam.’ All these predictions are made by a certain group of Regression, and Classification algorithms like – Linear Regression, Logistic Regression, Decision Tree, Random Forest, XGBoost, and so on. The usability of these algorithms varies based on the problem statement and the data set in operation. Along with these basic algorithms, a sub-field of Machine Learning which has gained immense popularity in recent times is Deep Learning. However, Deep Learning requires enormous computational power and works best with a massive amount of data. It uses neural networks whose architecture is similar to the human brain. Machine Learning could be subdivided into three categories – Supervised Learning – In supervised learning problems, both the input feature and the corresponding target variable is present in the dataset. Unsupervised Learning – The dataset is not labeled in an unsupervised learning problem i.e., only the input features are present, but not the target variable. The algorithms need to find out the separate clusters in the dataset based on certain patterns. Reinforcement Learning – In this type of problems, the learner is rewarded with a prize for every correct move, and penalized for every incorrect move. The application of Machine Learning is diversified in various domains like Banking, Healthcare, Retail, etc. One of the use cases in the banking industry is predicting the probability of credit loan default by a borrower given its past transactions, credit history, debt ratio, annual income, and so on. In Healthcare, Machine Learning is often been used to predict patient’s stay in the hospital, the likelihood of occurrence of a disease, identifying abnormal patterns in the cell, etc. Many software companies have incorporated Machine Learning in their workflow to steadfast the process of testing. Various manual, repetitive tasks are being replaced by machine learning models. Comparison Between AI and Machine Learning Machine Learning is the subset of Artificial Intelligence which has taken the advancement in AI to a whole new level. The thought behind letting the computer learn from themselves and voluminous data that are getting generated from various sources in the present world has led to the emergence of Machine Learning. In Machine Learning, the concept of neural networks plays a significant role in allowing the system to learn from themselves as well as maintaining its speed, and accuracy. The group of neural nets lets a model rectifying its prior decision and make a more accurate prediction next time. Artificial Intelligence is about acquiring knowledge and applying them to ensure success instead of accuracy. It makes the computer intelligent to make smart decisions on its own akin to the decisions made by a human being. The more complex the problem is, the better it is for AI to solve the complexity. On the other hand, Machine Learning is mostly about acquiring knowledge and maintaining better accuracy instead of success. The primary aim is to learn from the data to automate specific tasks. The possibilities around Machine Learning and Neural Networks are endless. A set of sentiments could be understood from raw text. A machine learning application could also listen to music, and even play a piece of appropriate music based on a person’s mood. NLP, a field of AI which has made some ground-breaking innovations in recent years uses Machine Learning to understand the nuances in natural language and learn to respond accordingly. Different sectors like banking, healthcare, manufacturing, etc., are reaping the benefits of Artificial Intelligence, particularly Machine Learning. Several tedious tasks are getting automated through ML which saves both time and money. Machine Learning has been sold these days consistently by marketers even before it has reached its full potential. AI could be seen as something of the old by the marketers who believe Machine Learning is the Holy Grail in the field of analytics. The future is not far when we would see human-like AI. The rapid advancement in technology has taken us closer than ever before to inevitability. The recent progress in the working AI is much down to how Machine Learning operates. Both Artificial Intelligence and Machine Learning has its own business applications and its usage is completely dependent on the requirements of an organization. AI is an age-old concept with Machine Learning picking up the pace in recent times. Companies like TCS, Infosys are yet to unleash the full potential of Machine Learning and trying to incorporate ML in their applications to keep pace with the rapidly growing Analytics space. Conclusion The hype around Artificial Intelligence and Machine Learning are such that various companies and even individuals want to master the skills without even knowing the difference between the two. Often both the terms are misused in the same context. To master Machine Learning, one needs to have a natural intuition about the data, ask the right questions, and find out the correct algorithms to use to build a model. It often doesn’t requiem how computational capacity. On the other hand, AI is about building intelligent systems which require advanced tools and techniques and often used in big companies like Google, Facebook, etc. There is a whole host of resources to master Machine Learning and AI. The Data Science blogs of Dimensionless is a good place to start with. Also, There are Online Data Science Courses which cover the various nitty gritty of Machine Learning.
Welcome to 6.86x Machine Learning with Python–From Linear Models to Deep Learning. Machine learning methods are commonly used across engineering and sciences, from computer systems to physics. Moreover, commercial sites such as search engines, recommender systems (e.g., Netflix, Amazon), advertisers, and financial institutions employ machine learning algorithms for content recommendation, predicting customer behavior, compliance, or risk. As a discipline, machine learning tries to design and understand computer programs that learn from experience for the purpose of prediction or control. In this course, you will learn about principles and algorithms for turning training data into effective automated predictions. We will cover: Representation, over-fitting, regularization, generalization, VC dimension; Clustering, classification, recommender problems, probabilistic modeling, reinforcement learning; On-line algorithms, support vector machines, and neural networks/deep learning. You will be able to: Understand principles behind machine learning problems such as classification, regression, clustering, and reinforcement learning Implement and analyze models such as linear models, kernel machines, neural networks, and graphical models Choose suitable models for different applications Implement and organize machine learning projects, from training, validation, parameter tuning, to feature engineering You will implement and experiment with the algorithms in several Python projects designed for different practical applications. You will expand your statistical knowledge to not only include a list of methods, but also the mathematical principles that link these methods together, equipping you with the tools you need to develop new ones.
andrewmogbolu2
Blockchain and AI are on just about every chief information officers watchlist of game-changing technologies that stand to reshape industries. Both technologies come with immense benefits, but both also bring their own challenges for adoption. It is also fair to say that the hype surrounding these technologies individually may be unprecedented, so the thought of bringing these two ingredients together may be viewed by some as brewing a modern-day version of IT pixie dust. At the same time, there is a logical way to think about this mash-up that is both sensible and pragmatic. Today, AI is for all intents and purposes a centralized process. An end user must have extreme faith in the central authority to produce a trusted business outcome. By decentralizing the three key elements of AI — that is, data, models, and analytics — blockchain can deliver the trust and confidence often needed for end users to fully adopt and rely on AI-based business processes. Let’s explore how blockchain is poised to enrich AI by bringing trust to data, models and analytics. Your data is your data Many of the world’s most notable AI technology services are centralized — including Amazon, Apple, Facebook, Google, as well as Chinese companies Alibaba, Baidu and Tencent. Yet all have encountered challenges in establishing trust among their eager, but somewhat cautious users. How can a business provide assurance to its users that its AI has not overstepped its bounds? Imagine if these AI services could produce a “forensic report,” verified by a third party, to prove to you, beyond a reasonable doubt, how and when businesses are using your data once those are ingested. Imagine further that your data could be used only if you gave permission to do so. A blockchain ledger can be used as a digital rights management system, allowing your data to be “licensed” to the AI provider under your terms, conditions and duration. The ledger would act as an access management system storing the proofs and permission by which a business can access and use the user’s data. Trusted AI models Consider the example of using blockchain technology as a means of providing trusted data and provenance of training models for machine learning. In this case, we’ve created a fictitious system to answer the question of whether a fruit is an apple or orange. This question-answering system that we build is called a model, and this model is created via a process called training. The goal of training is to create an accurate model that answers our questions correctly most of the time. Of course, to train a model, we need to collect data to train on — for this example, that could be the color of the fruit (as a wavelength of light) and the sugar content (as a percentage). With blockchain, you can track the provenance of the training data as well as see an audit trail of the evidence that led to the prediction of why a particular fruit is considered an apple versus an orange. A business can also prove that it is not “juicing up” its books by tagging fruit more often as apples, if that is the more expensive of the two fruits. Explaining AI decisions The European Union has adopted a law requiring that any decision made by a machine be readily explainable, on penalty of fines that could cost companies billions of dollars. The EU General Data Protection Regulation (GDPR), which came into force in 2018, includes a right to obtain an explanation of decisions made by algorithms and a right to opt out of some algorithmic decisions altogether. Massive amounts of data are being produced every second — more data than humans have the ability to assess and use as the basis for drawing conclusions. However, AI applications are capable of assessing large data sets and many variables, while learning about or connecting those variables relevant to its tasks and objectives. For this very reason, AI continues to be adopted in various industries and applications, and we are relying more and more on their outcomes. It is essential, however, that any decisions made by AI are still verified for accuracy by humans. Blockchain can help clarify the provenance, transparency, understanding, and explanations of those outcomes and decisions. If decisions and associated data points are recorded via transactions on a blockchain, the inherent attributes of blockchain will make auditing them much simpler. Blockchain is a key technology that brings trust to transactions in a network; therefore, infusing blockchain into AI decision-making processes could be the element needed to achieve the transparency necessary to fully trust the decisions and outcomes derived from AI. Blockchain and the Internet of Things More than a billion intelligent, connected devices are already part of today’s IoT. The expected proliferation of hundreds of billions more places us at the threshold of a transformation sweeping across the electronics industry and many other areas. With the advancement in IoT, industries are now enabled to capture data, gain insight from the data, and make decisions based on the data. Therefore, there is a lot of “trust” in the information obtained. But the real truth of the matter is, do we really know where these data came from and should we be making decisions and transacting based on data we cannot validate? For example, did weather data really originate from a censor in the Atlantic Ocean or did the shipping container really not exceed the agreed temperature limit? The IoT use cases are massive, but they all share the same issue with trust. IoT with blockchain can bring real trust to captured data. The underlying idea is to give devices, at the time of their creation, an identity that can be validated and verified throughout their lifecycle with blockchain. There is great potential for IoT systems in blockchain technology capabilities that rely on device identity protocols and reputation systems. With a device identity protocol, each device can have its own blockchain public key and send encrypted challenge and response messages to other devices, thereby ensuring a device remains in control of its identity. In addition, a device with an identity can develop a reputation or history that is tracked by a blockchain. Smart contracts represent the business logic of a blockchain network. When a transaction is proposed, these smart contracts are autonomously executed within the guidelines set by the network. In IoT networks, smart contracts can play a pivotal role by providing automated coordination and authorization for transactions and interactions. The original idea behind IoT was to surface data and gain actionable insight at the right time. For example, smart homes are a thing of the present and most everything can be connected. In fact, with IoT, when something goes wrong, these IoT devices can even take action — for example, ordering a new part. We need a way to govern the actions taken by these devices, and smart contracts are a great way to do so. In an ongoing experiment I have followed in Brooklyn, New York, a community is using a blockchain to record the production of solar energy and enable the purchase of excess renewable energy credits. The device itself has an identity and builds a reputation through its history of records and exchange. Through the blockchain, people can aggregate their purchasing power more easily, share the burden of maintenance, and trust that devices are recording actual solar production. As IoT continues to evolve and its adoption continues to grow, the ability to autonomously manage devices and actions taken by devices will be essential. Blockchain and smart contracts are positioned well to integrate those capabilities into IoT.
weiyx15
short term load forecasting for course project of POWER SYSTEM PREDICTION
Artifacts created during development of land price prediction system in Advanced Pattern Recognition Course at Waseda University Graduate School
hsaunchenlu
This repository contain the In-Class projects I did in the machine learning course. The project topics include PM2.5 prediction, the annual salary prediction, image sentiment classification, content classification and movie recommend system. The ML technology used in these projects are linear regression, logistic regression, CNN, RNN and Metrix Factorization.
Python data products are powering the AI revolution. Top companies like Google, Facebook, and Netflix use predictive analytics to improve the products and services we use every day. Take your Python skills to the next level and learn to make accurate predictions with data-driven systems and deploy machine learning models with this four-course Specialization from UC San Diego. This Specialization is for learners who are proficient with the basics of Python. You’ll start by creating your first data strategy. You’ll also develop statistical models, devise data-driven workflows, and learn to make meaningful predictions for a wide-range of business and research purposes. Finally, you’ll use design thinking methodology and data science techniques to extract insights from a wide range of data sources. This is your chance to master one of the technology industry’s most in-demand skills. Python Data Products for Predictive Analytics is taught by Professor Ilkay Altintas, Ph.D. and Julian McAuley. Dr. Alintas is a prominent figure in the data science community and the designer of the highly-popular Big Data Specialization on Coursera. She has helped educate hundreds of thousands of learners on how to unlock value from massive datasets.
wolfvoid
HNU - Intelligent Transportation Prediction System (2024Summer for a Course Project)
john777100
Final project of course "Digital System Design" supervised by Prof. An-Yeu Wu(with branch prediction and compression support)
JSFernandes
Developed for the Wireless Systems/Advanced Topics in Networking course in Sapienza Università di Roma, this is a Solar Energy Harvesting prediction system in Java that uses data collected by a mote running TinyOS and cloud coverage forecast provided by PredictionIO to predict how much solar energy will be harvested in the near future.
Shreyash1811
In this project we joined together various dataset consisting data on student demographics, Course selection, Interaction with provided resources and past scores in other similar courses. The objective of the project was to predict a students likelihood of failing a certain courses before the end of the semester and provide them support to turn around the prediction. It will not only improve lives of many students but also grow university reputation with high passing rate and better education system.
This is a team project from the grad-level course ECE 536, Digital Control Systems at NCSU. The team built an autonomous robot that followed a track with lanes, curves, dashed and solid lines, one line, and could navigate an intersection. An Arduino Uno controller was used in conjunction with an array of infrared sensors. We compared controller designs by writing the controller in Matlab, then converting that to a .ino file. The following controlers were compared: Current Observer, Infinite-Horizon Linear Quadratic Regulator, PID, Prediction Observer, Root Locus, and SISO Lead.
Dhanuaravinth
Hey everyone, Welcome back again. This video is about Proteus Design Suite Installation version 8.10 professional on Windows 10 Operating System. Any doubt regarding this let me know in the comment box below and If any improvements needed Kindly let me know your suggestion in the comment box below after seeing the video, and It will more important to make updates in my upcoming. Every single video on my channel is made with Love and Hard work, So don't forget to leave a Like.. :) ➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️ 🌐Learn free online courses on Great Learning: Use this Referral Link and start learning your new courses today and get 200 GL coins on the first sign-up... https://www.greatlearning.in/academy?referrer_code=GLMQORIELE7H0 ➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️ 🌐Our Website: https://twinprediction.blogspot.com/ ⭐️ ( Check Out Our Offical Website ) ➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️ ⭐️Support my Youtube Channel: http://www.youtube.com/c/TwinPrediction ➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️ ⚡️Check out this: ▶️Kicad v5.1.9 Installation: https://youtu.be/G2QNxP1rxIA ▶️PSIM v9.1.4 Pro Installation: https://youtu.be/pE20VfRWWdU ▶️Matlab 2021a Installation: https://youtu.be/ix8KbWW7DV8 ▶️Ng spice Installation For (Windows): https://youtu.be/3XploAG2Ejg ▶️Ng spice Installation For (Linux): https://youtu.be/iIa_jKtqXHY ▶️Altium Designer tutorial - Creating a new project - Part 1: https://youtu.be/FIzwY0MqyKk ▶️Altium Designer Tutorial - Creating Header For Beginners Part 2: https://youtu.be/dajn--6kxaE ▶️Altium Designer Tutorial - Creating Header For Beginners Part 3: https://youtu.be/pitRtYzSwsw ▶️Autocad Installation: https://youtu.be/FIzwY0MqyKk ▶️Altium Designer 20 Installation: https://youtu.be/7CGxSAFaiZ4 ▶️Figma installation: https://youtu.be/R6ourCY4i1I ▶️Android Studio Tutorial part 1- creating a new project: https://youtu.be/lo9NBVApcQw ▶️Flutter Visual Studio Code 1.49 Installation: https://youtu.be/uZr4qmu4QCg ▶️Flutter Android Studio 1.22 Installation: https://youtu.be/WQRNjJ0YP4i1l ▶️Matlab R2020b Installation: https://youtu.be/ad23_Z7077g ▶️MPLAB X IDE Installation: https://youtu.be/8-zj355YMHg ▶️Samsung SSD Upgrade With Data Migration: https://youtu.be/bW6XXGvLF3c ▶️Arduino (IDE) Installation: https://youtu.be/bDm9kXfnR_c ▶️Autodesk Fusion 360 Installation: https://youtu.be/rXZ5D5uWIpY ▶️Code Composer Studio Installation: https://youtu.be/q8dWSOcJczA ▶️Matlab 2020a Installation: https://youtu.be/B2oQbQDsABE ▶️Blender Installation: https://youtu.be/6gVY4Tz4eew ▶️Android Studio Installation: https://youtu.be/Iw46v8zXcpE ▶️Autodesk Eagle Installation: https://youtu.be/Whb56MEhksI ▶️Animation Composer Installation: https://youtu.be/pPzsIDUfqUw ➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️➖️ 🌐Connect With Us on Social Media: 🔵Facebook: https://www.facebook.com/sachin.aravinth.777 🔵Twitter: https://twitter.com/Twinprediction 🔵Instagram: https://www.instagram.com/twin_prediction/ 🔵Linkedin: https://www.linkedin.com/in/dhanu-aravinth-84b65b176 https://www.linkedin.com/in/twin-prediction-871711200/ 🔵Dribbble: https://dribbble.com/twin_prediction 🔵Blogger: https://twinprediction.blogspot.com/
lystun
This is a student course prediction system for tertiary institutions, using regression analysis to predict the likely course to be offered based on students performances in required examination (WAEC, UTME, Post UTME) and the pattern of inclination of the schools in giving admissions
Maruf-Ahmad-khan
No description available
JahnviKadia
NFT Price prediction system project in IR course
No description available
adityal10
Stock prices predictions for the course Financial Systems and Markets
helcig
PoC for a context-aware PKM system using NER, NLP, and vector embeddings (Postgres Vector) to categorize notes, with Node2Vec for link prediction. (Knowledge Graph course, TU Vienna)
Farrukhkhalid
oject Overview In this project, you will apply the skills you have acquired in this course to operationalize a Machine Learning Microservice API. You are given a pre-trained, sklearn model that has been trained to predict housing prices in Boston according to several features, such as average rooms in a home and data about highway access, teacher-to-pupil ratios, and so on. You can read more about the data, which was initially taken from Kaggle, on the data source site. This project tests your ability to operationalize a Python flask app—in a provided file, app.py—that serves out predictions (inference) about housing prices through API calls. This project could be extended to any pre-trained machine learning model, such as those for image recognition and data labeling. Housing price prediction. Project Tasks Your project goal is to operationalize this working, machine learning microservice using kubernetes, which is an open-source system for automating the management of containerized applications. In this project you will: Test your project code using linting Complete a Dockerfile to containerize this application Deploy your containerized application using Docker and make a prediction Improve the log statements in the source code for this application Configure Kubernetes and create a Kubernetes cluster Deploy a container using Kubernetes and make a prediction Upload a complete Github repo with CircleCI to indicate that your code has been tested
saitejagroove
This is part of the CS418 Data science course where we are dealing with two tasks on movie dataset. First task is Genre prediction using 1vsRest classifier and the other task is recommendation system.
Danilo-Lapa11
This project is intended for the IF977 software engineering course. We will create a system using Streamlit that provides predictions calculated with Machine Learning and time series of shares from the Brazilian stock exchange B3 taken via API.
FadiKais1
Projects and assignments from the Recommender Systems course, focusing on memory-based and model-based collaborative filtering, similarity metrics, evaluation methods, and algorithmic design. Includes full implementations of user-based CF, item-based CF, vector similarity computation cosine/Pearson, prediction functions, and recommendation ranking.
m1ndl0ss
Academic Performance Forest is a machine learning system built entirely in pure Java to predict student grades using a Random Forest regressor implemented from scratch (no ML libraries). Trained on 16,994 real student course grades, the model achieves 61% lower error than baseline, with 95% of predictions within ±1.16 grade points. The system inclu
Saicharan-Banothu
I am enrolled in an agentic AI course, mastering the design of autonomous systems. I'm learning to build AI agents that can reason, plan, and execute complex tasks independently, moving beyond simple prediction. This program is bridging advanced AI theory with practical, real-world applications for creating sophisticated, goal-driven models.
arpittutor
The most enthusiastic branch of Artificial Intelligence, Machine Learning is all around us in this modern world. As Facebook suggesting the stories in your feed, same Machine Learning brings out the power of data in a new way. It works on the phenomemnon of working on the development of computer programs that can access data and perform tasks automatically through predictions and detections, Machine Learning enables computer systems to learn and improve from experience continuously. Intellipaat is offering an industry-specific one of the best Machine Learning Training in Bangalore which mainly focuses on key modules such as Python, Algorithms, Statistics & Probability, Supervised & Unsupervised Learning, Decision Trees, Random Forests, Linear & Logistic regression, etc. If we talk about Machine Learning definition then, it is a core sub-area of Artificial Intelligence (AI). Machine Learning applications learn from experience (well data) like humans without direct programming. https://intellipaat.com/machine-learning-certification-training-course-bangalore/
Data analytics isn’t just about the future, it is being put to use at this very moment in all businesses. It forms an integral part of the company and the professionals are paid highly for their part. Here are reasons why joining data analytics training in Gurgaon is a viable option After the completion of Data Analytics Course, you will be able to: Understand Scala & Apache Spark implementation Spark operations on Spark Shell Spark Driver & its related Worker Nodes Spark + Flume Integration Setting up Data Pipeline using Apache Flume, Apache Kafka & Spark Streaming Spark RDDs and Spark Streaming Spark MLib : Creating Classifiers & Recommendations systems using MLib Spark Core concepts: Creating of RDDs: Parrallel RDDs, MappedRDD, HadoopRDD, JdbcRDD. Spark Architecture & Components Spark SQL experience with CSV, XML & JSON Reading data from different Spark sources Spark SQL & Dataframes Develop and Implement various Machine Learning Algorithms in daily practices & Live Environment Building Recommendation systems and Classifiers Perform various type of Analysis (Prediction & Regression) Implement plotting & graphs using various Machine Learning Libraries Import data from HDFS & Implement various Machine Learning Models Building different Neural networks using NumPy and TensorFlow Power BI Visualization Power BI Components Power BI Transformations Dax functions Data Exploration and Mapping Designing Dashboards Time Series, Aggregation & Filters Placement Gyansetu is providing complimentary placement service to all students. Gyansetu Placement Team consistently works on industry collaboration and associations which help our students to find their dream job right after the completion of training. Why Choose us? Gyansetu trainers are well known in Industry; who are highly qualified and currently working in top MNCs. We provide interaction with faculty before the course starts. Our experts help students in learning Technology from basics, even if you are not good at basic programming skills, don’t worry! We will help you. Faculties will help you in preparing project reports & presentations. Students will be provided Mentoring sessions by Experts.
HarinDulneth
No description available