Found 874 repositories(showing 30)
shirosaidev
Stock market analyzer and predictor using Elasticsearch, Twitter, News headlines and Python natural language processing and sentiment analysis
uclatommy
Real-time sentiment analysis in Python using twitter's streaming api
realpython
Let's perform Twitter sentiment analysis using Python, Docker, Elasticsearch, and Kibana!
ujjwalkarn
tutorial for sentiment analysis on Twitter data using Python
anubhavanand12qw
The coding has been done on Python 3.65 using Jupyter Notebook. This program fetches LIVE data from TWITTER using Tweepy. Then we clean our data or tweets ( like removing special characters ). After that we perform sentiment analysis on the twitter data and plot it for better visualization. The we fetch the STOCK PRICE from yahoo.finance and add it to the data-set to perform prediction. We apply many machine learning algorithms like (random forest, MLPClassifier, logistic regression) and train our data-set. Then we perform prediction on untrained data and plot it with the real data and see the accuracy.
roshancyriacmathew
This project walks you on how to create a twitter sentiment analysis model using python. Twitter sentiment analysis is performed to identify the sentiments of the people towards various topics. For this project, we will be analysing the sentiment of people towards Pfizer vaccines. We will be using the data available on Kaggle to create this machine learning model. The collected tweets from Twitter will be analysed using machine learning to identify the different sentiments present in the tweets. The different sentiments identified in this project include positive sentiment, negative sentiment and neutral sentiment. We will also be using different classifiers to see which classifier gives the best model accuracy.
kenchang408
Currently taking a course on Coursera.org called Introduction to Data Science. The very first programming assignment is called Twitter Sentiment Analysis in Python. I would like to keep track of my progress of the source code for this project and if people want to contribute after the class is over that is good. This is also used as a tutorial for people to learn how to use Twitter API in python to do Twitter sentiment analysis
AshwanthRamji
Sentiment Analysis using Python, Twitter API and Hive
ajayshewale
This project addresses the problem of sentiment analysis on Twitter. The goal of this project was to predict sentiment for the given Twitter post using Python. Sentiment analysis can predict many different emotions attached to the text, but in this report, only 3 major were considered: positive, negative and neutral. The training dataset was small (just over 5900 examples) and the data within it was highly skewed, which greatly impacted on the difficulty of building a good classifier. After creating a lot of custom features, utilizing bag-of-words representations and applying the Extreme Gradient Boosting algorithm, the classification accuracy at the level of 58% was achieved. Analysing the public sentiment as firms trying to find out the response of their products in the market, predicting political elections and predicting socioeconomic phenomena like the stock exchange.
ginking
Archimedes 1 is a bot based sentient based trader, heavily influenced on forked existing bots, with a few enhancements here or there, this was completed to understand how the bots worked to roll the forward in our own manner to our own complete ai based trading system (Archimedes 2:0) This bot watches [followed accounts] tweets and waits for them to mention any publicly traded companies. When they do, sentiment analysis is used determine whether the opinions are positive or negative toward those companies. The bot then automatically executes trades on the relevant stocks according to the expected market reaction. The code is written in Python and is meant to run on a Google Compute Engine instance. It uses the Twitter Streaming APIs (however new version) to get notified whenever tweets within remit are of interest. The entity detection and sentiment analysis is done using Google's Cloud Natural Language API and the Wikidata Query Service provides the company data. The TradeKing (ALLY) API does the stock trading (changed to ALLY). The main module defines a callback where incoming tweets are handled and starts streaming user's feed: def twitter_callback(tweet): companies = analysis.find_companies(tweet) if companies: trading.make_trades(companies) twitter.tweet(companies, tweet) if __name__ == "__main__": twitter.start_streaming(twitter_callback) The core algorithms are implemented in the analysis and trading modules. The former finds mentions of companies in the text of the tweet, figures out what their ticker symbol is, and assigns a sentiment score to them. The latter chooses a trading strategy, which is either buy now and sell at close or sell short now and buy to cover at close. The twitter module deals with streaming and tweeting out the summary. Follow these steps to run the code yourself: 1. Create VM instance Check out the quickstart to create a Cloud Platform project and a Linux VM instance with Compute Engine, then SSH into it for the steps below. The predefined machine type g1-small (1 vCPU, 1.7 GB memory) seems to work well. 2. Set up auth The authentication keys for the different APIs are read from shell environment variables. Each service has different steps to obtain them. Twitter Log in to your Twitter account and create a new application. Under the Keys and Access Tokens tab for your app you'll find the Consumer Key and Consumer Secret. Export both to environment variables: export TWITTER_CONSUMER_KEY="<YOUR_CONSUMER_KEY>" export TWITTER_CONSUMER_SECRET="<YOUR_CONSUMER_SECRET>" If you want the tweets to come from the same account that owns the application, simply use the Access Token and Access Token Secret on the same page. If you want to tweet from a different account, follow the steps to obtain an access token. Then export both to environment variables: export TWITTER_ACCESS_TOKEN="<YOUR_ACCESS_TOKEN>" export TWITTER_ACCESS_TOKEN_SECRET="<YOUR_ACCESS_TOKEN_SECRET>" Google Follow the Google Application Default Credentials instructions to create, download, and export a service account key. export GOOGLE_APPLICATION_CREDENTIALS="/path/to/credentials-file.json" You also need to enable the Cloud Natural Language API for your Google Cloud Platform project. TradeKing (ALLY) Log in to your TradeKing (ALLY account and create a new application. Behind the Details button for your application you'll find the Consumer Key, Consumer Secret, OAuth (Access) Token, and Oauth (Access) Token Secret. Export them all to environment variables: export TRADEKING_CONSUMER_KEY="<YOUR_CONSUMER_KEY>" export TRADEKING_CONSUMER_SECRET="<YOUR_CONSUMER_SECRET>" export TRADEKING_ACCESS_TOKEN="<YOUR_ACCESS_TOKEN>" export TRADEKING_ACCESS_TOKEN_SECRET="<YOUR_ACCESS_TOKEN_SECRET>" Also export your TradeKing (ALLY) account number, which you'll find under My Accounts: export TRADEKING_ACCOUNT_NUMBER="<YOUR_ACCOUNT_NUMBER>" 3. Install dependencies There are a few library dependencies, which you can install using pip: $ pip install -r requirements.txt 4. Run the tests Verify that everything is working as intended by running the tests with pytest using this command: $ export USE_REAL_MONEY=NO && pytest *.py --verbose 5. Run the benchmark The benchmark report shows how the current implementation of the analysis and trading algorithms would have performed against historical data. You can run it again to benchmark any changes you may have made: $ ./benchmark.py > benchmark.md 6. Start the bot Enable real orders that use your money: $ export USE_REAL_MONEY=YES Have the code start running in the background with this command: $ nohup ./main.py & License Archimedes (edits under Invacio) Max Braun Frame under Max Braun, licence under Apache V2 License. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
emumba-com
Live Twitter sentiment analysis using Python, Apache Spark Streaming, Kafka, NLTK, SocketIO
astranovasky
Twitter real-time sentiment analysis using Spark Structured Streaming and Python
MohamedAfham
A Twitter Sentiment Analysis model developed using python and NLTK (NLP Library)
This project is on twitter sentimental analysis by combining lexicon based and machine learning approaches. A supervised lexicon-based approach for extracting sentiments from tweets was implemented. Various supervised machine learning approaches were tested using scikit-learn libraries in python and implemented Decision Trees and Naive Bayes techniques.
npanwar
Suicide Ideation in Society -A Multivariate Analysis: Twitter sentimental Analysis using python to predict the suicidal thoughts of individuals by analyzing their tweets whether the sentiment of their tweets are positive or negative.
:airplane: A simple pipeline to retrieve, analyze and visualize Tweets in real time with Elasticsearch and Python
Twitter sentiment analysis using Python, Elasticsearch, and Kibana!
prabhusarath
Empirical Analysis of Predictive Algorithms for Collaborative Filtering, constructing a Social Network using Twitter Data, Community Detection and Link Prediction using Facebook ‘Like’ Data, Categorizing Movie Reviews based on Sentiment Analysis, Content-based Recommendation Algorithm using Python, Pandas, Numpy and scikit-learn.
piyushvarshney
Twitter Sentiment analysis with polarity and subjectivity using Python
jijopjames
Predicting US Presidential Election 2020 Result Using Twitter Sentiment Analysis with Python.
NishthaChaudhary
Natural language processing (NLP) is an exciting branch of artificial intelligence (AI) that allows machines to break down and understand human language. I plan to walk through text pre-processing techniques, machine learning techniques and Python libraries for NLP. Text pre-processing techniques include tokenization, text normalization and data cleaning. Once in a standard format, various machine learning techniques can be applied to better understand the data. This includes using popular modeling techniques to classify emails as spam or not, or to score the sentiment of a tweet on Twitter. Newer, more complex techniques can also be used such as topic modeling, word embeddings or text generation with deep learning. We will walk through an example in Jupyter Notebook that goes through all of the steps of a text analysis project, using several NLP libraries in Python including NLTK, TextBlob, spaCy and gensim along with the standard machine learning libraries including pandas and scikit-learn.
gauravssnl
Twitter Sentiment Analysis by using Python
itlinhph
[Web Mining] Vietnamese sentiment analysis from Twitter. Using TwitterAPI, Python
triandicAnt
Sentiment analysis over IMDB-movie reviews and Twitter data using Logistic regression and Naive Bayes' classifiers from Python’s machine learning package: scikit-learn.
vaitybharati
Assignment-11-Text-Mining-01-Elon-Musk, Perform sentimental analysis on the Elon-musk tweets (Exlon-musk.csv), Text Preprocessing: remove both the leading and the trailing characters, removes empty strings, because they are considered in Python as False, Joining the list into one string/text, Remove Twitter username handles from a given twitter text. (Removes @usernames), Again Joining the list into one string/text, Remove Punctuation, Remove https or url within text, Converting into Text Tokens, Tokenization, Remove Stopwords, Normalize the data, Stemming (Optional), Lemmatization, Feature Extraction, Using BoW CountVectorizer, CountVectorizer with N-grams (Bigrams & Trigrams), TF-IDF Vectorizer, Generate Word Cloud, Named Entity Recognition (NER), Emotion Mining - Sentiment Analysis.
anupriya1519
Sentiment Analysis on Twitter Data
shubhampadole68
Millions of people are using Twitter and expressing their emotions like happiness, sadness, angry, etc. The Sentiment analysis is also about detecting the emotions, opinion, assessment, attitudes, and took this into consideration as a way humans think. Sentiment analysis classifies the emotions into classes such as positive or negative. Nowadays, industries are interested to use textual data for semantic analysis to extract the view of people about their products and services. Sentiment analysis is very important for them to know the customer satisfaction level and they can improve their services accordingly. To work on the text data, they try to extract the data from social media platforms. There are a lot of social media sites like Google Plus, Facebook, and Twitter that allow expressing opinions, views, and emotions about certain topics and events. Microblogging site Twitter is expanding rapidly among all other online social media networking sites with about 200 million users. Twitter was founded in 2006 and currently, it is the most famous microblogging platform. In 2017 2 million users shared 8.3 million tweets in one hour. Twitter users use to post their thoughts, emotions, and messages on their profiles, called tweets. Words limit of a single tweet has 140 characters. Twitter sentiment analysis based on the NLP (natural language processing) field. For tweets text, we use NLP techniques like tokenizing the words, removing the stop words like I, me, my, our, your, is, was, etc. Natural language processing also plays a part to preprocess the data like cleaning the text and removing the special characters and punctuation marks. Sentimental analysis is very important because we can know the trends of people’s emotions on specific topics with their tweets.
Twitter sentiment analysis of popular CEOs using Python and Tableau
A Demo Twitter Streaming and Sentiment Analysis App to showcase RHT AMQ Streams (Kafka), MongoDB served through Python backend API and Javascript Frontend . This app runs on OpenShift and enjoys persistency using OpenShift Container Storage (rook-ceph)
Machine learning application created in Python which uses recurrent neural network (LSTM) for time-series predictions. Also, included twitter sentiment analysis for better accuracy.