Twitter Sentiment Analysis Pipeline to Database and Slack Notification
detail.loadingPreview
Automate Twitter sentiment analysis on #OnThisDay tweets. This workflow fetches tweets, stores them in MongoDB, analyzes sentiment with Google Cloud Natural Language, saves results to PostgreSQL, and notifies Slack based on sentiment score.
🚀Ready to Deploy This Workflow?
About This Workflow
Overview
This n8n workflow is designed to automate the process of collecting and analyzing the sentiment of tweets related to the '#OnThisDay' hashtag. It leverages several powerful n8n nodes to create an efficient ETL (Extract, Transform, Load) pipeline. The workflow begins by fetching tweets from Twitter, processing them through a MongoDB database for storage, and then utilizing Google Cloud Natural Language to determine the sentiment of each tweet. The extracted sentiment score and magnitude are then stored in a PostgreSQL database. Finally, a conditional logic step (IF node) checks the sentiment score, and if it meets a certain threshold, a notification is sent to Slack, including the tweet text and its sentiment analysis results.
Key Features
- Automated Tweet Fetching: Schedules and retrieves tweets based on a specific hashtag.
- Data Storage & Processing: Utilizes MongoDB for initial tweet storage and PostgreSQL for structured sentiment data.
- Advanced Sentiment Analysis: Integrates with Google Cloud Natural Language API for sophisticated text sentiment analysis.
- Conditional Notifications: Sends real-time alerts to Slack based on predefined sentiment score conditions.
- Customizable Integration: Easily adaptable to other social media platforms or text sources.
How To Use
- Configure Credentials: Set up your Twitter, MongoDB, PostgreSQL, Google Cloud Natural Language, and Slack API credentials within n8n.
- Set Up Cron Trigger: Define the schedule for running the workflow (e.g., daily at 6 AM) using the Cron node.
- Connect Twitter: Configure the Twitter node to search for '#OnThisDay' tweets.
- Store in MongoDB: Set up the MongoDB node to insert the fetched tweet data.
- Analyze Sentiment: Configure the Google Cloud Natural Language node to process the tweet text.
- Store Results: Use the Set node to extract and format sentiment score and magnitude, then the Postgres node to save this data along with the tweet text.
- Implement Conditional Logic: Configure the IF node to check the sentiment score against your desired threshold.
- Send Slack Notifications: Set up the Slack node to send alerts to your desired channel with relevant tweet and sentiment details when the condition is met.
- Activate Workflow: Enable the workflow to start the automation.
Apps Used
Workflow JSON
{
"id": "134a24af-94dd-408b-877c-7f9f717b10c5",
"name": "Twitter Sentiment Analysis Pipeline to Database and Slack Notification",
"nodes": 0,
"category": "Data Processing & Automation",
"status": "active",
"version": "1.0.0"
}Note: This is a sample preview. The full workflow JSON contains node configurations, credentials placeholders, and execution logic.
Get This Workflow
ID: 134a24af-94dd...
About the Author
Crypto_Watcher
Web3 Developer
Automated trading bots and blockchain monitoring workflows.
Statistics
Verification Info
Related Workflows
Discover more workflows you might like
Automate Fitness API Weekly Reports with n8n
This workflow automates the generation of weekly fitness API reports. It uses a Webhook Trigger to receive data, processes it with Text Splitter and Embeddings, stores it in Supabase, and uses an OpenAI Chat Model and RAG Agent to generate a report, finally logging it to Google Sheets. Error handling is included with a Slack Alert.
Automate Habit Form Weekly Summary with Webhook and Supabase
This n8n workflow automates the processing of weekly habit form submissions. It uses a Webhook Trigger to receive data, a Text Splitter and Embeddings node to prepare it for vector storage, and Supabase Insert to store it. A RAG Agent then summarizes the data, which is appended to a Google Sheet via Append Sheet, with Slack Alert for errors.