Warning
This project is shutdown. It was a good concept, maybe give it a second life in the future.
Convenient Red Bull price tracker made for Tbilisi, Georgia. This project covers both data collection and representation, with the latter delegated to Telegram bot and Grafana.
As of now it supports two most major delivery providers: Wolt and Glovo.
Project utilizes:
- Scripting [Python, SQL]
- Containerization [Docker, Docker Hub, AWS ECS (EC2 based), AWS ECR]
- CI/CD automation [Github Actions, AWS CodePipeline, AWS CodeBuild, AWS CodeDeploy, AWS Lambda, AWS API Gateway]
- Data storage [AWS RDS for PostgreSQL, json]
- User Interfaces [Grafana, Telegram Bot API]
Special thanks to TheSpeedX for the proxies!
Plans:
- Bot: Deploy bot in AWS
- Parse: Deploy parser in Github Actions with auto-commits for new json export files
- Parser: enable parsing for most popular Tbilisi districts with products EN/GE variations.
- DB: Enable product_volume column and autofill on Postgres side by parsing names
- CI/CD: Implement Github actions for tracker and AWS CodePipeline for bot
- CI/CD: Report AWS Codepipelie status and show it as a badge
- Proxy: make requests via proxies for run_tracker_github.yml
- DB: Implement Redis for data cache hits and user responses
- DB/Parser: Add 2 Nabiji grocery store
- Bot: Add product/store links to the bot output
- Bot: Allow users to define daily schedule time
- Bot: Dynamic checking of
requests_data.json
file - Grafana: Create average and lowest fluctuation charts
- Security: Apply Trivy github actions
I've made a bot for immediate or daily reportings: @RedBullTrackerBot. Here is how it works:
RedBullTrackerBot.mov
I wanted something to glance on and see the numbers changing, so I built a Grafana dashboard for that:
Available here: Red Bull Dynamics
The script begins by loading its configuration from a JSON file named requests_data.json
. This file contains essential configurations including the physical locations and the platform specific information.
For each platform specified in the configuration, the script makes HTTP requests to the platform's API. The type of request (GET
or POST
) and the necessary headers are defined per platform in the configuration file.
The script utilizes the locations_async
array from the configuration to dynamically adjust the request parameters for geolocation, enabling localized price tracking across different areas in Tbilisi.
@RedBullTrackerBot also looks into requests_data.json, this is where it gets current list of all supported districts.
Upon receiving the response from each platform, it parses the returned data for Red Bulls. This typically includes the product name, price, and any other pertinent details provided by the platform. Data from each platform is parsed in order to pull out only the necessities.
After data is parsed it is aggregated in both repo folder ./export where each file is timestamped in UTC+00:00 and sent for store in bulk to AWS RDS PostgreSQL database. Ready to be picked up at UI stage.
The data parsing process is automated through GitHub Actions, which configurations are stored in .github/workflows (default location).
The responsible action for scheduled runs is .
Tracker image is rebuilt if necessary tracker configurations are changed via git push,
If necessary bot configurations are changed via git push, AWS CodePipeline rebuilds the image with new configurations.
Meanwhile AWS Lambda handles AWS CodePipeline status checks and AWS API Gateway makes the status available for the badge.