Overview
This project focused on building an automated deal discovery platform for online marketplaces that helped users track listings, evaluate pricing trends, and identify promising deals. The system automates listing analysis, pricing checks, and real-time alerts. It gave users faster access to undervalued items while reducing the manual effort involved in tracking and verifying marketplace listings.
Client Background
The client operates in the e-commerce and resale intelligence space. Their operations depend on timely access to online listings and the ability to identify profitable deals based on historical pricing. Their business model involves reselling items sourced from various online platforms, where spotting underpriced listings early can directly impact margins.
Previously, this process relied heavily on manual tracking and limited third-party tools, which led to inconsistent data capture, slower decision-making, and missed opportunities. They needed a solution that could automate data collection, adapt to different platforms, deliver actionable insights, and support long-term growth.

Challenges & Objectives
The objectives and identified challenges of the project were as follows:
Challenges
- Many platforms require session handling, login credentials, and browser simulation to access listing data.
- The volume of data scraped across platforms was too large for manual processing or flat-file storage.
- Identifying underpriced listings required developing logic based on historic data rather than arbitrary thresholds.
- Alerts and insights needed to be easily accessible through a visual, filterable dashboard.
Objectives
- Build a web scraping tool for e-commerce price tracking across authenticated and public platforms.
- Store listing data, images, and metadata in a long-term, structured format using PostgreSQL
- Develop logic to flag deals using basic price analytics and machine learning for detecting underpriced marketplace listings.
- Integrate real-time deal alerts for resale and flipping platforms via Telegram.
- Deliver a dashboard for monitoring item listings and price trends

Approach & Implementation
The following were the details of the approach and subsequent implementation of solutions:
Web Scraping and Automation
To support both public and login-protected marketplaces, the system was built using Playwright and Puppeteer. It simulated human browsing behaviour, handled logins, and used proxy rotation through tools like Multilogin. These measures reduced the risk of detection while maintaining long-term scraping stability. Custom scraping routines were written per platform, allowing for flexible selectors and routine maintenance.
Data Storage and Processing
Listing information was stored in a PostgreSQL database with proper indexing for fast retrieval. Metadata, images, timestamps, and item descriptions were organised for long-term access and trend analysis. This backend design allowed the platform to serve as a scalable online monitoring platform for pricing intelligence.
Deal Detection and Analysis
Using Python libraries like Pandas and NumPy, the team created logic to track historical pricing trends and detect anomalies. Listings that significantly deviated from baseline values were flagged. A lightweight machine learning model built with scikit-learn was added to improve deal prediction accuracy over time, helping refine what was considered a "good deal."
Alerts and Integrations
A Telegram bot was developed to send real-time deal alerts. Alerts could be filtered by item type, location, price range, or custom parameters. This helped the client reduce the lag between listing appearance and response time.
Dashboard Interface
The frontend was built using HTML, JavaScript, and minimal UI components for clarity and responsiveness. The dashboard for monitoring item listings and price trends allowed users to view historical and live data, inspect flagged deals, and analyze pricing movements across platforms. Simple charts and filters gave structure to the raw data and improved decision-making.

Results & Outcomes
The following were some notable outcomes of the project:
- The platform monitoring tools automated listing collection across several marketplaces, including complex ones like Facebook Marketplace
- Real-time alerts improved the client's ability to act quickly on profitable listings
- Historical data and live insights were centralized in one easy-to-use dashboard
- The scraping routines continued to operate even as site structures changed, thanks to the modular selector design
Key Takeaways
Here are the conclusions we can draw from the project:
- A modular marketplace monitoring system, supported by browser emulation and scraping logic, can reliably track high-volume listing data
- Combining structured databases with basic ML techniques provides a scalable way to detect pricing anomalies
- Real-time notifications reduce manual monitoring and help users act quickly on undervalued listings
- Simple, focused dashboards make large datasets easier to work with, especially in fast-paced resale environments

Conclusion
Relu Consultancy developed a system that helped the client automate their listing analysis and improve the accuracy and speed of deal detection across online marketplaces. With a combination of scraping logic, database structuring, alert systems, and frontend visualization, the automated deal discovery platform for online marketplaces turned raw data into actionable insight. It allowed the client to respond faster, track trends more consistently, and reduce manual oversight. The project demonstrated how well-planned platform monitoring tools can support resale workflows and scale alongside marketplace changes.