---
title: "Live Portfolio Analytics - Powered by MCP Servers - Open Source"
slug: open-so
date_published: 2025-05-09T13:44:54.753Z
original_url: https://www.tigzig.com/post/open-so
source: migrated
processed_at: 2025-12-03T13:30:00.000Z
---

# Live Portfolio Analytics - Powered by MCP Servers - Open Source

**Analyze any stock, crypto, metal, or oil symbol vs benchmark using 70+ KPIs & 15+ Charts, AI-powered technicals, and clean PDF + web reports.**

Live across 6 interfaces: web apps, ChatGPT, chat agents, Excel (xlwings Lite), and forms.

Each interface serves a different use case - from rapid-deploy to full-featured agents. Modular architecture makes it easy to plug components into any flow - agentic or not.

Built the backend MCP servers, agent flows, and user interfaces as reusable modular components - easy to plug into different use cases and mix across stacks.

Performance stats powered by Python QuantStats package (by Ran Aroussi, creator of yfinance). Technical chart analysis with Python Finta and Gemini Vision. MCP servers built with Tadata FastAPI-MCP package, web apps with React and NextJS, and MCP enabled agents on n8n & Flowise.

Fully modular and live - clone it, remix it, set up your own stack

**MCP Servers are public:**

* QuantStats MCP
* YFinance MCP
* Technical Analysis MCP

Full build breakdown below

## 1. WHAT IT DOES - Live Portfolio Analytics Stack

This is a working analytics stack delivering live performance reports, AI-driven technicals, and financial data pulls - powered by MCP, FastAPI, and modular agents.

**1. QuantStats Performance vs. Benchmark** - For any Yahoo Finance symbol - stocks, crypto, oil, metals. Over 70 metrics, 15+ charts: drawdowns, Sharpe/Sortino ratios, return distributions, correlations - delivered in clean HTML

**2. AI Technical Analysis** - Technical analysis across two timeframes - Across daily and weekly timeframes. Chart analysis via Gemini Vision API, delivered as PDF and web reports with structured tables.

**3. Finance Data Pull:** Extract prices, profiles, and full financials from Yahoo Finance: 150+ fields of profile info, P&L, balance sheet, cash flow. Excel integration via xlwings Lite

## 2. HOW TO USE - 6 Live Interfaces

The system runs across 6 interfaces - each tailored for different use cases

1. Agentic - Custom UI (NextJS + n8n)
2. Agentic - ChatGPT (Custom GPT)
3. Agentic - Advanced (React-based with full analysis tools)
4. Agentic - Rapid Deploy (Flowise Native)
5. Excel - xlwings Lite
6. Form UI - HTML-JS-Jinja2

Ask the agent for guidance or start with a prebuilt prompt.

All support the same analytics setup - just different frontends and feature layers. See next section on interface strategy.

## 3. INTERFACE STRATEGY - Why 6, and Why Modular

The setup is designed around modular Gen AI-powered components - backend, agent layer, and UI - each one reusable and configurable depending on the use case. Once core processing is in place, it's easy to plug into different interfaces without rebuilding the logic.

The six interfaces aren't just demos - they show real deployment options. From lightweight forms to full-stack apps and AI agents. The options support a wide range of use cases - Section 6 on User Interfaces goes into details on when to use which and the trade-offs involved

To support this, I built three MCP-FastAPI servers and one standalone FastAPI server. These connect to agents running on n8n and Flowise, and frontends on React and Next.js. All components are connected via standard APIs, making them portable across tools - including third-party platforms.

In practice, modularity isn't always necessary. I sometimes deliver integrated solutions where the UI, logic, and agent live in the same build - faster for simpler use cases. But where reusability or scale is a factor, modular saves time, simplifies updates, and isolates risk.

This isn't about MCP or agents - it's about building practical, reusable analytics solutions that can plug into any interface or automation flow.

All live. Test it, clone it, build your own.

## 4. ARCHITECTURE - Modular, Component based, Reusable

The full stack is built around reusable components - each layer (frontend, agents, backend) is designed to plug into others with minimal setup. Here's how the architecture breaks down across interfaces and agents.

### 4.1. Modular - Component based: Why and When?

There are cases where a non-modular setup makes more sense. For example, in one client project I built a lightweight HTML-JS tool for Excel file import and manipulation, bundled with a browser-based SQLite agent and a simple NL-to-SQL chat interface - all integrated in a single app. In that setup, modularity would've just added unnecessary complexity.

But when I see components - UI, backend logic, agents - that can be reused, I default to separating them out. In another case, I built a custom UI connected to a Flowise agent and backend SQL service. Later, the client needed a second agent setup pointing to a different database. All I had to do was update the agent config and env variable - no UI rebuild, no backend changes.

Modular setups also help with debugging, iteration, and access control. I can isolate issues, restrict backend exposure, and upgrade parts independently.

I started building my technical analysis stack directly inside Excel with xlwings Lite. As it evolved, I split core processing into an MCP-FastAPI server - now the same logic runs across all UIs: web, forms, agents, GPT, Excel.

None of this is new - tech has done it for decades. Just sharing how modularity speeds up my own analytics builds, and when I choose to keep it simple.

### 4.2. Frontend options

Covered in detail in a separate section with notes on when to use what, and trade-offs based on my experience.

UI options include: Next.js, React, ChatGPT, form-based UI, Flowise native UI, and Excel

React and NextJS UIs are set up as reusable modules - they can connect to agent flows on Flowise, n8n, or any API-accessible setup. Just update the API endpoint in the env variable, match input/output formats, and it's live.

In the current setup, the n8n agent connects to the NextJS UI, and the Flowise agent connects to the React app. Both agents are MCP enabled and are in turn connected to MCP Servers.

### 4.3. Agent setups

The agent layer acts as the glue between UIs and backend MCP servers - handling workflows, orchestration, and logic routing. Here's how the n8n and Flowise setups are configured.

n8n and Flowise AI support webhooks and API endpoints - easy to plug into any interface and is great for keeping the Agent layer separate.

Both include MCP Clients with SSE support for remote MCP Servers. Just drop in the MCP Server URL and you're set. In the current setup, agents connect to multiple MCP Servers and database tools (Flowise).

Both are production-grade tools built for complex agentic and non-agentic workflows (n8n). Flowise supports Sequential Agents with LangGraph - enabling advanced orchestration with routing and step-wise execution. n8n is superb with its Agent node, wide platform integration, HTTP node for API calls, and a solid set of routing and processing nodes.

### 4.4. Core engine

This is the processing brain of the system - everything from calculations to report formatting runs through this Python-based backend, wrapped in modular MCP-FastAPI services.

* QuantStats by Ran Aroussi (creator of yfinance)
* yfinance for market data
* Finta for technical indicators
* Matplotlib for charts
* Gemini Vision API for visual chart analysis
* ReportLab for PDF formatting

### 4.5. Backend

Three integrated MCP + FastAPI servers and one standalone FastAPI server (details in next section). All Python logic is wrapped in FastAPI and mounted on an MCP Server in a single deployment - which also serves the form UI.

I keep all reusable logic on FastAPI - easy to automate and connect across UIs, from ChatGPT to Excel to any custom UI. Tadata's FastAPI-MCP package makes it simple to mount MCP on any existing FastAPI setup.

**Connections:**

* n8n and Flowise agents connect to the MCP server via their native MCP Client nodes
* React and NextJS UIs connect to agents via API
* ChatGPT connected to FastAPI endpoints on the same integrated MCP-FastAPI server via Custom Actions OpenAPI schema
* Form UI connected to FastAPI endpoints on the same MCP-FastAPI server
* Excel connects to FastAPI endpoints through xlwings Lite

## 5. MCP SERVERS

The backend is split into four focused processing services - each one handles a specific piece of the analytics workflow, from financial data to report generation. All are exposed via MCP or API, built for reuse and quick integration.

There are three custom MCP Servers and one standalone FastAPI Server. The MCP servers are public - just plug and play. Add the URL to any SSE-enabled MCP client. Both n8n and Flowise have native nodes for this.

The servers are integrated MCP-FastAPI servers. I used Tadata's FastAPI-MCP package to mount MCP on top of FastAPI - just a few lines of codes - brilliant package. A single deployment runs MCP + FastAPI + Form UI (HTML-JS-Jinja2). Works cleanly with both agentic and non-agentic setups.

All the user interfaces connect to these integrated MCP-FastAPI servers - explained in the User Interfaces section

### 5.1. QuantStats MCP Server

This is an MCP-FastAPI wrapper over the QuantStats package - connects to any frontend via MCP or API. Takes two Yahoo Finance symbols (one for performance, one for benchmark) plus a time range. Returns a formatted HTML report using the QuantStats package. Tables and charts are auto-generated directly by the package.

The MCP enabled agents connect to MCP Server and rest of UIs to the FastAPI endpoints.

### 5.2. Technical Analysis MCP Server

This is a processing server that runs a multi-step workflow to generate technical analysis reports in PDF and web format. Takes a Yahoo Finance symbol and time range, returns AI-generated reports.

**Workflow steps:**

* Connects to Yahoo Finance MCP-FastAPI server to pull price data
* Converts daily prices to weekly dataframe
* Calculates technical indicators using Finta
* Connects to Gemini Vision API to get chart and technical analysis
* Connects to ReportLab FastAPI server for generating the final PDF and Web reports.

The MCP enabled agents connect to MCP Server and rest of UIs to the FastAPI endpoints.

### 5.3 Yahoo Finance MCP Server

This is an MCP-FastAPI wrapper over the yfinance package. Connects to any frontend via MCP/ API. Takes a Yahoo Finance symbol and returns:

* Price data (JSON) - requires date range
* Company profile with 150+ fields
* Full financials: P&L, balance sheet, cash flow, and quarterly breakdowns

The MCP enabled agents connect to MCP Server and rest of UIs to the FastAPI endpoints.

### 5.4 ReportLab Markdown to PDF-HTML FastAPI Server

This is a FastAPI processing server for generating custom formatted reports. PDF and HTML outputs are customized for this use case but can be adapted for others. ReportLab offers deep customization for PDFs, easily replicated in web reports using standard HTML-JS.

**Workflow:**

* Convert Markdown to HTML (using markdown)
* Parse HTML with BeautifulSoup for structure
* Use ReportLab to build styled PDF
* Style HTML output to match PDF
* Reference charts from static folder
* Auto-clean old files (older than 24 hrs) using FastAPI startup events + Starlette background tasks

I've set up a separate endpoint for technical analysis, with custom formatting for both PDF and HTML outputs. The same FastAPI server also includes a generic endpoint that takes Markdown content and returns a PDF with simpler formatting. It's deployed as an HTML-JS-Tailwind form UI that calls FastAPI endpoints - all served from a unified FastAPI server using Jinja2 templates.

## 6. USER INTERFACES

All processing and agent logic connects to live, working interfaces. Each UI is connected to the same backend and agent layer - just optimized for different workflows, tools, or user preferences. All of these are live, working apps - each built on top of the same backend stack. The UI is just the entry point. Some are lightweight and fast to deploy, others offer more control or customization.

### 6.1. Custom GPT

Custom GPT is usually my first choice when a UI is needed. You get an out-of-the-box interface, embedded agent, and Python code execution. Just supply a JSON OpenAPI schema to connect to any backend. Faster and cleaner than building even basic HTML-JS forms.

Limitations: no custom UI, and feature set is narrower than a React or NextJS build. You'll need a Plus account ($20/month) to create a Custom GPT, though free users can still access it with rate limits.

Live setup connects to QuantStats and Technical Analysis MCP-FastAPI servers via FastAPI endpoints.

### 6.2. Flowise native UI

Flowise native UI is a solid option when GPT isn't feasible. No UI build required. You get a ready-to-use chat interface that supports complex agent flows using LangGraph Sequential Agents, custom tools, and APIs. n8n too provides a similar native UI.

### 6.3. Excel integration - xlwings Lite

xlwings Lite (by [Felix Zumstein](https://www.linkedin.com/in/felix-zumstein/) creator of xlwings) is a lightweight Excel add-in that runs full Python workflows inside Excel - no local Python install needed. It comes with a built-in editor, console, environment variables, and deep Excel integration.

One of the big benefits is that it allows easy connectivity to any CORS-enabled backend API service. This fits well with my setup - since I use FastAPI servers extensively - and also makes it easy to connect to LLM/AI endpoints and third-party server providers.

### 6.4. Simple Forms

In many cases, a simple form works best. I typically use Jinja2 templates (HTML-JS-Tailwind) for tight FastAPI integration. Sometimes Flask. Works well for fairly complex UIs too, with full JavaScript access and server-side rendering - env vars and routes stay hidden.

### 6.5. Going full stack? React / NextJS

For complex apps and a polished UI, I use React or NextJS. The REX-3 Co-Analyst app has the QuantStats Agent on a separate tab, connected to the same Flowise agent from earlier - just wrapped inside a React interface.

### 6.6. NextJS

The biggest benefit of NextJS: env vars and API routes stay private, and it supports server-side rendering. That's a major security benefit. The NextJS portfolio analysis agent uses a leaner UI, but it's still set up in a modular way - can be connected to any API-enabled agent backend by just changing the environment variable.

Note: Vercel serverless functions time out at 60 seconds (300s on pro plan), so longer API calls need workarounds.

## 7. DEPLOYMENTS

NextJS and React apps are deployed on Vercel. All MCP + FastAPI servers, along with n8n and Flowise, run on a Dockerized setup via Coolify, deployed on a Hetzner VPS.

## 8. AI CODER: CURSOR

I use Cursor as my AI coding assistant across all builds - including this one. Every part of this stack - from UI's to FastAPI servers - was written and iterated using Cursor.

**Fine print**: This is not investment research or financial advice. It's a live working example showing how to stitch together AI, analytics, and infra into real outputs. The logic and analysis structure is based on a general-use setup - fully modifiable to fit your own requirements. Source code, backend, and app stack are open and adaptable. AI and humans can both make mistakes - always validate results.
