LangAlpha reframes AI-assisted investing around a core insight: real investment work is iterative and persistent, not one-shot question-and-answer.
Most AI finance tools today ask a question, deliver an answer, and close the loop. LangAlpha instead treats investment research like a software codebase—persistent, cumulative, and built to evolve as new data arrives.
**The Problem at Scale**
Financial data creates a specific technical challenge. MCP tools designed for market data don't scale to typical investor workflows. A single tool call requesting five years of daily prices dumps tens of thousands of tokens into the context window. Data vendors package dozens of tools into single MCP servers, with schemas alone consuming 50,000+ tokens before any analysis begins.
**The Solution: Sandbox-First Tool Architecture**
LangAlpha auto-generates typed Python modules directly from MCP server schemas at workspace initialization. The agent imports them as normal Python libraries rather than calling them as tools. Only a one-line summary per server remains in the prompt.
With approximately 80 tools distributed across financial data servers, the prompt cost remains constant regardless of whether a server exposes 3 or 30 tools. The actual data processing happens in a sandboxed Python environment, not the LLM context window.
**Persistent Workspaces**
Each workspace creates a dedicated sandbox environment with a persistent filesystem and an agent.md memory file. This compounds research across sessions. Users create workspaces organized by research goal—"Q2 rebalance," "data center demand deep dive," "energy sector rotation"—and return to find prior analysis, threads, and accumulated context intact.
The agent interviews users about their goals and style, produces an initial deliverable, and saves everything to the workspace. Tomorrow's analysis builds on yesterday's work without rebuilding context from scratch.
**Feature Set**
LangAlpha includes progressive tool discovery (tools loaded as summaries, full documentation available on-demand in the workspace), programmatic tool calling for complex multi-step analysis, and skills for pre-built financial workflows—DCF models, earnings analysis, coverage report initiation, morning notes, and document generation.
The web interface provides inline financial charting, TradingView integration, real-time WebSocket market data, shareable conversations, and subagent monitoring. Users can set price-triggered automations, schedule recurring or one-shot tasks, and control the agent through Slack or Discord with full feature support.
**Infrastructure**
The backend runs on FastAPI with PostgreSQL (dual-pool setup for app data and LangGraph checkpoints), Redis for SSE event buffering and mid-workflow steering, and Daytona cloud sandboxes for execution. The system supports live steering—sending follow-up messages while agents work to redirect or clarify—and parallel async subagents with isolated context windows.
LangAlpha uses a provider-agnostic LLM abstraction layer with automatic failover, meaning the same middleware stack and workflows function across multiple LLM backends.
The project was submitted to the Gemini 3 Hackathon, with ongoing development continuing beyond the original submission. Source: github.com/ginlix-ai/langalpha