Skip to main content
Back to all seeds
DataLive2025

Forex Lab

Real-time trading bot platform on Deriv markets. Drag-and-drop visual bot builder (Blockly), live WebSocket charting, AI signal engine (RSI + Moving Averages + LLM analysis), risk management with profit/stop-loss targets.

ReactTypeScriptBlocklyDeriv WebSocket APISupabaseLerna/NX monorepoOpenAI/Claude APIs

What it is

Real-time trading bot platform on Deriv markets. Operators build strategies in a drag-and-drop visual editor (Blockly), run them against live WebSocket data, and let an AI signal engine (RSI + moving averages + LLM analysis) advise on trade entries. Live at app-forex-lab.pages.dev.

The problem

Off-the-shelf trading bots demand code or spreadsheet-formula DSLs that retail traders can't touch, and the ones that ship a visual builder rarely combine with live market data and risk-management rules in the same product.

What I built

The Blockly bot builder

Visual editor where strategies compose from blocks: indicators (RSI / MA), signals (cross / threshold), trade actions (buy / sell / close), and risk rules (profit target, stop-loss). The output is a runnable JS function evaluated against live ticks.

Real-time WebSocket pipeline

Deriv's tick stream feeds an event bus that simultaneously updates charts, runs strategy evaluations, and persists state for backtesting. Each subscription is reference-counted so we don't open duplicate sockets per chart.

AI signal engine

Indicator outputs feed an LLM prompt (OpenAI / Claude) that summarises the current setup, surfaces risk, and explains the proposed action in plain English. Always advisory — humans pull the trigger.

Monorepo

Lerna/NX monorepo splits the engine, UI, and shared types into packages so the engine can be tested headlessly and reused for backtesting jobs.

Engineering decisions

Why Blockly over a custom DSL

Blockly gives accessible drag-and-drop UX and a typed block schema with zero invented surface. A custom DSL would have looked cool but added documentation cost for users.

Why an advisory LLM, never autonomous

Ethically and legally the bar for autonomous trade execution is high. The LLM explains and recommends; the user clicks. That's the right division of responsibility for retail.

Why Cloudflare Pages over Vercel

Pages' edge runtime gives shorter latency for the WebSocket relay endpoint, which matters when ticks arrive 5–10 per second per chart.

What I'd do differently

Move the strategy evaluation off the main thread sooner — Web Workers were retrofitted after I noticed UI jank under heavy tick loads. A v2 should also persist backtest results so users can compare strategy iterations without re-running.