Skip to main content

Probalytics

Clean, unified prediction market data. One API. Multiple access methods. Aggregate data from Polymarket, Kalshi, and more into a single normalized dataset. Choose how you access it: REST API, raw SQL, or bulk exports.

REST API

Query via HTTP with auth

SQL (ClickHouse)

Direct database connection

File Downloads

CSV, JSON, Parquet exports

The Problem

Prediction market data is fragmented across exchanges with different APIs, formats, and schemas. Polymarket needs blockchain indexing. Kalshi has its own REST API. Building cross-platform analysis means maintaining multiple integrations and handling data inconsistencies.
What we solved:
  • ✓ Unified API across exchanges
  • ✓ Normalized data schema
  • ✓ Historical data back to platform launch
  • ✓ Continuous updates
  • ✓ Multiple access methods for your workflow

Data Available

Two core datasets, continuously updated:

Markets

Every prediction market question: title, outcomes, category, status, open/close dates, resolution data, current prices

Trades

Every executed order: price, size, side, timestamp, maker/taker status
Historical records go back to each platform’s launch. Data refreshes every 5 minutes.

Supported Exchanges

ExchangeTypeCoverageStatus
PolymarketBlockchain (Polygon)Markets, trades, on-chain dataLive ✓
KalshiUS-regulatedMarkets, tradesLive ✓
PredictItUS-basedMarkets, tradesComing soon
MetaculusForecasting platformQuestions, predictionsComing soon

Access Methods

Choose what fits your workflow:

REST API

Query via HTTP with simple authentication. Best for: real-time data, mobile apps, webhooks, quick integrations.
  • Authentication: API key in header
  • Rate limited: see docs
  • Response format: JSON

SQL (ClickHouse)

Direct database connection. Best for: data analysis, batch operations, complex queries, dashboards.
  • Connect from: Python, Node.js, Go, DBeaver, etc.
  • Full SQL support: aggregations, joins, window functions
  • Performance: optimized for analytics

File Downloads

Bulk exports. Best for: local analysis, research, backups, data science pipelines.
  • Formats: CSV, JSON, Parquet
  • Frequency: daily/weekly
  • Size: check quickstart

Use Cases

Trading

Build bots, alerts, dashboards. Track prices across platforms. Detect opportunities.

Research

Market efficiency analysis. Forecast accuracy studies. Information aggregation patterns.

Arbitrage

Find price spreads. Match markets across exchanges. Identify inefficiencies.

Backtesting

Test strategies against historical data. Validate models. Performance analysis.

Next Steps

Ready to start? Jump to Quickstart to choose your access method and get working code in 2 minutes.

Next Steps

Quickstart

Get working code in 5 minutes

API Reference

Complete endpoint docs

SQL Guide

Tables, schemas, queries

Tutorials

Real examples: arbitrage, analysis, pipelines

Need Help?