Get direct access to the complete prediction market database for advanced analytics and complex queries.
Getting Started
Already have credentials? Jump to Connection Methods . New to Probalytics? Create ClickHouse credentials in app.probalytics.io → ClickHouse Credentials section. See Quickstart for setup.
Connection Details
Host : clickhouse.probalytics.io
Port : 9440 (Secure TCP) or 8443 (Secure HTTPS)
Database : probalytics
Authentication : Username and password from dashboard
All connections are secured with TLS encryption. Use the --secure flag for CLI or enable TLS/SSL in your client.
Quick Exploration
SQL Workspace (Browser)
The fastest way to explore and test queries. Visit app.probalytics.io/sql and start querying immediately—no setup required.
Large result sets will crash your browser. Keep queries focused and use LIMIT to restrict results. For bulk exports or large analytics, use programmatic connections instead.
Connection Methods
For production applications and large-scale analysis, use one of these programmatic connections:
Native ClickHouse driver for best performance. Install: pip install clickhouse-driver from clickhouse_driver import Client
client = Client(
host = 'clickhouse.probalytics.io' ,
port = 9440 ,
user = 'YOUR_USERNAME' ,
password = 'YOUR_PASSWORD' ,
database = 'probalytics' ,
secure = True
)
# Test connection
result = client.execute( 'SELECT count() FROM markets;' )
print ( f "Total markets: { result[ 0 ][ 0 ] } " )
HTTP-based client, simpler setup. Install: pip install clickhouse-connect import clickhouse_connect
client = clickhouse_connect.get_client(
host = 'clickhouse.probalytics.io' ,
port = 8443 ,
username = 'YOUR_USERNAME' ,
password = 'YOUR_PASSWORD' ,
database = 'probalytics'
)
# Test connection
result = client.query( 'SELECT count() FROM markets;' )
print ( f "Total markets: { result.first_item } " )
Install: npm install @clickhouse/client import { createClient } from '@clickhouse/client' ;
const client = createClient ({
url: 'https://clickhouse.probalytics.io:8443' ,
username: 'YOUR_USERNAME' ,
password: 'YOUR_PASSWORD' ,
database: 'probalytics'
});
// Test connection
const result = await client . query ({
query: 'SELECT count() FROM markets;' ,
format: 'JSONEachRow'
});
const data = await result . json ();
console . log ( `Total markets: ${ data [ 0 ]. count } ` );
Install: go get github.com/ClickHouse/clickhouse-go/v2 package main
import (
" fmt "
" github.com/ClickHouse/clickhouse-go/v2 "
" context "
" crypto/tls "
)
func main () {
conn , _ := clickhouse . Open ( & clickhouse . Options {
Addr : [] string { "clickhouse.probalytics.io:9440" },
Auth : clickhouse . Auth {
Username : "YOUR_USERNAME" ,
Password : "YOUR_PASSWORD" ,
},
TLS : & tls . Config {},
Settings : clickhouse . Settings {
"max_execution_time" : 300 ,
},
})
var count uint64
err := conn . QueryRow ( context . Background (),
"SELECT count() FROM markets;" ). Scan ( & count )
fmt . Printf ( "Total markets: %d \n " , count )
}
Popular GUI tool for SQL queries and exploration.
Download DBeaver Community
Create new connection → Search “ClickHouse”
Configure:
Server Host : clickhouse.probalytics.io
Port : 8443
Database : probalytics
Username : From your dashboard
Password : From your dashboard
SSL : Enable SSL/TLS in connection settings
Test connection → Browse and query tables
Quick exploration and ad-hoc queries. Install: ClickHouse Docs clickhouse client --host=clickhouse.probalytics.io --secure --port=9440 --user=YOUR_USERNAME --password=YOUR_PASSWORD --database=probalytics
Or run a query directly: clickhouse client \
--host clickhouse.probalytics.io \
--secure \
--port 9440 \
--user YOUR_USERNAME \
--password YOUR_PASSWORD \
--database probalytics \
--query "SELECT count() FROM markets;"
Exploring the Database
List All Tables
SHOW TABLES FROM probalytics;
Describe Table Structure
DESCRIBE TABLE probalytics . markets ;
DESCRIBE TABLE probalytics . trades ;
Test Your Connection
SELECT
'markets' as table_name,
count () as row_count
FROM markets
UNION ALL
SELECT
'trades' as table_name,
count () as row_count
FROM trades;
Understanding ReplacingMergeTree
Tables use the ReplacingMergeTree engine for efficient updates and deduplication.
How it works:
New versions of the same row replace old versions
Deduplication is based on the indexed_at column (highest value wins)
Background merges consolidate duplicates asynchronously
Queries may return duplicates until merges complete
The FINAL Keyword
Use FINAL to force immediate deduplication:
-- May include duplicates (fast)
SELECT count () FROM markets;
-- Guaranteed deduplicated (slower)
SELECT count () FROM markets FINAL;
FINAL triggers synchronous deduplication which is significantly slower. Use it only when you need guaranteed unique results. For most analytics, skip FINAL and rely on background merges.
Access Tiers & Limits
Your tier determines available data, query limits, and resource quotas:
Feature Starter Pro Custom Tables markets, trades All tables All tables Historical Data Last 30 days Full history Full history Queries/hour 100 1,000 Unlimited Max Execution Time 5 minutes 1 hour Unlimited Max Memory 4 GB 32 GB Unlimited Max Result Rows 100,000 1 billion Unlimited
Your tier is set when creating ClickHouse credentials in the dashboard. Upgrade your plan to access higher tiers.
Available Tables
markets
Primary table for prediction market metadata.
Available to : All tiers
Records : One per market across platforms
Key fields : platform, title, description, outcomes, dates, status
See Tables reference for complete schema
trades
Individual trade records on each market.
Available to : All tiers
Records : One per trade executed
Key fields : market_id, price, volume, side, trader_id, timestamps
Note : Partitioned monthly by created_at for performance
See Tables reference for complete schema
chain_events
Raw blockchain events (internal use).
Available to : Pro and Custom tiers only
Records : Low-level chain events from smart contracts
Use case : Advanced on-chain analysis
Not recommended for typical market analysis
Next Steps
Tables Reference Complete field definitions and data types
Common Queries Ready-to-run query examples
SQL Tips Performance optimization strategies