This quickstart guide enables you to model complex multiscale systems, such as financial markets, with no reliance on training data.
The CIL models directional changes as a continuous causal chain, eliminating the need for historical training data while providing a deterministic view of system evolution.
In this guide you will learn:
- Identification: How to pinpoint system changes at their exact point of inception within the microscale.
- Tracking: How to follow an initial causal signal’s evolution across multiple scales with zero information loss.
Prerequisites
1. Create a Virtual Environment
We recommend using a virtual environment to prevent dependency conflicts with other projects.
On Windows:
python -m venv venv
venv\Scripts\activate
On Mac:
python3 -m venv venv
source venv/bin/activate
2. Install the Package
Once your environment is active, install the sumtyme python library.
The sumtyme package provides the underlying engine for detecting directional changes and analysing multiscale systems without requiring external training datasets.
3. Verify Installation
You can quickly verify that the package is ready for use by checking the version in your terminal:
python -c "import sumtyme; print(sumtyme.__version__)"
Gold Price Volatility Analysis (Oct 2025)
| Metric | Description |
|---|
| Market Context | Gold reached record highs followed by an 11% correction. |
| Asset Under Review | SPDR Gold Trust (GLD) |
| Analysis Period | October 20 to October 28, 2025 |
| Peak Price | 403.30 (Recorded Oct 20, 19:59) |
| Trough Price | 357.62 (Recorded Oct 28, 09:08) |
| Maximum Drawdown | 11.32% |
Phase 1: Pinpointing Microscale Inception
Detect the exact moment a change starts at the micro-level before it is visible in macro data.
import pandas as pd
# 1. Fetch data
gold_data = pd.read_csv('https://raw.githubusercontent.com/sumteam/data_store/main/GLD/api_outputs/GLD_1s_reactive_outputs.csv', parse_dates=['datetime'])
# 2. Date to start analysis
analysis_start_date = pd.to_datetime("2025-10-20 20:00:00")
# 3. Filter data
mask = gold_data['datetime'] >= analysis_start_date
filtered_df = gold_data.loc[mask].reset_index(drop=True)
print(f"Starting simulation for {len(filtered_df)} data points...")
# 4. Simulate the API calls
filtered_data = filtered_df.to_dict('records')
for current_tick in filtered_data:
# Extracting variables
timestamp = current_tick['datetime']
price = current_tick['open']
chain_detected = current_tick.get('chain_detected')
if chain_detected == -1:
print(f"--- Event Detected at {timestamp} ---")
print(f"Price: ${price}")
break
Phase 2: Mapping Multiscale Signal Propagation
Follow the signal as it moves across scales, evolving from a minor fluctuation into a significant trend.
import sumtyme
client = sumtyme.client(apikey='xxxxxxx')
# 1. Define the data hierarchy (Granularity Scales)
# Each tuple contains the URL to a specific timeframe's CSV and its label.
# This setup allows the system to analyse how events cascade from 1-second ticks up to 10 minute timeframe.
scales = [
("https://raw.githubusercontent.com/sumteam/data_store/main/GLD/api_outputs/GLD_1s_reactive_outputs.csv", '1s'),
("https://raw.githubusercontent.com/sumteam/data_store/main/GLD/api_outputs/GLD_5s_reactive_outputs.csv", '5s'),
("https://raw.githubusercontent.com/sumteam/data_store/main/GLD/api_outputs/GLD_15s_reactive_outputs.csv", '15s'),
("https://raw.githubusercontent.com/sumteam/data_store/main/GLD/api_outputs/GLD_30s_reactive_outputs.csv", '30s'),
("https://raw.githubusercontent.com/sumteam/data_store/main/GLD/api_outputs/GLD_1m_reactive_outputs.csv", '1m'),
("https://raw.githubusercontent.com/sumteam/data_store/main/GLD/api_outputs/GLD_2m_reactive_outputs.csv", '2m'),
("https://raw.githubusercontent.com/sumteam/data_store/main/GLD/api_outputs/GLD_5m_reactive_outputs.csv", '5m'),
("https://raw.githubusercontent.com/sumteam/data_store/main/GLD/api_outputs/GLD_10m_reactive_outputs.csv", '10m'),
]
# 2. Execute Causal Mapping
# initial_chain_starts: Specific datetime where a chain first started.
# causal_chain_details: A detailed breakdown of how the signal moved across different timeframes.
initial_chain_starts, causal_chain_details = client.map_causal_chains(scales)
# 3. Output results for review
print("--- Chain Inception Points ---")
print(initial_chain_starts[initial_chain_starts['propagation_id'] == 'Chain_2'])
print("\n--- Detailed Causal Path Analysis ---")
print(causal_chain_details[causal_chain_details['propagation_id']=='Chain_2'])
Result
The CIL framework successfully identified the structural breakdown of the Gold (GLD) market within seconds of its microscale inception, well before the trend became visible to traditional macro indicators.
| Metric | Details |
|---|
| Detection Status | Negative Chain Detected |
| Initial Detection Price | 402.31 |
| Detection Timestamp | 2025-10-20 20:02:02 |
| Price Drop Before Detection | 0.99 (from 403.30) |
| Time to Detection | 2 minutes, 48 seconds |
| Drawdown Saved | 99.85% |
The robustness of the CIL approach is evidenced by Propagation ID: Chain_2. The signal’s ability to propagate through every timeframe confirms it was a systemic shift rather than random noise:
- Micro-confirmation (1s – 30s): The signal survived the initial volatility phase, confirming a structural directional shift at the earliest possible stage.
- Macro-realisation (1m – 10m): The chain remained intact across all scales, by the time it reached the 10m scale on Oct 28, the market had realised the full 11% correction.
- Zero Information Loss: Each scale transition maintained the original -1 (negative) directionality, validating the deterministic nature of the causal chain.