summaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
Diffstat (limited to 'README.md')
-rw-r--r--README.md228
1 files changed, 123 insertions, 105 deletions
diff --git a/README.md b/README.md
index 0ff20d5..c1e020f 100644
--- a/README.md
+++ b/README.md
@@ -2,143 +2,161 @@
`transivent` is a Python library for detecting and analysing transient events (~spikes) in time-series data. It provides a flexible and configurable pipeline for processing waveform data, identifying events based on signal-to-noise ratio, and visualizing the results. The library is designed to handle large files efficiently through chunked processing.
-## Quick Start
+Additionally, `transivent` includes tools for diffusion analysis of detected events, including Mean Square Displacement (MSD) calculation, autocorrelation analysis, and statistical visualization.
+
+**Brutalist Philosophy**: Simple, composable building blocks. Use `detect()` for custom data or `detect_from_wfm()` for Wfm files. All underlying components are available for advanced users. No unnecessary abstractions.
-The primary entrypoint for analysis is the `transivent.process_file` function. The analysis pipeline is controlled via a configuration dictionary, and the library provides functions to read waveform data from binary files with XML sidecars.
+## Quick Start
-Here is a brief example based on `example.py`:
+### For Custom Time-Series Data
```python
-from transivent import process_file, get_waveform_params
-
-# 1. Define the analysis configuration
-CONFIG = {
- "DATA_PATH": "path/to/your/data/",
- "SMOOTH_WIN_T": 10e-3,
- "DETECTION_SNR": 3,
- "MIN_EVENT_KEEP_SNR": 5,
- "SIGNAL_POLARITY": 1,
- "CHUNK_SIZE": 1_000_000, # Set to a value to enable chunking
- # ... and more
-}
-
-# 2. Define the measurement file
-measurement = {
- "data": "RefCurve_2025-07-17_0_065114.Wfm.bin",
-}
-
-# 3. Merge configs and get waveform parameters
-config = {**CONFIG, **measurement}
-params = get_waveform_params(
- config["data"], data_path=config["DATA_PATH"]
-)
+import numpy as np
+from transivent import detect
-# 4. Run the processing pipeline
-process_file(
- name=config["data"],
- data_path=config["DATA_PATH"],
- sampling_interval=params["sampling_interval"],
- smooth_win_t=config.get("SMOOTH_WIN_T"),
- detection_snr=config.get("DETECTION_SNR"),
- min_event_keep_snr=config.get("MIN_EVENT_KEEP_SNR"),
- signal_polarity=config.get("SIGNAL_POLARITY"),
- chunk_size=config.get("CHUNK_SIZE"),
- # ... other parameters
+# Load your data (CSV, NumPy, HDF5, etc.)
+t = np.linspace(0, 1, 100000) # Time in seconds
+x = np.random.randn(100000) * 0.1 # Signal
+
+# Detect events
+results = detect(
+ t, x,
+ name="My Data",
+ detection_snr=3.0,
+ signal_polarity=-1, # -1 for negative spikes, +1 for positive
)
-```
-## API Documentation
+# Access results
+events = results["events"]
+print(f"Found {len(events)} events")
+```
-The public interface of `transivent` is defined in the main `__init__.py` package.
+### For Wfm Files (proprietary format with XML sidecar)
----
-### `analyze_thresholds`
```python
-def analyze_thresholds(x: np.ndarray, bg_clean: np.ndarray, global_noise: np.float32, detection_snr: float, min_event_keep_snr: float, signal_polarity: int) -> Tuple[np.ndarray, np.ndarray]
-```
-Analyze threshold statistics and create threshold arrays.
+from transivent import detect_from_wfm
-### `calculate_initial_background`
-```python
-def calculate_initial_background(t: np.ndarray, x: np.ndarray, smooth_n: int, filter_type: str = "gaussian") -> np.ndarray
-```
-Calculate initial background estimate.
+# Detect events in Wfm file
+results = detect_from_wfm(
+ name="data.Wfm.bin",
+ sampling_interval=5e-7, # seconds
+ data_path="/path/to/data/",
+ detection_snr=3.0,
+)
-### `calculate_smoothing_parameters`
-```python
-def calculate_smoothing_parameters(sampling_interval: float, smooth_win_t: Optional[float], smooth_win_f: Optional[float], min_event_t: float, detection_snr: float, min_event_keep_snr: float, widen_frac: float, signal_polarity: int) -> Tuple[int, int]
+events = results["events"]
+print(f"Found {len(events)} events")
```
-Calculate smoothing window size and minimum event length in samples.
-### `configure_logging`
-```python
-def configure_logging(log_level: str = "INFO") -> None
-```
-Configure loguru logging with specified level.
+## Advanced Usage
-### `create_oscilloscope_plot`
-```python
-def create_oscilloscope_plot(t: np.ndarray, x: np.ndarray, bg_initial: np.ndarray, bg_clean: np.ndarray, events: np.ndarray, detection_threshold: np.ndarray, keep_threshold: np.ndarray, name: str, detection_snr: float, min_event_keep_snr: float, max_plot_points: int, envelope_mode_limit: float, smooth_n: int, global_noise: Optional[np.float32] = None) -> OscilloscopePlot
-```
-Create oscilloscope plot with all visualization elements.
+For advanced workflows, individual building blocks are available in submodules. The `detect()` function internally uses these components:
-### `get_final_events`
```python
-def get_final_events(state: Dict[str, Any]) -> np.ndarray
-```
-Extract and finalise the list of detected events from the state.
+from transivent.analysis import (
+ calculate_initial_background,
+ calculate_clean_background,
+ detect_initial_events,
+ detect_final_events,
+ estimate_noise,
+)
-### `initialize_state`
-```python
-def initialize_state(config: Dict[str, Any]) -> Dict[str, Any]
+# Use these functions to build custom pipelines
```
-Initialise the state dictionary for processing.
-### `process_chunk`
-```python
-def process_chunk(data: Tuple[np.ndarray, np.ndarray], state: Dict[str, Any]) -> Dict[str, Any]
-```
-Process a single data chunk to find events.
+For diffusion analysis:
-### `process_file`
```python
-def process_file(name: str, sampling_interval: float, data_path: str, smooth_win_t: Optional[float] = None, smooth_win_f: Optional[float] = None, detection_snr: float = 3.0, min_event_keep_snr: float = 6.0, min_event_t: float = 0.75e-6, widen_frac: float = 10.0, signal_polarity: int = -1, max_plot_points: int = 10000, envelope_mode_limit: float = 10e-3, sidecar: Optional[str] = None, crop: Optional[List[int]] = None, yscale_mode: str = "snr", show_plots: bool = True, filter_type: str = "gaussian", filter_order: int = 2, chunk_size: Optional[int] = None) -> None
-```
-Process a single waveform file for event detection.
+from transivent.diffusion import (
+ extract_event_waveforms,
+ calculate_msd_parallel,
+ calculate_acf,
+ fit_diffusion_linear,
+)
-### `EventPlotter`
-```python
-class EventPlotter:
- def __init__(self, osc_plot: OscilloscopePlot, events: Optional[np.ndarray] = None, trace_idx: int = 0, bg_clean: Optional[np.ndarray] = None, global_noise: Optional[np.float32] = None, y_scale_mode: str = "raw")
-```
-Provides utility functions for plotting individual events or event grids.
+# After detecting events with detect()...
+results = detect(t, x)
+events = results["events"]
+bg_clean = results["bg_clean"]
-### `detect_events`
-```python
-def detect_events(time: np.ndarray, signal: np.ndarray, bg: np.ndarray, snr_threshold: np.float32 = np.float32(2.0), min_event_len: int = 20, min_event_amp: np.float32 = np.float32(0.0), widen_frac: np.float32 = np.float32(0.5), global_noise: Optional[np.float32] = None, signal_polarity: int = -1) -> Tuple[np.ndarray, np.float32]
+waveforms = extract_event_waveforms(t, x, events, bg_clean=bg_clean)
+# ... continue with diffusion analysis
```
-Detect events in signal above background with specified thresholds.
-### `merge_overlapping_events`
-```python
-def merge_overlapping_events(events: np.ndarray) -> np.ndarray
-```
-Merge overlapping events.
+## Public API
+
+The main entry points are `detect()` and `detect_from_wfm()`. For full API documentation, consult the docstrings in the source code or the type hints in `src/transivent/__init__.py`.
+
+Building blocks and advanced functions are available in submodules:
+- `transivent.analysis` - Background estimation, noise analysis, event detection
+- `transivent.event_detector` - Low-level detection algorithms
+- `transivent.diffusion` - Diffusion analysis tools (optional)
+- `transivent.io` - File I/O utilities
+
+## Examples
+
+The repository includes several example scripts demonstrating different use cases:
+
+- **`example_quick_start.py`** - Quick introduction to both `detect()` and `detect_from_wfm()`
+- **`example.py`** - Complete workflow for processing Wfm files with XML sidecars
+- **`example_custom_data.py`** - Demonstrates using transivent with any time-series data (CSV, NumPy arrays, etc.)
+- **`example_diffusion.py`** - Complete diffusion analysis workflow with visualization
+
+Each example is self-contained and can be run directly to see the library in action.
+
+## Migration from v1.0.0
+
+If you're upgrading from v1.0.0, here's how to update your code:
-### `get_waveform_params`
+### v1.0.0 → v2.0.0 Mapping
+
+**Old:**
```python
-def get_waveform_params(bin_filename: str, data_path: Optional[str] = None, sidecar: Optional[str] = None) -> Dict[str, Any]
+from transivent import process_file
+
+process_file(
+ name="data.Wfm.bin",
+ sampling_interval=5e-7,
+ data_path="/path/to/data/",
+ detection_snr=3.0,
+ show_plots=True,
+)
```
-Parse XML sidecar file to extract waveform parameters.
-### `rd`
+**New:**
```python
-def rd(filename: str, sampling_interval: Optional[float] = None, data_path: Optional[str] = None, sidecar: Optional[str] = None, crop: Optional[List[int]] = None) -> Tuple[np.ndarray, np.ndarray]
+from transivent import detect_from_wfm
+
+results = detect_from_wfm(
+ name="data.Wfm.bin",
+ sampling_interval=5e-7,
+ data_path="/path/to/data/",
+ detection_snr=3.0,
+ save_plots=True,
+)
+
+events = results["events"] # Now you get results back!
```
-Read waveform binary file using sidecar XML for parameters.
-### `rd_chunked`
+### Key Changes
+
+| v1.0.0 | v2.0.0 | Notes |
+|--------|--------|-------|
+| `process_file()` | `detect_from_wfm()` | For Wfm files only |
+| No custom data support | `detect()` | New entry point for any data |
+| No return value (void) | Returns dict | All results in one place |
+| 37 public functions | 8 public functions | Much simpler API |
+| Plot on disk only | Plot in memory + disk | Access via `results["plot"]` |
+| Internal functions exposed | Submodules for building blocks | `transivent.analysis`, `transivent.diffusion` |
+
+### Building Blocks
+
+If you were using internal functions like `calculate_initial_background`, they're still available but now in submodules:
+
```python
-def rd_chunked(filename: str, chunk_size: int, sampling_interval: Optional[float] = None, data_path: Optional[str] = None, sidecar: Optional[str] = None) -> Generator[Tuple[np.ndarray, np.ndarray], None, None]
+# Old (no longer in main API)
+from transivent import calculate_initial_background
+
+# New
+from transivent.analysis import calculate_initial_background
```
-Read waveform binary file in chunks using sidecar XML for parameters. This is a generator function that yields chunks of data.