1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
|
# transivent
`transivent` is a Python library for detecting and analysing transient events (~spikes) in time-series data. It provides a flexible and configurable pipeline for processing waveform data, identifying events based on signal-to-noise ratio, and visualizing the results. The library is designed to handle large files efficiently through chunked processing.
## Quick Start
The primary entrypoint for analysis is the `transivent.process_file` function. The analysis pipeline is controlled via a configuration dictionary, and the library provides functions to read waveform data from binary files with XML sidecars.
Here is a brief example based on `example.py`:
```python
from transivent import process_file, get_waveform_params
# 1. Define the analysis configuration
CONFIG = {
"DATA_PATH": "path/to/your/data/",
"SMOOTH_WIN_T": 10e-3,
"DETECTION_SNR": 3,
"MIN_EVENT_KEEP_SNR": 5,
"SIGNAL_POLARITY": 1,
"CHUNK_SIZE": 1_000_000, # Set to a value to enable chunking
# ... and more
}
# 2. Define the measurement file
measurement = {
"data": "RefCurve_2025-07-17_0_065114.Wfm.bin",
}
# 3. Merge configs and get waveform parameters
config = {**CONFIG, **measurement}
params = get_waveform_params(
config["data"], data_path=config["DATA_PATH"]
)
# 4. Run the processing pipeline
process_file(
name=config["data"],
data_path=config["DATA_PATH"],
sampling_interval=params["sampling_interval"],
smooth_win_t=config.get("SMOOTH_WIN_T"),
detection_snr=config.get("DETECTION_SNR"),
min_event_keep_snr=config.get("MIN_EVENT_KEEP_SNR"),
signal_polarity=config.get("SIGNAL_POLARITY"),
chunk_size=config.get("CHUNK_SIZE"),
# ... other parameters
)
```
## API Documentation
The public interface of `transivent` is defined in the main `__init__.py` package.
---
### `analyze_thresholds`
```python
def analyze_thresholds(x: np.ndarray, bg_clean: np.ndarray, global_noise: np.float32, detection_snr: float, min_event_keep_snr: float, signal_polarity: int) -> Tuple[np.ndarray, np.ndarray]
```
Analyze threshold statistics and create threshold arrays.
### `calculate_initial_background`
```python
def calculate_initial_background(t: np.ndarray, x: np.ndarray, smooth_n: int, filter_type: str = "gaussian") -> np.ndarray
```
Calculate initial background estimate.
### `calculate_smoothing_parameters`
```python
def calculate_smoothing_parameters(sampling_interval: float, smooth_win_t: Optional[float], smooth_win_f: Optional[float], min_event_t: float, detection_snr: float, min_event_keep_snr: float, widen_frac: float, signal_polarity: int) -> Tuple[int, int]
```
Calculate smoothing window size and minimum event length in samples.
### `configure_logging`
```python
def configure_logging(log_level: str = "INFO") -> None
```
Configure loguru logging with specified level.
### `create_oscilloscope_plot`
```python
def create_oscilloscope_plot(t: np.ndarray, x: np.ndarray, bg_initial: np.ndarray, bg_clean: np.ndarray, events: np.ndarray, detection_threshold: np.ndarray, keep_threshold: np.ndarray, name: str, detection_snr: float, min_event_keep_snr: float, max_plot_points: int, envelope_mode_limit: float, smooth_n: int, global_noise: Optional[np.float32] = None) -> OscilloscopePlot
```
Create oscilloscope plot with all visualization elements.
### `get_final_events`
```python
def get_final_events(state: Dict[str, Any]) -> np.ndarray
```
Extract and finalise the list of detected events from the state.
### `initialize_state`
```python
def initialize_state(config: Dict[str, Any]) -> Dict[str, Any]
```
Initialise the state dictionary for processing.
### `process_chunk`
```python
def process_chunk(data: Tuple[np.ndarray, np.ndarray], state: Dict[str, Any]) -> Dict[str, Any]
```
Process a single data chunk to find events.
### `process_file`
```python
def process_file(name: str, sampling_interval: float, data_path: str, smooth_win_t: Optional[float] = None, smooth_win_f: Optional[float] = None, detection_snr: float = 3.0, min_event_keep_snr: float = 6.0, min_event_t: float = 0.75e-6, widen_frac: float = 10.0, signal_polarity: int = -1, max_plot_points: int = 10000, envelope_mode_limit: float = 10e-3, sidecar: Optional[str] = None, crop: Optional[List[int]] = None, yscale_mode: str = "snr", show_plots: bool = True, filter_type: str = "gaussian", filter_order: int = 2, chunk_size: Optional[int] = None) -> None
```
Process a single waveform file for event detection.
### `EventPlotter`
```python
class EventPlotter:
def __init__(self, osc_plot: OscilloscopePlot, events: Optional[np.ndarray] = None, trace_idx: int = 0, bg_clean: Optional[np.ndarray] = None, global_noise: Optional[np.float32] = None, y_scale_mode: str = "raw")
```
Provides utility functions for plotting individual events or event grids.
### `detect_events`
```python
def detect_events(time: np.ndarray, signal: np.ndarray, bg: np.ndarray, snr_threshold: np.float32 = np.float32(2.0), min_event_len: int = 20, min_event_amp: np.float32 = np.float32(0.0), widen_frac: np.float32 = np.float32(0.5), global_noise: Optional[np.float32] = None, signal_polarity: int = -1) -> Tuple[np.ndarray, np.float32]
```
Detect events in signal above background with specified thresholds.
### `merge_overlapping_events`
```python
def merge_overlapping_events(events: np.ndarray) -> np.ndarray
```
Merge overlapping events.
### `get_waveform_params`
```python
def get_waveform_params(bin_filename: str, data_path: Optional[str] = None, sidecar: Optional[str] = None) -> Dict[str, Any]
```
Parse XML sidecar file to extract waveform parameters.
### `rd`
```python
def rd(filename: str, sampling_interval: Optional[float] = None, data_path: Optional[str] = None, sidecar: Optional[str] = None, crop: Optional[List[int]] = None) -> Tuple[np.ndarray, np.ndarray]
```
Read waveform binary file using sidecar XML for parameters.
### `rd_chunked`
```python
def rd_chunked(filename: str, chunk_size: int, sampling_interval: Optional[float] = None, data_path: Optional[str] = None, sidecar: Optional[str] = None) -> Generator[Tuple[np.ndarray, np.ndarray], None, None]
```
Read waveform binary file in chunks using sidecar XML for parameters. This is a generator function that yields chunks of data.
|