aboutsummaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorSam Scholten2026-03-30 11:42:22 +1000
committerSam Scholten2026-03-30 11:42:22 +1000
commit637ddc52f4dc23ba3aa7cccef014aa85cab36b49 (patch)
treed9116fb184f32741bf1c8571ab6160be0b08acb3 /README.md
parent5a7c47d626ff3fc1352b2036001e853ae211d1af (diff)
downloadpicostream-637ddc52f4dc23ba3aa7cccef014aa85cab36b49.tar.gz
picostream-637ddc52f4dc23ba3aa7cccef014aa85cab36b49.zip
Release v1.0.0v1.0
Diffstat (limited to 'README.md')
-rw-r--r--README.md240
1 files changed, 71 insertions, 169 deletions
diff --git a/README.md b/README.md
index d332ae0..e130832 100644
--- a/README.md
+++ b/README.md
@@ -1,216 +1,118 @@
# PicoStream
-High-performance data acquisition from PicoScope 5000a series to HDF5, with decoupled live visualisation. Designed for robust, high-speed logging where data integrity is critical.
+A fast, simple GUI for streaming from PicoScope 5000a series scopes. Built on [labdaemon](https://github.com/qnslab/labdaemon).
-## Quick Start
+## What's new in v1.0
-Install with `uv`:
+Complete rewrite with:
-```bash
-uv pip install -e .
-```
+- **Dual channels** — acquire A and B simultaneously (16-bit = single channel only)
+- **Ring buffer** — configurable lookback (5–120s) with on-demand recording
+- **Pre-trigger capture** — include N seconds of data before you hit record
+- **Keep or discard** — stop recording and either save or trash the file
+- **VisPy plotting** — hardware-accelerated OpenGL, much faster than before
+- **Zarr format** — chunked storage, faster than HDF5 for large files
+- **PyQt6** — bumped from PyQt5
-Install the [PicoSDK](https://www.picotech.com/downloads) (Linux users: see [Arch AUR wiki](https://aur.archlinux.org/packages/picoscope) for configuration).
+![PicoStream GUI](assets/images/screenshot.png)
-Launch the GUI:
+## Install
-```bash
-just gui
-# or: uv run python -m picostream.main
-```
+Requires Python ≥3.10, PicoSDK installed.
-Or use the CLI:
+Install PicoSDK from [picotech.com/downloads](https://www.picotech.com/downloads).
-```bash
-uv run picostream -s 62.5 -o data.hdf5 --plot
+Linux users: create `/etc/udev/rules.d/99-picoscope.rules`:
```
-
-## Key Features
-
-- **Producer-consumer architecture** with large shared memory buffer pool prevents data loss
-- **Decoupled live plotting** reads from HDF5 file, cannot interfere with acquisition
-- **Live-only mode** for continuous monitoring without filling disk
-- **Numba-accelerated decimation** for efficient visualisation of large datasets
-- **GUI and CLI interfaces** for interactive and scripted workflows
-
-## Usage
-
-### GUI Application
-
-```bash
-just gui
+SUBSYSTEM=="usb", ATTR{idVendor}=="0ce9", MODE="0666"
```
+Then run `sudo udevadm control --reload-rules && sudo udevadm trigger`.
-Configure acquisition parameters, start/stop capture, and view live data. Settings persist between sessions.
-
-### Building Standalone Executable
+From PyPI:
```bash
-just build-gui
+pip install picostream
```
-Executable appears in `dist/PicoStream` (or `dist/PicoStream.exe` on Windows).
-
-### Command-Line Interface
-
-Standard acquisition (saves all data):
+For development:
```bash
-uv run picostream -s 62.5 -o my_data.hdf5 --plot
+just sync
```
-Live-only mode (limits file size):
+## Run
+
+After pip install:
```bash
-uv run picostream -s 62.5 --plot --max-buff-sec 60
+picostream
```
-View existing file:
+During development:
```bash
-uv run python -m picostream.dfplot /path/to/data.hdf5
+just gui
```
-Run `uv run picostream --help` for all options.
-
-## Documentation
-
-### Architecture
-
-PicoStream uses a producer-consumer pattern to ensure data integrity:
-
-- **Producer (`PicoDevice`)**: Interfaces with PicoScope hardware, streams ADC data into shared memory buffers in a dedicated thread
-- **Consumer (`Consumer`)**: Retrieves filled buffers from queue, writes to HDF5, returns empty buffers to pool in separate thread
-- **Buffer Pool**: Large shared memory buffers (100+ MB) prevent data loss if disk I/O slows
-- **Live Plotter (`HDF5LivePlotter`)**: Reads from HDF5 file on disk, completely decoupled from acquisition
+## Quickstart
-This ensures the critical acquisition path is never blocked by disk I/O or GUI rendering.
+1. Enter scope serial (or `MOCK` to simulate), click **Connect**
+2. Set channels, voltage ranges, sample rate, resolution
+3. **Start Acquisition** (Space bar)
+4. **Start Recording** (R) when you want to save
+5. **Stop & Keep** (K) or **Stop & Discard** (Del)
-### Data Analysis with `PicoStreamReader`
+Recordings save to `~/picostream/` by default.
-The `PicoStreamReader` class provides efficient access to HDF5 files with on-the-fly decimation.
+## Crash Cleanup
-```python
-import numpy as np
-from picostream.reader import PicoStreamReader
+If the app crashes during recording, incomplete Zarr files remain in `~/picostream/`.
+These are useless. Delete them:
-with PicoStreamReader('my_data.hdf5') as reader:
- sample_rate_sps = 1e9 / reader.sample_interval_ns
- print(f"File contains {reader.num_samples:,} samples at {sample_rate_sps / 1e6:.2f} MS/s")
-
- # Iterate through file with 10x decimation
- for times, voltages_mv in reader.get_block_iter(
- chunk_size=10_000_000, decimation_factor=10, decimation_mode='min_max'
- ):
- print(f"Processed {voltages_mv.size} decimated points")
+```bash
+rm -rf ~/picostream/*.zarr
```
-### API Reference: `PicoStreamReader`
-
-#### Initialisation
-
-**`__init__(self, hdf5_path: str)`**
-
-Initialises the reader. File is opened when used as context manager.
-
-#### Metadata Attributes
-
-Available after opening:
+Change save location in the GUI (Record Controls → Change...).
-- `num_samples: int` - Total raw samples in dataset
-- `sample_interval_ns: float` - Time interval between samples (nanoseconds)
-- `voltage_range_v: float` - Configured voltage range (e.g., `20.0` for ±20V)
-- `max_adc_val: int` - Maximum ADC count value (e.g., 32767)
-- `analog_offset_v: float` - Configured analogue offset (Volts)
-- `downsample_mode: str` - Hardware downsampling mode (`'average'` or `'aggregate'`)
-- `hardware_downsample_ratio: int` - Hardware downsampling ratio
+## Shortcuts
-#### Methods
+| Key | Action |
+|-----|--------|
+| Space | Start/stop acquisition |
+| 1, 2 | Toggle channels |
+| R | Start recording |
+| K | Stop + keep |
+| Del | Stop + discard |
+| Ctrl+O | Open existing Zarr file |
-**`get_block_iter(self, chunk_size: int = 1_000_000, decimation_factor: int = 1, decimation_mode: str = "mean") -> Generator`**
+## Development
-Generator yielding `(times, voltages)` tuples for entire dataset. Recommended for large files.
-
-- `chunk_size`: Number of raw samples per chunk
-- `decimation_factor`: Decimation factor
-- `decimation_mode`: `'mean'` or `'min_max'`
-
-**`get_next_block(self, chunk_size: int, decimation_factor: int = 1, decimation_mode: str = "mean") -> Tuple | None`**
-
-Retrieves next sequential block. Returns `None` at end of file. Use `reset()` to restart.
-
-**`get_block(self, size: int, start: int = 0, decimation_factor: int = 1, decimation_mode: str = "mean") -> Tuple`**
-
-Retrieves specific block from file.
-
-- `size`: Number of raw samples
-- `start`: Starting sample index
-
-**`reset(self) -> None`**
-
-Resets internal counter for `get_next_block()`.
-
-### API Reference: `HDF5LivePlotter`
-
-PyQt5 widget for real-time visualisation of HDF5 files. Can be used standalone or embedded in custom applications.
-
-#### Initialisation
-
-**`__init__(self, hdf5_path: str = "/tmp/data.hdf5", update_interval_ms: int = 50, display_window_seconds: float = 0.5, decimation_factor: int = 150, max_display_points: int = 4000)`**
-
-- `hdf5_path`: Path to HDF5 file
-- `update_interval_ms`: Update frequency (milliseconds)
-- `display_window_seconds`: Duration of data to display (seconds)
-- `decimation_factor`: Decimation factor for display
-- `max_display_points`: Maximum points to display (prevents GUI slowdown)
-
-#### Methods
-
-**`set_hdf5_path(self, hdf5_path: str) -> None`**
-
-Updates monitored HDF5 file path.
-
-**`start_updates(self) -> None`**
-
-Begins periodic plot updates.
-
-**`stop_updates(self) -> None`**
-
-Stops periodic updates.
-
-**`save_screenshot(self) -> None`**
-
-Saves PNG screenshot. Called automatically when 'S' key is pressed.
+```bash
+just gui # run from source
+just test # run tests
+just test-cov # run tests with coverage
+just format # format all code
+just lint # lint code
+just typecheck # type check
+just build # build PyInstaller executable
+just clean # remove build artifacts
+just sync # install dependencies
+just update # update dependencies
+just help # see all commands
+```
-#### Usage Example
+## Acknowledgements
-```python
-from PyQt5.QtWidgets import QApplication
-from picostream.dfplot import HDF5LivePlotter
-
-app = QApplication([])
-plotter = HDF5LivePlotter(
- hdf5_path="my_data.hdf5",
- display_window_seconds=1.0,
- decimation_factor=100
-)
-plotter.show()
-plotter.start_updates()
-app.exec_()
-```
+This began as a fork of [JoshHarris2108/pico_streaming](https://github.com/JoshHarris2108/pico_streaming) (unlicensed). The original producer-consumer architecture and PicoSDK integration came from Josh's work.
## Changelog
-### Version 0.2.0
-- Added live-only mode with `--max-buff-sec` option
-- Added GUI application for interactive control
-- Improved plotter to handle buffer resets gracefully
-- Added total sample count tracking across buffer resets
-- Skip verification step in live-only mode for better performance
-
-### Version 0.1.0
-- Initial release with core streaming and plotting functionality
+### v1.0.0
+Complete rewrite — new architecture, new file format, dual channels, GUI-only.
-## Acknowledgements
+### v0.2.0
+CLI + HDF5 + single channel + PyQt5/PyQtGraph.
-This package began as a fork of [JoshHarris2108/pico_streaming](https://github.com/JoshHarris2108/pico_streaming) (unlicensed). Thanks to Josh for the original architecture.
+### v0.1.0
+Initial release.