I have a general question about these SDRs: what is the latency between when the radio wave "hits" the dongle, and when it is detected by the software framework?
Buffers, buffers everywhere... It depends on the sampling rate and technology (USB, ethernet, direct PCI), but in general 1msec is achievable, 10microsec is not achievable.
That's why BladeRF-wiphy ( https://news.ycombinator.com/item?id=25814237 ) implements the lower level phy on FPGA - there's no way to reply
an IEEE 802.11 ACK frame within 10 microseconds of the end of the received frame, as required by the standard.
Depends on your buffer size, bandwidth, and what you're demodulating/decoding, but it's pretty fast. With a narrow bandwidth on a fast port and simple modulation/encoding, maybe a few milliseconds?
Counter-intuitively, it is often difficult to reach low latency with "narrow bandwidth" (you mean low sampling rate), because many SDR interfaces are designed to fill full packets with data, not send tiny packets (ex: Ethernet packets). This problem goes away above 1-10Msps.
A few ms as others have said, but it's also a non-deterministic delay.
This is a problem for applications like time-of-flight measurement, so one way to account for this is to send a known signal on TX and look for it on RX