GSBStreamWriter

class baseband.gsb.base.GSBStreamWriter(fh_ts, fh_raw, header0=None, sample_rate=None, samples_per_frame=None, payload_nbytes=None, nchan=None, bps=None, complex_data=None, squeeze=True)[source] [edit on github]

Bases: baseband.gsb.base.GSBStreamBase, baseband.base.base.StreamWriterBase

GSB format writer.

Encodes and writes sequences of samples to file.

Parameters
fh_tsfilehandle

For writing time stamps to storage.

fh_rawfilehandle, or nested tuple of filehandles

For writing raw binary data to storage. A single file is needed for rawdump, and a tuple for phased. For a nested tuple, the outer tuple determines the number of polarizations, and the inner tuple(s) the number of streams per polarization. E.g., ((polL1, polL2), (polR1, polR2)) for two streams per polarization. A single tuple is interpreted as streams of a single polarization.

header0GSBHeader

Header for the first frame, holding time information, etc.

sample_rateQuantity, optional

Number of complete samples per second, i.e. the rate at which each channel of each polarization is sampled. If not given, will be inferred assuming the frame rate is exactly 0.25165824 s.

samples_per_frameint, optional

Number of complete samples per frame (possibly combining two files). Can give payload_nbytes instead.

payload_nbytesint, optional

Number of bytes per payload (in each raw file separately). If both samples_per_frame and payload_nbytes are None, payload_nbytes is set to 2**22 (4 MiB).

nchanint, optional

Number of channels. Default: 1 for rawdump, 512 for phased.

bpsint, optional

Bits per elementary sample, i.e. per real or imaginary component for complex data. Default: 4 for rawdump, 8 for phased.

complex_databool, optional

Whether data are complex. Default: False for rawdump, True for phased.

squeezebool, optional

If True (default), write accepts squeezed arrays as input, and adds any dimensions of length unity.

Attributes Summary

bps

Bits per elementary sample.

complex_data

Whether the data are complex.

header0

First header of the file.

payload_nbytes

Number of bytes per payload, divided by the number of raw files.

sample_rate

Number of complete samples per second.

sample_shape

Shape of a complete sample (possibly subset or squeezed).

samples_per_frame

Number of complete samples per frame.

squeeze

Whether data arrays have dimensions with length unity removed.

start_time

Start time of the file.

time

Time of the sample pointer's current offset in file.

Methods Summary

close()

flush()

tell([unit])

Current offset in the file.

write(data[, valid])

Write data, buffering by frames as needed.

Attributes Documentation

bps

Bits per elementary sample.

complex_data

Whether the data are complex.

header0

First header of the file.

payload_nbytes

Number of bytes per payload, divided by the number of raw files.

sample_rate

Number of complete samples per second.

sample_shape

Shape of a complete sample (possibly subset or squeezed).

samples_per_frame

Number of complete samples per frame.

squeeze

Whether data arrays have dimensions with length unity removed.

If True, data read out has such dimensions removed, and data passed in for writing has them inserted.

start_time

Start time of the file.

See also time for the time of the sample pointer’s current offset.

time

Time of the sample pointer’s current offset in file.

See also start_time for the start time of the file.

Methods Documentation

close() [edit on github]
flush()[source] [edit on github]
tell(unit=None) [edit on github]

Current offset in the file.

Parameters
unitUnit or str, optional

Time unit the offset should be returned in. By default, no unit is used, i.e., an integer enumerating samples is returned. For the special string ‘time’, the absolute time is calculated.

Returns
offsetint, Quantity, or Time

Offset in current file (or time at current position).

write(data, valid=True) [edit on github]

Write data, buffering by frames as needed.

Parameters
datandarray

Piece of data to be written, with sample dimensions as given by sample_shape. This should be properly scaled to make best use of the dynamic range delivered by the encoding.

validbool, optional

Whether the current data are valid. Default: True.