Loggers

Logging functionality

There are two types of loggers:

Data:

_LOGGERS

list of strings, which loggers have been created already.

Functions:

init_logger(module_name[, log_level, ...])

Initialize a logger for logging events.

update_logger_sizes()

Adjust each logger's maxBytes attribute so that the total across all loggers is prefs.LOGGING_MAX_BYTES

Classes:

DataLogger([compression_level])

Class for logging numerical respiration data and control settings. Creates a hdf5 file with this general structure: / root |--- waveforms (group) | |--- time | pressure_data | flow_out | control_signal_in | control_signal_out | FiO2 | Cycle No. | |--- controls (group) | |--- (time, controllsignal) | |--- derived_quantities (group) | |--- (time, Cycle No, I_PHASE_DURATION, PIP_TIME, PEEP_time, PIP, PIP_PLATEAU, PEEP, VTE ) | |--- program_information (group) | |--- (version & githash).

pvp.common.loggers._LOGGERS = ['pvp.common.prefs', 'pvp.alarm.alarm_manager']

list of strings, which loggers have been created already.

pvp.common.loggers.init_logger(module_name: str, log_level: Optional[int] = None, file_handler: bool = True) logging.Logger[source]

Initialize a logger for logging events.

To keep logs sensible, you should usually initialize the logger with the name of the module that’s using it, eg:

logger = init_logger(__name__)

If a logger has already been initialized (ie. its name is in loggers._LOGGERS, return that.

otherwise configure and return the logger such that

  • its LOGLEVEL is set to prefs.LOGLEVEL

  • It formats logging messages with logger name, time, and logging level

  • if a file handler is specified (default), create a logging.RotatingFileHandler according to params set in prefs

Parameters
Returns

Logger 4 u 2 use

Return type

logging.Logger

pvp.common.loggers.update_logger_sizes()[source]

Adjust each logger’s maxBytes attribute so that the total across all loggers is prefs.LOGGING_MAX_BYTES

class pvp.common.loggers.DataLogger(compression_level: int = 9)[source]

Bases: object

Class for logging numerical respiration data and control settings. Creates a hdf5 file with this general structure:

/ root |— waveforms (group) | |— time | pressure_data | flow_out | control_signal_in | control_signal_out | FiO2 | Cycle No. | |— controls (group) | |— (time, controllsignal) | |— derived_quantities (group) | |— (time, Cycle No, I_PHASE_DURATION, PIP_TIME, PEEP_time, PIP, PIP_PLATEAU, PEEP, VTE ) | |— program_information (group) | |— (version & githash)

Public Methods:

close_logfile(): Flushes, and closes the logfile. store_waveform_data(SensorValues): Takes data from SensorValues, but DOES NOT FLUSH store_controls(): Store controls in the same file? TODO: Discuss flush_logfile(): Flush the data into the file

Initialized the coontinuous numerical logger class.

Parameters

compression_level (int, optional) – Compression level of the hdf5 file. Defaults to 9.

Methods:

__init__([compression_level])

Initialized the coontinuous numerical logger class.

_open_logfile()

Opens the hdf5 file and generates the file structure.

close_logfile()

Flushes & closes the open hdf file.

store_program_data()

Appends program metadata to the logfile: githash and version

store_waveform_data(sensor_values, ...)

Appends a datapoint to the file for continuous logging of streaming data.

store_control_command(control_setting)

Appends a datapoint to the event-table, derived from ControlSettings

store_derived_data(derived_values)

Appends a datapoint to the event-table, derived the continuous data (PIP, PEEP etc.)

flush_logfile()

This flushes the datapoints to the file.

check_files()

make sure that the file's are not getting too large.

rotation_newfile()

This rotates through filenames, similar to a ringbuffer, to make sure that the program does not run of of space/

load_file([filename])

This loads a hdf5 file, and returns data to the user as a dictionary with two keys: waveform_data and control_data

log2mat([filename])

Translates the compressed hdf5 into a matlab file containing a matlab struct. This struct has the same structure as the hdf5 file, but is not compressed. Use for any file: dl = DataLogger() dl.log2mat(filename) The file is saved at the same path as .mat file, where the content is represented as matlab-structs.

log2csv([filename])

Translates the compressed hdf5 into three csv files containing:

__init__(compression_level: int = 9)[source]

Initialized the coontinuous numerical logger class.

Parameters

compression_level (int, optional) – Compression level of the hdf5 file. Defaults to 9.

_open_logfile()[source]

Opens the hdf5 file and generates the file structure.

close_logfile()[source]

Flushes & closes the open hdf file.

store_program_data()[source]

Appends program metadata to the logfile: githash and version

store_waveform_data(sensor_values: SensorValues, control_values: ControlValues)[source]

Appends a datapoint to the file for continuous logging of streaming data. NOTE: Not flushed yet.

Parameters
  • sensor_values (SensorValues) – SensorValues to be stored in the file.

  • control_values (ControlValues) – ControlValues to be stored in the file

store_control_command(control_setting: ControlSetting)[source]

Appends a datapoint to the event-table, derived from ControlSettings

Parameters

control_setting (ControlSetting) – ControlSettings object, the content of which should be stored

store_derived_data(derived_values: DerivedValues)[source]

Appends a datapoint to the event-table, derived the continuous data (PIP, PEEP etc.)

Parameters

derived_values (DerivedValues) – DerivedValues object, the content of which should be stored

flush_logfile()[source]

This flushes the datapoints to the file. To be executed every other second, e.g. at the end of breath cycle.

check_files()[source]

make sure that the file’s are not getting too large.

rotation_newfile()[source]

This rotates through filenames, similar to a ringbuffer, to make sure that the program does not run of of space/

load_file(filename=None)[source]

This loads a hdf5 file, and returns data to the user as a dictionary with two keys: waveform_data and control_data

Parameters

filename (str, optional) – Path to a hdf5-file. If none is given, uses currently open file. Defaults to None.

Returns

Containing the data arranged as ` {“waveform_data”: waveform_data, “control_data”: control_data, “derived_data”: derived_data, “program_information”: program_data}`

Return type

dictionary

log2mat(filename=None)[source]

Translates the compressed hdf5 into a matlab file containing a matlab struct. This struct has the same structure as the hdf5 file, but is not compressed. Use for any file:

dl = DataLogger() dl.log2mat(filename)

The file is saved at the same path as .mat file, where the content is represented as matlab-structs.

Parameters

filename (str, optional) – Path to a hdf5-file. If none is given, uses currently open file. Defaults to None.

log2csv(filename=None)[source]
Translates the compressed hdf5 into three csv files containing:
  • waveform_data (measurement once per cycle)

  • derived_quantities (PEEP, PIP etc.)

  • control_commands (control commands sent to the controller)

This approximates the structure contained in the hdf5 file. Use for any file:

dl = DataLogger() dl.log2csv(filename)

Parameters

filename (str, optional) – Path to a hdf5-file. If none is given, uses currently open file. Defaults to None.