DynAIkonTrap.comms#

An interface for writing animal frames to disk or sending them to a server. The AbstractOutput combines a frame(s) with the most appropriate sensor log(s) and outputs these.

Functions

Output(settings, read_from)

Generator function to provide an implementation of the AbstractOutput based on the OutputMode of the settings argument.

Output(settings: OutputSettings, read_from: Tuple[Filter, SensorLogs]) Union[Sender, Writer]#

Generator function to provide an implementation of the AbstractOutput based on the OutputMode of the settings argument.

Classes

AbstractOutput(settings, read_from)

A base class to use for outputting captured images or videos.

Sender(settings, read_from)

The Sender is a simple interface for sending the desired data to a server

VideoCaption(sensor_logs, framerate)

Class to aid in generating captions for video output.

Writer(settings, read_from)

class AbstractOutput(settings: OutputSettings, read_from: Tuple[Filter, SensorLogs])#

A base class to use for outputting captured images or videos. The output_still() and output_video() functions should be overridden with output method-specific implementations.

close()#
abstract output_still(image: bytes, time: float, sensor_log: SensorLog)#

Output a still image with its sensor data. The sensor data can be provided via the keyword arguments.

Parameters
  • image (bytes) – The JPEG image frame

  • time (float) – UNIX timestamp when the image was captured

  • sensor_log (SensorLog) – Log of sensor values at time frame was captured

abstract output_video(video: IO[bytes], caption: StringIO, time: float, **kwargs)#

Output a video with its meta-data. The sensor data is provided via the video captions (caption).

Parameters
  • video (IO[bytes]) – MP4 video (codec: H264 - MPEG-4 AVC (part 10))

  • caption (StringIO) – Caption of sensor readings as produced by VideoCaption.generate_sensor_json()

  • time (float) – UNIX timestamp when the image was captured

class Sender(settings: SenderSettings, read_from: Tuple[Filter, SensorLogs])#

The Sender is a simple interface for sending the desired data to a server

output_still(image: bytes, time: float, sensor_log: SensorLog)#

Output a still image with its sensor data. The sensor data can be provided via the keyword arguments.

Parameters
  • image (bytes) – The JPEG image frame

  • time (float) – UNIX timestamp when the image was captured

  • sensor_log (SensorLog) – Log of sensor values at time frame was captured

output_video(video: IO[bytes], caption: StringIO, time: float, **kwargs)#

Output a video with its meta-data. The sensor data is provided via the video captions (caption).

Parameters
  • video (IO[bytes]) – MP4 video (codec: H264 - MPEG-4 AVC (part 10))

  • caption (StringIO) – Caption of sensor readings as produced by VideoCaption.generate_sensor_json()

  • time (float) – UNIX timestamp when the image was captured

class VideoCaption(sensor_logs: SensorLogs, framerate: float)#

Class to aid in generating captions for video output. The captions are based on the logged sensor readings.

Parameters
  • sensor_logs (SensorLogs) – The object containing the log of sensor readings

  • framerate (float) – Camera framerate

generate_sensor_json(timestamps: List[float]) StringIO#

Generate JSON captions containing the sensor readings at given moments in time.

The format is as follows:

[
    {
        "start": 0,
        "end": 1,
        "log": {
            "EXAMPLE_SENSOR_1": {
                "value": 0.0,
                "units": "x"
            },
            "EXAMPLE_SENSOR_2": {
                "value": 0.0,
                "units": "x"
            }
        }
    },
    {
        "start": 1,
        "end": 5,
        "logs": {}
    }
]

The "start" and "end" correspond to the frame numbers in which the sensor logs are valid. The frame numbers are inclusive. It is not guaranteed that all frames are covered by logs. There may also be also be overlaps between entries if the exact timestamp where a new set of sensor readings becomes valid occurs during a frame.

Parameters

timestamps (List[float]) – Timestamps for every frame in the motion/animal sequence

Returns

The JSON captions wrapped in a StringIO, ready for writing to file

Return type

StringIO

generate_vtt_for(timestamps: List[float]) StringIO#

Generate WebVTT captions containing the sensor readings at given moments in time.

Parameters

timestamps (List[float]) – Timestamps for every frame in the motion/animal sequence

Returns

The WebVTT captions ready to be sent to a server

Return type

StringIO

class Writer(settings: WriterSettings, read_from: Tuple[Filter, SensorLogs])#
output_still(image: bytes, time: float, sensor_log: SensorLog)#

Output a still image with its sensor data. The sensor data can be provided via the keyword arguments.

Parameters
  • image (bytes) – The JPEG image frame

  • time (float) – UNIX timestamp when the image was captured

  • sensor_log (SensorLog) – Log of sensor values at time frame was captured

output_video(video: IO[bytes], caption: StringIO, time: float, **kwargs)#

Output a video with its meta-data. The sensor data is provided via the video captions (caption).

Parameters
  • video (IO[bytes]) – MP4 video (codec: H264 - MPEG-4 AVC (part 10))

  • caption (StringIO) – Caption of sensor readings as produced by VideoCaption.generate_sensor_json()

  • time (float) – UNIX timestamp when the image was captured