Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Software camera sync algorithm #184

Open
wants to merge 33 commits into
base: next
Choose a base branch
from
Open

Software camera sync algorithm #184

wants to merge 33 commits into from

Conversation

davidplowman
Copy link
Collaborator

This series of commits adds the rpi.sync software camera sync algorithm.

naushir and others added 27 commits September 25, 2024 15:35
Add a vendor control rpi::ScalerCrops that is analogous to the current
core::ScalerCrop, but can apply a different crop to each configured
stream.

This control takes a span of Rectangle structures - the order of
rectangles must match the order of streams configured by the application.

Signed-off-by: Naushir Patuck <[email protected]>
Do not cache the scalerCrop_ parameter. The cached value is used to
update the request metadata, but since this is not an expensive
operation (and can only occur once per frame), caching it is of limited
value.

This will simplify logic in a future commit where we can specify a
crop per-output stream.

Signed-off-by: Naushir Patuck <[email protected]>
…op()

This will be required when we program separate crop values to each ISP
output in a future commit.

Signed-off-by: Naushir Patuck <[email protected]>
In preparation for assigning separate crop windows for each stream, add
a new CropParams structure that stores the existing ispCrop_ and
ispMinCropSize_ as fields. Use a new std::map to store a CropParams
structure where the map key is the index of the stream configuration in
the CameraConfiguration vector.

At preset, only a single CropParams structure will be set at key == 0 to
preserve the existing crop handling logic.

Signed-off-by: Naushir Patuck <[email protected]>
Add a ispIndex field to CropParams that is used to track
which ISP output (0/1) will be used for a given stream during
configuration.

Tracking this information is required for an upcoming change where crop
rectangles can be specified for each configured stream. Currently, the
value is fixed to 0.

Signed-off-by: Naushir Patuck <[email protected]>
At this point, the index is unused, but will be in a future commit where
we can set different crops on each ISP output.

Signed-off-by: Naushir Patuck <[email protected]>
Handle multiple scaler crops being set through the rpi::ScalerCrops
control. We now populate the cropParams_ map in the loop where we handle
the output stream configuration items. The key of this map is the index
of the stream configuration structure set by the application. This will
also be the same index used to specify the crop rectangles through the
ScalerCrops control.

CameraData::applyScalerCrop() has been adapted to look at either
controls::ScalerCrop or controls::rpi::ScalerCrops. The former takes
priority over the latter, and if present, will apply the same scaler
crop to all output streams.

Finally return all crops through the same ScalerCrops control via
request metadata. The first configure stream's crop rectangle is also
returned via the ScalerCrop control in the request metadata.

Signed-off-by: Naushir Patuck <[email protected]>
sensorInfo_ currently gets populated in configureIPA(), but is possibly
referenced in platformConfigure() which is called first. Fix this by
populating sensorInfo_ straight after configuring the sensor in
configure(), ensuring the fields are valid in the call to
platformConfigure().

Signed-off-by: Naushir Patuck <[email protected]>
Ensure we index the CameraConfiguration incrementally when setting up
the raw and output streams.

Signed-off-by: Naushir Patuck <[email protected]>
Add a new subpoject wrap file for the libpisp library located at
https://github.com/raspberrypi/libpisp

The libpisp library is used to configure the Raspberry Pi 5 Frontend
and Backend ISP components.

Signed-off-by: Naushir Patuck <[email protected]>
Reviewed-by: David Plowman <[email protected]>
Add the Raspberry Pi 5 ISP (PiSP) pipeline handler to libcamera. To
include this pipeline handler in the build, set the following meson
option:

meson configure -Dpipelines=rpi/pisp

Signed-off-by: Naushir Patuck <[email protected]>
Reviewed-by: David Plowman <[email protected]>
Add the Raspberry Pi 5 ISP (PiSP) IPA to libcamera. To include this IPA
in the build, set the following meson option:

meson configure -Dipas=rpi/pisp

Signed-off-by: Naushir Patuck <[email protected]>
Reviewed-by: David Plowman <[email protected]>
The IMX708 sensor driver advertises its module variants (narrow/wide angle lens,
IR block/pass) by modifying the media entity name string. So add duplicate
entries for each variant.

Signed-off-by: Nick Hollinghurst <[email protected]>
Signed-off-by: Naushir Patuck <[email protected]>
Reviewed-by: Naushir Patuck <[email protected]>
Reviewed-by: David Plowman <[email protected]>
Look for the RAW mandatory stream flag in the pipeline handler config
file. If this flag is set, it guarantees that the application will
provide buffers for Unicam Image, so override the minUnicamBuffers and
minTotalUnicamBuffers config parameters in the following way:

- If startup drop frames are required, allocate at least 1 internal buffer.
- If no startup drop frames are required, do not allocate any internal buffers.

Look for the Output 0 mandatory stream flag in in the pipeline handler
config file. If this flag is set, it guarantees that the application
will provide buffers for the ISP, do not allocate any internal buffers
for the device.

Add a new rpi_apps.yaml pipeline handler config file that enables both
these flags.  To use the file, set the following env variable for a
custom build:

export LIBCAMERA_RPI_CONFIG_FILE=/usr/local/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml

or for a packaged install:

export LIBCAMERA_RPI_CONFIG_FILE=/usr/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml

Signed-off-by: Naushir Patuck <[email protected]>
… buffers

V4L2 only allows upto VIDEO_MAX_FRAME frames to be queued at a time, so
if we reach this limit, store the framebuffers in a pending queue, and
try to enqueue once a buffer has been dequeued.

Signed-off-by: Naushir Patuck <[email protected]>
Use an r-value reference in set() and setLocked(), allowing more
efficient metadata handling with std::forward and std::move if needed.

Signed-off-by: Naushir Patuck <[email protected]>
These function erases a key/value pair from the metadata object.

Signed-off-by: Naushir Patuck <[email protected]>
This property (cfeDataBufferStrided) indicates if the CSI-2 hardware
writes to the embedded/metadata buffer directly, or if it treats the
buffer like an image buffer and strides the metadata lines.

Unicam write this buffer strided, while the PiSP Frontend writes to it
directly. This information will be relevant to data parsers in the
helpers where the data is structured in lines.

Signed-off-by: Naushir Patuck <[email protected]>
Add the follwing RPi vendor controls to handle Convolutional Neural
Network processing:

CnnOutputTensor
CnnOutputTensorInfo
CnnEnableInputTensor
CnnInputTensor
CnnInputTensorInfo
CnnKpiInfo

These controls will be used to support the new Raspberry Pi AI Camera,
using an IMX500 sensor with on-board neural network processing.

Signed-off-by: Naushir Patuck <[email protected]>
Add code to handle the new CNN vendor controls in the Raspberry Pi IPA.

The value of CnnInputTensorInfo is cached as it is the only stateful
input control.

All other controls are output controls, and the values are copied into
directly from the rpiMetadata object if present. The camera helpers
populate the rpiMetadata object if the sensor supports on-board CNN
processing, such as the IMX500.

Signed-off-by: Naushir Patuck <[email protected]>
Add a CamHelper::setHwConfig() helper used by the IPA to set the
hardware configuartion in use by the pipeline. This will be needed by
the IMX500 camera helper in a future commit to determine if the
metadata buffer is strided.

Signed-off-by: Naushir Patuck <[email protected]>
Add a Sony IMX500 camera helper to the IPA. This also includes support
for the on-chip CNN hardware accelerator and parsing of the neural
network data stream returned in the metadata buffer.

Add tuning files for both VC4 and PiSP platforms.

Signed-off-by: Naushir Patuck <[email protected]>
The camera sync algorithm uses the following new controls:

SyncMode - a camera can be a server or client
SyncWait - whether the sync point has been reached
SyncLag - how far away from synchronisation a camera was
SyncFrameWallClock - for passing wall clock time to the IPA.

Signed-off-by: David Plowman <[email protected]>
Signed-off-by: Arsen Mikovic <[email protected]>
Subsequent commits will add actual values to this queue, which we can
then use for sending wallclock timestamps over to the IPAs, where a
future "synchonisation algorithm" can use them.

Also add code to return the wallclock time to the application through
the frame metadata.

Signed-off-by: Naushir Patuck <[email protected]>
@davidplowman
Copy link
Collaborator Author

@naushir For your enjoyment...!

Copy link
Collaborator

@naushir naushir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor things, but otherwise looks good!

src/libcamera/pipeline/rpi/vc4/vc4.cpp Outdated Show resolved Hide resolved
src/ipa/rpi/common/ipa_base.cpp Show resolved Hide resolved
maxJitter_ = maxJitter;
minPts_ = minPts;
reset();
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not move this to the constructor?

Also, maybe worth having maxJitter as a duration type? Although maybe not if the below calcs use int types.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yes, I didn't get this quite right. I meant to call initialise() after reading the parameters from the tuning file, but forgot. So I'll put that in (also, things like maxJitter don't need to be member variables either any more).

I'll see if making maxJitter a duration helps me or not...

src/ipa/rpi/controller/rpi/sync.cpp Outdated Show resolved Hide resolved
src/ipa/rpi/controller/rpi/sync.cpp Show resolved Hide resolved
constexpr unsigned int kDefaultMinAdjustment = 50;
constexpr unsigned int kDefaultFitNumPts = 100;
constexpr unsigned int kDefaultFitMaxJitter = 500;
constexpr unsigned int kDefaultFitMinPts = 10;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Although libcamera now uses this k notation, none of our other algorithms do. Not really fussed either way.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on the recent AWB patches, I sense leaving the ks in may be preferable...

We add wallclock timestamps to the queue when we dequeue a camera
buffer from the CFE.

Signed-off-by: Naushir Patuck <[email protected]>
We add wallclock timestamps to the queue when we dequeue a camera
buffer from Unicam.

Signed-off-by: Naushir Patuck <[email protected]>
We add a base class for a "sync algorithm", and define its inputs and
outputs in the SyncStatus class.

We add the necessary plumbing to the base IPA code so as to arrange
for the necessary parameters to be made available to such an
algorithm, and also to handle the return values, passing them back as
necessary to the pipeline handler.

Signed-off-by: Naushir Patuck <[email protected]>
@davidplowman davidplowman force-pushed the sync branch 5 times, most recently from d0f9609 to 6f7e33e Compare October 4, 2024 08:58
In this implementation, the server sends data packets out onto the
network every 30 frames or so.

Clients listening for this packet will send frame length deltas back
to the pipeline handler to match the synchronisation of the server.

We can use wallclock timestamps so that the process will actually work
across networked Pis, but it does really on those wallclocks being
properly synchronised. We de-jitter our wallclock measurements (as
they're made in userspace) to match the more accurate kernel
SensorTimestamp value.

When the server's advertised "ready time" is reached, both client and
server will signal this through metadata back to their respective
controlling applications.

Signed-off-by: David Plowman <[email protected]>
Signed-off-by: Arsen Mikovic <[email protected]>
Signed-off-by: Naushir Patuck <[email protected]>
Standard sync parameters are added to all tuning files.

Signed-off-by: David Plowman <[email protected]>
Standard sync parameters are added to all tuning files.

Signed-off-by: David Plowman <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants