diff mbox series

[v4,04/21] doc: media/v4l-drivers: Add Qualcomm Camera Subsystem driver document

Message ID 1502199018-28250-5-git-send-email-todor.tomov@linaro.org
State Accepted
Commit 15fc39aedf8e58654f954e1caf6b91680b1c4a77
Headers show
Series Qualcomm 8x16 Camera Subsystem driver | expand

Commit Message

Todor Tomov Aug. 8, 2017, 1:30 p.m. UTC
Add a document to describe Qualcomm Camera Subsystem driver.

Signed-off-by: Todor Tomov <todor.tomov@linaro.org>

---
 Documentation/media/v4l-drivers/qcom_camss.rst | 124 +++++++++++++++++++++++++
 1 file changed, 124 insertions(+)
 create mode 100644 Documentation/media/v4l-drivers/qcom_camss.rst

-- 
2.7.4

Comments

Todor Tomov Aug. 18, 2017, 7:53 a.m. UTC | #1
Hi Hans,

On 18.08.2017 10:45, Hans Verkuil wrote:
> Hi Todor,

> 

> A few small comments below:

> 

> On 08/08/2017 03:30 PM, Todor Tomov wrote:

>> Add a document to describe Qualcomm Camera Subsystem driver.

>>

>> Signed-off-by: Todor Tomov <todor.tomov@linaro.org>

>> ---

>>  Documentation/media/v4l-drivers/qcom_camss.rst | 124 +++++++++++++++++++++++++

>>  1 file changed, 124 insertions(+)

>>  create mode 100644 Documentation/media/v4l-drivers/qcom_camss.rst

>>

>> diff --git a/Documentation/media/v4l-drivers/qcom_camss.rst b/Documentation/media/v4l-drivers/qcom_camss.rst

>> new file mode 100644

>> index 0000000..4707ea7

>> --- /dev/null

>> +++ b/Documentation/media/v4l-drivers/qcom_camss.rst

>> @@ -0,0 +1,124 @@

>> +.. include:: <isonum.txt>

>> +

>> +Qualcomm Camera Subsystem driver

>> +================================

>> +

>> +Introduction

>> +------------

>> +

>> +This file documents the Qualcomm Camera Subsystem driver located under

>> +drivers/media/platform/qcom/camss-8x16.

>> +

>> +The current version of the driver supports the Camera Subsystem found on

>> +Qualcomm MSM8916 and APQ8016 processors.

>> +

>> +The driver implements V4L2, Media controller and V4L2 subdev interfaces.

>> +Camera sensor using V4L2 subdev interface in the kernel is supported.

>> +

>> +The driver is implemented using as a reference the Qualcomm Camera Subsystem

>> +driver for Android as found in Code Aurora [#f1]_.

>> +

>> +

>> +Qualcomm Camera Subsystem hardware

>> +----------------------------------

>> +

>> +The Camera Subsystem hardware found on 8x16 processors and supported by the

>> +driver consists of:

>> +

>> +- 2 CSIPHY modules. They handle the Physical layer of the CSI2 receivers.

>> +  A separate camera sensor can be connected to each of the CSIPHY module;

>> +- 2 CSID (CSI Decoder) modules. They handle the Protocol and Application layer

>> +  of the CSI2 receivers. A CSID can decode data stream from any of the CSIPHY.

>> +  Each CSID also contains a TG (Test Generator) block which can generate

>> +  artificial input data for test purposes;

>> +- ISPIF (ISP Interface) module. Handles the routing of the data streams from

>> +  the CSIDs to the inputs of the VFE;

>> +- VFE (Video Front End) module. Contains a pipeline of image processing hardware

>> +  blocks. The VFE has different input interfaces. The PIX input interface feeds

>> +  the input data to the image processing pipeline. Three RDI input interfaces

>> +  bypass the image processing pipeline. The VFE also contains the AXI bus

>> +  interface which writes the output data to memory.

> 

> Can you explain what PIX and RDI stand for?

> 

> I would also think it is a good idea to add a comment at the top of the various

> subdev sources that say a bit more than just "CSID Module".

> 

> A simple "CSID (CSI Decoder) Module" is enough. Just so the reader knows what

> it is all about.

> 

> Otherwise I don't have any more comments about this series.

> 

> I don't need a v5 for this, if you can just post one patch for this documentation

> and one patch improving the source comments as described above, then that's

> fine with me.


Thank you for the review again.
I'll post two additional patches to add explanations of the abbreviations.

> 

> Regards,

> 

> 	Hans

> 

>> +

>> +

>> +Supported functionality

>> +-----------------------

>> +

>> +The current version of the driver supports:

>> +

>> +- input from camera sensor via CSIPHY;

>> +- generation of test input data by the TG in CSID;

>> +- raw dump of the input data to memory. RDI interface of VFE is supported.

>> +  PIX interface (ISP processing, statistics engines, resize/crop, format

>> +  conversion) is not supported in the current version;

>> +- concurrent and independent usage of two data inputs - could be camera sensors

>> +  and/or TG.

>> +

>> +

>> +Driver Architecture and Design

>> +------------------------------

>> +

>> +The driver implements the V4L2 subdev interface. With the goal to model the

>> +hardware links between the modules and to expose a clean, logical and usable

>> +interface, the driver is split into V4L2 sub-devices as follows:

>> +

>> +- 2 CSIPHY sub-devices - each CSIPHY is represented by a single sub-device;

>> +- 2 CSID sub-devices - each CSID is represented by a single sub-device;

>> +- 2 ISPIF sub-devices - ISPIF is represented by a number of sub-devices equal

>> +  to the number of CSID sub-devices;

>> +- 3 VFE sub-devices - VFE is represented by a number of sub-devices equal to

>> +  the number of RDI input interfaces.

>> +

>> +The considerations to split the driver in this particular way are as follows:

>> +

>> +- representing CSIPHY and CSID modules by a separate sub-device for each module

>> +  allows to model the hardware links between these modules;

>> +- representing VFE by a separate sub-devices for each RDI input interface allows

>> +  to use the three RDI interfaces concurently and independently as this is

>> +  supported by the hardware;

>> +- representing ISPIF by a number of sub-devices equal to the number of CSID

>> +  sub-devices allows to create linear media controller pipelines when using two

>> +  cameras simultaneously. This avoids branches in the pipelines which otherwise

>> +  will require a) userspace and b) media framework (e.g. power on/off

>> +  operations) to  make assumptions about the data flow from a sink pad to a

>> +  source pad on a single media entity.

>> +

>> +Each VFE sub-device is linked to a separate video device node.

>> +

>> +The complete list of the media entities (V4L2 sub-devices and video device

>> +nodes) is as follows:

>> +

>> +- msm_csiphy0

>> +- msm_csiphy1

>> +- msm_csid0

>> +- msm_csid1

>> +- msm_ispif0

>> +- msm_ispif1

>> +- msm_vfe0_rdi0

>> +- msm_vfe0_video0

>> +- msm_vfe0_rdi1

>> +- msm_vfe0_video1

>> +- msm_vfe0_rdi2

>> +- msm_vfe0_video2

>> +

>> +

>> +Implementation

>> +--------------

>> +

>> +Runtime configuration of the hardware (updating settings while streaming) is

>> +not required to implement the currently supported functionality. The complete

>> +configuration on each hardware module is applied on STREAMON ioctl based on

>> +the current active media links, formats and controls set.

>> +

>> +

>> +Documentation

>> +-------------

>> +

>> +APQ8016 Specification:

>> +https://developer.qualcomm.com/download/sd410/snapdragon-410-processor-device-specification.pdf

>> +Referenced 2016-11-24.

>> +

>> +

>> +References

>> +----------

>> +

>> +.. [#f1] https://source.codeaurora.org/quic/la/kernel/msm-3.10/

>>

> 


-- 
Best regards,
Todor Tomov
Daniel Mack Aug. 25, 2017, 2:10 p.m. UTC | #2
Hi Todor,

Thanks a lot for working on the upstream support for this!

On 08/08/2017 03:30 PM, Todor Tomov wrote:
> +The Camera Subsystem hardware found on 8x16 processors and supported by the

> +driver consists of:

> +

> +- 2 CSIPHY modules. They handle the Physical layer of the CSI2 receivers.

> +  A separate camera sensor can be connected to each of the CSIPHY module;

> +- 2 CSID (CSI Decoder) modules. They handle the Protocol and Application layer

> +  of the CSI2 receivers. A CSID can decode data stream from any of the CSIPHY.

> +  Each CSID also contains a TG (Test Generator) block which can generate

> +  artificial input data for test purposes;

> +- ISPIF (ISP Interface) module. Handles the routing of the data streams from

> +  the CSIDs to the inputs of the VFE;

> +- VFE (Video Front End) module. Contains a pipeline of image processing hardware

> +  blocks. The VFE has different input interfaces. The PIX input interface feeds

> +  the input data to the image processing pipeline. Three RDI input interfaces

> +  bypass the image processing pipeline. The VFE also contains the AXI bus

> +  interface which writes the output data to memory.


[I'm based on the 4.9 Linaro downstream version of this code right now,
but at a glance the driver version there looks very much identical to
this one.]

Could you explain how ISPIF, CSID and CSIPHY are related?

I have a userspace test setup that works fine for USB webcams, but when
operating on any of the video devices exposed by this driver, the
lowlevel functions such as .s_power of the ISPIF, CSID, CSIPHY and the
sensor driver layers aren't called into.

The general setup seems to work fine though. The sensor is probed,
camss_subdev_notifier_complete() is called, and the v4l2 subdevices
exist. But the stream start is not propagated to the other layers, and
I'm trying to understand why.

My DTS looks something like this right now, and the hardware is an
APQ8016 board (Variscite DART SD410).

&i2c {
	cam0: ov5640@3c {
		compatible = "ovti,ov5640";
		reg = <0x3c>;

		// clocks, regulators, gpios etc are omitted

		port {
			cam0_ep: endpoint {
				clock-lanes = <1>;
				data-lanes = <0 2>;
				remote-endpoint = <&csiphy0_ep>;
			};
		};
	};
};

&camss {
	ports {
		port@0 {
			reg = <0>;
			csiphy0_ep: endpoint {
				clock-lanes = <1>;
				data-lanes = <0 1 2 3>;
				qcom,settle-cnt = <0xe>;
				remote-endpoint = <&cam0_ep>;
			};
		};
	};
};

Also, which video device should be opened when accessing the cameras on
each of the hardware ports? And what are the other two devices doing?

I'm sure I'm missing something trivial, but at least I can't find this
information in the documentation.


Thanks,
Daniel
Daniel Mack Aug. 29, 2017, 5:02 p.m. UTC | #3
Hi Todor,

On 08/28/2017 09:10 AM, Todor Tomov wrote:
> On 25.08.2017 17:10, Daniel Mack wrote:

>> I have a userspace test setup that works fine for USB webcams, but when

>> operating on any of the video devices exposed by this driver, the

>> lowlevel functions such as .s_power of the ISPIF, CSID, CSIPHY and the

>> sensor driver layers aren't called into.

> 

> Have you activated the media controller links? The s_power is called

> when the subdev is part of a pipeline in which the video device node

> is opened. You can see example configurations for the Qualcomm CAMSS

> driver on:

> https://github.com/96boards/documentation/blob/master/ConsumerEdition/DragonBoard-410c/Guides/CameraModule.md

> This will probably answer most of your questions.


Yes, it does, thank you! I didn't expect the necessity for any manual
setup on this level due to this sentence in the documentation:

> Runtime configuration of the hardware (updating settings while

> streaming) is not required to implement the currently supported

> functionality.


I hence assumed there is a fixed mapping that is partly derived from DTS
information etc. Anyway, this seems to work now, so thanks a bunch for
the pointer!

Another thing that confused me for a while is the CCI driver, which also
exposes media pads. I have the cameras connected to a regular I2C bus
however, and it seems to work fine. That just leaves the question why
this CCI driver exists at all.

I also have some more questions, but they are even more platform
specific, so I'll rather post them in the 96boards forum.


Thanks again!
Daniel
Daniel Mack Oct. 16, 2017, 3:01 p.m. UTC | #4
Hi,

On 28.08.2017 09:10, Todor Tomov wrote:
> On 25.08.2017 17:10, Daniel Mack wrote:

>> Could you explain how ISPIF, CSID and CSIPHY are related?

>>

>> I have a userspace test setup that works fine for USB webcams, but when

>> operating on any of the video devices exposed by this driver, the

>> lowlevel functions such as .s_power of the ISPIF, CSID, CSIPHY and the

>> sensor driver layers aren't called into.

> 

> Have you activated the media controller links? The s_power is called

> when the subdev is part of a pipeline in which the video device node

> is opened. You can see example configurations for the Qualcomm CAMSS

> driver on:

> https://github.com/96boards/documentation/blob/master/ConsumerEdition/DragonBoard-410c/Guides/CameraModule.md

> This will probably answer most of your questions.


It did in fact, yes. Thanks again for the pointer.

I am however struggling getting a 4-lane OV13855 camera to work with
this camss driver, and I'd be happy to hear about similar setups that work.

In short, here's what my setup looks like:

1. I wrote a driver for the OV13855 sensor, based on the one for OV13858
but with updated register values. It announces
MEDIA_BUS_FMT_SBGGR10_1X10 as bus format which is what the sensor should
be sending, if I understand the specs correctly.


2. The DTS snippet for the endpoint connection look like this:

&blsp_i2c6 {
	cam0: ov13855@16 {
		/* ... */
		port {
			cam0_ep: endpoint {
				clock-lanes = <1>;
				data-lanes = <0 2 3 4>;
				remote-endpoint = <&csiphy0_ep>;
			};
		};
	};
};

&camss {
	ports {
		port@0 {
			reg = <0>;
			csiphy0_ep: endpoint {
				clock-lanes = <1>;
				data-lanes = <0 2 3 4>;
				remote-endpoint = <&cam0_ep>;
			};
		};
	};
};

There are also no lane swaps or any intermediate components in hardware.
We've checked the electrical bits many times, and that end seems alright.


3. The pads and links are set up like this:

# media-ctl -d /dev/media0 -l
'"msm_csiphy0":1->"msm_csid0":0[1],"msm_csid0":1->"msm_ispif0":0[1],"msm_ispif0":1->"msm_vfe0_rdi0":0[1]'

# media-ctl -d /dev/media0 -V '"ov13855
1-0010":0[fmt:SBGGR10_1X10/4224x3136
field:none],"msm_csiphy0":0[fmt:SBGGR10_1X10/4224x3136
field:none],"msm_csid0":0[fmt:SBGGR10_1X10/4224x3136
field:none],"msm_ispif0":0[fmt:SBGGR10_1X10/4224x3136
field:none],"msm_vfe0_rdi0":0[fmt:SBGGR10_1X10/4224x3136 field:none]'

Both commands succeed.


4. When streaming is started, the power consumption of the device goes
up, all necessary external clocks and voltages are provided and are
stable, and I can see a continuous stream of data on all 4 MIPI lanes
using an oscilloscope.


5. Capturing frames with the following yavta command doesn't work
though. The task is mostly stuck in the buffer dequeing ioctl:

# yavta -B capture-mplane -c10 -I -n 5 -f SBGGR10P -s 4224x3136 /dev/video0

vfe_isr() does fire sometimes with VFE_0_IRQ_STATUS_1_RDIn_SOF(0) set,
but very occasionally only, and the frames do not contain data.

FWIW, an ov6540 is connected to port 1 of the camss, and this sensor
works fine.

I'd be grateful for any pointer about what I could investigate on.


Thanks,
Daniel
Todor Tomov Oct. 25, 2017, 12:07 p.m. UTC | #5
Hi Daniel,

On 16.10.2017 18:01, Daniel Mack wrote:
> Hi,

> 

> On 28.08.2017 09:10, Todor Tomov wrote:

>> On 25.08.2017 17:10, Daniel Mack wrote:

>>> Could you explain how ISPIF, CSID and CSIPHY are related?

>>>

>>> I have a userspace test setup that works fine for USB webcams, but when

>>> operating on any of the video devices exposed by this driver, the

>>> lowlevel functions such as .s_power of the ISPIF, CSID, CSIPHY and the

>>> sensor driver layers aren't called into.

>>

>> Have you activated the media controller links? The s_power is called

>> when the subdev is part of a pipeline in which the video device node

>> is opened. You can see example configurations for the Qualcomm CAMSS

>> driver on:

>> https://github.com/96boards/documentation/blob/master/ConsumerEdition/DragonBoard-410c/Guides/CameraModule.md

>> This will probably answer most of your questions.

> 

> It did in fact, yes. Thanks again for the pointer.

> 

> I am however struggling getting a 4-lane OV13855 camera to work with

> this camss driver, and I'd be happy to hear about similar setups that work.

> 

> In short, here's what my setup looks like:

> 

> 1. I wrote a driver for the OV13855 sensor, based on the one for OV13858

> but with updated register values. It announces

> MEDIA_BUS_FMT_SBGGR10_1X10 as bus format which is what the sensor should

> be sending, if I understand the specs correctly.

> 

> 

> 2. The DTS snippet for the endpoint connection look like this:

> 

> &blsp_i2c6 {

> 	cam0: ov13855@16 {

> 		/* ... */

> 		port {

> 			cam0_ep: endpoint {

> 				clock-lanes = <1>;

> 				data-lanes = <0 2 3 4>;

> 				remote-endpoint = <&csiphy0_ep>;

> 			};

> 		};

> 	};

> };

> 

> &camss {

> 	ports {

> 		port@0 {

> 			reg = <0>;

> 			csiphy0_ep: endpoint {

> 				clock-lanes = <1>;

> 				data-lanes = <0 2 3 4>;

> 				remote-endpoint = <&cam0_ep>;

> 			};

> 		};

> 	};

> };

> 

> There are also no lane swaps or any intermediate components in hardware.

> We've checked the electrical bits many times, and that end seems alright.

> 

> 

> 3. The pads and links are set up like this:

> 

> # media-ctl -d /dev/media0 -l

> '"msm_csiphy0":1->"msm_csid0":0[1],"msm_csid0":1->"msm_ispif0":0[1],"msm_ispif0":1->"msm_vfe0_rdi0":0[1]'

> 

> # media-ctl -d /dev/media0 -V '"ov13855

> 1-0010":0[fmt:SBGGR10_1X10/4224x3136

> field:none],"msm_csiphy0":0[fmt:SBGGR10_1X10/4224x3136

> field:none],"msm_csid0":0[fmt:SBGGR10_1X10/4224x3136

> field:none],"msm_ispif0":0[fmt:SBGGR10_1X10/4224x3136

> field:none],"msm_vfe0_rdi0":0[fmt:SBGGR10_1X10/4224x3136 field:none]'

> 

> Both commands succeed.

> 

> 

> 4. When streaming is started, the power consumption of the device goes

> up, all necessary external clocks and voltages are provided and are

> stable, and I can see a continuous stream of data on all 4 MIPI lanes

> using an oscilloscope.

> 

> 

> 5. Capturing frames with the following yavta command doesn't work

> though. The task is mostly stuck in the buffer dequeing ioctl:

> 

> # yavta -B capture-mplane -c10 -I -n 5 -f SBGGR10P -s 4224x3136 /dev/video0

> 

> vfe_isr() does fire sometimes with VFE_0_IRQ_STATUS_1_RDIn_SOF(0) set,

> but very occasionally only, and the frames do not contain data.

> 

> FWIW, an ov6540 is connected to port 1 of the camss, and this sensor

> works fine.

> 

> I'd be grateful for any pointer about what I could investigate on.

>


Everything that you have described seems correct.

As you say that frames do not contain any data, do
VFE_0_IRQ_STATUS_0_IMAGE_MASTER_n_PING_PONG
fire at all or not?

Do you see any interrupts on the ISPIF? Which?

Could you please share what hardware setup you have - mezzanine and camera module.


-- 
Best regards,
Todor Tomov
diff mbox series

Patch

diff --git a/Documentation/media/v4l-drivers/qcom_camss.rst b/Documentation/media/v4l-drivers/qcom_camss.rst
new file mode 100644
index 0000000..4707ea7
--- /dev/null
+++ b/Documentation/media/v4l-drivers/qcom_camss.rst
@@ -0,0 +1,124 @@ 
+.. include:: <isonum.txt>
+
+Qualcomm Camera Subsystem driver
+================================
+
+Introduction
+------------
+
+This file documents the Qualcomm Camera Subsystem driver located under
+drivers/media/platform/qcom/camss-8x16.
+
+The current version of the driver supports the Camera Subsystem found on
+Qualcomm MSM8916 and APQ8016 processors.
+
+The driver implements V4L2, Media controller and V4L2 subdev interfaces.
+Camera sensor using V4L2 subdev interface in the kernel is supported.
+
+The driver is implemented using as a reference the Qualcomm Camera Subsystem
+driver for Android as found in Code Aurora [#f1]_.
+
+
+Qualcomm Camera Subsystem hardware
+----------------------------------
+
+The Camera Subsystem hardware found on 8x16 processors and supported by the
+driver consists of:
+
+- 2 CSIPHY modules. They handle the Physical layer of the CSI2 receivers.
+  A separate camera sensor can be connected to each of the CSIPHY module;
+- 2 CSID (CSI Decoder) modules. They handle the Protocol and Application layer
+  of the CSI2 receivers. A CSID can decode data stream from any of the CSIPHY.
+  Each CSID also contains a TG (Test Generator) block which can generate
+  artificial input data for test purposes;
+- ISPIF (ISP Interface) module. Handles the routing of the data streams from
+  the CSIDs to the inputs of the VFE;
+- VFE (Video Front End) module. Contains a pipeline of image processing hardware
+  blocks. The VFE has different input interfaces. The PIX input interface feeds
+  the input data to the image processing pipeline. Three RDI input interfaces
+  bypass the image processing pipeline. The VFE also contains the AXI bus
+  interface which writes the output data to memory.
+
+
+Supported functionality
+-----------------------
+
+The current version of the driver supports:
+
+- input from camera sensor via CSIPHY;
+- generation of test input data by the TG in CSID;
+- raw dump of the input data to memory. RDI interface of VFE is supported.
+  PIX interface (ISP processing, statistics engines, resize/crop, format
+  conversion) is not supported in the current version;
+- concurrent and independent usage of two data inputs - could be camera sensors
+  and/or TG.
+
+
+Driver Architecture and Design
+------------------------------
+
+The driver implements the V4L2 subdev interface. With the goal to model the
+hardware links between the modules and to expose a clean, logical and usable
+interface, the driver is split into V4L2 sub-devices as follows:
+
+- 2 CSIPHY sub-devices - each CSIPHY is represented by a single sub-device;
+- 2 CSID sub-devices - each CSID is represented by a single sub-device;
+- 2 ISPIF sub-devices - ISPIF is represented by a number of sub-devices equal
+  to the number of CSID sub-devices;
+- 3 VFE sub-devices - VFE is represented by a number of sub-devices equal to
+  the number of RDI input interfaces.
+
+The considerations to split the driver in this particular way are as follows:
+
+- representing CSIPHY and CSID modules by a separate sub-device for each module
+  allows to model the hardware links between these modules;
+- representing VFE by a separate sub-devices for each RDI input interface allows
+  to use the three RDI interfaces concurently and independently as this is
+  supported by the hardware;
+- representing ISPIF by a number of sub-devices equal to the number of CSID
+  sub-devices allows to create linear media controller pipelines when using two
+  cameras simultaneously. This avoids branches in the pipelines which otherwise
+  will require a) userspace and b) media framework (e.g. power on/off
+  operations) to  make assumptions about the data flow from a sink pad to a
+  source pad on a single media entity.
+
+Each VFE sub-device is linked to a separate video device node.
+
+The complete list of the media entities (V4L2 sub-devices and video device
+nodes) is as follows:
+
+- msm_csiphy0
+- msm_csiphy1
+- msm_csid0
+- msm_csid1
+- msm_ispif0
+- msm_ispif1
+- msm_vfe0_rdi0
+- msm_vfe0_video0
+- msm_vfe0_rdi1
+- msm_vfe0_video1
+- msm_vfe0_rdi2
+- msm_vfe0_video2
+
+
+Implementation
+--------------
+
+Runtime configuration of the hardware (updating settings while streaming) is
+not required to implement the currently supported functionality. The complete
+configuration on each hardware module is applied on STREAMON ioctl based on
+the current active media links, formats and controls set.
+
+
+Documentation
+-------------
+
+APQ8016 Specification:
+https://developer.qualcomm.com/download/sd410/snapdragon-410-processor-device-specification.pdf
+Referenced 2016-11-24.
+
+
+References
+----------
+
+.. [#f1] https://source.codeaurora.org/quic/la/kernel/msm-3.10/