diff mbox series

[1/2] media: dt-bindings: media: i2c: Add MT9M114 camera sensor binding

Message ID 20200511233456.9722-1-laurent.pinchart@ideasonboard.com
State New
Headers show
Series [1/2] media: dt-bindings: media: i2c: Add MT9M114 camera sensor binding | expand

Commit Message

Laurent Pinchart May 11, 2020, 11:34 p.m. UTC
Add device tree binding for the ON Semiconductor MT9M114 CMOS camera
sensor.

Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
---
 .../bindings/media/i2c/onnn,mt9m114.yaml      | 188 ++++++++++++++++++
 MAINTAINERS                                   |   7 +
 2 files changed, 195 insertions(+)
 create mode 100644 Documentation/devicetree/bindings/media/i2c/onnn,mt9m114.yaml

Comments

Laurent Pinchart Feb. 6, 2022, 9:41 p.m. UTC | #1
Hi Sakari,

Reviving this old mail thread, as there are a few open questions. I
don't think they're blocking though, so I'll post a new version of the
driver for review soon

On Sun, May 17, 2020 at 01:38:08AM +0300, Laurent Pinchart wrote:
> On Wed, May 13, 2020 at 06:12:19PM +0300, Sakari Ailus wrote:
> > On Wed, May 13, 2020 at 03:36:40PM +0300, Laurent Pinchart wrote:
> > > On Wed, May 13, 2020 at 02:53:56PM +0300, Sakari Ailus wrote:
> > > > On Wed, May 13, 2020 at 03:00:27AM +0300, Laurent Pinchart wrote:
> > > > > On Wed, May 13, 2020 at 01:55:29AM +0300, Sakari Ailus wrote:
> > > > > > On Tue, May 12, 2020 at 02:34:56AM +0300, Laurent Pinchart wrote:
> > > > > > > The MT9M114 is a CMOS camera sensor that combines a 1296x976 pixel array
> > > > > > > with a 10-bit dynamic range together with an internal ISP. The driver
> > > > > > > exposes two subdevs, one for the pixel array and one for the ISP (named
> > > > > > > IFP for Image Flow Processor). Major supported features are
> > > > > > > 
> > > > > > > - Full configuration of analog crop and binning in the pixel array
> > > > > > > - Full configuration of scaling in the ISP
> > > > > > > - Automatic exposure and white balance
> > > > > > > - Manual exposure and analog gain
> > > > > > > - Horizontal and vertical flip
> > > > > > > 
> > > > > > > Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
> > > > > > > ---
> > > > > > >  MAINTAINERS                 |    1 +
> > > > > > >  drivers/media/i2c/Kconfig   |   10 +
> > > > > > >  drivers/media/i2c/Makefile  |    1 +
> > > > > > >  drivers/media/i2c/mt9m114.c | 2161 +++++++++++++++++++++++++++++++++++
> > > > > > >  4 files changed, 2173 insertions(+)
> > > > > > >  create mode 100644 drivers/media/i2c/mt9m114.c
> > > > > 
> > > > > [snip]
> > > > > 
> > > > > > > diff --git a/drivers/media/i2c/mt9m114.c b/drivers/media/i2c/mt9m114.c
> > > > > > > new file mode 100644
> > > > > > > index 000000000000..7f70ae2865b8
> > > > > > > --- /dev/null
> > > > > > > +++ b/drivers/media/i2c/mt9m114.c
> > > > > 
> > > > > [snip]
> > > > > 
> > > > > > > +static int mt9m114_pa_s_ctrl(struct v4l2_ctrl *ctrl)
> > > > > > > +{
> > > > > > > +	struct mt9m114 *sensor = pa_ctrl_to_mt9m114(ctrl);
> > > > > > > +	int ret = 0;
> > > > > > > +
> > > > > > > +	switch (ctrl->id) {
> > > > > > > +	case V4L2_CID_EXPOSURE_ABSOLUTE:
> > > > > > > +		mt9m114_write(sensor,
> > > > > > > +			      MT9M114_CAM_SENSOR_CONTROL_COARSE_INTEGRATION_TIME,
> > > > > > > +			      ctrl->val / MT9M114_LINE_LENGTH, &ret);
> > > > > > 
> > > > > > Hmm. I'd introduce a separate control for the fine exposure time. We'll
> > > > > > need it in any case, and this would also allow setting just the coarse
> > > > > > exposure time.
> > > > > > 
> > > > > > What do you think?
> > > > > 
> > > > > I've re-read the documentation of the V4L2_CID_EXPOSURE_ABSOLUTE
> > > > > control, and it's documented as being expressed in multiples of 100µs.
> > > > 
> > > > It says "should". Indeed this is not the case for raw sensors. We could
> > > > update the documentation, I think.
> > > > 
> > > > > Clearly not a good fit here :-S The ov9650 driver suffers from the same
> > > > > problem. All the other sensor drievrs use V4L2_CID_EXPOSURE, whose unit
> > > > > is not defined. Do we need to introduce V4L2_CID_EXPOSURE_COARSE and
> > > > > V4L2_CID_EXPOSURE_FINE ? It would get messy with so many ways to control
> > > > 
> > > > I'm not opposed to that in principle. But what do we do with all the
> > > > current drivers in that case? They use V4L2_CID_EXPOSURE_ABSOLUTE.
> > > 
> > > No, all sensor drivers except one use V4L2_CID_EXPOSURE, not
> > > V4L2_CID_EXPOSURE_ABSOLUTE. It's thus V4L2_CID_EXPOSURE that we would
> > > need to document as being expressed in lines for sensors (but in that
> > > case we'll likely have a large number of drivers that misuse it).
> > 
> > Right, we had two controls for exposure indeed. I think there could be
> > drivers that use V4L2_CID_EXPOSURE with another unit. Worth checking
> > perhaps. Either way, we could add that unit to the documentation of the
> > V4L2_CID_EXPOSURE as a recommendation or a requirement.
> 
> A quick check showed the following units:
> 
> - mt9m032, mt9p031, mt9t001, mt9v011, mt9v032, smiapp: lines
> - ov7670: not documented, seems to be lines
> - m5mols: not documented
> - sk56aa, sk5kbaf: µs
> - sr030pc30: ms
> 
> In the soc-camera drivers, we have
> 
> - mt9m001, mt9t031, mt9v022: 1/256 * total number of lines
> - ov6650: not documented
> 
> I wouldn't care too much about those though.
> 
> The control is also used in the pwc and gspca drivers, with unspecified
> units.
> 
> I think we could thus document the recommended units as a number of
> lines if desired.
> 
> > > > > the exposure time :-S Or should we document V4L2_CID_EXPOSURE as being
> > > > > expressed in lines for new drivers, and add V4L2_CID_EXPOSURE_FINE to be
> > > > > expressed in pixels ? What would two separate control for coarse and
> > > > > fine exposures bring us, compared to expressing the exposure time in
> > > > > pixels ?
> > > > 
> > > > It takes time to do the writes over the I²C bus. At higher frame rates it
> > > > become increasingly risky, and the least risk is indeed often preferred.
> 
> By the way, the driver could still handle a single V4L2_CID_EXPOSURE
> control expresses in pixels efficiently, by only writing the registers
> that change.
> 
> > > What do you propose then ? Adding V4L2_CID_EXPOSURE_FINE ? In addition
> > > to V4L2_CID_EXPOSURE ? How should V4L2_CID_EXPOSURE be documented ? And
> > > how should V4L2_CID_EXPOSURE_FINE be documented for that matter ?
> > 
> > The unit of the new control would need to be pixels.
> 
> Do you thus recommend updating the documentation of V4L2_CID_EXPOSURE to
> state the unit is lines, and creating a new V4L2_CID_EXPOSURE_FINE
> expressed in pixels ?

This is one of the open questions. In v2 I'll switch to the
V4L2_CID_EXPOSURE control and map it to the coarse integration time.

> > > > > > > +		mt9m114_write(sensor,
> > > > > > > +			      MT9M114_CAM_SENSOR_CONTROL_FINE_INTEGRATION_TIME,
> > > > > > > +			      ctrl->val % MT9M114_LINE_LENGTH, &ret);
> > > > > > > +		break;
> > > > > > > +
> > > > > > > +	case V4L2_CID_ANALOGUE_GAIN:
> > > > > > > +		/*
> > > > > > > +		 * The CAM_SENSOR_CONTROL_ANALOG_GAIN contains linear analog
> > > > > > > +		 * gain values that are mapped to the GLOBAL_GAIN register
> > > > > > > +		 * values by the sensor firmware.
> > > > > > > +		 */
> > > > > > > +		mt9m114_write(sensor, MT9M114_CAM_SENSOR_CONTROL_ANALOG_GAIN,
> > > > > > > +			      ctrl->val, &ret);
> > > > > > > +		break;
> > > > > > > +
> > > > > > > +	default:
> > > > > > > +		return -EINVAL;
> > > > > > > +	}
> > > > > > > +
> > > > > > > +	return ret;
> > > > > > > +}
> > > > > 
> > > > > [snip]
> > > > > 
> > > > > > > +static int mt9m114_pa_get_fmt(struct v4l2_subdev *sd,
> > > > > > > +			      struct v4l2_subdev_pad_config *cfg,
> > > > > > > +			      struct v4l2_subdev_format *fmt)
> > > > > > > +{
> > > > > > > +	struct mt9m114 *sensor = pa_to_mt9m114(sd);
> > > > > > > +
> > > > > > > +	fmt->format = *__mt9m114_pa_get_pad_format(sensor, cfg, fmt->pad,
> > > > > > > +						   fmt->which);
> > > > > > 
> > > > > > I believe these need to be serialised with e.g. a mutex. Same for set
> > > > > > below.
> > > > > 
> > > > > You're right, I'll fix that. All this is a bit inefficient though, as
> > > > > the ioctl are already serialized in subdev_do_ioctl_lock(), so there
> > > > 
> > > > They are not, as lock is always NULL. Drivers are still responsible for
> > > > doing this. This would seem to need some kind of a rework; acquiring a
> > > > mutex should be done to the calls independently of whether they are done
> > > > through IOCTLs or from other drivers.
> > > 
> > > My point is that there are two layers of locking, which isn't a nice
> > > implementation. Locking in subdev drivers is too complicated in general.
> > > Out of scope for this patch series of course.
> > 
> > Agreed. The locking could be moved from drivers to the framework, but it
> > wouldn't be a trivial task. As of now, the framework does not provide
> > locking for sub-device nodes.
> 
> Clearly not a task I want to tackle now :-)
> 
> > > > > would be double-locking, but that's required when the subdev operations
> > > > > are called within the kernel. Oh well... :-(
> > > > > 
> > > > > > > +	return 0;
> > > > > > > +}
> > > > > > > +
> > > > > > > +static int mt9m114_pa_set_fmt(struct v4l2_subdev *sd,
> > > > > > > +			      struct v4l2_subdev_pad_config *cfg,
> > > > > > > +			      struct v4l2_subdev_format *fmt)
> > > > > > > +{
> > > > > > > +	struct mt9m114 *sensor = pa_to_mt9m114(sd);
> > > > > > > +	struct v4l2_mbus_framefmt *format;
> > > > > > > +	struct v4l2_rect *crop;
> > > > > > > +	unsigned int hscale;
> > > > > > > +	unsigned int vscale;
> > > > > > > +
> > > > > > > +	crop = __mt9m114_pa_get_pad_crop(sensor, cfg, fmt->pad, fmt->which);
> > > > > > > +	format = __mt9m114_pa_get_pad_format(sensor, cfg, fmt->pad, fmt->which);
> > > > > > > +
> > > > > > > +	/* The sensor can bin horizontally and vertically. */
> > > > > > > +	hscale = DIV_ROUND_CLOSEST(crop->width, fmt->format.width ? : 1);
> > > > > > > +	vscale = DIV_ROUND_CLOSEST(crop->height, fmt->format.height ? : 1);
> > > > > > > +	format->width = crop->width / clamp(hscale, 1U, 2U);
> > > > > > > +	format->height = crop->height / clamp(vscale, 1U, 2U);
> > > > > > > +
> > > > > > > +	fmt->format = *format;
> > > > > > > +
> > > > > > > +	return 0;
> > > > > > > +}
> > > > > > > +
> > > > > > > +static int mt9m114_pa_get_selection(struct v4l2_subdev *sd,
> > > > > > > +				    struct v4l2_subdev_pad_config *cfg,
> > > > > > > +				    struct v4l2_subdev_selection *sel)
> > > > > > > +{
> > > > > > > +	struct mt9m114 *sensor = pa_to_mt9m114(sd);
> > > > > > > +
> > > > > > > +	switch (sel->target) {
> > > > > > > +	case V4L2_SEL_TGT_CROP:
> > > > > > > +		sel->r = *__mt9m114_pa_get_pad_crop(sensor, cfg, sel->pad,
> > > > > > > +						    sel->which);
> > > > > > > +		return 0;
> > > > > > > +
> > > > > > > +	case V4L2_SEL_TGT_CROP_DEFAULT:
> > > > > > > +	case V4L2_SEL_TGT_CROP_BOUNDS:
> > > > > > > +		sel->r.left = 0;
> > > > > > > +		sel->r.top = 0;
> > > > > > > +		sel->r.width = MT9M114_PIXEL_ARRAY_WIDTH;
> > > > > > > +		sel->r.height = MT9M114_PIXEL_ARRAY_HEIGHT;
> > > > > > > +		return 0;
> > > > > > > +
> > > > > > 
> > > > > > Could you add NATIVE_SIZE target?
> > > > > 
> > > > > Sure. The sensor has optical dark pixels, but I don't see a way in the
> > > > > datasheet to read the dark lines out (they're used by the internal ISP).
> > > > > I will thus set the native size as equal to the crop bounds.
> > > > > 
> > > > > The V4L2 documentation could really benefit from clarifying the native
> > > > > size and crop bounds targets by the way, it's awfully underspecified
> > > > > (and as a result I would guess that 99% of the drivers get it wrong).
> > > > 
> > > > The crop bounds are effectively the same in this case. But the crop bounds
> > > > (and crop) targets are used in a lot of different cases, too.
> > > > 
> > > > This is indeed a little grey area as sensor implementations differ. Those
> > > > black pixels may still affect timing, but some devices probably don't even
> > > > document them.
> > > 
> > > That doesn't tell me how you think the crop bounds and native size
> > > should be defined when there are dark pixels :-) We can discuss it
> > > separately, but I think it requires a discussion.
> > 
> > Right now, I guess I'd just use what the sensor transmits on the bus, and
> > ignore what's inside. We don't have a way to report the rest to the user
> > space anyway, and that generally would only have a very slight impact on
> > timing.
> 
> It still leaves open questions though. It's quite common to have dummy
> lines between the optical dark region and the active region, as the
> transition between optical dark and active may span a few lines. Some
> sensors will read out and transmit those lines, some won't. If we
> include the optical dark lines in the native size, we could have a weird
> rectangle that contains lines that can't be transmitted. If we don't
> include the optical dark lines, we'll need a new API to describe them.
> 
> I think we need to revisit these crop targets, taking into account
> optical dark lines as well as other line types, and clearly specify what
> sensor drivers shall implement. It doesn't have to be done now.

This is also an open question.

> > > > ...
> > > > 
> > > > > > > +static int mt9m114_parse_dt(struct mt9m114 *sensor)
> > > > > > > +{
> > > > > > > +	struct fwnode_handle *fwnode = dev_fwnode(&sensor->client->dev);
> > > > > > > +	struct fwnode_handle *ep;
> > > > > > > +	int ret;
> > > > > > > +
> > > > > > > +	if (!fwnode)
> > > > > > > +		return -ENXIO;
> > > > > > > +
> > > > > > > +	ep = fwnode_graph_get_next_endpoint(fwnode, NULL);
> > > > > > > +	if (!ep) {
> > > > > > > +		dev_err(&sensor->client->dev, "No endpoint found\n");
> > > > > > > +		return -EINVAL;
> > > > > > > +	}
> > > > > > > +
> > > > > > > +	ret = v4l2_fwnode_endpoint_alloc_parse(ep, &sensor->bus_cfg);
> > > > > > 
> > > > > > Please initialise the bus type.
> > > > > 
> > > > > Are you fine initializing it to V4L2_MBUS_UNKNOWN ? I don't want to loop
> > > > > over v4l2_fwnode_endpoint_alloc_parse() for all supported bus types.
> > > > 
> > > > Feel free to propose alternatives. Either way, that is probably the most
> > > > simple thing you can do in a driver.
> > > > 
> > > > We don't want to add DT properties just to cover deficiencies in driver
> > > > implementation.
> > > 
> > > The implementation here doesn't depend on additional DT properties as
> > > far as I can tell. I thus don't see the problem with the current code.
> > 
> > Feel free to keep it as-is if it works with the redundant DT properties
> > removed.
> 
> I've tested it again without the DT properties, and everything works fine.
> 
> > ...
> > 
> > > > > > > +static struct i2c_driver mt9m114_driver = {
> > > > > > > +	.driver = {
> > > > > > > +		.owner	= THIS_MODULE,
> > > > > > > +		.name	= "mt9m114",
> > > > > > > +	},
> > > > > > > +	.probe		= mt9m114_probe,
> > > > > > > +	.remove		= mt9m114_remove,
> > > > > > > +	.id_table	= mt9m114_id,
> > > > > > 
> > > > > > No OF or ACPI ID table? Really?
> > > > > 
> > > > > I have no idea what ACPI IDs this device would have, and OF isn't
> > > > > required, the I2C subsystem strips the vendor prefix from the compatible
> > > > > string and matches against i2c_driver.id_table.
> > > > 
> > > > If this driver is intended to work on a DT based system, it needs to have a
> > > > compatible string associated with it. The I²C ID table is meant to be used
> > > > with e.g. platform data.
> > > 
> > > The driver works as-is on DT-based systems, that's what I have tested it
> > > with :-)
> > 
> > Please switch to use of_device_id table and the compatible string.
> 
> I'll add an OF device table, but the I2C id_table has to be kept in
> i2c_driver (there's no OF device table there). Is there any other
> benefit than module auto-loading ?

I'll drop the I2C device ID table and use the OF device ID table in v2.
diff mbox series

Patch

diff --git a/Documentation/devicetree/bindings/media/i2c/onnn,mt9m114.yaml b/Documentation/devicetree/bindings/media/i2c/onnn,mt9m114.yaml
new file mode 100644
index 000000000000..2c3c691aacfd
--- /dev/null
+++ b/Documentation/devicetree/bindings/media/i2c/onnn,mt9m114.yaml
@@ -0,0 +1,188 @@ 
+# SPDX-License-Identifier: (GPL-2.0-only OR BSD-2-Clause)
+%YAML 1.2
+---
+$id: http://devicetree.org/schemas/media/i2c/onnn,mt9m114.yaml#
+$schema: http://devicetree.org/meta-schemas/core.yaml#
+
+title: ON Semiconductor 1/6-inch 720p CMOS Digital Image Sensor
+
+maintainers:
+  - Laurent Pinchart <laurent.pinchart@ideasonboard.com>
+
+description: |-
+  The ON Semiconductor MT9M114 is a 1/6-inch 720p (1.26 Mp) CMOS digital image
+  sensor with an active pixel-array size of 1296H x 976V. It is programmable
+  through an I2C interface and outputs image data over a 8-bit parallel or
+  1-lane MIPI CSI-2 connection.
+
+properties:
+  compatible:
+    const: onnn,mt9m114
+
+  reg:
+    description: I2C device address
+    enum:
+      - 0x48
+      - 0x5d
+
+  clocks:
+    description: EXTCLK clock signal
+    maxItems: 1
+
+  vdd-supply:
+    description:
+      Core digital voltage supply, 1.8V
+
+  vddio-supply:
+    description:
+      I/O digital voltage supply, 1.8V or 2.8V
+
+  vaa-supply:
+    description:
+      Analog voltage supply, 2.8V
+
+  reset-gpios:
+    description: |-
+      Reference to the GPIO connected to the RESET_BAR pin, if any (active
+      low).
+
+  # See ../video-interfaces.txt for more details
+  port:
+    type: object
+    properties:
+      endpoint:
+        type: object
+        properties:
+          bus-type:
+            enum: [4, 5, 6]
+
+          clock-lanes:
+            items:
+              - const: 0
+
+          data-lanes:
+            items:
+              - const: 1
+
+          bus-width:
+            items:
+              - const: 8
+
+          hsync-active:
+            items:
+              - const: 1
+
+          vsync-active:
+            items:
+              - const: 1
+
+        required:
+          - bus-type
+
+        allOf:
+          - if:
+              properties:
+                bus-type:
+                  const: 4
+            then:
+              properties:
+                bus-width: false
+                hsync-active: false
+                vsync-active: false
+
+          - if:
+              properties:
+                bus-type:
+                  const: 5
+            then:
+              properties:
+                clock-lanes: false
+                data-lanes: false
+
+          - if:
+              properties:
+                bus-type:
+                  const: 6
+            then:
+              properties:
+                clock-lanes: false
+                data-lanes: false
+                hsync-active: false
+                vsync-active: false
+
+        unevaluatedProperties: false
+
+required:
+  - compatible
+  - reg
+  - clocks
+  - vdd-supply
+  - vddio-supply
+  - vaa-supply
+  - port
+
+additionalProperties: false
+
+examples:
+  - |
+    #include <dt-bindings/gpio/gpio.h>
+
+    i2c0 {
+        #address-cells = <1>;
+        #size-cells = <0>;
+
+        sensor@48 {
+            compatible = "onnn,mt9m114";
+            reg = <0x48>;
+
+            clocks = <&clk24m 0>;
+
+            reset-gpios = <&gpio5 21 GPIO_ACTIVE_LOW>;
+
+            vddio-supply = <&reg_cam_1v8>;
+            vdd-supply = <&reg_cam_1v8>;
+            vaa-supply = <&reg_2p8v>;
+
+            port {
+                endpoint {
+                    bus-type = <4>;
+                    clock-lanes = <0>;
+                    data-lanes = <1>;
+                    remote-endpoint = <&mipi_csi_in>;
+                };
+            };
+        };
+    };
+
+  - |
+    #include <dt-bindings/gpio/gpio.h>
+
+    i2c0 {
+        #address-cells = <1>;
+        #size-cells = <0>;
+
+        sensor@5d {
+            compatible = "onnn,mt9m114";
+            reg = <0x5d>;
+
+            clocks = <&clk24m 0>;
+
+            reset-gpios = <&gpio5 21 GPIO_ACTIVE_LOW>;
+
+            vddio-supply = <&reg_cam_1v8>;
+            vdd-supply = <&reg_cam_1v8>;
+            vaa-supply = <&reg_2p8v>;
+
+            port {
+                endpoint {
+                    bus-type = <5>;
+                    bus-width = <8>;
+                    hsync-active = <1>;
+                    vsync-active = <1>;
+                    remote-endpoint = <&parallel_in>;
+                };
+            };
+        };
+    };
+
+...
diff --git a/MAINTAINERS b/MAINTAINERS
index 091ec22c1a23..61d2fb6d049e 100644
--- a/MAINTAINERS
+++ b/MAINTAINERS
@@ -11457,6 +11457,13 @@  T:	git git://linuxtv.org/media_tree.git
 F:	drivers/media/i2c/mt9m032.c
 F:	include/media/i2c/mt9m032.h
 
+MT9M114 ON SEMICONDUCTOR SENSOR DRIVER
+M:	Laurent Pinchart <laurent.pinchart@ideasonboard.com>
+L:	linux-media@vger.kernel.org
+S:	Maintained
+T:	git git://linuxtv.org/media_tree.git
+F:	Documentation/devicetree/bindings/media/i2c.onnn,mt9m114.yaml
+
 MT9P031 APTINA CAMERA SENSOR
 M:	Laurent Pinchart <laurent.pinchart@ideasonboard.com>
 L:	linux-media@vger.kernel.org