1.. SPDX-License-Identifier: GPL-2.0 2 3.. include:: <isonum.txt> 4 5Qualcomm Camera Subsystem driver 6================================ 7 8Introduction 9------------ 10 11This file documents the Qualcomm Camera Subsystem driver located under 12drivers/media/platform/qcom/camss. 13 14The current version of the driver supports the Camera Subsystem found on 15Qualcomm MSM8916/APQ8016 and MSM8996/APQ8096 processors. 16 17The driver implements V4L2, Media controller and V4L2 subdev interfaces. 18Camera sensor using V4L2 subdev interface in the kernel is supported. 19 20The driver is implemented using as a reference the Qualcomm Camera Subsystem 21driver for Android as found in Code Aurora [#f1]_ [#f2]_. 22 23 24Qualcomm Camera Subsystem hardware 25---------------------------------- 26 27The Camera Subsystem hardware found on 8x16 / 8x96 processors and supported by 28the driver consists of: 29 30- 2 / 3 CSIPHY modules. They handle the Physical layer of the CSI2 receivers. 31 A separate camera sensor can be connected to each of the CSIPHY module; 32- 2 / 4 CSID (CSI Decoder) modules. They handle the Protocol and Application 33 layer of the CSI2 receivers. A CSID can decode data stream from any of the 34 CSIPHY. Each CSID also contains a TG (Test Generator) block which can generate 35 artificial input data for test purposes; 36- ISPIF (ISP Interface) module. Handles the routing of the data streams from 37 the CSIDs to the inputs of the VFE; 38- 1 / 2 VFE (Video Front End) module(s). Contain a pipeline of image processing 39 hardware blocks. The VFE has different input interfaces. The PIX (Pixel) input 40 interface feeds the input data to the image processing pipeline. The image 41 processing pipeline contains also a scale and crop module at the end. Three 42 RDI (Raw Dump Interface) input interfaces bypass the image processing 43 pipeline. The VFE also contains the AXI bus interface which writes the output 44 data to memory. 45 46 47Supported functionality 48----------------------- 49 50The current version of the driver supports: 51 52- Input from camera sensor via CSIPHY; 53- Generation of test input data by the TG in CSID; 54- RDI interface of VFE 55 56 - Raw dump of the input data to memory. 57 58 Supported formats: 59 60 - YUYV/UYVY/YVYU/VYUY (packed YUV 4:2:2 - V4L2_PIX_FMT_YUYV / 61 V4L2_PIX_FMT_UYVY / V4L2_PIX_FMT_YVYU / V4L2_PIX_FMT_VYUY); 62 - MIPI RAW8 (8bit Bayer RAW - V4L2_PIX_FMT_SRGGB8 / 63 V4L2_PIX_FMT_SGRBG8 / V4L2_PIX_FMT_SGBRG8 / V4L2_PIX_FMT_SBGGR8); 64 - MIPI RAW10 (10bit packed Bayer RAW - V4L2_PIX_FMT_SBGGR10P / 65 V4L2_PIX_FMT_SGBRG10P / V4L2_PIX_FMT_SGRBG10P / V4L2_PIX_FMT_SRGGB10P / 66 V4L2_PIX_FMT_Y10P); 67 - MIPI RAW12 (12bit packed Bayer RAW - V4L2_PIX_FMT_SRGGB12P / 68 V4L2_PIX_FMT_SGBRG12P / V4L2_PIX_FMT_SGRBG12P / V4L2_PIX_FMT_SRGGB12P). 69 - (8x96 only) MIPI RAW14 (14bit packed Bayer RAW - V4L2_PIX_FMT_SRGGB14P / 70 V4L2_PIX_FMT_SGBRG14P / V4L2_PIX_FMT_SGRBG14P / V4L2_PIX_FMT_SRGGB14P). 71 72 - (8x96 only) Format conversion of the input data. 73 74 Supported input formats: 75 76 - MIPI RAW10 (10bit packed Bayer RAW - V4L2_PIX_FMT_SBGGR10P / V4L2_PIX_FMT_Y10P). 77 78 Supported output formats: 79 80 - Plain16 RAW10 (10bit unpacked Bayer RAW - V4L2_PIX_FMT_SBGGR10 / V4L2_PIX_FMT_Y10). 81 82- PIX interface of VFE 83 84 - Format conversion of the input data. 85 86 Supported input formats: 87 88 - YUYV/UYVY/YVYU/VYUY (packed YUV 4:2:2 - V4L2_PIX_FMT_YUYV / 89 V4L2_PIX_FMT_UYVY / V4L2_PIX_FMT_YVYU / V4L2_PIX_FMT_VYUY). 90 91 Supported output formats: 92 93 - NV12/NV21 (two plane YUV 4:2:0 - V4L2_PIX_FMT_NV12 / V4L2_PIX_FMT_NV21); 94 - NV16/NV61 (two plane YUV 4:2:2 - V4L2_PIX_FMT_NV16 / V4L2_PIX_FMT_NV61). 95 - (8x96 only) YUYV/UYVY/YVYU/VYUY (packed YUV 4:2:2 - V4L2_PIX_FMT_YUYV / 96 V4L2_PIX_FMT_UYVY / V4L2_PIX_FMT_YVYU / V4L2_PIX_FMT_VYUY). 97 98 - Scaling support. Configuration of the VFE Encoder Scale module 99 for downscalling with ratio up to 16x. 100 101 - Cropping support. Configuration of the VFE Encoder Crop module. 102 103- Concurrent and independent usage of two (8x96: three) data inputs - 104 could be camera sensors and/or TG. 105 106 107Driver Architecture and Design 108------------------------------ 109 110The driver implements the V4L2 subdev interface. With the goal to model the 111hardware links between the modules and to expose a clean, logical and usable 112interface, the driver is split into V4L2 sub-devices as follows (8x16 / 8x96): 113 114- 2 / 3 CSIPHY sub-devices - each CSIPHY is represented by a single sub-device; 115- 2 / 4 CSID sub-devices - each CSID is represented by a single sub-device; 116- 2 / 4 ISPIF sub-devices - ISPIF is represented by a number of sub-devices 117 equal to the number of CSID sub-devices; 118- 4 / 8 VFE sub-devices - VFE is represented by a number of sub-devices equal to 119 the number of the input interfaces (3 RDI and 1 PIX for each VFE). 120 121The considerations to split the driver in this particular way are as follows: 122 123- representing CSIPHY and CSID modules by a separate sub-device for each module 124 allows to model the hardware links between these modules; 125- representing VFE by a separate sub-devices for each input interface allows 126 to use the input interfaces concurrently and independently as this is 127 supported by the hardware; 128- representing ISPIF by a number of sub-devices equal to the number of CSID 129 sub-devices allows to create linear media controller pipelines when using two 130 cameras simultaneously. This avoids branches in the pipelines which otherwise 131 will require a) userspace and b) media framework (e.g. power on/off 132 operations) to make assumptions about the data flow from a sink pad to a 133 source pad on a single media entity. 134 135Each VFE sub-device is linked to a separate video device node. 136 137The media controller pipeline graph is as follows (with connected two / three 138OV5645 camera sensors): 139 140.. _qcom_camss_graph: 141 142.. kernel-figure:: qcom_camss_graph.dot 143 :alt: qcom_camss_graph.dot 144 :align: center 145 146 Media pipeline graph 8x16 147 148.. kernel-figure:: qcom_camss_8x96_graph.dot 149 :alt: qcom_camss_8x96_graph.dot 150 :align: center 151 152 Media pipeline graph 8x96 153 154 155Implementation 156-------------- 157 158Runtime configuration of the hardware (updating settings while streaming) is 159not required to implement the currently supported functionality. The complete 160configuration on each hardware module is applied on STREAMON ioctl based on 161the current active media links, formats and controls set. 162 163The output size of the scaler module in the VFE is configured with the actual 164compose selection rectangle on the sink pad of the 'msm_vfe0_pix' entity. 165 166The crop output area of the crop module in the VFE is configured with the actual 167crop selection rectangle on the source pad of the 'msm_vfe0_pix' entity. 168 169 170Documentation 171------------- 172 173APQ8016 Specification: 174https://developer.qualcomm.com/download/sd410/snapdragon-410-processor-device-specification.pdf 175Referenced 2016-11-24. 176 177APQ8096 Specification: 178https://developer.qualcomm.com/download/sd820e/qualcomm-snapdragon-820e-processor-apq8096sge-device-specification.pdf 179Referenced 2018-06-22. 180 181References 182---------- 183 184.. [#f1] https://source.codeaurora.org/quic/la/kernel/msm-3.10/ 185.. [#f2] https://source.codeaurora.org/quic/la/kernel/msm-3.18/ 186