li-mt9m021/README.md

183 lines
6.7 KiB
Markdown

# Leopard Imaging LI-M021C-MIPI Stereo-Optic Cameras
## Cameras setup
The camera sensors should be conencted to a ConnectTech's Elroy board.
## Features
* V4L2 Kernel Driver Version 2.0 supported on L4T32.4.4
* V4l2 controls
* test pattern
* individual gains
* vertical/horizontal flip
* flash control
* LibArgus and nvarguscamerasrc
* Resolutions supported:
* 1280x720 @ 60fps
* 1280x960 @ 45fps
* Gain, exposure, and framerate controls
* Camera synchronization
## Capture Tests
### Frame-rate Tests
* Set the framerate to 60fps and the driver will configure the sensor:
```
gst-launch-1.0 nvarguscamerasrc sensor-id=0 aelock=true awblock=true ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)NV12,framerate=(fraction)60/1' ! fakesink
```
* Set the framerate to 45fps and the driver will configure the sensor:
```
gst-launch-1.0 nvarguscamerasrc sensor-id=0 aelock=true awblock=true ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)NV12,framerate=(fraction)45/1' ! fakesink
```
### UDP Streaming Test
#### Sender Endpoint
```
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)NV12,framerate=(fraction)60/1' ! omxh264enc control-rate=2 bitrate=8000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! rtph264pay mtu=1400 ! udpsink host=$HOST_IP port=5000 sync=false async=false
```
#### Receiver Endpoint
```
gst-launch-1.0 udpsrc port=5000 ! "application/x-rtp,media=(string)video,payload=(int)96,clock-rate=(int)90000,encoding-name=(string)H264" ! rtph264depay ! queue ! avdec_h264 ! xvimagesink sync=true async=false
```
### Set Controls Test
Run a pipeline, then set gain and exposure controls using v4l2-ctl:
```
v4l2-ctl -d /dev/video1 -c exposure=14000
v4l2-ctl -d /dev/video1 -c gain=100
```
### Dual Synchronized Capture Test
First run the master pipeline and then the slave pipeline:
#### Master Pipeline
```
gst-launch-1.0 nvarguscamerasrc sensor-id=0 aelock=true awblock=true ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)NV12,framerate=(fraction)60/1' ! fakesink
```
#### Slave Pipeline
```
gst-launch-1.0 nvarguscamerasrc sensor-id=1 aelock=true awblock=true ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)NV12,framerate=(fraction)60/1' ! fakesink
```
It is not recommended to start both streams at the same time, because nvarguscamerasrc will fail if no buffers arrive on a defined timeout.
### V4l2 Capture Test
#### Master
```
v4l2-ctl -d /dev/video0 --set-fmt-video=width=1280,height=720,pixelformat=RG12 --set-ctrl bypass_mode=0 --stream-mmap
```
#### Slave
```
v4l2-ctl -d /dev/video1 --set-fmt-video=width=1280,height=720,pixelformat=RG12 --set-ctrl bypass_mode=0 --stream-mmap
```
### AE Synchronized
First run the master pipeline and then the slave pipeline:
#### Master Pipeline
```
gst-launch-1.0 nvarguscamerasrc sensor-id=1 ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)NV12,framerate=(fraction)60/1' ! fakesink
```
#### Slave Pipeline
```
gst-launch-1.0 nvarguscamerasrc sensor-id=0 aelock=true awblock=true ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)NV12,framerate=(fraction)60/1' ! fakesink
```
### Run script
```
./script_to_control_gain_exposure.sh &
```
## Appends
#### Auto Exposure
The AE controls realays in the feedback provided by the Driver's camera sensor to the nvarguscamerasrc libraries,
and the custom DTSIs with the sensor parameters definitions.
As the AE has interdependency with the digital gain. This gain it's and operation on the Driver, and follows this setup:
```bash
/*
* Digital gain equation:
*
* RANGE: 1x, 7.97x
* STEPS: 1/32
*
* SCALE FACTOR = 3
*
* min_gain_val = 102
* max_gain_val = 160
* gain_factor = 3
*
* gain accepts mapping to range 32 - 53
*/
```
MT9M021 sensor datasheet:
* <a href="https://files.niemo.de/aptina_pdfs/MT9M021-M031_Developer_Guide.pdf">MT9M021 Developer Guide</a>
Where, the `min_gain_val, max_gain_val, step_gain_val` are part of the cameras' DTSI (tegra186-tx2-spiri-camera.dtsi) fixed parameters
as per the datasheet. The `gain` consists on the steps (1x + n * 1/32 for the register) scaled to a value that represents this fraction in an integer value for the gain register.
#### Kernel Changes
The Driver for the MT9M021 cameras consists on the following structure, that adds the DTB and Kernel sources, along with its Makefiles that lead its portability to a Kernel source.
```bash
.
├── hardware
│ └── nvidia-spiri
│ └── platform
│ └── t18x
│ ├── common
│ │ └── kernel-dts
│ │ └── t18x-common-platforms
│ │ ├── tegra186-tx2-spiri-camera-base.dtsi
│ │ └── tegra186-tx2-spiri-camera.dtsi
│ └── quill
│ └── kernel-dts
│ ├── Makefile
│ ├── tegra186-tx2-spiri-base.dts
│ ├── tegra186-tx2-spiri-mPCIe.dts
│ ├── tegra186-tx2-spiri-revF+.dts
│ └── tegra186-tx2-spiri-USB3.dts
├── kernel
│ ├── kernel-4.9
│ │ └── arch
│ │ └── arm64
│ │ └── configs
│ │ └── tegra_defconfig
│ └── nvidia-spiri
│ ├── drivers
│ │ └── media
│ │ ├── i2c
│ │ │ ├── Kconfig
│ │ │ ├── Makefile
│ │ │ ├── mt9m021.c
│ │ │ └── mt9m021_mode_tbls.h
│ └── include
│ └── media
│ └── mt9m021.h
└── README.md
```
In order to add the Driver to the Kernel, the following reference Kernel files are patched for adding custom controls that the camera implements.
* kernel/nvidia/drivers/media/platform/tegra/camera/camera_common.c
* kernel/nvidia/drivers/media/platform/tegra/camera/tegracam_ctrls.c
* kernel/nvidia/include/media/camera_common.h
* kernel/nvidia/include/media/tegra-v4l2-camera.h
#### Documentation
* <a href="https://nextcloud.spirirobotics.com/f/3369">CSI2 adapter board guide</a>
* <a href="https://nextcloud.spirirobotics.com/f/3382">Camera module data sheet</a>
* <a href="https://nextcloud.spirirobotics.com/f/3392">Camera sensor data sheet</a>
* <a href="https://nextcloud.spirirobotics.com/f/3396">Camera sensor development guide</a>
* <a href="https://nextcloud.spirirobotics.com/f/3386">MIPI bridge</a>