USB/CSI Camera for a Webcam on Stream
For Windows
It is possible by creating an RTSP server on your host Windows OS that can be accessed by a Stream container, you just need to follow these steps:
-
Create a folder on your host machine where the RTSP server files will reside.
-
Go to
mediamtx releases repositoryand download the zip file. -
Once you have it in your local folder, extract it.
-
Get your camera name. You can list all your audio/video devices using this command:
ffmpeg -list_devices true -f dshow -i dummyGet from the results list the one you need to use i.e.:
...
[dshow @ 000001c636eadcc0] "c922 Pro Stream Webcam" (video)
...Copy the descriptive name in quotation marks (i.e.:
c922 Pro Stream Webcam) and paste it on the required field in themediamtx.ymlfile.You can also find the camera name in the Windows Device Manager under the "Cameras" or "Imaging devices" section.
-
Open the
mediamtx.ymlfile in a text editor and make the following changes inpathssection, replacing<YOUR_VIDEO_CAMERA_NAME>with the name of your camera.mediamtx.ymlpaths:
camera01:
runOnInit: ffmpeg -f dshow -video_size 1280x720 -framerate 30 -i video="<YOUR_VIDEO_CAMERA_NAME>" -pix_fmt yuv420p -c:v libx264 -preset ultrafast -b:v 600k -f rtsp rtsp://localhost:$RTSP_PORT/$MTX_PATH
runOnInitRestart: yesDshowParameters used:-video_size: By defaultffmpegreceives input from camera at a640x480resolution, with this parameter you can select the desired one, this example sets it to1280x720.-framerate: Framerates or fps, this example sets it to30.
Replace
<YOUR_VIDEO_CAMERA_NAME>with the video camera name you want to use. -
Save the changes you made in the file.
-
Execute the
mediamtx.exeprogram. You should be able to see a Windows cmd output like this:2024/02/06 23:55:52 INF MediaMTX v1.5.1
2024/02/06 23:55:52 INF configuration loaded from C:\Users\User\Desktop\mediamtx\mediamtx.yml
2024/02/06 23:55:52 INF [path camera01] runOnInit command started
2024/02/06 23:55:52 INF [RTSP] listener opened on :8554 (TCP), :8000 (UDP/RTP), :8001 (UDP/RTCP)
2024/02/06 23:55:52 INF [RTMP] listener opened on :1935
2024/02/06 23:55:52 INF [HLS] listener opened on :8888
2024/02/06 23:55:52 INF [WebRTC] listener opened on :8889 (HTTP), :8189 (ICE/UDP)
2024/02/06 23:55:52 INF [SRT] listener opened on :8890 (UDP)
ffmpeg version 6.1.1-essentials_build-www.gyan.dev Copyright (c) 2000-2023 the FFmpeg developers
built with gcc 12.2.0 (Rev10, Built by MSYS2 project)
configuration: --enable-gpl --enable-version3 --enable-static --pkg-config=pkgconf --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-dxva2 --enable-d3d11va --enable-libvpl --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
libavutil 58. 29.100 / 58. 29.100
libavcodec 60. 31.102 / 60. 31.102
libavformat 60. 16.100 / 60. 16.100
libavdevice 60. 3.100 / 60. 3.100
libavfilter 9. 12.100 / 9. 12.100
libswscale 7. 5.100 / 7. 5.100
libswresample 4. 12.100 / 4. 12.100
libpostproc 57. 3.100 / 57. 3.100
Input #0, dshow, from 'video=c922 Pro Stream Webcam':
Duration: N/A, start: 184827.343686, bitrate: N/A
Stream #0:0: Video: mjpeg (Baseline) (MJPG / 0x47504A4D), yuvj422p(pc, bt470bg/bt709/unknown), 1280x720, 30 fps, 30 tbr, 10000k tbn
Stream mapping:
Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[swscaler @ 0000016617efca00] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 000001661b3a7040] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0000016617efca00] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 000001661b3a6340] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 00000166154bef80] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 00000166154bef80] profile Constrained Baseline, level 3.1, 4:2:0, 8-bit
[libx264 @ 00000166154bef80] 264 - core 164 r3172 c1c9931 - H.264/MPEG-4 AVC codec - Copyleft 2003-2023 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=18 lookahead_threads=3 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=0 intra_refresh=0 rc=abr mbtree=0 bitrate=600 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
2024/02/06 23:55:52 INF [RTSP] [conn [::1]:52234] opened
2024/02/06 23:55:52 INF [RTSP] [session bbd94c07] created by [::1]:52234
2024/02/06 23:55:52 INF [RTSP] [session bbd94c07] is publishing to path 'camera01', 1 track (H264)
Output #0, rtsp, to 'rtsp://localhost:8554/camera01':
Metadata:
encoder : Lavf60.16.100
Stream #0:0: Video: h264, yuv420p(tv, bt470bg/bt709/unknown, progressive), 1280x720, q=2-31, 600 kb/s, 30 fps, 90k tbn
Metadata:
encoder : Lavc60.31.102 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/600000 buffer size: 0 vbv_delay: N/A
Now that the RTSP server is running, you need to add to the parameter --network=host to run the Stream container:
docker run... --network=host platerecognizer/alpr-stream:latest
Once the Stream container is running, just open the Stream config.ini file and add the URL of the RTSP server running in the host machine to the url field in the camera section.
More information about the URL configuration here.
Make sure to use the correct URL.
Format:rtsp://<HOST_IP_ADDRESS>:<RTSP_PORT>/<CAMERA_SERVER_PATH>
Example: rtsp://192.168.1.100:8554/camera01
rtspis the service protocol.192.168.1.100is the Windows host machine IP address.8554is the default RTSP port.camera01is the camera path in the server.
In general, if you are running the Stream container in the same host machine as the RTSP server, you can use localhost or 127.0.0.1 as the host IP address.
USB Camera For Linux
This tutorial will guide you through the process of setting up a USB camera (webcam) on a Linux machine to work with Plate Recognizer Stream software.
1. Identifying Camera and Supported Formats
First of all, you need to identify your camera and its supported formats, resolutions and framerates. Follow these steps:
-
Make sure your camera is connected to your Linux machine and recognized by the OS. You can use the command
lsusbto list all connected USB devices.$ lsusb
Bus 002 Device 003: ID 046d:085c Logitech, Inc.
Bus 001 Device 101: ID 0458:708f KYE Systems Corp. (Mouse Systems) FaceCam 1000XwarningIf you don't see your camera listed, make sure it is properly connected and that you have the necessary drivers installed.
notePlease write down the camera model, vendor ID and product ID, as this information will be used later in this section.
In this example, we will use the
KYE Systems Corp. FaceCam 1000Xcamera, so:- Camera Model:
FaceCam 1000X - Vendor ID:
0458 - Product ID:
708f
- Camera Model:
-
Find the id of the camera. For example,
/dev/video0.$ ls -ltrh /dev/video*
-rwxrw----+ 1 root root 0 Sep 24 21:15 /dev/video2
crw-rw----+ 1 root video 81, 1 Oct 7 14:39 /dev/video1Because the camera is connected via USB, it should be listed as
/dev/videoX, whereXis a number starting from0. However, if you have multiple video devices connected, the number may be different. -
Using
v4l2-ctlcommand, you can list all the supported formats of your camera. This command is part of thev4l-utilspackage, which you may need to install first.$ v4l2-ctl --list-formats-ext -d /dev/video1
Driver Info:
Driver name : uvcvideo
Card type : FaceCam 1000X: FaceCam 1000X
Bus info : usb-0000:00:14.0-2.2.4
Driver version : 6.14.11
Capabilities : 0x84a00001
Video Capture
Metadata Capture
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x04200001
Video Capture
Streaming
Extended Pix Format
Media Driver Info:
Driver name : uvcvideo
Model : FaceCam 1000X: FaceCam 1000X
Serial :
Bus info : usb-0000:00:14.0-2.2.4
Media version : 6.14.11
Hardware revision: 0x00000004 (4)
Driver version : 6.14.11
Interface Info:
ID : 0x03000002
Type : V4L Video
Entity Info:
ID : 0x00000001 (1)
Name : FaceCam 1000X: FaceCam 1000X
Function : V4L2 I/O
Flags : default
Pad 0x01000007 : 0: Sink
Link 0x02000010: from remote pad 0x100000a of entity 'Processing 2' (Video Pixel Formatter): Data, Enabled, Immutable
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:
Width/Height : 640/480
Pixel Format : 'MJPG' (Motion-JPEG)
Field : None
Bytes per Line : 0
Size Image : 921600
Colorspace : sRGB
Transfer Function : Rec. 709
YCbCr/HSV Encoding: ITU-R 601
Quantization : Default (maps to Full Range)
Flags :
Crop Capability Video Capture:
Bounds : Left 0, Top 0, Width 640, Height 480
Default : Left 0, Top 0, Width 640, Height 480
Pixel Aspect: 1/1
Selection Video Capture: crop_default, Left 0, Top 0, Width 640, Height 480, Flags:
Selection Video Capture: crop_bounds, Left 0, Top 0, Width 640, Height 480, Flags:
Streaming Parameters Video Capture:
Capabilities : timeperframe
Frames per second: 30.000 (30/1)
Read buffers : 0
User Controls
brightness 0x00980900 (int) : min=-255 max=255 step=1 default=0 value=0
contrast 0x00980901 (int) : min=0 max=30 step=1 default=15 value=15
saturation 0x00980902 (int) : min=0 max=127 step=1 default=32 value=32
hue 0x00980903 (int) : min=-16000 max=16000 step=1 default=0 value=0
white_balance_automatic 0x0098090c (bool) : default=1 value=1
gamma 0x00980910 (int) : min=20 max=250 step=1 default=100 value=100
power_line_frequency 0x00980918 (menu) : min=0 max=2 default=1 value=1 (50 Hz)
0: Disabled
1: 50 Hz
2: 60 Hz
white_balance_temperature 0x0098091a (int) : min=2800 max=6500 step=1 default=5000 value=5000 flags=inactive
sharpness 0x0098091b (int) : min=0 max=31 step=1 default=16 value=16
backlight_compensation 0x0098091c (int) : min=0 max=2 step=1 default=0 value=0notePlease write down the supported formats, resolutions and framerates of your camera, as this information will be used later in this section.
In this example, we will use the
MJPGformat at a resolution of640x480and a framerate of30 fps.You can also see the camera model and driver information at the top of the output. Make sure it matches the camera you want to use, as seen by the
lsusbcommand above.Depending on your camera, you may see different formats, resolutions and framerates. We will discuss several methods to set the camera parameters later in this section.
The supported formats may include:
YUYVMJPGH264UYVYYUV420NV12RGB3- Among others.
2. Assigning a Persistent Camera device ID
When using a USB camera, the device id (e.g., /dev/video0) may change if you connect/disconnect other video devices or reboot the machine. To avoid this, we will create a udev rule that assigns a persistent name to your camera based on its vendor ID and product ID.
This step is optional, but highly recommended to avoid issues with changing device ids, so that you don't have to update the camera device file every time the device id changes.
-
Identify the udev attributes of your camera using the
udevadmcommand:$ udevadm info -a -n /dev/video1 | grep '{idVendor}\|{idProduct}\|{serial}\|{product}'
Udevadm info starts with the device specified by the devpath and then
walks up the chain of parent devices. It prints for every device found,
all attributes and all tags. ...
...
looking at device '/devices/pci0000:00/0000:00:14.0/usb1/1-2/1-2.2/1-2.2.4/1-2.2.4:1.0/video4linux/video1':
KERNEL=="video1"
SUBSYSTEM=="video4linux"
...
ATTR{name}=="FaceCam 1000X: FaceCam 1000X"
looking at parent device '/devices/pci0000:00/0000:00:14.0/usb1/1-2/1-2.2/1-2.2.4/1-2.2.4:1.0':
...
looking at parent device '/devices/pci0000:00/0000:00:14.0/usb1/1-2/1-2.2/1-2.2.4/1-2.2.4':
...
ATTRS{idVendor}=="0458"
ATTRS{idProduct}=="708f"
ATTRS{serial}=="200901010001"
ATTRS{product}=="FaceCam 1000X: FaceCam 1000X"
...In this example, we will use the
idVendorandidProductattributes to create the udev rule.The output is very verbose, so make sure to look for the correct device:
- Match
KERNEL=="videoX"andSUBSYSTEM=="video4linux".KERNELshould match the/dev/videoXpath you found earlier. - As a general rule, you should look for the attributes in the parent device that contains the
idVendorandidProductattributes that match the camera you want to use. - Pay attention that
modelandnameshould also match the camera you want to use. - You should also use the
serialattribute if available, especially if you have multiple cameras of the same model.
- Match
-
Create a new file in
/etc/udev/rules.d/named99-alpr-camera.ruleswith the following content:99-alpr-camera.rulesSUBSYSTEM=="video4linux", ATTRS{idVendor}=="0458", ATTRS{idProduct}=="708f", SYMLINK+="alpr-camera-1"noteif you have multiple cameras of the same model, you can use the
serialattribute to differentiate them. For example:99-alpr-camera.rulesSUBSYSTEM=="video4linux", ATTRS{idVendor}=="0458", ATTRS{idProduct}=="708f", ATTRS{serial}=="200901010001", SYMLINK+="alpr-camera-1"
SUBSYSTEM=="video4linux", ATTRS{idVendor}=="0458", ATTRS{idProduct}=="708f", ATTRS{serial}=="200901010002", SYMLINK+="alpr-camera-2"Even if you have a single camera, it's a good practice to use the
serialattribute if your camera provides it.- Replace
0458and708fwith your camera's vendor ID and product ID, respectively. - You can name the rule file anything you want, but it should end with
.rules. - You can also change
alpr-camerainSYMLINKto any name you prefer. That name will be used as the persistent camera device id, i.e.,/dev/alpr-camera.
- Replace
-
Reload the udev rules and trigger them:
sudo udevadm control --reload-rules
sudo udevadm trigger -
Verify that the symlink was created:
$ ls -l /dev/alpr-camera-1
lrwxrwxrwx 1 root root 6 Oct 7 20:03 /dev/alpr-camera-1 -> video1
Now, you can use /dev/alpr-camera-1 as the camera id. Almost there! From now on, you can use /dev/alpr-camera-1 as the camera id, and it will always point to your USB camera, regardless of changes in device ids, as long as the camera is connected.
Stream container is not able to use the webcam device directly. So, it's recommended to set up an RTSP server on the host machine and use the RTSP stream URL in Stream's config.ini file.
3. Setting up RTSP Server on the Host Machine
To stream video from your USB camera to our Stream software, you have to set up an RTSP server on your host machine.
Depending on camera capabilities, such as formats available, you can choose one of the following methods to set up the RTSP server:
Using v4l2rtspserver
One of the simplest ways to set up an RTSP server is by using the v4l2rtspserver tool. You can install it from the package manager or build it from source.
This will only work if your camera supports MJPEG (MJPG) or H264 formats. You can check the supported formats using the v4l2-ctl --list-formats-ext -d /dev/alpr-camera-1 command as shown above.
-
Download and install the V4L2 RTSP server package:
wget https://github.com/mpromonet/v4l2rtspserver/releases/download/v0.3.11/v4l2rtspserver-0.3.11-Linux-x86_64-Release.deb
# wget https://github.com/mpromonet/v4l2rtspserver/releases/download/v0.3.11/v4l2rtspserver-0.3.11-Linux-arm64-Release.deb # for ARM64 architecture
sudo apt update
sudo apt install ./v4l2rtspserver-0.3.11-Linux-x86_64-Release.deb -
Create a systemd service file to manage the RTSP server:
sudo nano /etc/systemd/system/v4l2-rtsp.serviceAdd the following content to the service file:
/etc/systemd/system/v4l2-rtsp.service[Unit]
Description=v4l2 RTSP server to forward ALPR camera 1 (USB) to Stream
[Service]
ExecStart=/bin/bash -c "v4l2rtspserver -W 1920 -H 1080 -F 30 -P 8554 /dev/alpr-camera-1" -u "alpr-camera-1"
Restart=always
User=<YOUR_USERNAME>
[Install]
WantedBy=multi-user.targetnote- Replace
1920and1080with the resolution supported by your camera (as identified earlier). - Replace
30with the framerate supported by your camera (as identified earlier). - Depending on supported formats, you may need to add
-fformat MJPGor-fformat H264to theExecStartline to specify the format explicitly. - The RTSP path can be changed by modifying the
-uparameter. In this example, we set it toalpr-camera-1, so the RTSP URL will bertsp://<HOST_IP_ADDRESS>:8554/alpr-camera-1. - You can adjust the port by changing the
-Pparameter. The default RTSP port is8554. - Replace
<YOUR_USERNAME>with your Linux username or any other user with access to the camera device.
- Replace
-
Reload systemd, enable, start, and check the status of the service:
sudo systemctl daemon-reload
sudo systemctl enable --now v4l2-rtsp.service
sudo systemctl status v4l2-rtsp.serviceMake sure the service is running without errors.
● v4l2-rtsp.service - v4l2 RTSP server to forward ALPR camera 1 (USB) to Stream
Loaded: loaded (/etc/systemd/system/v4l2-rtsp.service; enabled; vendor preset: enabled)
Active: active (running) since Mon 2024-10-07 20:10:00 UTC; 10s ago
Main PID: 12345 (bash)
Tasks: 2 (limit: 4915)
... -
Verify that the RTSP server is running and accessible:
$ curl rtsp://localhost:8554/alpr-camera-1
RTSP/1.0 200 OK
CSeq: 1
Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSEnoteWe strongly recommend using a media player like VLC to test the RTSP stream. Open VLC and go to
Media>Open Network Stream, then enter the URLrtsp://localhost:8554/alpr-camera-1.You can also use ffplay if
ffmpegis installed on your system:ffplay rtsp://localhost:8554/alpr-camera-1That should open a window displaying the video feed from your USB camera.
warningIf you encounter issues, such as the stream not playing, having artifacts, low resolution, or green or black screen, make sure your camera supports the specified format, resolution, and framerate. You may need to adjust the parameters in the
ExecStartline of the systemd service file accordingly. Please refer to thev4l2rtspserverdocumentation for more details.Please double-check the camera capabilities using the
v4l2-ctl --list-formats-ext -d /dev/alpr-camera-1command as shown above.
Using mediamtx with ffmpeg
Another option is to use mediamtx with ffmpeg. This method provides more flexibility and can work with various camera formats (such as YUYV, MJPG, etc.) by leveraging ffmpeg's capabilities. It can be easily integrated into existing workflows.
-
Download and extract the MediaMTX binary:
Go to the
mediamtx releases repositoryand download the appropriate archive for your Linux architecture (e.g.,mediamtx_v1.x.x_linux_amd64.tar.gzfor x86_64 ormediamtx_v1.x.x_linux_arm64.tar.gzfor ARM64).# Example for x86_64
wget https://github.com/bluenviron/mediamtx/releases/download/v1.15.1/mediamtx_v1.15.1_linux_amd64.tar.gz
## or for ARM64
# wget https://github.com/bluenviron/mediamtx/releases/download/v1.15.1/mediamtx_v1.15.1_linux_arm64.tar.gz
tar -xzf mediamtx_v1.15.1_linux_amd64.tar.gz
# It's recommended to copy the binary to a directory in your PATH, e.g.: /usr/local/bin
sudo cp mediamtx /usr/local/bin/
sudo chmod +x /usr/local/bin/mediamtxnoteMake sure that you have
ffmpeginstalled on your system, as it is required for this method. You can install it using your package manager, e.g.,sudo apt install ffmpeg. -
Configure MediaMTX:
Edit the
mediamtx.ymlconfiguration file. By default, it's in the same directory as the binary, or you can specify its path.Add or modify the
pathssection to include your camera stream:mediamtx.ymlpaths:
alpr-camera-1:
runOnInit: ffmpeg -f v4l2 -video_size 1280x720 -framerate 30 -i /dev/alpr-camera-1 -pix_fmt yuv420p -c:v libx264 -preset ultrafast -b:v 600k -f rtsp rtsp://localhost:$RTSP_PORT/$MTX_PATH
runOnInitRestart: yesnote- Replace
1280x720and30with the resolution and framerate supported by your camera (as identified earlier). - If your camera uses a format like YUYV, you may need to add
-input_format yuyv422after-i /dev/alpr-camera-1to specify the input format explicitly. - Adjust the bitrate (
-b:v 600k) and other ffmpeg parameters as needed for your setup.
- Replace
-
Create a systemd service file to manage MediaMTX:
sudo nano /etc/systemd/system/mediamtx.serviceAdd the following content:
/etc/systemd/system/mediamtx.service[Unit]
Description=MediaMTX RTSP server for ALPR camera 1
[Service]
ExecStart=/usr/local/bin/mediamtx /path/to/mediamtx.yml
Restart=always
User=<YOUR_USERNAME>
[Install]
WantedBy=multi-user.targetnoteReplace
/path/to/mediamtx.ymlwith the actual path to your configuration file, andyour-userwith a user that has permissions to access/dev/alpr-camera-1(typically a user in thevideogroup).Also, set a user that has permissions to access
/dev/alpr-camera-1(typically a user in thevideogroup). -
Reload systemd, enable, start, and check the status of the service:
sudo systemctl daemon-reload
sudo systemctl enable --now mediamtx.service
sudo systemctl status mediamtx.service -
Verify the service is active and RTSP stream works as intended, as shown in the previous method.
4. Configuring Stream to Use the Camera
Now that you have your camera set up and an RTSP server running, you need to configure Stream to use the camera.
-
Update the
urlparameter in yourconfig.ini. with the RTSP URL from the running RTSP server.url = 'rtsp://localhost:8554/alpr-camera-1' -
Start Stream with the
docker run. And set your ownLICENSE_KEY,TOKENand volume for/path/to/stream_dir.# don't use --gpus all if docker complains about it, or if you don't have a GPU.
docker run \
--gpus all \
--name stream \
--net host \
--restart="unless-stopped" \
--user `id -u`:`id -g` \
-v /path/to/stream_dir:/user-data \
-e LICENSE_KEY=XXXXX \
-e TOKEN=YYYYY \
platerecognizer/alpr-stream:latestnoteFor the docker container to access localhost RTSP server, you need to run the container with
--net hostflag.
CSI Camera For Raspberry Pi: Camera Module 2/3/4
We recommend updating your OS and packages before starting the process.
-
Choose a format (use command v4l2-ctl --list-formats-ext from v4l-utils apt package) supported by camera hardware. For example, width=1280, height=1080, framerate=15/1, format=UYVY.
-
Set the camera
urlinconfig.inito:
url = "libcamerasrc ! video/x-raw, width=AAAAA, height=BBBBB, framerate=CCCCC/1, format=DDDDD ! videoconvert ! video/x-raw,format=(string)BGR ! appsink max-buffers=5"
- Start Stream with the
docker run. And set your ownLICENSE_KEY,TOKENand volume for/path/to/stream_dir. Notice the new environment variables, volumes and flags.
docker run \
-t \
--privileged \
--name stream \
--restart="unless-stopped" \
--user `id -u`:`id -g` \
--group-add video \
-v /run/udev:/run/udev
-v /path/to/stream_dir:/user-data \
-e OPENCV_API_PREFERENCE=1800 \
-e LICENSE_KEY=XXXXX \
-e TOKEN=YYYYY \
platerecognizer/alpr-stream:arm
Read more about Gstreamer and libcamera, if you need to toggle specific camera options or switch to another device.
Jetson USB/CSI Camera
To use onboard cameras a few additional configurations are necessary.
- First get your available camera resolutions
gst-launch-1.0 nvarguscamerasrc sensor-id=0
- Set the camera
urlinconfig.inito:
url = "nvarguscamerasrc ! video/x-raw(memory:NVMM), width=AAAAA, height=BBBBB, framerate=CCCCC/1, format=NV12 ! nvvidconv flip-method=0 ! video/x-raw,width=960, height=616 ! nvvidconv ! video/x-raw ! videoconvert ! video/x-raw,format=(string)BGR ! videoconvert ! appsink max-buffers=5"
Change AAAAA, BBBBB, and CCCCC to your values available to your camera. The framerate (FPS) of the camera could be any value below the available for the specified resolution.
- Now you can run the application. Notice the new environment variable
-e OPENCV_API_PREFERENCE=1800, volume and device. List of OPENCV_API_PREFERENCE:
docker run \
-t \
--runtime nvidia \
--privileged \
--name stream \
--restart="unless-stopped" \
--user `id -u`:`id -g` \
--group-add video \
-v /tmp/argus_socket:/tmp/argus_socket \
-v /path/to/stream_dir:/user-data \
-e OPENCV_API_PREFERENCE=1800 \
-e LICENSE_KEY=XXXXX \
-e TOKEN=YYYYY \
platerecognizer/alpr-stream:jetson
For the commands above, make sure to:
- Change XXXXX to the License Key that we gave you.
- Change YYYYY to your Plate Recognizer Token.