In this FPV camera latency testing I will share my results on over 20 cameras, also my method of testing camera delay. FPV camera lag has impact on a pilot’s reaction time in flight, and it is one of the most important considerations in selecting FPV cameras.
FPV Camera Latency Measurement
It’s important to understand the latency of a FPV camera is not constant but constantly changing due to processing and limited frame rate. Consistent latency is just as important as total latency. Even if a camera had low average latency, occasional high delays could affect the overall performance.
Other parts of an FPV system might also introduce latency, for example VTX, VRX, and display. Here is my test on this: 5.8Ghz Analog VTX/VRX latency
|I compiled the specifications of all FPV cameras for mini quad in this spreadsheet so you can compare them more closely.|
This data was taken from the cameras I have, please use these results as reference only. Values are measured in ms – milliseconds.
|Camera||Latency (Max)||Min||Average (500 Samples)|
|Runcam Swift Rotor Riot Edition||37||17||27|
|Runcam Swift 2||32||14||23|
|Runcam Swift Mini||34||15||24|
|Runcam Micro Swift||33||17||28|
|Runcam Eagle 2||34||15||25|
|Runcam Owl Plus||25||9||18|
|Runcam Owl 2||26||8||18|
|Foxeer Arrow V2||38||16||27|
|Micro Swift 2||35||19||28|
|Aomway 650TVL CCD||29||12||22|
|Aomway 700TVL CMOS V2||23||6||14|
|Runcam Sparrow (Full, Micro)||24||6||15|
|Runcam Eagle 2 Pro||34||17||25|
|Runcam Night Eagle 2||35||18||26|
|Eachine SpeedyBee SEC||43||20||34|
|Foxeer Predator (Micro, Mini)||24||6||15|
|Micro Sparrow 2||24||6||15|
|Foxeer Monster Pro Mini & Micro||23||5||14|
|Runcam Micro Swift 3||39||23||33|
|Latency (Max)||Min||Average (500 Samples)|
|Runcam Split (V1, V2)||51||23||38|
|Runcam Split Mini||51||23||37|
|GoPro Hero 3 Silver||132||103||117|
|808 Key Chain||103||53||79|
*HD Cameras are measured during recording.
Method of Testing Camera Latency
I will try to explain everything in simple terms, if you want to discuss further please move over to my forum thread.
Why I use what I use?
Many reviewers online use virtual digital clocks/timer on an iPad or smart phone to measure the latency of a FPV camera or HD camera video out.
It might or might not work. I think it is unreliable because the displays and camera all have relatively low refresh rates, probably not on the same frequency so they are not even in phase. For example if everything runs on 60fps, the result could have an error of up to 16.7ms. This is considered significant as the total latency of the Runcam Swift is only 30ms.
In my testing I try to take as many samples as possible for each camera to get the best possible result. (currently around 500 samples at random intervals for each camera)
How it works
In my testing setup, the FPV camera is connected directly to a LCD display. I put a LED in front of the camera and when it turns on, the display brightness will increase. There is a photo-transistor (or LDR – light dependant resistor) in front of the monitor to detect the brightness changes. The time difference in turning on the LED and detection of the light, would be the latency in the FPV camera.
However the display’s latency might contribute some error in the measurements. Since I am running so many samples at different intervals, the results should be unaffected by phase shifts and very consistent in my testing. This data is useful for comparing which camera has the least latency.
The LED and LDR are both connected to the Arduino UNO micro controller, so we can measure the time very accurately. The LED and LDR has delay as well: LED takes time to turn on (i.e. increase brightness from 0% to 100%), and LDR takes time to detect light. From my measurement, the delay is only around 170us, compared to the delay of FPV camera this is negligible.
I connected the camera to the monitor directly, and not using VTX/VRX to avoid extra latency in the system which will add more uncertainty to the results.
Arduino runs at 10KHz sampling rate.
NTSC and PAL
My assumption was that the video format of camera has an impact on the latency. (Read this about the difference between NTSC and PAL)
NTSC is 30fps while PAL is 25fps, but because they are interlaced (in every frame, odd lines are drawn before even lines), which gives them a real fresh rate of 60fps (16.7ms) and 50fps (20ms) respectively. So in theory NTSC should be 3.3ms faster than PAL.
I want to verify if NTSC is really faster than PAL, and the Eagle is very handy as I can easily change video encoding in the settings. So I tested the Eagle (CMOS) and Eagle 2 (CMOS) under PAL and NTSC setting.
- With the Eagle, minimum latency decreases significantly with NTSC, while average latency shows moderate reduction, but there is little to none difference in max latency
- This is not the case with both cameras though, the latency seems to be the same with both encoding formats in a different camera
So it seems like the FPS difference doesn’t always make a difference in our latency testing. But why it was different with the Eagle in some cases?
I checked with Runcam, and they suggest the difference in latency could be caused by the camera algorithm, which might perform better with one video encoding format than the other.
I wanted to test more cameras with NTSC and PAL, but all other cameras i have are PAL and cannot be changed in the setting. In our latency testing, we always try to use PAL whenever possible since that’s what I personally use.