0906 hardwire kit, voltage testing

tinman

Active Member
Joined
Feb 25, 2014
Messages
103
Reaction score
50
Country
United States
After yanking out the hardwire kit, I decided to do some testing on it to see how good the power regulator performs.

First, the obligatory images of the PCB
upload_2017-10-17_14-42-33.png

upload_2017-10-17_14-42-48.png

According to the post here by @Rayman.Chan , the regulator is suppose to be able to handle 5V@2A max (please correct me if I'm wrong). Not wanting to melt the regulator, I stop the testing at 1.5A.

A regulator is suppose to be able to maintain a stable output voltage from the lowest to the maximum load current rating. Most vendor would give a 10% tolerance, while very good design would specify a 5% tolerance. For the best regulators, it would be 1% tolerance. Assuming a 10% tolerance on this regulator, this means the output should swing from 4.5V to 5.5V.

Well, what I expected and the actual outcome is a little different. Again, I stop at 1.5A for safety concern. From the graph below, when the regulator just power up, the output voltage is 5.4V, well within the expected tolerance of 10%. But, at 1.5A, still below the 2A rating, the output voltage has already dropped to 4.3V, below the expected 4.5V. This implies the regulator is probably a 15% - 20% tolerance.

This is not that bad. According to @Rayman.Chan , the CAM should normally draw 700mA, which means during the operating time, the output voltage is at the 5V range.
upload_2017-10-17_14-54-8.png

Next comes efficiency of the regulator. Ideally we want a 100% efficiency, but in the real world, getting to 90% is great and 80% would be good enough. At the current load that the CAM should be consuming during normal operation, we are looking at ~77% efficiency. Not that great , but OK.

upload_2017-10-17_14-53-37.png

Below is the raw data collected. The posting doesn't allow fancy tables, so I'm uploading the data as an image file. So what does this data tell us. Well, the input to the hardwire kit was set to 12.8V, well above any of the possible trip voltage on the kit. At the operation current of 700mA, the CAM is consuming 3.5W, while on the input, the regulator is consuming 4.W.

upload_2017-10-17_14-57-2.png

Test was done using an Agilent 2902A as the 12.8V input. For the load, an Keithley 2460 was used. All voltage and current sensing was done using kelvin connection.
 
To tap onto the USB connector, a micro-USB breakout board was used.
upload_2017-10-17_15-24-23.png
 
Where did you measure the voltages?

I would expect a drop if it is at the end of the power cable rather than on the PCB...

Edit: I guess that last post has answered the question!
You will be loosing some in that board and test leads too.

Can you check the output voltage on the PCB to see how much is lost in the cable?
 
To mimic what the CAM would see, the voltage is measured at the end of the cables. Technically, that is not the proper way to test a regulator performance, but I wanted to see what the performance would be like from the CAM's point of view.
 
I highly doubt the lose of the cable can be of any significance, but well, here it is:
@Cable -> 5.4392V
@PCB -> 5.4392V

A difference of 1.2mV
 
I highly doubt the lose of the cable can be of any significance, but well, here it is:
@Cable -> 5.4392V
@PCB -> 5.4392V

A difference of 1.2mV

I don't see any difference, maybe typo?

@Cable -> 5.4392V
@PCB -> 5.4392V
 
My bad.
@Cable -> 5.4392V
@PCB -> 5.44042V

Using Agilent 34410A
 
My bad.
@Cable -> 5.4392V
@PCB -> 5.44042V

Using Agilent 34410A
If you measure it with no load then there will be no voltage drop!

Put your 1.5A load on it and then measure...

And then put the load that the camera actually gives, maybe 0.7A and measure...

5.44V is a little high, maybe should be careful what you plug in to that, unless it drops down to within USB specification at 5.25 under load, which you table shows it will for most devices.
 
Where did you measure the voltages?
I would expect a drop if it is at the end of the power cable

If you measure it with no load then there will be no voltage drop!

yes you guys have a strong electronic technology and skills
there will be voltage lose on the cable because cable resistance.


5.44V is a little high, maybe should be careful what you plug in to that, unless it drops down to within USB specification at 5.25 under load, which you table shows it will for most devices.

the no load voltage is set to 5.4V +/-0.05 so it is in tolerance, which is the MAX voltage when using this hardwire kit.
when the cable connector voltage drop to 4.2V, we think it is not usable anymore.

the regulators is marked 3A output in specification but we only believe what we get in our testing.
 
The length of the USB cable is ~92" (~2.33 meters). Going by the USB 2.0 spec, for a 2m cable, the power wire should be 24 AWG. For a 24 AWG cable, the resistance per meter should 0.0842/m. Using the original measured data for a 1A load, the voltage seen on the USB connector is 4.768V, the loss through the cable should then be 196.19mV.

To see if this theory pans out, a new measurement is done with a current load of 1A
@ USB connector -> 5.447V
@ PCB -> 4.75V

A difference of 697mV. That don't seem right. But, to get the current load onto the USB cable, a micro-usb breakout board was used with two test leads. The test leads is a mystery here on spec, so a direct measurement is done on the header pins on the breakout board
@ micro USB breakout board headers -> 4.904V

A difference of 543mV now. Maybe the trace of the breakout board is causing a loss. Again, the trace impedance, trace width, and trace length is a mystery without the layout file, so a direct measurement is done on the expose pin of the USB connector
@ exposed USB pin connector -> 4.914V

Not much, but the breakout board introduces 10mV drop. The difference is now 533mV.

Why such a huge difference between the expected and the actual? One possible theory is the wire is not 24 AWG or the insertion loss at the USB connector to the breakout board is very large.
 
Damn, I need to get better hook up leads. One measured at 355 milli-ohm and the other one at 488 milli-ohm.
 
Going by the USB 2.0 spec, for a 2m cable, the power wire should be 24 AWG.
My guess is that the cable was chosen to be a nice price, good quality and nicely flexible, then the voltage output of the regulator was set to give the correct voltage at the camera when the camera is drawing its average load.

A full size and fully shielded USB power cable to match the USB specification would be undesirable when you know it is being used for a dashcam, especially if it is sold for a specific dashcam. Although my 0906 is powered by a proper spec USB 3 power cable!
 
Got curious at why the big discrepancy between the expect and the measured. I went ahead and measured the impedance from the PCB->connector->breakout board->test leads. On the 5V line, the resistance is 518milli-ohm and the GND has 558mill-ohm. Using the 1A draw, the calculated voltage drop on the 5V line would then be 518mV. That is now much closer to the measured drop of 697mV.

The difference between the measured and calculated is now 179mV. The only probable cause left is the GND level is not GND anymore due to resistance of the GND wire.
 
this is reasonable.

take a example, for the data cable, it will be a challenge to choose supplier,
most data cable on market is >1 ohm on the 0.8m wires (thin copper-steel cords), which made a big volrage drop when connect a camera to computer so the camera will not go to mass storage mode;
we sourced from a lot of vendors then got the cable with <0.5 ohm on 0.8m wires (AWG28 copper). that is not a big parameter difference but big effect - works or not work.
 
Back
Top