The interesting part of reviewing an instrument is to actually test the instrument against the datasheet to understand how this one particular unit performs and determine how well it holds to the claimed specifications. As I now have a sizeable collection of test equipment, I felt it was only responsible to put it through a gauntlet of tests.
Voltage Reading Accuracy
One of the two primary measurement parameters is voltage, which is a parameter that can be more easily verified than some others. In order to do this, I employed the Keithley Model 2110 5.5-digit bench-top DMM as the voltage reference and the Keysight E36103A power supply as the voltage source Both items were awarded from prior RoadTests and were selected as the 2110 had the best accuracy from my collection and the had excellent front-panel read-out accuracy and low output ripple to avoid skewing the results It should be noted that this investigation is quasi-scientific as the instruments used were not recalibrated since leaving the factory, although cross-referencing readings suggests that the unit appears to be at least as accurate as my next-closest 4.5-digit DMM (Keysight U1461A).
Testing was automated using a pyvisa script which set the Keithley Model 2110 into manual triggering mode, auto-ranging DC volts reading over 10PLC (for maximum accuracy). The script set the BA6010 to manual triggering, slow mode with no averaging and had already been pre-configured with my local power-line frequency (50Hz). The power supply was enabled and stepped through the full range of 0 to 20V in 1mV increments. At each point, the Model 2110 and BA6010 were triggered as close-to-simultaneously as possible through USB TMC and the results read-out and recorded for analysis.
While I could crudely compare reading differences, it’s important to understand each instrument has its own margin of error. To visualize this, I’ve plotted the delta along with the upper and lower boundaries for the delta based on the Keithley Model 2110’s reading accuracy specifications.
The low-range of the BA6010 is the 6V range. Looking at the 0-6V readings, the Keithley uses its 0.1V, 1V and 10V ranges, resulting in changing “envelope” of potential error. The mean line itself is close to the zero line with some clear deviations at 1.2V (due to a change in the DMM’s range) and 3V (likely due to an internal calibration data step of the BA6010). Overall, the graph is scaled from -3 to 3mV – the readings fall well within the expected 3mV error envelope for this range.
Zooming out to the 0-20V readings, the Keithley now also uses the 100V range resulting in another jump. Now, with the BA6010 in the 60V range, it seems that the BA6010 reads somewhat low by about 15-25mV. The trend, interestingly, mirrors the lower part of the range but scaled up. The low limit reading touches the specification limit at 20V, which implies that if the Keithley 2110 was right at the limit of its allowable error, the BA6010 would also be. However, as this is an unlikely situation, the raw difference line in blue is more likely the truth. While in specification, it is noticeably low.
To span the full range, I added a Manson HCS-3102 power supply providing 20V on its floating rails to obtain the span up to 40V. As the HCS-3102 is a switching supply, there is a slight amount of introduced ripple which could affect the results slightly.
By expanding to 40V, we can see that the low limit line does pass the limit just slightly, although the mean stays within. As a result, I think this unit is probably reading low but within specification. Mirroring the 6V range, at the 30V mark, the unit corrects itself and becomes much closer to the reported value from the Keithley 2110.
Not satisfied with this, and contrary to the manuals of my power supplies, I stacked another HCS-3102 providing 20V to be able to span to the complete 60V.
What follows is no great surprise – the voltage appears to get more accurate trending back towards the zero-difference line. As a result, it seems that throughout the range, the BA6010 does meet the claimed voltage accuracy specifications, however, there does seem to be some “discontinuities” in the voltage around mid-range (3V/30V) which could be of interest.
Finally just to placate the sceptics I decided to also run the test with the Keithley 2110 testing the read-back accuracy of the Keysight to its readout specifications Accordingly the readout sits within a tight band around+1mV-1.5mV being well within the claimed accuracy boundaries of 0.05 5mV The slight tilt and jump at 12V due to the range change does suggest there may be some error contributed by the Keithley 2110 but the magnitude of the error is very much in accordance with the claimed margin of error
Resistance Reading Accuracy
The second primary parameter that the BA6010 reads is resistance. Unfortunately, this is more difficult to test, as while I do have a B&K Precision Model 8600 Electronic Load that has a constant resistance mode, this is a DC load and the signal from the BA6010 is a 1khz AC signal. As a result, I was not convinced that it would be a valid test.
As a result, I had to resort to these “resistor cards” I built a while back for testing meter accuracy. These are just strips of vero board with random resistors across the range that I pulled from my junk box that I could measure the values and compare meters with.
This method has a number of caveats – the temperature of the resistors may change as a result of the test and handling and without knowing the temperature coefficient of the resistors, the reading values can differ for that reason alone. Furthermore, the contact shape and area of the probes can affect the reading, especially for smaller resistances.
Because of this, I decided to do an “indicative” test just to see whether there was any unexpected errors in resistance. While I could measure the resistors with the Keithley 2110 DMM, that uses a DC resistance measurement whereas the resistors could behave slightly differently under AC (however unlikely). To exclude this possibility, I used my (own, personally bought) Agilent Technologies U1733C LCR meter in 1khz mode. The meter itself has about 4-digit resolution, so it’s not really capable of verifying the absolute accuracy of the BA6010. As it is a two-wire measurement, the null-offset mode was used to minimise lead error as much as possible.
The absolute difference in resistance value was commendably small not even an ohm up to 3kohm Some of the larger variances are purely because of the lack of resolution of the which was reading in ohms for the larger resistances
If we look at it by percentage, we can see that most of the errors are under 0.4% with only one at near 1% that was probably due to contact resistance and two-wire measurement errors. As a result, the concordance between the two meters is quite high and I suspect the BA6010 is able to exceed the claimed accuracy in resistance, at least in the case of such “simple” resistors.
Complex Impedance Reading Accuracy
Unfortunately, owing to a lack of collection of inductors and capacitors, it’s not really possible for me to characterise the complex impedance reading accuracy of the unit. That being said, I did try to best understand the manual’s (at times) confusing explanation on calculating error and produced an Excel worksheet that calculates the actual error which depends on a number of factors such as the reading and dissipation factor.
While the datasheet claims a typical error of around 5%, in the slow mode across a range of readings and dissipation factor values, it seems that the actual error is likely to be a lot less – peaking about 1.8% except for extremely small values of R.
As for whether this actually accurately captures the performance of the instrument is something I’m not adequately equipped to test, but I suspect it does based on how well it performs on the resistance accuracy.
Output Voltage, Current and Waveforms
The output voltage from the unit depends somewhat on the impedance connected to the unit. Putting the output directly into a 10x scope input on the Rohde & Schwarz RTM3004 results in a signal that’s close to 2V peak-to-peak. These low voltages should make it safe for human contact. There is noticeable asymmetry in the waveforms as the signal is probably isolated by a transformer and is reaching saturation. Unloaded, the FFT of the waveform seems to show a number of harmonics.
When loaded, say with a 1.1 ohm resistor, the waveform takes a much cleaner shape and the FFT is much cleaner.
The output from the unit is interrupted at ~466ms intervals in the slow mode, which probably coincides with the sampling rate. For convenience, I have some earlier screenshots from my Picoscope 2205A as the 1:1 probe is more appropriate for such weak signals.
Unfortunately, as I didn’t have any meter which could measure currents of 1khz AC directly, it’s easier just to refer to the datasheet for this information. The measurement current depends on the range, with a maximum of 100mA. The actual current can be seen in the BA6010’s display provided the secondary measurement feature is turned on. With such relatively-low currents flowing, there shouldn’t be any concerns about sparks near batteries or damage to gold plated connectors.
Standby and Active Power Consumption
As a reader of another RoadTest commented about the unreasonable power some instruments consume, as well as being a prior recipient of a Tektronix PA1000 Power Analyzer, I decided to do some standby power measurements using a synthetic pure-sine wave inverter source with Variac voltage-trimming.
When in standby mode, the unit measured a standby power of 0.73595W, complying with the 1W limit in the EU when tested to the IEC62301 standard. Note the power factor value is not accurate in part due to my inverter set-up.
When active, testing with the same protocol for comparison, we see that the power consumption of the unit tends to trend slightly upward over time, which may explain the slow warm-up performance shown in the next section. Active power is about 10.5W, which is quite reasonable for a test instrument.
One point of interest is how the instrument “warms up” over time. I noticed a significant drift in readings when started from cold, so I decided to investigate this by leaving the instrument sitting overnight, powering it up in the morning and immediately logging the values from the instrument in R-V mode with slow sampling for highest accuracy. Throughout this period, room temperature change was negligible and remained within the 20±5°C envelope. The leads were shorted throughout the test, so voltage and resistance should (in theory) have read zero. Short compensation was turned off for this task to prevent its interference in read values.
The warm-up trend in voltage shows a noticeable change over time, varying close to 2.4mV in all. Most of the change seems to have taken place by the first two hours, with the reading only seeming to stabilize just before the five-hour mark just shy of 1.5mV. This could be due to the warming-up of other components with dissimilar metals, producing a thermal junction effect.
It seems that the resistance readings are somewhat “noisy” in appearance but only move a small amount over the tested warm-up interval. As the change is rather small, it seems that after about one and a half hours, the unit’s readings are “as good as they get”.
Because of this, almost all of the testing I performed above with the unit was done with the unit warmed-up over at least six hours or leaving it continuously operating even though the error induced is relatively small.
To examine the measurement rate, I used the EOM output with the unit running on INT trigger. Measuring the intervals between triggers using R-V mode with shorted leads and auto-ranging, I found that the measurement rate for 50Hz power input was:
- 88ms/11.4Hz in Fast Mode
- 268ms/3.7Hz in Medium Mode
- 468ms/2.1Hz in Slow Mode
This was at odds with the claimed 50 measurements per second in slow, 10 measurements per second in medium and 6.25 measurements per second in fast as stated in the datasheet. The only way I was able to meet the ~50Hz measurement rate was to measure voltage only in fast mode, with a fixed range (to maximise speed) and even then it still clocked in at around 47.6Hz, so the claimed sample rates are probably for this scenario alone.
Putting the BA6010 through its paces, I was able to verify that my particular sample did seem to meet the voltage reading accuracy specifications, although with some discontinuities in the result and getting a little close to the limit especially towards the middle of the 60V range. The resistance reading accuracy as compared with a 4-digit handheld LCR meter showed a high level of concordance, suggesting to me that the BA6010’s resistance accuracy is quite good and potentially better than stated when testing simple resistors.
When measuring complex impedances, the accuracy depends on the reading and dissipation factor. While the manual was slightly confusing about this, I’ve attempted to distil it down into an Excel workbook along with a graph which shows that, in the slow mode for the majority of the range of values, error below 1.8% is achievable. Whether this is actually the case, I was not able to confirm.
The BA6010 seems to put out up to 2V peak-to-peak into a 10x oscilloscope probe and a maximum current of 100mA according to the datasheet, with voltage and current dependent on the measurement range and impedance connected. These levels should not be hazardous to humans, batteries or the gold-plating on the clips, which is good to know.
Standby power consumption for the instrument was tested to be 0.73595W, below the 1W limit set by the EU and quite a decent result. Active power consumption was tested to be approximately 10.5W, increasing slightly over time. Connected with this was the warm-up behaviour, which showed that voltage drifts of up to 2.4mV over six hours, with most of the movement completed within the first two hours and fully stabilising only after close to five hours. The resistance drifted less and was more stable overall, however, it seems that about one-and-a-half hours is needed to stabilise the resistance value.
Testing the measurement rates show that the datasheet claimed reading rates are in the best-case scenario of using the meter to measure voltage alone. Even then, the resulting read rates fall slightly short of the claimed 50Hz maximum in fast mode, reaching 47.6Hz in my tests. In R mode with a fixed range, the reading rate was around half of this. In R-V mode (as you might reasonably expect to use) with auto ranging enabled, I achieved 11.4Hz in fast mode, 3.7Hz in medium mode and 2.1Hz in slow mode which is quite different from what the datasheet might have you believe.
Appendix: pyvisa Scripts
Please note, I have supplied my pyvisa code for reference, however you may not be able to run it because it uses absolute instrument references including serial numbers, hard-coded file paths, it may require the availability of other instruments, it may be written for Python 2 and you’re trying to run it on Python 3 (or vice versa), the code may rely on Windows-only libraries. No liability is taken for this code – use it at your own risk.
Voltage Reading Accuracy
# B&K Precision BA6010 Testing of Voltage Reading Accuracy
Using Keysight as supply Keithley Model 2110 as reference
# Gough Lui (goughlui.com) - September 2018
resource_manager = visa.ResourceManager()
ins_ba6010 = resource_manager.open_resource('USB0::0x0471::0x6010::520L17113::INSTR')
ins_e36103a = resource_manager.open_resource('USB0::0x2A8D::0x0702::MY56156196::INSTR')
ins_k2110 = resource_manager.open_resource('USB0::0x05E6::0x2110::1374001::INSTR')
# Roll Call
print 'Available:' + '\n' + ins_ba6010.query('*IDN?') + ins_e36103a.query('*IDN?') + ins_k2110.query('*IDN?')
# Open Data Files
print 'Opening a logfile ...'
data_log = 'D:\experimental\output.csv'
f = open(data_log,'a')
# Set Up BA6010
print 'Setting Up - BA6010'
ins_ba6010.query_delay = 0.1
# Set Up K2110
print 'Setting Up - K2110'
# Begin Voltage Experiment
print 'Begin Testing'
for x in range (0,20000,1):
ins_e36103a.write('VOLT:LEV:IMM:AMPL ' + str(x/1000.0))
mmvolts = ins_k2110.query_ascii_values('FETC?',separator='\r')
bavolts = ins_ba6010.query_ascii_values('FETC?',separator=',')
print 'Completed step ' + str(x)
# Close Log
# Turn off supply
# Announce Completion
print 'Script Completed!'
This blog is part of a series of posts for the B&K Precision BA6010 Battery Analyser RoadTest, where you will find all the links to the other parts of the review.