- The voltage and current should be measured (ideally) simultaneously so that the power at that specific point in time can be determined.
- The rate at which new voltage and current measurements are made needs to be high enough that the accumulated samples form an accurate representation of the source waveform. This is especially true if the current waveform has transients and nonlinearities (due to non-linear loads caused by power electronics).
Both of these factors are highly dependent on the speed of the analog-to-digital converter (ADC). Assuming that multiple-channel simultaneous ADCs are not being used (which is the case for my system), separation in time between the voltage and current measurements will be entirely a function of the conversion time. The measurement of the current can't start until the measurement of the voltage is complete. Similarly, the rate at which the 60Hz voltage and current waveforms can be sampled is determined by how long it takes one voltage-current measurement pair to be made.
For the sake of simplicity, this prototype system I'm building now is going to use the Arduino's on-board ADC. This ADC has a built-in multiplexer which has a reputation for not being very fast but since I am planning on using an external multiplexer to accommodate all the power circuits I want to measure (the Arduino can only mux in 8 channels, my plan calls for 13-16), I can get away with only using a single channel for the on-board ADC.
The question that remains, then, is how fast the on-board ADC can do a conversion. My prowling of the internet found some very experienced people who found a way to adjust the clock that the ADC uses when converting. The ADC clock runs at a scaled down rate (prescaler) of the main system clock and the scaling rate can be changed programmatically. A part of the documentation for the microprocessor used in my Arduino states:
Ignoring the actual results from the conversion, I decided to characterize the time it takes for the ADC to complete a given number of conversions. The results from this test are below showing the conversion time for a varying number of conversions. The last column is the difference in the 1 million conversions time as compared to the previous prescaler value.
The ADC accuracy also depends on the ADC clock. The recommended maximum ADC clock frequency is limited by the internal DAC in the conversion circuitry. For optimum performance, the ADC clock should not exceed 200 kHz. However, frequencies up to 1 MHz do not reduce the ADC resolution significantly.
Operating the ADC with frequencies greater than 1 MHz is not characterized.
Prescaler Value 1k conv. 10k conv. 100k conv. 1M conv. Difference
2 7.52us 7.43us 7.42us 7.42us -
4 9.53us 9.32us 9.30us 9.30us 1.88us
8 12.78us 12.57us 12.55us 12.55us 3.25us
16 19.15us 19.04us 19.04us 19.03us 6.48us
32 32.19us 32.02us 32.00us 32.00us 13.00us
64 60.28us 60.24us 60.24us 60.24us 28.24us
128 112.24us 112.02us 112.00us 112.00us 51.76us
There are two reasons to feel confident in this data:
- The conversion time is reasonably consistent across the various number of conversions. There is not a dramatic difference in the calculated conversion time when the loop contained 1 thousand conversion and 1 million conversions.
- The difference in conversion times scales very nearly linearly with the prescaler value.
Based on these results and the documentation's note that the ADC should operate reasonably well up to 1 MHz, I am planning on setting the prescaler to 16 giving me that 1 MHz conversion rate. Doing the math: 16 channels at a conversion time of 19us per channel gives me a total of 304us to sample all channels. With a 60 Hz waveform this allows 54 samples across all channels in one cycle. This is maximum conversion rate I should expect; the test above does nothing with the conversion results except store them. The arithmetic I need to do will slow the process down; characterizing that will be another test I'll have to do once the hardware is complete and I have a fuller start on the hardware.
You say you were (for the moment) ignoring the results of the conversions, which is fair.
ReplyDeleteBecause you're counting on being able to sample each channel only once and getting an accurate result, I think you should test to make sure that's the case. I envision feeding a non-zero DC voltage into your mux, switch away to a different channel and wait a bit, switch back and do a conversion as fast as you can, then do (say) a thousand conversions. Calculate the mean and SD of the thousand and find the Z-score of your first conversion to ensure that taking a single reading as quickly as possible really does give you a representative sample.