diff options
author | Ian Abbott <abbotti@mev.co.uk> | 2012-08-31 20:41:29 +0100 |
---|---|---|
committer | Greg Kroah-Hartman <gregkh@linuxfoundation.org> | 2012-09-04 11:55:38 -0700 |
commit | e6391a182865efc896cb2a8d79e07b7ac2f45b48 (patch) | |
tree | 2cf49d2f3517e2d67e684dea5444efe3063796c8 /drivers/dca | |
parent | 4a7a4f95a5e15648f24a971b36b82adc36d2cb6b (diff) | |
download | kernel_goldelico_gta04-e6391a182865efc896cb2a8d79e07b7ac2f45b48.zip kernel_goldelico_gta04-e6391a182865efc896cb2a8d79e07b7ac2f45b48.tar.gz kernel_goldelico_gta04-e6391a182865efc896cb2a8d79e07b7ac2f45b48.tar.bz2 |
staging: comedi: das08: Correct AI encoding for das08jr-16-ao
The element of `das08_boards[]` for the 'das08jr-16-ao' board has the
`ai_encoding` member set to `das08_encode12`. It should be set to
`das08_encode16` same as the 'das08jr/16' board. After all, this board
has 16-bit AI resolution.
The description of the A/D LSB register at offset 0 seems incorrect in
the user manual "cio-das08jr-16-ao.pdf" as it implies that the AI
resolution is only 12 bits. The diagrams of the A/D LSB and MSB
registers show 15 data bits and a sign bit, which matches what the
software expects for the `das08_encode16` AI encoding method.
Cc: stable <stable@vger.kernel.org>
Signed-off-by: Ian Abbott <abbotti@mev.co.uk>
Signed-off-by: Greg Kroah-Hartman <gregkh@linuxfoundation.org>
Diffstat (limited to 'drivers/dca')
0 files changed, 0 insertions, 0 deletions