Using libftdi 1.5 with an FT232BL, I've run into a situation in which
bit-bang mode causes the device to fill some internal receive buffer with
unwanted data that apparently cannot be purged using the normal buffer
flush routines.

My use case requires that one of the pins on the FT232 be used in async
bit-bang mode (BITMODE_BITBANG) for a couple seconds, and then promptly
return to serial/FIFO mode to read data that the remote side will begin
sending. The problem: after returning to normal serial mode, calls to
ftdi_read_data() will return immediately, reading bytes with the value 0xFC
or 0xFF (which I'm guessing represent the bit-bang pin state?) Only after
reading 256 bytes like this will ftdi_read_data() begin to return bytes
that actually arrived on the serial lines. The number of 0xFC/0xFF bytes in
the buffer seems to correspond to the amount of time spent in bit-bang
mode, but it fills quickly -- the entire 256-byte buffer will be filled in
a matter of a few milliseconds.

Calling ftdi_tciflush() after leaving bit-bang mode does not appear to have
any effect; ftdi_read_data() will still return the unwanted data first.
What can I do to purge only the bit-bang receive data after returning to
serial mode? Is there a way to prevent it from being buffered in the first
place?

--Colin


--
libftdi - see http://www.intra2net.com/en/developer/libftdi for details.
To unsubscribe send a mail to libftdi+unsubscr...@developer.intra2net.com   

Reply via email to