On Wed, 2 Sep 2020 at 17:35, Philippe Mathieu-Daudé <f4...@amsat.org> wrote:
> Peter said "'clock' is basically meaningless for virt machines",
>
> I understand and agree. But how to make that explicit/obvious in
> the code, when a device expects a clock frequency/period?

When a particular *device* needs a clock, then presumably
it has a defined purpose for it, and we can pick a
frequency for it then.

> See for example hw/riscv/virt.c, it uses the following (confusing
> to me) in virt_machine_init():
>
>    serial_mm_init(system_memory, memmap[VIRT_UART0].base,
>        0, qdev_get_gpio_in(DEVICE(mmio_plic), UART0_IRQ), 399193,
>        serial_hd(0), DEVICE_LITTLE_ENDIAN);

In this case, the board has a model of a 16550A UART on it,
which uses its input clock to determine what the actual baud
rate is for particular guest settings of the divisor registers.
So we need to look at:
 * what does guest software expect the frequency to be?
 * what is a "good" frequency which gives the guest the best
   possible choices of baud rate?
and also at whether we need to tell the guest the frequency
via a device tree or other mechanism.

In some devices the input clock genuinely doesn't affect the
guest-visible behaviour, in which case we can pick an arbitrary
or conventional value, or just implement the device to work OK
without a clock connected.

Note also that we don't have to make a single decision for the
whole board -- we can run different devices with different clocks
if that makes sense.

Incidentally the serial.c API seems to be slightly confused
about the difference between setting the baudbase as a value
in bps (ie the traditional 9600, 115200, etc values) or a
frequency (where 115200 bps corresponds to a 1.8432MHz clock,
apparently). Luckily nobody uses the serial_set_frequency()
function :-)

thanks
-- PMM

Reply via email to