[libre-riscv-dev] pipeline stages controlling delays
Luke Kenneth Casson Leighton
lkcl at lkcl.net
Sun Apr 7 04:45:56 BST 2019
i found a simpler case, just involving a single-length pipeline, and
where buffering is switched *off*. d_ready and d_valid are still
staggered, however it doesn't matter what the latency is between them.
what matters is whether the "input ready" signal is asserted HI
(success, always), or whether it is set to random (fails).
you can see in the attached, top is "success", bottom is "fail", the
n_i_ready signal is written, however *only* when it combines with
d_ready does it produce an *actual* "ready" signal (n_i_rdy_data). in
the "success" case, n_i_ready is always HI, making the d_ready signal
effectively "the" n_i_ready signal, and thus giving us the OLD
it's when the d_ready combines with n_i_ready that the problems start,
which makes no sense.
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 78046 bytes
Desc: not available
More information about the libre-riscv-dev