[Libre-soc-dev] dirty coding and dependency hell (was Re: daily kan-ban update 10nov2020)
whygee at f-cpu.org
whygee at f-cpu.org
Sat Nov 14 02:02:52 GMT 2020
On 2020-11-13 18:55, Jacob Lifshay wrote:
> On Fri, Nov 13, 2020, 04:49 Luke Kenneth Casson Leighton
> <lkcl at lkcl.net> wrote:
>> or throw it out and start again. and this is again perfectly fine
>> because the code is so ridiculously short.
>> the technique is called "rapid prototyping" and it is an extremely
>> valuable skill to acquire.
> Yeah, I've definitely done that before, though I've often used bash
> of python for that -- if it can't easily be done in bash the program is
> probably complex enough that you'd want to do a more principled design.
I've indeed been using bash exclusively for the "glue" of my own system.
VHDL can handle amazing things and deals with the complex stuff,
while bash makes it all work together with a pinch of grep/sed.
This dichotomy has worked pretty well for me so far :
* VHDL implements my CPU's assembler like a champ, because it handles
string operations among many other features that Verilog users don't
they can dream about (they instinctively turn to metalanguages or C/C++
layers). It also performs the static and dynamic analysis
of the netlist, I can inject faults in a given gate to check the BIST,
estimate the logic depth of a pipeline stage... And GHDL can compile
the VHDL source code into an actual stand-alone program !
* Bash is an overblown application-specific language that does much more
than it's meant to do and in ways too obscure to even care, the syntax
is sometimes confusing, but everybody has a bash somewhere and it gets
the dirty job done. I even made a crude XML parser with it. Yuk.
Verification tests require running hundreds of independent instances of
the VHDL simulator so yesterday I implemented a parallel thread
dispatcher. Some would say to use xargs or make -j instead but I have
a much finer control over the details and the performance gain is the
same, while also keeping the dependency low.
The point of using fewer tools and expanding their use beyond reasonable
is called "simplicity". I've been hit by "dependency hell",
nightmare", "deprecation horror" and all other sorts of time wasters
in 25+ years. The fewer languages and external dependencies, the longer
the code base remains relevant, the more the work is useful and can
I have tested my current system with someone else : the hardest part was
to install GHDL (an old, known issue). Had I used a 3rd-party XML
this dependency would have slowed things down, made things uselessly
harder since the parser is not a critical part. The crazy part is that
we are now brain-wired to rely on the "classic UNIX/Linux environment"
but it constantly changes and Microsoft throws even more uncertainty
their WSL system, so nothing should be taken for granted.
I want my code to work reasonably well on 10 years old computers and
as well in 10 years, without having to invent crutches all the time.
If you don't count Magic, most EDA tools have been "come and go".
The field evolves so fast, and projects only last for as long as they
are commercially relevant, so holding on an IC code base for more than
a few years rarely happens.
Another exception is the LEON family of softcores that are written
in high quality VHDL and have strived since 1997. They have become
a "benchmark", even for evaluating EAD tools' performance or compliance.
I wish Libre-SoC could emulate this model.
I'm sure Luke and Jacob are very sensitive to these arguments.
I also understand very well that today's priority is "rush to tape-out".
I hope the rush will not let complex errors creep into the silicon.
Once the proto is successful, I hope you'all guys will clean up the code
base and cut as many dependencies as possible, in order to make your
more useful and used by others, bringing more eyeballs and developers.
Because the first and most prevalent obstacle to adoption is
Good luck everybody, I'm still watching and excited to see the results
More information about the Libre-soc-dev