-- Leo's gemini proxy

-- Connecting to ew.srht.site:1965...

-- Connected

-- Sending request

-- Meta line: 20 text/gemini

2022-12-19

Tools: redo (part 4) CFLAGS and friends, env/VAR, default.run.do

tags: software


Content

Part 0: Intro

Part 1: Hello, world!

Part 2: Automatic Recording of Dependencies on Header Files

Part 3: CFLAGS and friends, config.sh, compile.do

Part 4: CFLAGS and friends, env/VAR, default.run.do

Part 5: Auto-update BUILDDATE in version.h

Part 6: The yacc/bison problem: one call produces two artifacts

Part 7: Test: Generator for N source files


My code featured in this series can be found at

https://git.sr.ht/~ew/ew.redo/


Part 4: How about CFLAGS and friends? env/VAR, default.run.do


In part 3 I detailed one option to deal with an equivalent of CFLAGS et al. makefile variables. I could have stopped there, but I did not like, that "compile" and all equivalent scripts sourcing config.sh were going to be rebuild, even if they didn't pick up on a given change. Variables may be used separately in different parts of the build. So I wanted to have each Variable in a separate file (which can be versioned), and compile is exactly dependant only on CC and CFLAGS and not on LIBS or other existing, but unused variables. Now the plan is quite different:


each Variable is represented by a file env/VARNAME

the file's name represents the name of said variable, while

the file's content represents the value of said variable

compile.in is a template including %%VARNAME%% labels

default.run.do will read compile.in and produce compile.run while a) expanding all found VARNAME variables, and b) record a dependency of compile.run on env/VARNAME

compile.run is called in default.o.do

link.run (as a separate use case) is called in hello.do



So let's look at CFLAGS and trace it's way through the build. The value of CFLAGS is the content of the file env/CFLAGS:


-O2 -g -Wpedantic -Wall

CFLAGS is referenced in the template compile.in


# compile.in
%%CC%% %%CFLAGS%% -c -MMD -MF "$2.d"  -o "$3" "$2.c"
read DEPS < "$2.d"
redo-ifchange ${DEPS#*:}

The "%%VAR%%" Notation is inspired from the documentation of Avery Pennaruns implementation of redo. This template is transformed into a .run script calling default.run.do:


#!/bin/bash
# from $target.in create $target.run, using env/VAR files.
if [ -e "$2.in" ]
then
    # find out, which env/VARs are used by $2.in
    declare -g VARLIST=""
    for V in $( cd ./env || exit 99; ls -1 | grep -v '~$')
    do
        if [ $( grep -c "%%${V}%%" "$2.in") -gt 0 ]
        then
            VARLIST="${VARLIST} $V"
        fi
    done

    # depend on those found (not all available VARs); NB: greedy replace!
    redo-ifchange "$2.in" ${VARLIST// / env\/}

    # do "read VAR < env/VAR; ..." for all variables found
    eval $(for V in ${VARLIST}; do echo "read $V < env/$V; "; done)

    # generate sed stmt as array of s/// statements; beware of quoting!
    declare -a SEDLIST
    i=0
    for V in ${VARLIST}; do
        # use '|' delimiters, since ${!V} is likely to contain '/'
        SEDLIST[$i]="-e s|%%${V}%%|${!V}|g"
        i=$((i+1))
    done
    # edit template and create output file
    touch "$3" && chmod ug+x "$3" && \
    sed "${SEDLIST[@]}"  <"$2.in" >"$3"
else
        echo "$0: Fatal: dont know how to build $1... " >&2
        exit 99
fi

This script is not small, and it uses the "${!V}" bashism. Therefore a shebang line to call bash is needed. How does the script operate? It scans the template for all defined variables found in the ./env subdirectory. If the variable is used, its name is added to VARLIST. This VARLIST is then used to generate a command list for sed to substitute all occurances of %%VAR%% with the content of file env/VAR.


We do gain accurate dependencies indeed:


shell$ redo-dot | grep env
        "compile.run" -> "env/CC"
        "compile.run" -> "env/CFLAGS"
        "link.run" -> "env/LIBS"
        "link.run" -> "env/LINK"

We also gain visible Variables, since they are just files.


So, how do we fare? I did just add another level of indirection, and I did not reduce complexity. And it can be argued, that this solution is maybe not what is absolutely needed. I'm not convinced yet, that this is the way to go.



Home

-- Response ended

-- Page fetched on Thu May 2 23:45:58 2024