• =?UTF-8?B?V2hhdOKAmXM=?= Your Least Favourite Programming Language?

    From Lawrence =?iso-8859-13?q?D=FFOliveiro?=@[email protected] to comp.lang.misc on Fri Oct 31 06:48:08 2025
    From Newsgroup: comp.lang.misc

    As their soundcheck question for 2024, Computerphile asked their
    interviewees what their least favourite programming language was <https://www.youtube.com/watch?v=03lRzf7iSiU>.

    The most popular answer was JavaScript, with 4 votes. There were 2
    votes for PHP, and one each for Lisp and Python.

    The Python-hater didn’t like dynamic typing. Given how many
    dynamically-typed languages there are (including lots older than
    Python), how come Python was the first one he thought of?

    As for JavaScript, I think it’s misunderstood. The only one who gave a
    reason for his dislike gave an outdated reason -- scope hoisting. That doesn’t have to apply any more, if you avoid “var” declarations, and
    also use strict mode to avoid implicit globals.

    I imagine PHP would have got more votes, if more people had had to use
    it.

    One mentioned COBOL (which for him was worse than Fortran), but nobody
    thought of BASIC. I guess that is now so far in the past, many among
    the interviewees wouldn’t even have any memories of using it ...
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From David Brown@[email protected] to comp.lang.misc on Fri Oct 31 13:28:36 2025
    From Newsgroup: comp.lang.misc

    On 31/10/2025 07:48, Lawrence D’Oliveiro wrote:
    As their soundcheck question for 2024, Computerphile asked their
    interviewees what their least favourite programming language was <https://www.youtube.com/watch?v=03lRzf7iSiU>.

    The most popular answer was JavaScript, with 4 votes. There were 2
    votes for PHP, and one each for Lisp and Python.

    The Python-hater didn’t like dynamic typing. Given how many dynamically-typed languages there are (including lots older than
    Python), how come Python was the first one he thought of?

    As for JavaScript, I think it’s misunderstood. The only one who gave a reason for his dislike gave an outdated reason -- scope hoisting. That doesn’t have to apply any more, if you avoid “var” declarations, and also use strict mode to avoid implicit globals.

    I imagine PHP would have got more votes, if more people had had to use
    it.

    One mentioned COBOL (which for him was worse than Fortran), but nobody thought of BASIC. I guess that is now so far in the past, many among
    the interviewees wouldn’t even have any memories of using it ...

    I suppose it is all about which languages people know and use. It would
    be unreasonable to answer "Brainfuck", given that very few people have
    written code in it. (I expect more people have written Brainfuck
    interpreters than programs in Brainfuck.)

    So for the Python-hater, Python was presumably the only dynamically
    typed language they used. And very few people these days have much
    experience with COBOL.

    Did you have a "least favourite" language yourself?

    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From John Ames@[email protected] to comp.lang.misc on Fri Oct 31 08:17:42 2025
    From Newsgroup: comp.lang.misc

    On Fri, 31 Oct 2025 06:48:08 -0000 (UTC)
    Lawrence D’Oliveiro <[email protected]d> wrote:
    One mentioned COBOL (which for him was worse than Fortran), but nobody thought of BASIC. I guess that is now so far in the past, many among
    the interviewees wouldn’t even have any memories of using it ...
    And/or that anybody using BASIC in a modern context is gonna be using
    something like FreeBASIC, which is a vastly improved and perfectly
    reasonable little language compared to the early microcomputer BASICs.
    Re: Javascript, it's true that a lot of improvements have been made to
    it, but from a certain perspective it's all lipstick on a pig; there's
    been so many things slapped on to paper over some poor initial design
    decisions or chase trends in web development over the years that at
    this point it's a fossil shale of a language.
    Anyway, there's things to dislike about most any language, but there
    aren't too many that I'd universally condemn in my own assessment.
    Pascal gets a frowny-face for design decisions that should never, ever
    have made it past the initial draft of the first paper (making array
    size part of the type specification was braindead from the start - an impediment to good design *and* a burden on performance, all in the
    name of avoiding a problem that there were much better solutions for,)
    and while newer iterations have improved things somewhat, it would've
    been better to throw it out and re-do from scratch...*but* the FP folks
    do put a lot of effort into making it a full-featured and surprisingly
    portable platform for development.
    Another language that feels needlessly gross and tedious is Java - it's
    just *unreasonably* verbose, to the point where one is tempted to
    employ a macro preprocessor just to condense sesquipedalian nonsense
    like System.out.println() down to furshlugginer print() and so forth.
    Any language that *requires* an IDE with weapons-grade autocomplete
    deserves censure...but then it's somehow still the best solution we've
    got for write-once-run-anywhere development, which is maddening :/
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From bart@[email protected] to comp.lang.misc on Fri Oct 31 22:47:16 2025
    From Newsgroup: comp.lang.misc

    On 31/10/2025 15:17, John Ames wrote:
    On Fri, 31 Oct 2025 06:48:08 -0000 (UTC)
    Lawrence D’Oliveiro <[email protected]d> wrote:

    One mentioned COBOL (which for him was worse than Fortran), but nobody
    thought of BASIC. I guess that is now so far in the past, many among
    the interviewees wouldn’t even have any memories of using it ...

    And/or that anybody using BASIC in a modern context is gonna be using something like FreeBASIC, which is a vastly improved and perfectly
    reasonable little language compared to the early microcomputer BASICs.

    Re: Javascript, it's true that a lot of improvements have been made to
    it, but from a certain perspective it's all lipstick on a pig; there's
    been so many things slapped on to paper over some poor initial design decisions or chase trends in web development over the years that at
    this point it's a fossil shale of a language.

    Anyway, there's things to dislike about most any language, but there
    aren't too many that I'd universally condemn in my own assessment.
    Pascal gets a frowny-face for design decisions that should never, ever
    have made it past the initial draft of the first paper (making array
    size part of the type specification was braindead from the start - an impediment to good design *and* a burden on performance, all in the
    name of avoiding a problem that there were much better solutions for,)
    and while newer iterations have improved things somewhat, it would've
    been better to throw it out and re-do from scratch...*but* the FP folks
    do put a lot of effort into making it a full-featured and surprisingly portable platform for development.

    Another language that feels needlessly gross and tedious is Java - it's
    just *unreasonably* verbose, to the point where one is tempted to
    employ a macro preprocessor just to condense sesquipedalian nonsense
    like System.out.println() down to furshlugginer print() and so forth.

    Zig is worse ('i' is an integer):

    const std = @import("std");

    std.debug.print("{} {}\n", .{i, @sqrt(@as(f64, @floatFromInt(i)))});


    This is the equivalent of this in mine:

    println i, sqrt(i)

    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Lawrence =?iso-8859-13?q?D=FFOliveiro?=@[email protected] to comp.lang.misc on Sat Nov 1 06:26:55 2025
    From Newsgroup: comp.lang.misc

    On Fri, 31 Oct 2025 08:17:42 -0700, John Ames wrote:

    Re: Javascript, it's true that a lot of improvements have been made
    to it, but from a certain perspective it's all lipstick on a pig;
    there's been so many things slapped on to paper over some poor
    initial design decisions or chase trends in web development over the
    years that at this point it's a fossil shale of a language.

    I don’t think it is. It has lexical binding, and functions as
    first-class objects! So has Python, but what else can you name that
    has that? Strict mode and let/const-in-place-of-var lets you get rid
    of a lot of the boneheadedness. The statement-continuation rule is
    weird, but manageable.

    My main use of it has been in web pages, where I maybe write a few
    hundred lines of it at a time, commonly less. From a recent bit of
    fun I did for a friend, here is the setup of column headings for
    a table of data that are clickable to set the table sort order:

    const row = document.createElement("tr")
    const cell = document.createElement("th")
    cell.textContent = ""
    row.appendChild(cell)
    let fieldindex = 0
    for (const name of fieldnames)
    {
    const cell = document.createElement("th")
    const sortbut = document.createElement("button")
    sortbut.setAttribute("onclick", "mymod.set_sort_order(" + fieldindex.toString() + ")")
    sortbut.textContent = name
    cell.appendChild(sortbut)
    row.appendChild(cell)
    ++fieldindex
    } /*for*/
    thead.appendChild(row)

    Here’s the function that sets the sort order (note the use of lexical binding):

    function set_sort_order(fieldindex)
    /* Note that each change of sort order is applied on top of
    the previous sort order. To start again from the default
    ordering, refresh the page. */
    {
    if (sortcol != fieldindex)
    {
    sortcol = fieldindex
    fieldvalues.sort
    (
    function (a, b)
    {
    const fieldtype = fieldtypes[fieldindex]
    const conv = sort_conv[fieldtype]
    const key_a = conv(a[fieldindex])
    const key_b = conv(b[fieldindex])
    return key_a < key_b ? -1 : key_a > key_b ? 1 : 0
    } /*function*/
    )
    load_page()
    } /*if*/
    } /*set_sort_order*/

    See the reference to that “sort_conv” table? It is keyed off a column
    type, to define the appropriate sort order for that column. For
    example, a column of quarterly dates has values like “Q3'23” and “Q1'24”, and you want the former to sort before the latter. In a
    column of numbers with units, you want a value beginning “2G” to sort before one beginning “100M”. And so on.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From BGB@[email protected] to comp.lang.misc on Mon Nov 3 14:05:12 2025
    From Newsgroup: comp.lang.misc

    On 10/31/2025 10:17 AM, John Ames wrote:
    On Fri, 31 Oct 2025 06:48:08 -0000 (UTC)
    Lawrence D’Oliveiro <[email protected]d> wrote:

    One mentioned COBOL (which for him was worse than Fortran), but nobody
    thought of BASIC. I guess that is now so far in the past, many among
    the interviewees wouldn’t even have any memories of using it ...

    And/or that anybody using BASIC in a modern context is gonna be using something like FreeBASIC, which is a vastly improved and perfectly
    reasonable little language compared to the early microcomputer BASICs.


    In one project, I ended up implementing a BASIC dialect partly inspired
    by early Unstructured BASIC dialects. Though, in this case, this was
    more because it allowed writing an interpreter in roughly 1000 lines of C.

    Though, I ended up extending it in some non-standard ways that didn't
    really match up with either 80s BASIC, or the direction the language
    went in the 90s.

    But, the use-case I had didn't really need it to go in the 90s
    direction, and keeping the interpreter small also did not favor going
    that direction (where the 90s dialects more went in the direction of
    trying to turn it into a more general purpose programming language).


    It went from 80s style Unstructured BASIC to:
    Unstructured BASIC, but with dynamic scoping and return values...
    And, CSG / vector stuff.

    Trying to do a clean up of the core idea of what it became, ended up
    sorta like this:
    https://pastebin.com/2pEE7VE8



    Re: Javascript, it's true that a lot of improvements have been made to
    it, but from a certain perspective it's all lipstick on a pig; there's
    been so many things slapped on to paper over some poor initial design decisions or chase trends in web development over the years that at
    this point it's a fossil shale of a language.


    Did another recent experiment, where I tried stripping JS back to a
    minimal core and implementing something "roughly kinda similar" but
    trying to write the interpreter in a way that minimized line count.

    Got something written in around 2700 lines of C (over around 3 days).

    FWIW: https://github.com/cr88192/bgbtech_misc/tree/master/vm_bs3l


    This reuses a few ideas from the design of the BASIC dialect, in places
    where it could save implementation complexity without trashing the
    language design too much.



    Was still lacking various language constructs (like "switch()"), but
    alas. I also went with dynamic scoping, mostly because I can implement
    dynamic scoping using less code than lexical scoping.

    Also, absent additional analysis steps or similar, and thus code
    complexity, full lexical scoping is prone to rapidly leak memory. The
    main way to avoid the massive memory leak is to detect cases of
    non-capture and either fall back to a C-like scoping model; or at least destroy the non-captured frames.

    This is a non-issue with dynamic scoping, even if, potentially, using
    dynamic scoping as the default is itself a foot gun (in terms of semantics).


    Still not really well tested. I didn't have an immediate use-case so
    more threw it together as a test/proof of concept.


    It isn't really meant to be either well written, or particularly fast.
    It parses an AST and then just sort of uses a tree-walking interpreter.

    Ultimately, it is maybe kinda moot, as pretty much any language/interpreter/compiler that sees significant use is almost
    invariably prone to turn into a bloated monstrosity.


    There are also limits.
    A while back, I had started an attempt to write a more traditional C
    compiler (with a vaguely GCC like design), but also trying to keep it
    under 30 kLOC (roughly a similar code footprint to the original version
    of the Doom engine). Sorta stalled out the project when I noted that I
    was already going to blow past this limit. I had also started to
    question whether it even makes sense to use the traditional strategy
    (or, using fully disjoint stages, producing native code object files,
    and then running them through a linker).


    My existing C compiler (that I use in my own projects) uses an approach
    of first compiling to a stack-based IL, and then doing all of the native
    code generation in what would-be the link stage. I am left suspecting
    this may actually be a more sensible approach (with the frontend stages existing more to compile from the source language to the IL used by the backend).



    I am also left to consider possible alternatives to C and C++. But, the
    best idea I have at the moment (that would fit my uses) would be to make
    a language sort of resembling a hybrid of C and C#. Though, some of the possible merits of such a language would be weakened if (for practical reasons) such a compiler is likely to also need to be able to accept
    plain old C. The main argument in favor is that "C-like with some C#
    stuff glued on" being simpler/cheaper to implement than an actual C++ compiler.

    Though, some cost notable savings would be possible by the compiler
    either not supporting C, or only supporting a restricted subset of C
    (maybe going as far as only allowing C code which is the least common denominator of both languages).

    Might be nice if one can have a compiler that:
    Can generate reasonably efficient native code binaries;
    Supports a more advanced OO language as well as a C like use cases;
    Fits the whole compiler / "toolchain" in under 100K lines.
    Where, 30K lines may be unrealistic, but at least 100K.

    My existing compiler is around 250K lines, bigger than i would like.


    Still pretty small though if compared with GCC or Clang, which both
    weigh in in well into MLOC territory (and Clang basically wrecking my PC trying to build it). Like, even if it is good/fancy/whatever, not
    inclined to try to compile something where compiling the compiler brings
    my PC to its knees for multiple hours at a time.

    I much prefer working on a compiler where the rebuild time is around 5
    seconds or so (or, around 30 seconds if building it with GCC).


    Like, seriously, we don't need the C and C++ compilers to be like "The
    One Ring": "One compiler to build them all! One compiler to link them!"

    Nor, the PC equivalent of "Forged in the fires of Mt. Doom", namely
    multi-hour build times for said compiler...

    Well, and if one is like, "Well, does your compiler generate better code
    than GCC or Clang, or target all of the same types of machines?", I can
    be like, "This is missing the point."

    More often it matters that a compiler exists, rather than which compiler
    it is, or even whether or not it is the "best" compiler (more "a
    compiler exists", and "preferably not complete garbage").



    But, then the problem with C++ exists, that it is too much of a pain to
    write a C++ compiler, so inherently it becomes a choice-limiting issue.

    Ideally, one needs a language which can serve a similar role, but for
    which writing a compiler for it is less of a nightmarish undertaking.
    Well, and where the choice is not also "just use C".


    Well, in a similar way:
    Implementing a small JS like interpreter in 2700 LOC, stands no chance
    of replacing something like SpiderMonkey or V8, but expecting it would
    do so would also be missing the point.

    There are use-cases where one may want a small (if slow) interpreter,
    and where throwing some 500 kLOC beast of a VM at the problem is not an
    ideal strategy.



    Anyway, there's things to dislike about most any language, but there
    aren't too many that I'd universally condemn in my own assessment.
    Pascal gets a frowny-face for design decisions that should never, ever
    have made it past the initial draft of the first paper (making array
    size part of the type specification was braindead from the start - an impediment to good design *and* a burden on performance, all in the
    name of avoiding a problem that there were much better solutions for,)
    and while newer iterations have improved things somewhat, it would've
    been better to throw it out and re-do from scratch...*but* the FP folks
    do put a lot of effort into making it a full-featured and surprisingly portable platform for development.

    Another language that feels needlessly gross and tedious is Java - it's
    just *unreasonably* verbose, to the point where one is tempted to
    employ a macro preprocessor just to condense sesquipedalian nonsense
    like System.out.println() down to furshlugginer print() and so forth.
    Any language that *requires* an IDE with weapons-grade autocomplete
    deserves censure...but then it's somehow still the best solution we've
    got for write-once-run-anywhere development, which is maddening :/


    Agreed, even as someone who had before designed/implemented a Java-like language...

    Well, there are reasons my current leaning would be to start from a base
    more resembling C#. In some ways, its design points make more sense
    (even if seemingly some of its choices look more like "take Java and
    make it look more like C++").


    For API design, probably makes some sense to stick close to a C like
    approach when possible. Like, try to aim for a "minimal but effective"
    API design, not so much the "try to micro-manage every possible thing a
    person might want to do in the language" approach that Java had used
    (while at the same time making it needlessly verbose to actually use
    said interfaces).

    Maybe also don't try to design the language with the seeming starting assumption that all the programmers are stupid and can't figure out how
    to do anything on their own; ...


    Though, granted, a custom non-standard C# like language, absent
    widespread adoption, would not likely achieve a WORA like goal.

    ...

    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Lawrence =?iso-8859-13?q?D=FFOliveiro?=@[email protected] to comp.lang.misc on Mon Nov 3 20:29:53 2025
    From Newsgroup: comp.lang.misc

    On Mon, 3 Nov 2025 14:05:12 -0600, BGB wrote:

    Also, absent additional analysis steps or similar, and thus code
    complexity, full lexical scoping is prone to rapidly leak memory.

    You mean, you need to put call frames on the heap, not the stack? And
    that will expose any lack of robustness in your memory-management
    scheme?

    By the way, Python is clever enough to only put referenced outer-level variables within the closure.

    This is a non-issue with dynamic scoping, even if, potentially,
    using dynamic scoping as the default is itself a foot gun (in terms
    of semantics).

    This sounds like a “I could implement it much more efficiently if I
    didn’t have to do it correctly” kind of argument ...
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From BGB@[email protected] to comp.lang.misc on Mon Nov 3 14:55:31 2025
    From Newsgroup: comp.lang.misc

    On 11/3/2025 2:29 PM, Lawrence D’Oliveiro wrote:
    On Mon, 3 Nov 2025 14:05:12 -0600, BGB wrote:

    Also, absent additional analysis steps or similar, and thus code
    complexity, full lexical scoping is prone to rapidly leak memory.

    You mean, you need to put call frames on the heap, not the stack? And
    that will expose any lack of robustness in your memory-management
    scheme?


    Yes.

    Minimal memory management: Don't bother with GC;
    Minimal handling of lexical closures: Just put the frames of the heap,
    capture them if a closure is created, don't bother with case of closure
    not being created.

    But, this is a bad scenario, as it results in an explosively bad memory leak...


    By the way, Python is clever enough to only put referenced outer-level variables within the closure.


    Yes, but:
    Detecting which variables are captured, or whether or not there is
    capture or non-capture, is the "additional analysis steps" thing...


    Likewise, the cheaper option is using a per-frame flag for capture-vs-non-capture (frame is freed if the "frame was captured" flag
    is not set). Still though, this is more code complexity than not using
    lexical scoping in the first place.


    Like, dynamic scoping doesn't need frames that may or may not be
    captured, can just put all of the variables onto a stack and push/pop
    this stack position per-frame.


    This is a non-issue with dynamic scoping, even if, potentially,
    using dynamic scoping as the default is itself a foot gun (in terms
    of semantics).

    This sounds like a “I could implement it much more efficiently if I didn’t have to do it correctly” kind of argument ...

    It is cheap and simple...

    But, one thing it is not, is semantically equivalent to lexical scoping.


    Some of my past languages had both lexical and dynamic scoping (with
    lexical as the default). This can make more sense.

    But, as noted, the goal in the case of this interpreter was to use less
    code, and probably would have needed to spend maybe an additional 300
    lines of code or similar to have added support for lexical scoping (that wouldn't just immediately leak all the RAM).

    ...


    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Lawrence =?iso-8859-13?q?D=FFOliveiro?=@[email protected] to comp.lang.misc on Mon Nov 3 23:10:23 2025
    From Newsgroup: comp.lang.misc

    On Mon, 3 Nov 2025 14:55:31 -0600, BGB wrote:

    Some of my past languages had both lexical and dynamic scoping (with
    lexical as the default). This can make more sense.

    Dynamic binding simply isn’t that useful. Lexical binding just works more naturally the way people expect.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From John Ames@[email protected] to comp.lang.misc on Mon Nov 3 15:56:48 2025
    From Newsgroup: comp.lang.misc

    On Mon, 3 Nov 2025 14:05:12 -0600
    BGB <[email protected]> wrote:

    And/or that anybody using BASIC in a modern context is gonna be
    using something like FreeBASIC, which is a vastly improved and
    perfectly reasonable little language compared to the early
    microcomputer BASICs.

    In one project, I ended up implementing a BASIC dialect partly
    inspired by early Unstructured BASIC dialects. Though, in this case,
    this was more because it allowed writing an interpreter in roughly
    1000 lines of C.

    Though, I ended up extending it in some non-standard ways that didn't
    really match up with either 80s BASIC, or the direction the language
    went in the 90s.

    But, the use-case I had didn't really need it to go in the 90s
    direction, and keeping the interpreter small also did not favor going
    that direction (where the 90s dialects more went in the direction of
    trying to turn it into a more general purpose programming language).

    Yeah, there've been a number of odd little one-off BASICs over the
    years. ASIC was one that struck me as particularly kooky - a shareware implementation that could compile to COM/EXE, but which lacked a number
    of operators/control structures :/

    https://en.wikipedia.org/wiki/ASIC_programming_language

    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From BGB@[email protected] to comp.lang.misc on Tue Nov 4 12:02:22 2025
    From Newsgroup: comp.lang.misc

    On 11/3/2025 5:10 PM, Lawrence D’Oliveiro wrote:
    On Mon, 3 Nov 2025 14:55:31 -0600, BGB wrote:

    Some of my past languages had both lexical and dynamic scoping (with
    lexical as the default). This can make more sense.

    Dynamic binding simply isn’t that useful. Lexical binding just works more naturally the way people expect.


    I am not saying that Dynamic is preferable to Lexical by any means, from
    a semantics POV, but if the goal is to implement a semi-practical
    interpreter while also using less code (and with less risk of memory
    leaks), it can make sense.


    There are a few contexts where dynamic scoping can be useful as well,
    just usually not as the default.

    Say, for example, if you had something analogous to stdin/stdout as dynamically scoped variables, then it makes IO redirection easy.
    Likewise for other sorts of state that (in C) are often held in global variables.


    They can also serve a similar role as TLS variables (though, in a more advanced compiler, they were usually implemented as an additional local save/restore mechanism on top of TLS variables). In this case, the role
    of dynamic variables can be overloaded to that of TLS variables.

    One can debate how to best specify it.
    dynvar x; //one possibility for a language like this
    //if 'var' became lexically scoped
    dynamic var x; //my original script language
    dynamic int x; //BS2
    __dynamic int x; //In C mode in BGBCC
    _Thread_local int x; //does basically same thing as __dynamic here

    Can note that C normally uses a 2-level scope scheme:
    Global variables;
    Local Variables;
    In this case, lambdas could either capture by-reference or by value.

    In this case, a C-like scoping model could also be implemented in
    relatively little additional code (probably with lambdas just bulk
    copying all the captured local variables; could ignore both dynamic and
    global variables if going to a C-like model).

    At the moment, "function()" doesn't capture anything.



    Though, if I get around to it, lexical scoping and "switch()" would
    probably be pretty high on the "makes sense to add" list.

    Though, the handling of switch would probably also be "kinda crap":
    Just walk downwards, skipping lines until the correct case label is encountered, then start running until a "break;" or "return();" is encountered.


    May also need to implement some form of memory management. Most likely
    options ATM are slabs and zone allocation. A traditional garbage
    collector would be asking too much here, and reference counting is a pain.

    In a zone allocator, likely any newly created objects would be initially
    in an "eval" zone, and when assigned to a global variable (or into an
    object in the "global" zone) would be re-tagged as global (with a
    graph-walk to also re-assign any pointed to objects). Once eval
    terminates, any variables still in the eval zone are destroyed.

    Would likely use a zone tagging scheme similar to that used in the Doom engine, where lower-numbered zones are "more global" (so zone promotion
    merely sets it to the lowest number).


    Still, TBD.

    ...

    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Lawrence =?iso-8859-13?q?D=FFOliveiro?=@[email protected] to comp.lang.misc on Tue Nov 4 20:32:24 2025
    From Newsgroup: comp.lang.misc

    On Tue, 4 Nov 2025 12:02:22 -0600, BGB wrote:

    Say, for example, if you had something analogous to stdin/stdout as dynamically scoped variables, then it makes IO redirection easy.

    A better technique would be to save the current values of stdin and
    stdout, substitute different I/O streams during the execution of the inner code, then restore the outer values before returning. Offers better
    control and generalizes more readily.

    They can also serve a similar role as TLS variables (though, in a more advanced compiler, they were usually implemented as an additional local save/restore mechanism on top of TLS variables). In this case, the role
    of dynamic variables can be overloaded to that of TLS variables.

    Now you’re trying to justify one mechanism by mixing it up with a
    different one.
    --- Synchronet 3.21a-Linux NewsLink 1.2