• Is it possible .....??

    From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Tue Mar 3 21:24:18 2026
    From Newsgroup: alt.os.linux

    Cross-posted to alt.os.linux and alt.comp.os.windows-11

    If I wanted to save a copy of what icons I have on my Computers Desktop,
    I could press "Print Screen" or similar and a 'picture' of my desktop
    would be saved somewhere.

    However, I've just had one of my sisters ring to say she's coming over tomorrow afternoon so I can show her how to take a photo/screenshot of
    her Mobile Phone's screen.

    Is this natively possible?? (Don't quote me but I think her phone is NOT
    an Iphone, if it matters.)

    (I'm thinking I'll take a photo of her Phone on my Phone and text it to
    her!)

    TIA
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Andy Burns@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Tue Mar 3 10:58:02 2026
    From Newsgroup: alt.os.linux

    Daniel70 wrote:

    I've just had one of my sisters ring to say she's coming over tomorrow afternoon so I can show her how to take a photo/screenshot of her Mobile Phone's screen.

    On most android phones, press vol- and power simultaneously, if you want
    to get fancy newer android versions allow capturing an area larger or
    smaller than the actual screen (see icons like ^/v or a pencil that
    appear at the bottom)
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Tue Mar 3 12:00:20 2026
    From Newsgroup: alt.os.linux

    On 2026-03-03 11:24, Daniel70 wrote:
    Cross-posted to alt.os.linux and alt.comp.os.windows-11

    If I wanted to save a copy of what icons I have on my Computers Desktop,
    I could press "Print Screen" or similar and a 'picture' of my desktop
    would be saved somewhere.

    However, I've just had one of my sisters ring to say she's coming over tomorrow afternoon so I can show her how to take a photo/screenshot of
    her Mobile Phone's screen.

    Is this natively possible?? (Don't quote me but I think her phone is NOT
    an Iphone, if it matters.)

    (I'm thinking I'll take a photo of her Phone on my Phone and text it to her!)

    It matters, and also matters the exact model.

    You should ask in the specific group: comp.mobile.android or misc.phone.mobile.iphone.
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Tue Mar 3 23:16:25 2026
    From Newsgroup: alt.os.linux

    On 3/03/2026 9:58 pm, Andy Burns wrote:
    Daniel70 wrote:

    I've just had one of my sisters ring to say she's coming over
    tomorrow afternoon so I can show her how to take a photo/screenshot
    of her Mobile Phone's screen.

    On most android phones, press vol- and power simultaneously,

    I'm in my sister's GOOD book!! ;-) Thank you, Andy.

    if you want to get fancy newer android versions allow capturing an
    area larger or smaller than the actual screen (see icons like ^/v or
    a pencil that appear at the bottom)
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Paul@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Tue Mar 3 07:32:50 2026
    From Newsgroup: alt.os.linux

    On Tue, 3/3/2026 5:24 AM, Daniel70 wrote:
    Cross-posted to alt.os.linux and alt.comp.os.windows-11

    If I wanted to save a copy of what icons I have on my Computers Desktop, I could press "Print Screen" or similar and a 'picture' of my desktop would be saved somewhere.

    However, I've just had one of my sisters ring to say she's coming over tomorrow afternoon so I can show her how to take a photo/screenshot of her Mobile Phone's screen.

    Is this natively possible?? (Don't quote me but I think her phone is NOT an Iphone, if it matters.)

    (I'm thinking I'll take a photo of her Phone on my Phone and text it to her!)

    TIA

    A Google search here, put a Gemini summary at the top of the page.
    This is mostly a repeat of the answer Andy gave.

    *******
    AI Overview
    To take a screenshot on most Android devices, press and hold
    the Power and Volume Down buttons simultaneously for 1–2 seconds.
    The screen will flash, and the image will be saved to your gallery.

    For Samsung, you can also use a palm swipe gesture or the Edge panel.

    Common Methods

    Physical Buttons (Standard): Press Power + Volume Down at the same time.
    Samsung Gesture: Swipe the edge of your palm across the screen
    (must be enabled in settings).
    Samsung Edge Panel: Swipe open the Edge panel and select "Smart select".
    Google Assistant: Say "Hey Google, take a screenshot".
    Quick Settings: Pull down the notification shade and
    tap the "Screenshot" or "Capture" button.

    This video demonstrates how to take a screenshot using different methods on an Android device:
    [the AI does not insert a link with a video... likely sniffed my browser...]

    Tips

    Long Screenshot: Tap the "Capture more" button that appears after taking
    a screenshot to capture a scrolling page.
    Editing: Tap the preview in the bottom left corner to
    crop, edit, or share immediately.
    Location: Saved screenshots are found in the Gallery or Google Photos app.

    *******

    "How do I take a screenshot on my iPhone, iPad or iPod touch?
    hold down Sleep/Wake then immediately press and release Volume up =
    The screen flashes white.
    Your device captures the entire screen and saves it as a photo.
    "

    *******

    Operating System
    Market Share (%)

    Android 70
    iOS 30

    *******

    Paul

    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Tue Mar 3 23:52:17 2026
    From Newsgroup: alt.os.linux

    On 3/03/2026 11:32 pm, Paul wrote:
    On Tue, 3/3/2026 5:24 AM, Daniel70 wrote:
    Cross-posted to alt.os.linux and alt.comp.os.windows-11

    If I wanted to save a copy of what icons I have on my Computers
    Desktop, I could press "Print Screen" or similar and a 'picture' of
    my desktop would be saved somewhere.

    However, I've just had one of my sisters ring to say she's coming
    over tomorrow afternoon so I can show her how to take a
    photo/screenshot of her Mobile Phone's screen.

    Is this natively possible?? (Don't quote me but I think her phone
    is NOT an Iphone, if it matters.)

    (I'm thinking I'll take a photo of her Phone on my Phone and text
    it to her!)

    TIA

    A Google search here, put a Gemini summary at the top of the page.
    This is mostly a repeat of the answer Andy gave.

    Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI (Artificial Intelligence). ;-P
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From J.O. Aho@[email protected] to alt.os.linux on Tue Mar 3 15:13:59 2026
    From Newsgroup: alt.os.linux

    On 03/03/2026 13.52, Daniel70 wrote:
    On 3/03/2026 11:32 pm, Paul wrote:

    A Google search here, put a Gemini summary at the top of the page.
    This is mostly a repeat of the answer Andy gave.

    Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI (Artificial Intelligence). ;-P

    Sadly still no AI, all you have is Large Language Models that just
    generates a flow of text that most likely hang together based on the
    input the program got.
    --
    //Aho
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Lew Pitcher@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Tue Mar 3 14:30:52 2026
    From Newsgroup: alt.os.linux

    On Tue, 03 Mar 2026 21:24:18 +1100, Daniel70 wrote:

    Cross-posted to alt.os.linux and alt.comp.os.windows-11

    If I wanted to save a copy of what icons I have on my Computers Desktop,
    I could press "Print Screen" or similar and a 'picture' of my desktop
    would be saved somewhere.

    However, I've just had one of my sisters ring to say she's coming over tomorrow afternoon so I can show her how to take a photo/screenshot of
    her Mobile Phone's screen.

    Is this natively possible?? (Don't quote me but I think her phone is NOT
    an Iphone, if it matters.)

    (I'm thinking I'll take a photo of her Phone on my Phone and text it to her!)

    TIA

    I've seen three different and mutually exclusive ways to take screenshots
    on Android.

    1) press power and volume_down simultaneously
    2) using the edge of your hand, swipe the screen left to right
    3) pull down the "Quick Settings" menu, click the "Take Screenshot" icon

    # 2 and 3 seem to be device and Android version dependant.
    --
    Lew Pitcher
    "In Skills We Trust"
    Not LLM output - I'm just like this.
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Tue Mar 3 19:30:57 2026
    From Newsgroup: alt.os.linux

    On 2026-03-03 15:30, Lew Pitcher wrote:
    On Tue, 03 Mar 2026 21:24:18 +1100, Daniel70 wrote:


    I've seen three different and mutually exclusive ways to take screenshots
    on Android.

    1) press power and volume_down simultaneously
    2) using the edge of your hand, swipe the screen left to right
    3) pull down the "Quick Settings" menu, click the "Take Screenshot" icon

    # 2 and 3 seem to be device and Android version dependant.

    Mine is supposed to trigger a tiny menu on long press on the power
    button, and one of the entries should be screenshot. It no longer does
    that. Instead, "press power and volume_down simultaneously" works now.

    Also, swipe up from bottom-centre edge, then I get a bunch of smaller
    displays of all the applications that are active; select the one, and
    tap a small button to take the shot.
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux on Wed Mar 4 19:27:45 2026
    From Newsgroup: alt.os.linux

    On 4/03/2026 1:13 am, J.O. Aho wrote:
    On 03/03/2026 13.52, Daniel70 wrote:
    On 3/03/2026 11:32 pm, Paul wrote:

    A Google search here, put a Gemini summary at the top of the
    page. This is mostly a repeat of the answer Andy gave.

    Thank you, Paul. I think I prefer HI (Human Intelligence) rather
    than AI (Artificial Intelligence). ;-P

    Sadly still no AI, all you have is Large Language Models

    Ah!! There you go! Just showing off my ignorance!! ;-P

    that just generates a flow of text that most likely hang together
    based on the input the program got.
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From wasbit@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Wed Mar 4 09:24:53 2026
    From Newsgroup: alt.os.linux

    On 03/03/2026 12:52, Daniel70 wrote:
    snip <
    Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial Intelligence
    a cow! :)
    --
    Regards
    wasbit
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Wed Mar 4 22:12:18 2026
    From Newsgroup: alt.os.linux

    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip <
    Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI
    (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial Intelligence
    a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Paul@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Wed Mar 4 15:25:16 2026
    From Newsgroup: alt.os.linux

    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip <
    Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI >>> (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI)
    is the one that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    "ELIZA won a 2021 Legacy Peabody Award. A 2023 preprint reported that
    ELIZA beat OpenAI's GPT-3.5, the model used by ChatGPT at the time, in
    a Turing test study. However, it did not outperform GPT-4 or real humans."

    Now, that's just crazy talk. ELIZA must have written that :-)
    Nobody could mistake ELIZA for an AI. If anything, it was
    intended to show people what a "machine intelligence" would
    do. It would write sentences in response to the sentences
    you fed it (a big deal at the time). And that article points
    out, that some people believed it was real.

    "[Weizenbaum] was surprised that some people, including his secretary,
    attributed human-like feelings to the computer program, A phenomenon that
    came to be called the ELIZA effect."

    I would say the LLM-AI comes closer, because at least its compositions
    have a theme, and the theme is consistent throughout a composition. But
    they also have a "voice" that is not very convincing. This one for example...

    "Let's dig in!"

    As if they'd just served a large helping of mashed potato
    and included a dish of butter and a fork.

    This is the future of AI. The arising from primordial slime.
    ______
    ______ | 20xx |
    ______ | 2021 | | |
    | 1966 | | | | |
    ------ ------ ------
    Running
    Crawling Walking With Scissors

    Paul
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Wed Mar 4 21:53:33 2026
    From Newsgroup: alt.os.linux

    On 2026-03-04 21:25, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip <
    Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI >>>> (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI)
    is the one that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    I met Eliza when a friend programmed it in Basic in my Casio FX-850P. I
    was amazed.

    I think he had previously done it in a Sinclair ZX81.
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From rbowman@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 02:03:30 2026
    From Newsgroup: alt.os.linux

    On Wed, 4 Mar 2026 15:25:16 -0500, Paul wrote:

    Now, that's just crazy talk. ELIZA must have written that Nobody could mistake ELIZA for an AI. If anything, it was intended to show people
    what a "machine intelligence" would do. It would write sentences in
    response to the sentences you fed it (a big deal at the time). And that article points out, that some people believed it was real.

    I think I first ran into Eliza in a collection of CP/M software. I thought
    it was a spoof on the Rogers style therapy that was somewhat popular at
    the time.
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Andy Burns@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 07:51:24 2026
    From Newsgroup: alt.os.linux

    rbowman wrote:

    I think I first ran into Eliza in a collection of CP/M software. I thought
    it was a spoof on the Rogers style therapy that was somewhat popular at
    the time.

    "Tell me more about your mother ..."

    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 19:59:14 2026
    From Newsgroup: alt.os.linux

    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human
    Intelligence) rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 10:17:54 2026
    From Newsgroup: alt.os.linux

    On 2026-03-05 09:59, Daniel70 wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human
    Intelligence) rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release   1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!

    I met her in the eighties.
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 10:16:49 2026
    From Newsgroup: alt.os.linux

    On 2026-03-05 08:51, Andy Burns wrote:
    rbowman wrote:

    I think I first ran into Eliza in a collection of CP/M software. I
    thought
    it was a spoof on the Rogers style therapy that was somewhat popular at
    the time.

        "Tell me more about your mother ..."


    hee heh heh :-)
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From David LaRue@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 10:45:19 2026
    From Newsgroup: alt.os.linux

    "Carlos E.R." <[email protected]d> wrote in news:2bun7mxjfm.ln2 @Telcontar.valinor:

    On 2026-03-05 09:59, Daniel70 wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human
    Intelligence) rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release   1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!

    I met her in the eighties.

    ELIZA was a simple program that could fool a lot of fools into thinking
    there was a human behind the curtian. It was intersting enough to me in
    1978 or so when I encountered ut to look into its internals. It was so simple. I even added a few rules on my copy when it would make a mistake. Basicially it could miss less common words and you just have to add a rule
    for that.

    ELIZA was created by Joseph Weisenbaum (sp?) as part of his doctorate
    thesis. The thesis was interesting to read. At the time I was just
    starting my adventure into creating software. I recently retired but it
    has been an interesting adventure.

    If ELIZA intriges you, look into the writings of so many greats from that
    era. Read, learn, and use what you need. There are so many good ideas out there for you to expand on if you have the energy to use them wisely.
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Jasen Betts@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 12:33:31 2026
    From Newsgroup: alt.os.linux

    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human
    Intelligence) rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C
    --
    Jasen.
    🇺🇦 Слава Україні
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Mark Lloyd@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 19:13:43 2026
    From Newsgroup: alt.os.linux

    On Thu, 5 Mar 2026 07:51:24 +0000, Andy Burns wrote:

    rbowman wrote:

    I think I first ran into Eliza in a collection of CP/M software. I
    thought it was a spoof on the Rogers style therapy that was somewhat
    popular at the time.

    "Tell me more about your mother ..."

    I remember hearing of someone who entered "Necessity is the mother of invention." and got the above response.
    --
    Mark Lloyd
    http://notstupid.us/

    "Christendom has done away with Christianity without being quite aware
    of it." [Soren Kierkegaard, Time magazine, 16 December 1946]
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Mark Lloyd@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 19:16:16 2026
    From Newsgroup: alt.os.linux

    On Thu, 5 Mar 2026 19:59:14 +1100, Daniel70 wrote:

    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human Intelligence)
    rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!

    It was in the eighties (it had to have been between 1982 and 1988) when I first encountered Eliza.
    --
    Mark Lloyd
    http://notstupid.us/

    "Christendom has done away with Christianity without being quite aware
    of it." [Soren Kierkegaard, Time magazine, 16 December 1946]
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From rbowman@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Thu Mar 5 19:30:19 2026
    From Newsgroup: alt.os.linux

    On Thu, 5 Mar 2026 19:59:14 +1100, Daniel70 wrote:

    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human Intelligence)
    rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!

    Way.
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 20:00:44 2026
    From Newsgroup: alt.os.linux

    On 5/03/2026 11:33 pm, Jasen Betts wrote:
    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human
    Intelligence) rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've never considered them in more widespread than Defence-type usage.
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 20:06:11 2026
    From Newsgroup: alt.os.linux

    On 6/03/2026 6:30 am, rbowman wrote:
    On Thu, 5 Mar 2026 19:59:14 +1100, Daniel70 wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human Intelligence)
    rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!

    Way.

    Yeah! Way! since my previous post I've remembered a work-colleague
    putting together a Micro-B computer in about 1979/1980, .. so certainly 'Way"!!
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 11:35:08 2026
    From Newsgroup: alt.os.linux

    On 2026-03-06 10:00, Daniel70 wrote:
    On 5/03/2026 11:33 pm, Jasen Betts wrote:
    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human
    Intelligence) rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release   1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've never considered them in more widespread than Defence-type usage.

    66 is not WWII times. It is the times of the Apolo missions, which had
    flight computers. Computers did exist, although huge. Early 70s, there
    was a computer room at my father's job. Programmers tried things to find
    out what could be done with a computer.
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 22:03:49 2026
    From Newsgroup: alt.os.linux

    On 6/03/2026 9:35 pm, Carlos E.R. wrote:
    On 2026-03-06 10:00, Daniel70 wrote:
    On 5/03/2026 11:33 pm, Jasen Betts wrote:
    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human
    Intelligence) rather than AI (Artificial Intelligence).
    ;-P

    AI stands for Artificial Insemination. You can't
    Artificial Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM"
    instead??

    They use LLM-AI to signify "this is not the final or real
    one".

    An AI that achieves Artificial General Intelligence (AGI) is
    the one that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966,
    No Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've
    never considered them in more widespread than Defence-type usage.

    66 is not WWII times. It is the times of the Apolo missions,

    Of course!! Of course!!

    Gee Whiz. I just hate it when someone shoots me down in flames ....
    soooo easily!!

    which had flight computers. Computers did exist, although huge. Early
    70s, there was a computer room at my father's job.

    And another trip down memory line. As part of my Army Apprenticeship, I
    spent time at an Australian Army Signals Unit in 1976

    They were the Hub for Signals-switching for all Aust Army Units and for
    that purpose, they had S.T.R.a.D. (Signals Reception Transmission and Distribution) which was (I think) six equipment bays, each ten or
    fifteen meters long.

    All flashing lights, etc.

    Programmers tried things to find out what could be done with a
    computer.
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From jjb@[email protected] to alt.os.linux on Fri Mar 6 17:09:41 2026
    From Newsgroup: alt.os.linux

    On 2026-03-06 12:03, Daniel70 wrote:
    On 6/03/2026 9:35 pm, Carlos E.R. wrote:
    On 2026-03-06 10:00, Daniel70 wrote:
    On 5/03/2026 11:33 pm, Jasen Betts wrote:
    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human
    Intelligence) rather than AI (Artificial Intelligence).
    ;-P

    AI stands for Artificial Insemination. You can't
    Artificial Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM"
    instead??

    They use LLM-AI to signify "this is not the final or real
    one".

    An AI that achieves Artificial General Intelligence (AGI) is
    the one that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release   1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966,
    No Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've
    never considered them in more widespread than Defence-type usage.

    66 is not WWII times. It is the times of the Apolo missions,

    Of course!! Of course!!

    Gee Whiz. I just hate it when someone shoots me down in flames ....
    soooo easily!!

    which had flight computers. Computers did exist, although huge. Early
    70s, there was a computer room at my father's job.

    And another trip down memory line. As part of my Army Apprenticeship, I
    spent time at an Australian Army Signals Unit in 1976

    They were the Hub for Signals-switching for all Aust Army Units and for
    that purpose, they had S.T.R.a.D. (Signals Reception Transmission and Distribution) which was (I think) six equipment bays, each ten or
    fifteen meters long.

    All flashing lights, etc.

    Programmers tried things to find out what could be done with a
    computer.


    It was 1968 when I started programming a Univac 1005 system, a year
    later followed by a Siemens 4004 system. The 1005 was the size of a big shrank, the 4004 sat in a big room. So yes, computers did exist and
    were used by non-military in 1968...
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Tim Slattery@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 11:09:54 2026
    From Newsgroup: alt.os.linux

    "Carlos E.R." <[email protected]d> wrote:

    Defence-type usage.

    66 is not WWII times. It is the times of the Apollo missions, which had >flight computers. Computers did exist, although huge. Early 70s, there
    was a computer room at my father's job. Programmers tried things to find
    out what could be done with a computer.

    I was in Palo Alto (California) High School at that time. The high
    school was right next to the School District offices, so I was able to
    take a computer programming course. We were able to use the school
    district's IBM1620 to run our programs. A small (for the time)
    machine, and not very powerful, but it got me into programming. IBM
    360s and 370s were also around at that time, and many governments and
    companies used them. Micro computers that you could own yourself
    debuted in the 1970s.
    --
    Tim Slattery
    timslattery <at> utexas <dot> edu
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Kerr-Mudd, John@[email protected] to alt.os.linux,alt.comp.os.windows-11,alt.folklore.computers on Fri Mar 6 16:27:06 2026
    From Newsgroup: alt.os.linux

    On Fri, 06 Mar 2026 11:09:54 -0500
    Tim Slattery <[email protected]> wrote:

    "Carlos E.R." <[email protected]d> wrote:

    Defence-type usage.

    66 is not WWII times. It is the times of the Apollo missions, which had >flight computers. Computers did exist, although huge. Early 70s, there
    was a computer room at my father's job. Programmers tried things to find >out what could be done with a computer.

    I was in Palo Alto (California) High School at that time. The high
    school was right next to the School District offices, so I was able to
    take a computer programming course. We were able to use the school
    district's IBM1620 to run our programs. A small (for the time)
    machine, and not very powerful, but it got me into programming. IBM
    360s and 370s were also around at that time, and many governments and companies used them. Micro computers that you could own yourself
    debuted in the 1970s.

    Us nostalgic folk are over here: -->

    (Xposted, FU to afc)
    --
    Bah, and indeed Humbug.
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From rbowman@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 18:51:29 2026
    From Newsgroup: alt.os.linux

    On Fri, 6 Mar 2026 20:00:44 +1100, Daniel70 wrote:

    On 5/03/2026 11:33 pm, Jasen Betts wrote:
    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human Intelligence) >>>>>>> rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one
    that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No
    Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've never considered them in more widespread than Defence-type usage.

    https://en.wikipedia.org/wiki/Z3_(computer)
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From rbowman@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 19:06:26 2026
    From Newsgroup: alt.os.linux

    On Fri, 6 Mar 2026 22:03:49 +1100, Daniel70 wrote:

    On 6/03/2026 9:35 pm, Carlos E.R. wrote:
    On 2026-03-06 10:00, Daniel70 wrote:
    On 5/03/2026 11:33 pm, Jasen Betts wrote:
    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human Intelligence) >>>>>>>>> rather than AI (Artificial Intelligence).
    ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow!

    AH!! Of course. Is that why people now speak of "LLM"
    instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the
    one that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966,
    No Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've never
    considered them in more widespread than Defence-type usage.

    66 is not WWII times. It is the times of the Apolo missions,

    Of course!! Of course!!

    Gee Whiz. I just hate it when someone shoots me down in flames ....
    soooo easily!!

    My first programming class was in '66, FORTRAN IV on an IBM System
    360/30. There were a few schools like Dartmouth that had started
    fledgling CS programs but this wasn't seen as a career path, only another
    tool to be used like our slide rules. There was also an analog computer
    lab just in case. Except in niche applications analog was on the way out.

    Good things never die and analog is showing some promise in neuromorphic applications. After all the brain is an electrochemical analog device.

    As for Eliza, there were CP/M versions running on 8080 and Z80 machines in
    the late '70s.

    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From rbowman@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 19:11:38 2026
    From Newsgroup: alt.os.linux

    On Fri, 06 Mar 2026 11:09:54 -0500, Tim Slattery wrote:

    "Carlos E.R." <[email protected]d> wrote:

    Defence-type usage.

    66 is not WWII times. It is the times of the Apollo missions, which had >>flight computers. Computers did exist, although huge. Early 70s, there
    was a computer room at my father's job. Programmers tried things to find >>out what could be done with a computer.

    I was in Palo Alto (California) High School at that time. The high
    school was right next to the School District offices, so I was able to
    take a computer programming course. We were able to use the school
    district's IBM1620 to run our programs. A small (for the time) machine,
    and not very powerful, but it got me into programming. IBM 360s and 370s
    were also around at that time, and many governments and companies used
    them. Micro computers that you could own yourself debuted in the 1970s.

    While I learned FORTRAN IV in the mid-60s I didn't have much interest in programming until the '70s. I'd worked with industrial control circuitry,
    all relay logic, that slowly went solid state, and ultimately to MCUs. One 8080 could replace a LOT of octal base relays. Logic is logic.
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 22:40:12 2026
    From Newsgroup: alt.os.linux

    On 2026-03-06 20:11, rbowman wrote:
    On Fri, 06 Mar 2026 11:09:54 -0500, Tim Slattery wrote:

    "Carlos E.R." <[email protected]d> wrote:

    Defence-type usage.

    66 is not WWII times. It is the times of the Apollo missions, which had
    flight computers. Computers did exist, although huge. Early 70s, there
    was a computer room at my father's job. Programmers tried things to find >>> out what could be done with a computer.

    I was in Palo Alto (California) High School at that time. The high
    school was right next to the School District offices, so I was able to
    take a computer programming course. We were able to use the school
    district's IBM1620 to run our programs. A small (for the time) machine,
    and not very powerful, but it got me into programming. IBM 360s and 370s
    were also around at that time, and many governments and companies used
    them. Micro computers that you could own yourself debuted in the 1970s.

    While I learned FORTRAN IV in the mid-60s I didn't have much interest in programming until the '70s. I'd worked with industrial control circuitry,
    all relay logic, that slowly went solid state, and ultimately to MCUs. One 8080 could replace a LOT of octal base relays. Logic is logic.

    Once I built a 1 bit adder with relays, just for fun. Nobody appreciated
    the fun, the IBM PC clone era was in full blast.
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Fri Mar 6 22:43:00 2026
    From Newsgroup: alt.os.linux

    On 2026-03-06 19:51, rbowman wrote:
    On Fri, 6 Mar 2026 20:00:44 +1100, Daniel70 wrote:

    On 5/03/2026 11:33 pm, Jasen Betts wrote:
    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human Intelligence) >>>>>>>> rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one >>>>> that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No
    Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've never
    considered them in more widespread than Defence-type usage.

    https://en.wikipedia.org/wiki/Z3_(computer)

    Nice one. :-)
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From vallor@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 06:03:59 2026
    From Newsgroup: alt.os.linux

    At Fri, 6 Mar 2026 22:43:00 +0100, "Carlos E.R."
    <[email protected]d> wrote:

    On 2026-03-06 19:51, rbowman wrote:
    On Fri, 6 Mar 2026 20:00:44 +1100, Daniel70 wrote:

    On 5/03/2026 11:33 pm, Jasen Betts wrote:
    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human
    Intelligence) rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is
    the one that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release 1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966,
    No Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've
    never considered them in more widespread than Defence-type usage.

    https://en.wikipedia.org/wiki/Z3_(computer)

    Nice one. :-)

    Yes indeed -- sent me down a Wikipedia rabbit hole.
    --
    -v System76 Thelio Mega v1.1 x86_64 Mem: 258G
    OS: Linux 7.0.0-rc2 D: Mint 22.3 DE: Xfce 4.18 (X11)
    NVIDIA GeForce RTX 3090Ti (24G) (580.126.18)
    "Wesley Crusher, please report to airlock 5!"
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Paul@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 02:02:29 2026
    From Newsgroup: alt.os.linux

    On Fri, 3/6/2026 4:40 PM, Carlos E.R. wrote:
    On 2026-03-06 20:11, rbowman wrote:
    On Fri, 06 Mar 2026 11:09:54 -0500, Tim Slattery wrote:

    "Carlos E.R." <[email protected]d> wrote:

      Defence-type usage.

    66 is not WWII times. It is the times of the Apollo missions, which had >>>> flight computers. Computers did exist, although huge. Early 70s, there >>>> was a computer room at my father's job. Programmers tried things to find >>>> out what could be done with a computer.

    I was in Palo Alto (California) High School at that time. The high
    school was right next to the School District offices, so I was able to
    take a computer programming course. We were able to use the school
    district's IBM1620 to run our programs. A small (for the time) machine,
    and not very powerful, but it got me into programming. IBM 360s and 370s >>> were also around at that time, and many governments and companies used
    them. Micro computers that you could own yourself debuted in the 1970s.

    While I learned FORTRAN IV in the mid-60s I didn't have much interest in
    programming until the '70s. I'd worked with industrial control circuitry,
    all relay logic, that slowly went solid state, and ultimately to MCUs. One >> 8080 could replace a LOT of octal base relays. Logic is logic.

    Once I built a 1 bit adder with relays, just for fun. Nobody appreciated the fun, the IBM PC clone era was in full blast.


    Now that's something I've not tried, is using relays for logic.

    That means you must be an expert at designing snubbers then :-)

    You should have gone for broke, and done a 4 bit adder.
    Then your next step would be a calculator made out
    of relays.

    We had a guy at work, who liked to design asynchronous
    logic. His circuits always ran faster than everyone
    elses (because... they didn't wait for a clock edge).
    But designing those (without computer assistance), is
    a lot of work. As you need cover terms so stuff does
    not glitch. There were even commercial companies
    interested in the idea, but it kinda died out. At least
    he didn't break anything. I could trust him not to
    blow up a project. He wasn't a kook.

    *******

    Some engineers are known for their weird fixations with components.
    My manager hired a guy, he was mostly non-communicative. I couldn't
    say there was a language barrier, as we never had any conversations
    with him.

    He was given a specification to work with, and he went off to design it.
    Months went by, he was wire wrapping it in the lab and so on. Well,
    nobody pokes around someone elses design (unless it is design review
    time). And being non-communicative, he wasn't partnered with anyone,
    he didn't ask any questions and so on. In other words, no one at
    all was able to learn anything about exactly what he was doing.

    So one day, he tells the manager it is finished and it is running
    in the lab. The proof it is working, is one red LED. If the LED
    is lit, it's working. If the LED is off, it's not working. (This is
    a bunch of status circuitry, monitoring logic operation.) Well,
    the thing was built entirely out of hex packs of transistors.
    All the gates (it's a digital logic function) were made from transistors.
    There was no jelly bean logic on the board. There must have been
    hundreds and hundreds of transistors. Then the guy says "he's leaving"
    and he is gone, just like that. We never heard from the manager, exactly what he thought of this :-) But, another lesson learned about
    handling people.

    If you're going to make a fetish about designing with relays,
    don't tell anyone :-) And pretend to be non-communicative
    while you're building your contraption. Seems a good strategy.

    Paul
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Paul@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 02:50:05 2026
    From Newsgroup: alt.os.linux

    On Fri, 3/6/2026 4:43 PM, Carlos E.R. wrote:
    On 2026-03-06 19:51, rbowman wrote:
    On Fri, 6 Mar 2026 20:00:44 +1100, Daniel70 wrote:

    On 5/03/2026 11:33 pm, Jasen Betts wrote:
    On 2026-03-05, Daniel70 <[email protected]> wrote:
    On 5/03/2026 7:25 am, Paul wrote:
    On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
    On 4/03/2026 8:24 pm, wasbit wrote:
    On 03/03/2026 12:52, Daniel70 wrote:
    snip < Thank you, Paul. I think I prefer HI (Human Intelligence) >>>>>>>>> rather than AI (Artificial Intelligence). ;-P

    AI stands for Artificial Insemination. You can't Artificial
    Intelligence a cow! :)

    AH!! Of course. Is that why people now speak of "LLM" instead??

    They use LLM-AI to signify "this is not the final or real one".

    An AI that achieves Artificial General Intelligence (AGI) is the one >>>>>> that will join the "exclusive AI club".

    The first AI I ran into, was a port of ELIZA.

    https://en.wikipedia.org/wiki/ELIZA

    Initial release   1966

    1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No
    Way!!

    '66? Sounds legit.

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've never
    considered them in more widespread than Defence-type usage.

    https://en.wikipedia.org/wiki/Z3_(computer)

    Nice one. :-)


    And there are still people doing stuff like that.

    "How to Build a 4-Bit Computer on Breadboards Using Individual Transistors"

    https://www.youtube.com/watch?v=_eo8l7HP-9U

    That one does not do branching either.

    That's an expensive way to build a digital circuit,
    as jelly beans at one time were cheaper than
    transistors. Some of the jelly beans were $0.25 .

    *******

    This one is more conventional. And considering the logic,
    runs at a decent speed. It's using a 74181 for its
    arithmetic ("ALU").

    https://www.bigmessowires.com/nibbler/

    Paul

    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 20:53:50 2026
    From Newsgroup: alt.os.linux

    On 7/03/2026 6:06 am, rbowman wrote:
    On Fri, 6 Mar 2026 22:03:49 +1100, Daniel70 wrote:
    On 6/03/2026 9:35 pm, Carlos E.R. wrote:
    On 2026-03-06 10:00, Daniel70 wrote:
    On 5/03/2026 11:33 pm, Jasen Betts wrote:

    <Snip>

    There were computers before UNIX, and languages before C

    Oh, sure, there were computers back in WWII-times .... just I've never >>>> considered them in more widespread than Defence-type usage.

    66 is not WWII times. It is the times of the Apolo missions,

    Of course!! Of course!!

    Gee Whiz. I just hate it when someone shoots me down in flames ....
    soooo easily!!

    My first programming class was in '66, FORTRAN IV on an IBM System
    360/30. There were a few schools like Dartmouth that had started
    fledgling CS programs but this wasn't seen as a career path, only another tool to be used like our slide rules. There was also an analog computer
    lab just in case. Except in niche applications analog was on the way out.

    Good things never die and analog is showing some promise in neuromorphic applications. After all the brain is an electrochemical analog device.

    As I think I've mentioned elsewhere, I did my Electronics Apprenticeship
    in the Aust Army 1973 -75. If, at the end of that three year period, we
    had passed all our Trade Modules (Discrete Components all the way up to Transistors and Valves)/General Education Subjects/Military
    Training/Physical Training Assessments, there was a two Week period
    where we were given an Introduction to Integrated Circuits .... And/Or/Nand/Nor ... up to four gates per IC!!

    High Teck!!

    As for Eliza, there were CP/M versions running on 8080 and Z80 machines in the late '70s.

    Studied 8085/Z80/6809 programming 1989 --> 1991.
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From rbowman@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 19:25:18 2026
    From Newsgroup: alt.os.linux

    On Fri, 6 Mar 2026 22:40:12 +0100, Carlos E.R. wrote:

    Once I built a 1 bit adder with relays, just for fun. Nobody appreciated
    the fun, the IBM PC clone era was in full blast.


    https://codehiddenlanguage.com/Chapter08/ https://codehiddenlanguage.com/Chapter14/

    Note that the pages are interactive. In chapter 8 he relates relay
    circuits to the logic symbols. In the text he says you could build an
    adder with relays -- a lot of relays.

    It's an interesting book by Charles Petzhold who wrote many Windows programming books. It reads like it was written for young adults but some
    of the concepts get pretty deep.
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From rbowman@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 19:57:36 2026
    From Newsgroup: alt.os.linux

    On Sat, 7 Mar 2026 02:02:29 -0500, Paul wrote:

    Now that's something I've not tried, is using relays for logic.

    That means you must be an expert at designing snubbers then

    You should have gone for broke, and done a 4 bit adder.
    Then your next step would be a calculator made out of relays.

    See my other post in the thread about Petzholt's 'Code' and his
    interactive website.

    https://www.allaboutcircuits.com/worksheets/electromechanical-relay-logic/

    The circuits could get complex with inputs from pushbuttons, limit
    switches, electro-mechanical timers, and other hardware. You were building
    a state machine with 120 VAC components.

    Solid state slowly entered the industrial field. Square D, a major
    supplier of switchgear, came out with NORPAK. There was an assortment of modules and back planes to mount them in. Programming was done with wire jumpers similar to the Dupont wires used with solderless breadboards. They were more secure since you used something like a automatic center punch to
    set the taper pin.

    The problem was the NOR gate is the easiest to create with transistors.
    Try designing logic when all you have is NORs and inverters. You start
    talking to yourself.

    Next up was the programmable logic controller (PLC) which is used to this
    day. I never worked with them as I'd moved on to straight 8080/Z80 controllers.

    At the time the interface used the metaphor of relay based ladder diagrams that was understood by industrial electricians. I was surprised when I interviewed a candidate who had experience with PLCs and he said ladder diagrams are still the most popular interface.

    Logic is logic. We built plastics molding systems and the hydraulic
    circuitry also implements logic with spool valves, check valves, and so
    forth. Air logic is the same. I've used that for explosive environments
    when you don't want sparks.

    https://fluidpowerjournal.com/design-efficient-air-logic-system/


    Fluidics is similar but can work with no moving parts at all.

    https://en.wikipedia.org/wiki/Fluidics

    It's all logic.

    https://en.wikipedia.org/wiki/Z3_(computer)


    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From rbowman@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 20:03:13 2026
    From Newsgroup: alt.os.linux

    On Sat, 7 Mar 2026 20:53:50 +1100, Daniel70 wrote:

    Studied 8085/Z80/6809 programming 1989 --> 1991.

    Kids! THe 8080 came out in '74 followed by the Z80 in '76. I mostly
    worked with Z80s. It had a couple of extra instructions but was otherwise
    the same. Of course Intel had a patent on the assembler so Zilog had to do
    it differently.

    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 22:04:28 2026
    From Newsgroup: alt.os.linux

    On 2026-03-07 08:02, Paul wrote:
    On Fri, 3/6/2026 4:40 PM, Carlos E.R. wrote:
    On 2026-03-06 20:11, rbowman wrote:
    On Fri, 06 Mar 2026 11:09:54 -0500, Tim Slattery wrote:

    "Carlos E.R." <[email protected]d> wrote:

      Defence-type usage.

    66 is not WWII times. It is the times of the Apollo missions, which had >>>>> flight computers. Computers did exist, although huge. Early 70s, there >>>>> was a computer room at my father's job. Programmers tried things to find >>>>> out what could be done with a computer.

    I was in Palo Alto (California) High School at that time. The high
    school was right next to the School District offices, so I was able to >>>> take a computer programming course. We were able to use the school
    district's IBM1620 to run our programs. A small (for the time) machine, >>>> and not very powerful, but it got me into programming. IBM 360s and 370s >>>> were also around at that time, and many governments and companies used >>>> them. Micro computers that you could own yourself debuted in the 1970s. >>>
    While I learned FORTRAN IV in the mid-60s I didn't have much interest in >>> programming until the '70s. I'd worked with industrial control circuitry, >>> all relay logic, that slowly went solid state, and ultimately to MCUs. One >>> 8080 could replace a LOT of octal base relays. Logic is logic.

    Once I built a 1 bit adder with relays, just for fun. Nobody appreciated the fun, the IBM PC clone era was in full blast.


    Now that's something I've not tried, is using relays for logic.

    That means you must be an expert at designing snubbers then :-)

    Oh, not a problem at all on that board. I just used a single breadboard, small, and relays that could be inserted directly there, kind of
    miniatures. And LEDs for lights.


    You should have gone for broke, and done a 4 bit adder.
    Then your next step would be a calculator made out
    of relays.

    I wanted to, but did not have that many relays.

    I lost the schematic, I don't remember how I did it. I emulated xor
    gates, I think.


    We had a guy at work, who liked to design asynchronous
    logic. His circuits always ran faster than everyone
    elses (because... they didn't wait for a clock edge).

    Right!

    But designing those (without computer assistance), is
    a lot of work. As you need cover terms so stuff does
    not glitch. There were even commercial companies
    interested in the idea, but it kinda died out. At least
    he didn't break anything. I could trust him not to
    blow up a project. He wasn't a kook.

    *******

    Some engineers are known for their weird fixations with components.
    My manager hired a guy, he was mostly non-communicative. I couldn't
    say there was a language barrier, as we never had any conversations
    with him.

    He was given a specification to work with, and he went off to design it. Months went by, he was wire wrapping it in the lab and so on. Well,
    nobody pokes around someone elses design (unless it is design review
    time). And being non-communicative, he wasn't partnered with anyone,
    he didn't ask any questions and so on. In other words, no one at
    all was able to learn anything about exactly what he was doing.

    So one day, he tells the manager it is finished and it is running
    in the lab. The proof it is working, is one red LED. If the LED
    is lit, it's working. If the LED is off, it's not working. (This is
    a bunch of status circuitry, monitoring logic operation.) Well,
    the thing was built entirely out of hex packs of transistors.
    All the gates (it's a digital logic function) were made from transistors. There was no jelly bean logic on the board. There must have been
    hundreds and hundreds of transistors. Then the guy says "he's leaving"
    and he is gone, just like that. We never heard from the manager, exactly what he thought of this :-) But, another lesson learned about
    handling people.

    !!


    If you're going to make a fetish about designing with relays,
    don't tell anyone :-) And pretend to be non-communicative
    while you're building your contraption. Seems a good strategy.

    LOL :-D
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 22:09:08 2026
    From Newsgroup: alt.os.linux

    On 2026-03-07 20:57, rbowman wrote:
    On Sat, 7 Mar 2026 02:02:29 -0500, Paul wrote:

    Now that's something I've not tried, is using relays for logic.

    That means you must be an expert at designing snubbers then

    You should have gone for broke, and done a 4 bit adder.
    Then your next step would be a calculator made out of relays.

    See my other post in the thread about Petzholt's 'Code' and his
    interactive website.

    https://www.allaboutcircuits.com/worksheets/electromechanical-relay-logic/

    The circuits could get complex with inputs from pushbuttons, limit
    switches, electro-mechanical timers, and other hardware. You were building
    a state machine with 120 VAC components.

    Solid state slowly entered the industrial field. Square D, a major
    supplier of switchgear, came out with NORPAK. There was an assortment of modules and back planes to mount them in. Programming was done with wire jumpers similar to the Dupont wires used with solderless breadboards. They were more secure since you used something like a automatic center punch to set the taper pin.

    The problem was the NOR gate is the easiest to create with transistors.
    Try designing logic when all you have is NORs and inverters. You start talking to yourself.

    Next up was the programmable logic controller (PLC) which is used to this day. I never worked with them as I'd moved on to straight 8080/Z80 controllers.

    I had a training course with them. Beautiful things.


    At the time the interface used the metaphor of relay based ladder diagrams that was understood by industrial electricians. I was surprised when I interviewed a candidate who had experience with PLCs and he said ladder diagrams are still the most popular interface.

    Yes, it is curious.


    Logic is logic. We built plastics molding systems and the hydraulic
    circuitry also implements logic with spool valves, check valves, and so forth. Air logic is the same. I've used that for explosive environments
    when you don't want sparks.

    https://fluidpowerjournal.com/design-efficient-air-logic-system/


    At the PCC course I took, the first day we build a machine with
    compressed air, valves and pistons, to see the idea.



    Fluidics is similar but can work with no moving parts at all.

    https://en.wikipedia.org/wiki/Fluidics

    It's all logic.

    https://en.wikipedia.org/wiki/Z3_(computer)


    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Carlos E.R.@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Sat Mar 7 22:13:48 2026
    From Newsgroup: alt.os.linux

    On 2026-03-07 20:25, rbowman wrote:
    On Fri, 6 Mar 2026 22:40:12 +0100, Carlos E.R. wrote:

    Once I built a 1 bit adder with relays, just for fun. Nobody appreciated
    the fun, the IBM PC clone era was in full blast.


    https://codehiddenlanguage.com/Chapter08/ https://codehiddenlanguage.com/Chapter14/

    Note that the pages are interactive. In chapter 8 he relates relay
    circuits to the logic symbols. In the text he says you could build an
    adder with relays -- a lot of relays.

    Interesting!


    It's an interesting book by Charles Petzhold who wrote many Windows programming books. It reads like it was written for young adults but some
    of the concepts get pretty deep.

    I remember that name.
    --
    Cheers, Carlos.
    ES🇪🇸, EU🇪🇺;
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Daniel70@[email protected] to alt.os.linux,alt.comp.os.windows-11 on Mon Mar 9 21:26:29 2026
    From Newsgroup: alt.os.linux

    On 8/03/2026 7:03 am, rbowman wrote:
    On Sat, 7 Mar 2026 20:53:50 +1100, Daniel70 wrote:

    Studied 8085/Z80/6809 programming 1989 --> 1991.

    Kids! THe 8080 came out in '74 followed by the Z80 in '76. I mostly
    worked with Z80s.

    In 1989, I knew I would be going on Full Time Schooling to get my
    Associate Diploma of Engineering (Electronics) as it was required for
    further promotion with-in The Royal Australian Corps of Signals with-in
    the Australian Army and I knew one of the subjects taught would be Micro-Processors (6809), so I thought I'd get a head start, in 1989, by
    doing the 8085 subject at night school.

    Oww! Oww! I told a lie!! ;-P Read On.

    Then, after I got out of the Army (1993), I did some relief, fill in,
    teaching and THEN was told I'd be teaching the Z80 to the next
    generation of Army Apprentice!!

    It had a couple of extra instructions but was otherwise the same.

    "had a couple of extra instructions"!! Off the top of my head, an 8085
    had, was it, 256 instructions (8 bit Op Code??) whereas the Z80 had 510
    (8 bit Op Code??) instructions (255 basic Op Codes plus a
    secondary/Alternate set of 255 Op Codes).

    The 256th OP Code told the System that the Byte what follows was an
    Alternate Set Op Code.

    Of course Intel had a patent on the assembler so Zilog had to do it differently.

    Strangely, I thought Intel released the patents/designs-works for the 8085/8086/80186/80286/80386/80486 to every manufacturer that wanted it
    so that 'they' could wipe out Apples uProcessors ..... and then, with
    their Aim effectively achieved, Intel shut the gates so everyone HAD to
    buy their uProcessors from Intel.

    That led AMD and others to develop their own 586's/686's, etc!!

    Or something like that!!
    --
    Daniel70
    --- Synchronet 3.21d-Linux NewsLink 1.2