I've just had one of my sisters ring to say she's coming over tomorrow afternoon so I can show her how to take a photo/screenshot of her Mobile Phone's screen.
Cross-posted to alt.os.linux and alt.comp.os.windows-11
If I wanted to save a copy of what icons I have on my Computers Desktop,
I could press "Print Screen" or similar and a 'picture' of my desktop
would be saved somewhere.
However, I've just had one of my sisters ring to say she's coming over tomorrow afternoon so I can show her how to take a photo/screenshot of
her Mobile Phone's screen.
Is this natively possible?? (Don't quote me but I think her phone is NOT
an Iphone, if it matters.)
(I'm thinking I'll take a photo of her Phone on my Phone and text it to her!)
Daniel70 wrote:
I've just had one of my sisters ring to say she's coming over
tomorrow afternoon so I can show her how to take a photo/screenshot
of her Mobile Phone's screen.
On most android phones, press vol- and power simultaneously,
if you want to get fancy newer android versions allow capturing an--
area larger or smaller than the actual screen (see icons like ^/v or
a pencil that appear at the bottom)
Cross-posted to alt.os.linux and alt.comp.os.windows-11
If I wanted to save a copy of what icons I have on my Computers Desktop, I could press "Print Screen" or similar and a 'picture' of my desktop would be saved somewhere.
However, I've just had one of my sisters ring to say she's coming over tomorrow afternoon so I can show her how to take a photo/screenshot of her Mobile Phone's screen.
Is this natively possible?? (Don't quote me but I think her phone is NOT an Iphone, if it matters.)
(I'm thinking I'll take a photo of her Phone on my Phone and text it to her!)
TIA
On Tue, 3/3/2026 5:24 AM, Daniel70 wrote:
Cross-posted to alt.os.linux and alt.comp.os.windows-11
If I wanted to save a copy of what icons I have on my Computers
Desktop, I could press "Print Screen" or similar and a 'picture' of
my desktop would be saved somewhere.
However, I've just had one of my sisters ring to say she's coming
over tomorrow afternoon so I can show her how to take a
photo/screenshot of her Mobile Phone's screen.
Is this natively possible?? (Don't quote me but I think her phone
is NOT an Iphone, if it matters.)
(I'm thinking I'll take a photo of her Phone on my Phone and text
it to her!)
TIA
A Google search here, put a Gemini summary at the top of the page.
This is mostly a repeat of the answer Andy gave.
On 3/03/2026 11:32 pm, Paul wrote:
A Google search here, put a Gemini summary at the top of the page.
This is mostly a repeat of the answer Andy gave.
Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI (Artificial Intelligence). ;-P
Cross-posted to alt.os.linux and alt.comp.os.windows-11
If I wanted to save a copy of what icons I have on my Computers Desktop,
I could press "Print Screen" or similar and a 'picture' of my desktop
would be saved somewhere.
However, I've just had one of my sisters ring to say she's coming over tomorrow afternoon so I can show her how to take a photo/screenshot of
her Mobile Phone's screen.
Is this natively possible?? (Don't quote me but I think her phone is NOT
an Iphone, if it matters.)
(I'm thinking I'll take a photo of her Phone on my Phone and text it to her!)
TIA
On Tue, 03 Mar 2026 21:24:18 +1100, Daniel70 wrote:
I've seen three different and mutually exclusive ways to take screenshots
on Android.
1) press power and volume_down simultaneously
2) using the edge of your hand, swipe the screen left to right
3) pull down the "Quick Settings" menu, click the "Take Screenshot" icon
# 2 and 3 seem to be device and Android version dependant.
On 03/03/2026 13.52, Daniel70 wrote:
On 3/03/2026 11:32 pm, Paul wrote:
A Google search here, put a Gemini summary at the top of the
page. This is mostly a repeat of the answer Andy gave.
Thank you, Paul. I think I prefer HI (Human Intelligence) rather
than AI (Artificial Intelligence). ;-P
Sadly still no AI, all you have is Large Language Models
that just generates a flow of text that most likely hang together--
based on the input the program got.
snip <
Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI (Artificial Intelligence). ;-P
On 03/03/2026 12:52, Daniel70 wrote:
snip <
Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI
(Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial Intelligence
a cow! :)
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip <
Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI >>> (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial Intelligence a cow! :)
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip <
Thank you, Paul. I think I prefer HI (Human Intelligence) rather than AI >>>> (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI)
is the one that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Now, that's just crazy talk. ELIZA must have written that Nobody could mistake ELIZA for an AI. If anything, it was intended to show people
what a "machine intelligence" would do. It would write sentences in
response to the sentences you fed it (a big deal at the time). And that article points out, that some people believed it was real.
I think I first ran into Eliza in a collection of CP/M software. I thought
it was a spoof on the Rogers style therapy that was somewhat popular at
the time.
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human
Intelligence) rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human
Intelligence) rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release  1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!
rbowman wrote:
I think I first ran into Eliza in a collection of CP/M software. I
thought
it was a spoof on the Rogers style therapy that was somewhat popular at
the time.
    "Tell me more about your mother ..."
On 2026-03-05 09:59, Daniel70 wrote:
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human
Intelligence) rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release  1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!
I met her in the eighties.
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human
Intelligence) rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!
rbowman wrote:
I think I first ran into Eliza in a collection of CP/M software. I
thought it was a spoof on the Rogers style therapy that was somewhat
popular at the time.
"Tell me more about your mother ..."
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human Intelligence)
rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human Intelligence)
rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!
On 2026-03-05, Daniel70 <[email protected]> wrote:
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human
Intelligence) rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!
'66? Sounds legit.
There were computers before UNIX, and languages before C
On Thu, 5 Mar 2026 19:59:14 +1100, Daniel70 wrote:
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human Intelligence)
rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!
Way.
On 5/03/2026 11:33 pm, Jasen Betts wrote:
On 2026-03-05, Daniel70 <[email protected]> wrote:Oh, sure, there were computers back in WWII-times .... just I've never considered them in more widespread than Defence-type usage.
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human
Intelligence) rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release  1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No Way!!
'66? Sounds legit.
There were computers before UNIX, and languages before C
On 2026-03-06 10:00, Daniel70 wrote:
On 5/03/2026 11:33 pm, Jasen Betts wrote:
On 2026-03-05, Daniel70 <[email protected]> wrote:Oh, sure, there were computers back in WWII-times .... just I've
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM"
snip < Thank you, Paul. I think I prefer HI (Human
Intelligence) rather than AI (Artificial Intelligence).
;-P
AI stands for Artificial Insemination. You can't
Artificial Intelligence a cow! :)
instead??
They use LLM-AI to signify "this is not the final or real
one".
An AI that achieves Artificial General Intelligence (AGI) is
the one that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966,
No Way!!
'66? Sounds legit.
There were computers before UNIX, and languages before C
never considered them in more widespread than Defence-type usage.
66 is not WWII times. It is the times of the Apolo missions,
which had flight computers. Computers did exist, although huge. Early
70s, there was a computer room at my father's job.
Programmers tried things to find out what could be done with a--
computer.
On 6/03/2026 9:35 pm, Carlos E.R. wrote:
On 2026-03-06 10:00, Daniel70 wrote:
On 5/03/2026 11:33 pm, Jasen Betts wrote:
On 2026-03-05, Daniel70 <[email protected]> wrote:Oh, sure, there were computers back in WWII-times .... just I've
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM"
snip < Thank you, Paul. I think I prefer HI (Human
Intelligence) rather than AI (Artificial Intelligence).
;-P
AI stands for Artificial Insemination. You can't
Artificial Intelligence a cow! :)
instead??
They use LLM-AI to signify "this is not the final or real
one".
An AI that achieves Artificial General Intelligence (AGI) is
the one that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release  1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966,
No Way!!
'66? Sounds legit.
There were computers before UNIX, and languages before C
never considered them in more widespread than Defence-type usage.
66 is not WWII times. It is the times of the Apolo missions,
Of course!! Of course!!
Gee Whiz. I just hate it when someone shoots me down in flames ....
soooo easily!!
which had flight computers. Computers did exist, although huge. Early
70s, there was a computer room at my father's job.
And another trip down memory line. As part of my Army Apprenticeship, I
spent time at an Australian Army Signals Unit in 1976
They were the Hub for Signals-switching for all Aust Army Units and for
that purpose, they had S.T.R.a.D. (Signals Reception Transmission and Distribution) which was (I think) six equipment bays, each ten or
fifteen meters long.
All flashing lights, etc.
Programmers tried things to find out what could be done with a
computer.
66 is not WWII times. It is the times of the Apollo missions, which had >flight computers. Computers did exist, although huge. Early 70s, there
was a computer room at my father's job. Programmers tried things to find
out what could be done with a computer.
"Carlos E.R." <[email protected]d> wrote:
Defence-type usage.
66 is not WWII times. It is the times of the Apollo missions, which had >flight computers. Computers did exist, although huge. Early 70s, there
was a computer room at my father's job. Programmers tried things to find >out what could be done with a computer.
I was in Palo Alto (California) High School at that time. The high
school was right next to the School District offices, so I was able to
take a computer programming course. We were able to use the school
district's IBM1620 to run our programs. A small (for the time)
machine, and not very powerful, but it got me into programming. IBM
360s and 370s were also around at that time, and many governments and companies used them. Micro computers that you could own yourself
debuted in the 1970s.
On 5/03/2026 11:33 pm, Jasen Betts wrote:
On 2026-03-05, Daniel70 <[email protected]> wrote:Oh, sure, there were computers back in WWII-times .... just I've never considered them in more widespread than Defence-type usage.
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human Intelligence) >>>>>>> rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one
that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No
Way!!
'66? Sounds legit.
There were computers before UNIX, and languages before C
On 6/03/2026 9:35 pm, Carlos E.R. wrote:
On 2026-03-06 10:00, Daniel70 wrote:
On 5/03/2026 11:33 pm, Jasen Betts wrote:
On 2026-03-05, Daniel70 <[email protected]> wrote:Oh, sure, there were computers back in WWII-times .... just I've never
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM"
snip < Thank you, Paul. I think I prefer HI (Human Intelligence) >>>>>>>>> rather than AI (Artificial Intelligence).
;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow!
instead??
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the
one that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966,
No Way!!
'66? Sounds legit.
There were computers before UNIX, and languages before C
considered them in more widespread than Defence-type usage.
66 is not WWII times. It is the times of the Apolo missions,
Of course!! Of course!!
Gee Whiz. I just hate it when someone shoots me down in flames ....
soooo easily!!
"Carlos E.R." <[email protected]d> wrote:
Defence-type usage.
66 is not WWII times. It is the times of the Apollo missions, which had >>flight computers. Computers did exist, although huge. Early 70s, there
was a computer room at my father's job. Programmers tried things to find >>out what could be done with a computer.
I was in Palo Alto (California) High School at that time. The high
school was right next to the School District offices, so I was able to
take a computer programming course. We were able to use the school
district's IBM1620 to run our programs. A small (for the time) machine,
and not very powerful, but it got me into programming. IBM 360s and 370s
were also around at that time, and many governments and companies used
them. Micro computers that you could own yourself debuted in the 1970s.
On Fri, 06 Mar 2026 11:09:54 -0500, Tim Slattery wrote:
"Carlos E.R." <[email protected]d> wrote:
Defence-type usage.
66 is not WWII times. It is the times of the Apollo missions, which had
flight computers. Computers did exist, although huge. Early 70s, there
was a computer room at my father's job. Programmers tried things to find >>> out what could be done with a computer.
I was in Palo Alto (California) High School at that time. The high
school was right next to the School District offices, so I was able to
take a computer programming course. We were able to use the school
district's IBM1620 to run our programs. A small (for the time) machine,
and not very powerful, but it got me into programming. IBM 360s and 370s
were also around at that time, and many governments and companies used
them. Micro computers that you could own yourself debuted in the 1970s.
While I learned FORTRAN IV in the mid-60s I didn't have much interest in programming until the '70s. I'd worked with industrial control circuitry,
all relay logic, that slowly went solid state, and ultimately to MCUs. One 8080 could replace a LOT of octal base relays. Logic is logic.
On Fri, 6 Mar 2026 20:00:44 +1100, Daniel70 wrote:
On 5/03/2026 11:33 pm, Jasen Betts wrote:
On 2026-03-05, Daniel70 <[email protected]> wrote:Oh, sure, there were computers back in WWII-times .... just I've never
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human Intelligence) >>>>>>>> rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one >>>>> that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No
Way!!
'66? Sounds legit.
There were computers before UNIX, and languages before C
considered them in more widespread than Defence-type usage.
https://en.wikipedia.org/wiki/Z3_(computer)
On 2026-03-06 19:51, rbowman wrote:
On Fri, 6 Mar 2026 20:00:44 +1100, Daniel70 wrote:
On 5/03/2026 11:33 pm, Jasen Betts wrote:
On 2026-03-05, Daniel70 <[email protected]> wrote:Oh, sure, there were computers back in WWII-times .... just I've
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human
Intelligence) rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is
the one that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release 1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966,
No Way!!
'66? Sounds legit.
There were computers before UNIX, and languages before C
never considered them in more widespread than Defence-type usage.
https://en.wikipedia.org/wiki/Z3_(computer)
Nice one. :-)
On 2026-03-06 20:11, rbowman wrote:
On Fri, 06 Mar 2026 11:09:54 -0500, Tim Slattery wrote:
"Carlos E.R." <[email protected]d> wrote:
 Defence-type usage.
66 is not WWII times. It is the times of the Apollo missions, which had >>>> flight computers. Computers did exist, although huge. Early 70s, there >>>> was a computer room at my father's job. Programmers tried things to find >>>> out what could be done with a computer.
I was in Palo Alto (California) High School at that time. The high
school was right next to the School District offices, so I was able to
take a computer programming course. We were able to use the school
district's IBM1620 to run our programs. A small (for the time) machine,
and not very powerful, but it got me into programming. IBM 360s and 370s >>> were also around at that time, and many governments and companies used
them. Micro computers that you could own yourself debuted in the 1970s.
While I learned FORTRAN IV in the mid-60s I didn't have much interest in
programming until the '70s. I'd worked with industrial control circuitry,
all relay logic, that slowly went solid state, and ultimately to MCUs. One >> 8080 could replace a LOT of octal base relays. Logic is logic.
Once I built a 1 bit adder with relays, just for fun. Nobody appreciated the fun, the IBM PC clone era was in full blast.
On 2026-03-06 19:51, rbowman wrote:
On Fri, 6 Mar 2026 20:00:44 +1100, Daniel70 wrote:
On 5/03/2026 11:33 pm, Jasen Betts wrote:
On 2026-03-05, Daniel70 <[email protected]> wrote:Oh, sure, there were computers back in WWII-times .... just I've never
On 5/03/2026 7:25 am, Paul wrote:
On Wed, 3/4/2026 6:12 AM, Daniel70 wrote:
On 4/03/2026 8:24 pm, wasbit wrote:
On 03/03/2026 12:52, Daniel70 wrote:AH!! Of course. Is that why people now speak of "LLM" instead??
snip < Thank you, Paul. I think I prefer HI (Human Intelligence) >>>>>>>>> rather than AI (Artificial Intelligence). ;-P
AI stands for Artificial Insemination. You can't Artificial
Intelligence a cow! :)
They use LLM-AI to signify "this is not the final or real one".
An AI that achieves Artificial General Intelligence (AGI) is the one >>>>>> that will join the "exclusive AI club".
The first AI I ran into, was a port of ELIZA.
https://en.wikipedia.org/wiki/ELIZA
Initial release  1966
1966?? You have GOT to be joking!! 1996, yeah, maybe, but 1966, No
Way!!
'66? Sounds legit.
There were computers before UNIX, and languages before C
considered them in more widespread than Defence-type usage.
https://en.wikipedia.org/wiki/Z3_(computer)
Nice one. :-)
On Fri, 6 Mar 2026 22:03:49 +1100, Daniel70 wrote:
On 6/03/2026 9:35 pm, Carlos E.R. wrote:
On 2026-03-06 10:00, Daniel70 wrote:
On 5/03/2026 11:33 pm, Jasen Betts wrote:
There were computers before UNIX, and languages before COh, sure, there were computers back in WWII-times .... just I've never >>>> considered them in more widespread than Defence-type usage.
66 is not WWII times. It is the times of the Apolo missions,
Of course!! Of course!!
Gee Whiz. I just hate it when someone shoots me down in flames ....
soooo easily!!
My first programming class was in '66, FORTRAN IV on an IBM System
360/30. There were a few schools like Dartmouth that had started
fledgling CS programs but this wasn't seen as a career path, only another tool to be used like our slide rules. There was also an analog computer
lab just in case. Except in niche applications analog was on the way out.
Good things never die and analog is showing some promise in neuromorphic applications. After all the brain is an electrochemical analog device.
As for Eliza, there were CP/M versions running on 8080 and Z80 machines in the late '70s.
Once I built a 1 bit adder with relays, just for fun. Nobody appreciated
the fun, the IBM PC clone era was in full blast.
Now that's something I've not tried, is using relays for logic.
That means you must be an expert at designing snubbers then
You should have gone for broke, and done a 4 bit adder.
Then your next step would be a calculator made out of relays.
Studied 8085/Z80/6809 programming 1989 --> 1991.
On Fri, 3/6/2026 4:40 PM, Carlos E.R. wrote:
On 2026-03-06 20:11, rbowman wrote:
On Fri, 06 Mar 2026 11:09:54 -0500, Tim Slattery wrote:
"Carlos E.R." <[email protected]d> wrote:While I learned FORTRAN IV in the mid-60s I didn't have much interest in >>> programming until the '70s. I'd worked with industrial control circuitry, >>> all relay logic, that slowly went solid state, and ultimately to MCUs. One >>> 8080 could replace a LOT of octal base relays. Logic is logic.
 Defence-type usage.
66 is not WWII times. It is the times of the Apollo missions, which had >>>>> flight computers. Computers did exist, although huge. Early 70s, there >>>>> was a computer room at my father's job. Programmers tried things to find >>>>> out what could be done with a computer.
I was in Palo Alto (California) High School at that time. The high
school was right next to the School District offices, so I was able to >>>> take a computer programming course. We were able to use the school
district's IBM1620 to run our programs. A small (for the time) machine, >>>> and not very powerful, but it got me into programming. IBM 360s and 370s >>>> were also around at that time, and many governments and companies used >>>> them. Micro computers that you could own yourself debuted in the 1970s. >>>
Once I built a 1 bit adder with relays, just for fun. Nobody appreciated the fun, the IBM PC clone era was in full blast.
Now that's something I've not tried, is using relays for logic.
That means you must be an expert at designing snubbers then :-)
You should have gone for broke, and done a 4 bit adder.
Then your next step would be a calculator made out
of relays.
We had a guy at work, who liked to design asynchronous
logic. His circuits always ran faster than everyone
elses (because... they didn't wait for a clock edge).
But designing those (without computer assistance), is
a lot of work. As you need cover terms so stuff does
not glitch. There were even commercial companies
interested in the idea, but it kinda died out. At least
he didn't break anything. I could trust him not to
blow up a project. He wasn't a kook.
*******
Some engineers are known for their weird fixations with components.
My manager hired a guy, he was mostly non-communicative. I couldn't
say there was a language barrier, as we never had any conversations
with him.
He was given a specification to work with, and he went off to design it. Months went by, he was wire wrapping it in the lab and so on. Well,
nobody pokes around someone elses design (unless it is design review
time). And being non-communicative, he wasn't partnered with anyone,
he didn't ask any questions and so on. In other words, no one at
all was able to learn anything about exactly what he was doing.
So one day, he tells the manager it is finished and it is running
in the lab. The proof it is working, is one red LED. If the LED
is lit, it's working. If the LED is off, it's not working. (This is
a bunch of status circuitry, monitoring logic operation.) Well,
the thing was built entirely out of hex packs of transistors.
All the gates (it's a digital logic function) were made from transistors. There was no jelly bean logic on the board. There must have been
hundreds and hundreds of transistors. Then the guy says "he's leaving"
and he is gone, just like that. We never heard from the manager, exactly what he thought of this :-) But, another lesson learned about
handling people.
If you're going to make a fetish about designing with relays,
don't tell anyone :-) And pretend to be non-communicative
while you're building your contraption. Seems a good strategy.
On Sat, 7 Mar 2026 02:02:29 -0500, Paul wrote:
Now that's something I've not tried, is using relays for logic.
That means you must be an expert at designing snubbers then
You should have gone for broke, and done a 4 bit adder.
Then your next step would be a calculator made out of relays.
See my other post in the thread about Petzholt's 'Code' and his
interactive website.
https://www.allaboutcircuits.com/worksheets/electromechanical-relay-logic/
The circuits could get complex with inputs from pushbuttons, limit
switches, electro-mechanical timers, and other hardware. You were building
a state machine with 120 VAC components.
Solid state slowly entered the industrial field. Square D, a major
supplier of switchgear, came out with NORPAK. There was an assortment of modules and back planes to mount them in. Programming was done with wire jumpers similar to the Dupont wires used with solderless breadboards. They were more secure since you used something like a automatic center punch to set the taper pin.
The problem was the NOR gate is the easiest to create with transistors.
Try designing logic when all you have is NORs and inverters. You start talking to yourself.
Next up was the programmable logic controller (PLC) which is used to this day. I never worked with them as I'd moved on to straight 8080/Z80 controllers.
At the time the interface used the metaphor of relay based ladder diagrams that was understood by industrial electricians. I was surprised when I interviewed a candidate who had experience with PLCs and he said ladder diagrams are still the most popular interface.
Logic is logic. We built plastics molding systems and the hydraulic
circuitry also implements logic with spool valves, check valves, and so forth. Air logic is the same. I've used that for explosive environments
when you don't want sparks.
https://fluidpowerjournal.com/design-efficient-air-logic-system/
Fluidics is similar but can work with no moving parts at all.
https://en.wikipedia.org/wiki/Fluidics
It's all logic.
https://en.wikipedia.org/wiki/Z3_(computer)
On Fri, 6 Mar 2026 22:40:12 +0100, Carlos E.R. wrote:
Once I built a 1 bit adder with relays, just for fun. Nobody appreciated
the fun, the IBM PC clone era was in full blast.
https://codehiddenlanguage.com/Chapter08/ https://codehiddenlanguage.com/Chapter14/
Note that the pages are interactive. In chapter 8 he relates relay
circuits to the logic symbols. In the text he says you could build an
adder with relays -- a lot of relays.
It's an interesting book by Charles Petzhold who wrote many Windows programming books. It reads like it was written for young adults but some
of the concepts get pretty deep.
On Sat, 7 Mar 2026 20:53:50 +1100, Daniel70 wrote:
Studied 8085/Z80/6809 programming 1989 --> 1991.
Kids! THe 8080 came out in '74 followed by the Z80 in '76. I mostly
worked with Z80s.
It had a couple of extra instructions but was otherwise the same.
Of course Intel had a patent on the assembler so Zilog had to do it differently.
| Sysop: | DaiTengu |
|---|---|
| Location: | Appleton, WI |
| Users: | 1,099 |
| Nodes: | 10 (0 / 10) |
| Uptime: | 492375:35:17 |
| Calls: | 14,106 |
| Calls today: | 2 |
| Files: | 187,124 |
| D/L today: |
1,808 files (806M bytes) |
| Messages: | 2,496,079 |