• terminal only for two weeks

    From Retrograde@fungus@amongus.com.invalid to comp.misc on Mon Nov 25 13:34:25 2024
    From Newsgroup: comp.misc

    From the «text is good enough» department:
    Title: Using (only) a Linux terminal for my personal computing in 2024
    Author: Thom Holwerda
    Date: Sun, 24 Nov 2024 22:13:32 +0000
    Link: https://www.osnews.com/story/141194/using-only-a-linux-terminal-for-my-personal-computing-in-2024/


    A month and a bit ago, I wondered if I could cope with a terminal-only computer[1].
    […]

    The only way to really find out was to give it a go.

    My goal was to see what it was like to use a terminal-only computer for my personal computing for two weeks, and more if I fancied it.
    ↫ Neil’s blog[2]

    I tried to do this too, once.

    Once.

    Doing everything from the terminal just isn’t viable for me, mostly because I didn’t grow up with it. Our family’s first computer ran MS-DOS (with a Windows
    3.1 installation we never used), and I’m pretty sure the experience of using MS-DOS as my first CLI ruined me for life. My mental model for computing didn’t
    start forming properly until Windows 95 came out, and as such, computing is inherently graphical for me, and no matter how many amazing CLI and TUI applications are out there – and there are many, many amazing ones – my brain
    just isn’t compatible with it.

    There are a few tasks I prefer doing with the command line, like updating my computers or editing system files using Nano, but for everything else I’m just
    faster and more comfortable with a graphical user interface. This comes down to not knowing most commands by heart, and often not even knowing the options and flags for the most basic of commands, meaning even very basic operations that people comfortable using the command line do without even thinking, take me ages.

    I’m glad any modern Linux distribution – I use Fedora KDE on all my computers –
    offers both paths for almost anything you could do on your computer, and unless I specifically opt to do so, I literally – literally literally – never have to
    touch the command line.

    Links:
    [1]: https://neilzone.co.uk/2024/10/could-i-cope-with-a-terminal-only-computer/ (link)
    [2]: https://neilzone.co.uk/2024/11/using-only-a-linux-terminal-for-my-personal-computing-in-2024/ (link)
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Mon Nov 25 22:18:21 2024
    From Newsgroup: comp.misc

    This message is in MIME format. The first part should be readable text,
    while the remaining parts are likely unreadable without MIME-aware tools.

    --8323328-2134534386-1732569503=:16130
    Content-Type: text/plain; charset=UTF-8; format=flowed Content-Transfer-Encoding: 8BIT



    On Mon, 25 Nov 2024, Retrograde wrote:

    From the «text is good enough» department:
    Title: Using (only) a Linux terminal for my personal computing in 2024 Author: Thom Holwerda
    Date: Sun, 24 Nov 2024 22:13:32 +0000
    Link: https://www.osnews.com/story/141194/using-only-a-linux-terminal-for-my-personal-computing-in-2024/


    A month and a bit ago, I wondered if I could cope with a terminal-only computer[1].
    […]

    The only way to really find out was to give it a go.

    My goal was to see what it was like to use a terminal-only computer for my personal computing for two weeks, and more if I fancied it.
    ↫ Neil’s blog[2]

    I tried to do this too, once.

    Once.

    Doing everything from the terminal just isn’t viable for me, mostly because I
    didn’t grow up with it. Our family’s first computer ran MS-DOS (with a Windows
    3.1 installation we never used), and I’m pretty sure the experience of using
    MS-DOS as my first CLI ruined me for life. My mental model for computing didn’t
    start forming properly until Windows 95 came out, and as such, computing is inherently graphical for me, and no matter how many amazing CLI and TUI applications are out there – and there are many, many amazing ones – my brain
    just isn’t compatible with it.

    There are a few tasks I prefer doing with the command line, like updating my computers or editing system files using Nano, but for everything else I’m just
    faster and more comfortable with a graphical user interface. This comes down to
    not knowing most commands by heart, and often not even knowing the options and
    flags for the most basic of commands, meaning even very basic operations that people comfortable using the command line do without even thinking, take me ages.

    I’m glad any modern Linux distribution – I use Fedora KDE on all my computers –
    offers both paths for almost anything you could do on your computer, and unless
    I specifically opt to do so, I literally – literally literally – never have to
    touch the command line.

    Links:
    [1]: https://neilzone.co.uk/2024/10/could-i-cope-with-a-terminal-only-computer/ (link)
    [2]: https://neilzone.co.uk/2024/11/using-only-a-linux-terminal-for-my-personal-computing-in-2024/ (link)


    Fascinating experiment. I would not be able to do it. I need a browser to
    run my business, manage my finances etc. so terminal only, while nice,
    would be very difficult without some serious programming and hacking
    around problems.
    --8323328-2134534386-1732569503=:16130--
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Mon Nov 25 21:52:59 2024
    From Newsgroup: comp.misc

    On 25 Nov 2024 13:34:25 GMT, Retrograde wrote:

    This comes down to not knowing most commands by heart,
    and often not even knowing the options and flags for the most basic of commands ...

    Don’t need to. Type “man «cmd»” to see all the details of the options available for any external command. I do this all the time.

    I’m glad any modern Linux distribution – I use Fedora KDE on all my computers – offers both paths for almost anything you could do on your computer, and unless I specifically opt to do so, I literally –
    literally literally – never have to touch the command line.

    Also, running a command line through a GUI terminal emulator lets you take advantage of cut/copy/paste between windows, which is a feature not
    available on a pure-command-line system.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From John McCue@jmccue@qball.jmcunx.com to comp.misc on Tue Nov 26 03:13:32 2024
    From Newsgroup: comp.misc

    Retrograde <fungus@amongus.com.invalid> wrote:
    From the ?text is good enough? department:
    Title: Using (only) a Linux terminal for my personal computing in 2024 Author: Thom Holwerda
    Date: Sun, 24 Nov 2024 22:13:32 +0000
    Link: https://www.osnews.com/story/141194/using-only-a-linux-terminal-for-my-personal-computing-in-2024/


    A month and a bit ago,?I wondered if I could cope with a terminal-only computer[1].
    [?]

    The only way to really find out was to give it a go.

    I am glad you tried, sure it was a nice and very different
    experience.

    <snip>

    Doing everything from the terminal just isn't viable for me,
    mostly because I didn't grow up with it.

    Fair enough, but at least you tried to see what things were
    like for us old people. But yes, big changes like this are
    hard to deal with.

    I started before DOS existed on minis and I remember when
    GUIs became a thing. I had to be dragged kicking and
    screaming into that environment :) Still I pretty much live
    in Xterms and only need a GUI for browsing and html email.

    <snip>

    Nice post!
    --
    csh(1) - "An elegant shell, for a more... civilized age."
    - Paraphrasing Star Wars
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Mike Spencer@mds@bogus.nodomain.nowhere to comp.misc on Tue Nov 26 03:18:45 2024
    From Newsgroup: comp.misc


    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    Also, running a command line through a GUI terminal emulator lets you take advantage of cut/copy/paste between windows, which is a feature not available on a pure-command-line system.

    The command line is like language. The GUI is like shopping.

    Turns out, lots of my highly educated friends aren't all that good
    with language. :-o

    A windowing system is not in itself what most people mean by GUI and
    is, yes, a huge leap forward over plain command-line terminals.

    I do use a GUI browser and, occasionally, a GUI image editing device.
    I can imagine that audio/video editing my work best in a full GUI.

    But my default is a simple window manager (twm) on top of X with
    numerous xterms open or iconified, some running things like dmesg -w,
    one with root access etc.

    I took one look, long ago, at Windows 95 and moved straight to Linux.
    Took one look at KDE (shopping) and found twm.

    FWIW,
    --
    Mike Spencer Nova Scotia, Canada

    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From yeti@yeti@tilde.institute to comp.misc on Tue Nov 26 09:22:50 2024
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote:

    Also, running a command line through a GUI terminal emulator lets you take advantage of cut/copy/paste between windows, which is a feature not available on a pure-command-line system.

    I still can use Cut&Paste on Linux's "real VTs" but I'd prefer a
    decorationless fullscreen XTerm over those if I would try to work
    GUIfree for a while because of easier size switching, Sixel and TeK40xx graphics.

    Screen and Tmux would offer (keyboard driven) Cut&Paste.

    There now may be framebuffer terminals with most of the features of
    XTerm, but testing those still is crying for attention in my eternally
    growing (™Dark Energy Inside!™) to do list. *sigh!*
    --
    1. Hitchhiker 5: (101) "You just come along with me and have a good
    time. The Galaxy's a fun place. You'll need to have this fish in your
    ear."
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Tue Nov 26 10:22:22 2024
    From Newsgroup: comp.misc



    On Tue, 26 Nov 2024, John McCue wrote:

    Retrograde <fungus@amongus.com.invalid> wrote:
    From the ?text is good enough? department:
    Title: Using (only) a Linux terminal for my personal computing in 2024
    Author: Thom Holwerda
    Date: Sun, 24 Nov 2024 22:13:32 +0000
    Link: https://www.osnews.com/story/141194/using-only-a-linux-terminal-for-my-personal-computing-in-2024/


    A month and a bit ago,?I wondered if I could cope with a terminal-only
    computer[1].
    [?]

    The only way to really find out was to give it a go.

    I am glad you tried, sure it was a nice and very different
    experience.

    <snip>

    Doing everything from the terminal just isn't viable for me,
    mostly because I didn't grow up with it.

    Fair enough, but at least you tried to see what things were
    like for us old people. But yes, big changes like this are
    hard to deal with.

    I started before DOS existed on minis and I remember when
    GUIs became a thing. I had to be dragged kicking and
    screaming into that environment :) Still I pretty much live
    in Xterms and only need a GUI for browsing and html email.

    Through the wonders of alpine, atleast you can do html email in the
    terminal as well! =)

    I use the gui for web browsing, reading pdf:s and libreoffice. The rest
    sits in the terminal (email, programming/scripting, tinkering, reading
    text files).

    I have been thinking about moving the reading part of web browsing into
    the terminal as well, but haven't found a browser I'm happy with. Modern
    web sites tend to become too messed up when viewed in the terminal. Maybe
    it would be possible to write a kind of "pre-processor" that formats web
    sites with a text based browser in mind?

    <snip>

    Nice post!


    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From yeti@yeti@tilde.institute to comp.misc on Tue Nov 26 12:15:23 2024
    From Newsgroup: comp.misc

    D <nospam@example.net> wrote:

    I have been thinking about moving the reading part of web browsing
    into the terminal as well, but haven't found a browser I'm happy
    with.

    I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary
    fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini,
    Gopher and similar).

    Maybe it would be possible to write a kind of "pre-processor" that
    formats web sites with a text based browser in mind?

    Despite me finding this solution really scary, something like that
    indeed exists:

    <https://www.brow.sh/>
    --
    4. Hitchhiker 11:
    (72) "Watch the road!'' she yelped.
    (73) "Shit!"
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Tue Nov 26 16:36:07 2024
    From Newsgroup: comp.misc



    On Tue, 26 Nov 2024, yeti wrote:

    D <nospam@example.net> wrote:

    I have been thinking about moving the reading part of web browsing
    into the terminal as well, but haven't found a browser I'm happy
    with.

    I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini,
    Gopher and similar).

    True.

    Maybe it would be possible to write a kind of "pre-processor" that
    formats web sites with a text based browser in mind?

    Despite me finding this solution really scary, something like that
    indeed exists:

    <https://www.brow.sh/>

    Ah yes... I've seen this before! I did drop it due to its dependency on
    FF, but the concept is similar. My idea was to aggressively filter a web
    page before passing it on to elinks or similar.

    Perhaps rewriting it a bit in order to avoid the looooooong list of menu options or links that always come up at the top of the page, before the content of the page shows after a couple of page downs (this happens for instance if I go to wikipedia).

    Instead parsing it, and adding those links at the bottom, removing
    javascript, and perhaps passing on only the text. Well, those are only
    ideas. Maybe I'll try, maybe I won't. Time will tell! =)
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Tue Nov 26 21:24:41 2024
    From Newsgroup: comp.misc

    On Tue, 26 Nov 2024 09:22:50 +0042, yeti wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote:

    Also, running a command line through a GUI terminal emulator lets you
    take advantage of cut/copy/paste between windows, which is a feature
    not available on a pure-command-line system.

    I still can use Cut&Paste on Linux's "real VTs" but I'd prefer a decorationless fullscreen XTerm over those if I would try to work
    GUIfree for a while because of easier size switching, Sixel and TeK40xx graphics.

    But then it becomes difficult to have more than one terminal session open
    at once.

    I typically have about two dozen.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Tue Nov 26 21:28:23 2024
    From Newsgroup: comp.misc

    On 26 Nov 2024 03:18:45 -0400, Mike Spencer wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    Also, running a command line through a GUI terminal emulator lets you
    take advantage of cut/copy/paste between windows, which is a feature
    not available on a pure-command-line system.

    The command line is like language. The GUI is like shopping.

    Did you learn in Comp Sci about the concept of “abstract machines”? To program a computer, you start with the bare hardware, and add layers of software on top of that, each creating a new “abstract machine” that is easier to use for narrower and narrower classes of problems, albeit less flexible than the machine layer below.

    The command line is itself such an abstract machine, and you can create additional layers on top of that by writing shell scripts.

    GUIs, on the other hand, are not suited to having any additional layers
    built on top of them. They are designed to be used by humans, and that’s that. Attempts to automate GUI operations tend not to work very well.

    Took one look at KDE (shopping) and found twm.

    KDE Konsole is probably the most versatile of all the GUI terminal
    emulators.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From not@not@telling.you.invalid (Computer Nerd Kev) to comp.misc on Wed Nov 27 07:52:52 2024
    From Newsgroup: comp.misc

    D <nospam@example.net> wrote:
    On Tue, 26 Nov 2024, yeti wrote:
    D <nospam@example.net> wrote:
    I have been thinking about moving the reading part of web browsing
    into the terminal as well, but haven't found a browser I'm happy
    with.

    I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary
    fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini,
    Gopher and similar).

    True.

    I like seeing useful images, so prefer Dillo and Links (the latter
    does support display via the framebuffer so you can run it
    graphically without X).

    Maybe it would be possible to write a kind of "pre-processor" that
    formats web sites with a text based browser in mind?

    Despite me finding this solution really scary, something like that
    indeed exists:

    <https://www.brow.sh/>

    Ah yes... I've seen this before! I did drop it due to its dependency on
    FF, but the concept is similar. My idea was to aggressively filter a web page before passing it on to elinks or similar.

    Perhaps rewriting it a bit in order to avoid the looooooong list of menu options or links that always come up at the top of the page, before the content of the page shows after a couple of page downs (this happens for instance if I go to wikipedia).

    Lucky if it's just a couple of page-downs, I can easily be
    hammering the button on some insane pages where 10% is the actual
    content and 90% is menu links. Often it's quicker to press End
    and work up from the bottom, but many websites have a few pages of
    junk at the bottom too now, so you have to hunt for the little
    sliver of content in the middle.

    Instead parsing it, and adding those links at the bottom, removing javascript, and perhaps passing on only the text.

    A similar approach is taken by frogfind.com, except rather than
    parsing the links and putting them at the end, it detetes them,
    which makes it impossible to navigate many websites. It does the
    other things you mention, but the link rewriting would probably be
    the hardest part to get right with a universal parser.

    Site-specific front-ends are a simpler goal. This is a list of ones
    that work in Dillo, and therefore without Javascript: https://alex.envs.net/dillectory/

    Of course then you have the problem of them breaking as soon as the
    target site/API changes or blocks them.
    --
    __ __
    #_ < |\| |< _#
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Mike Spencer@mds@bogus.nodomain.nowhere to comp.misc on Tue Nov 26 17:57:53 2024
    From Newsgroup: comp.misc


    D <nospam@example.net> writes:

    On Tue, 26 Nov 2024, yeti wrote:

    <https://www.brow.sh/>

    Ah yes... I've seen this before! I did drop it due to its dependency on
    FF, but the concept is similar. My idea was to aggressively filter a web page before passing it on to elinks or similar.

    Perhaps rewriting it a bit in order to avoid the looooooong list of menu options or links that always come up at the top of the page, before the content of the page shows after a couple of page downs (this happens for instance if I go to wikipedia).

    Instead parsing it, and adding those links at the bottom, removing javascript, and perhaps passing on only the text. Well, those are only ideas. Maybe I'll try, maybe I won't. Time will tell! =)

    I've done this for a few individual sites that I visit frequently.

    + A link to that site resides on my browser's "home" page.

    + That home page is a file in ~/html/ on localhost.

    + The link is actually to a target-specific cgi-bin Perl script on
    localhost where Apache is running, restricted to requests from
    localhost.

    + The script takes the URL sent from the home page, rewrites it for
    the routable net, sends it to the target using wget and reads all
    of the returned data into a variable.

    + Using Perl's regular expressions, stuff identified (at time of
    writing the script) as unwanted is elided -- js, style, svg,
    noscript etc. URLs self-referencing the target are rewritten to
    to be sent through the cgi-bin script.

    + Other tweaks peculiar to the specific target...

    + Result is handed back to the browser preceded by minimal HTTP
    headers.

    So far, works like a charm. Always the potential that a target host
    will change their format significantly. That has happened a couple of
    times, requiring fetching an unadorned copy of the target's page,
    tedious reading/parsing and edit to the script.

    This obviously doesn't work for those sites that initially send a
    dummy all-js page to verify that you have js enabled and send you a condescending reproof if you don't. Other server-side dominance games
    a potential challenge or a stone wall.

    Writing a generalized version, capable of dealing with pages from random/arbitrary sites is a notion perhaps worth pursuing but clearly
    more of a challenge than site-specific scripts. RSN, round TUIT etc.
    --
    Mike Spencer Nova Scotia, Canada
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Wed Nov 27 10:51:41 2024
    From Newsgroup: comp.misc



    On Tue, 27 Nov 2024, Computer Nerd Kev wrote:

    D <nospam@example.net> wrote:
    On Tue, 26 Nov 2024, yeti wrote:
    D <nospam@example.net> wrote:
    I have been thinking about moving the reading part of web browsing
    into the terminal as well, but haven't found a browser I'm happy
    with.

    I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary
    fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini,
    Gopher and similar).

    True.

    I like seeing useful images, so prefer Dillo and Links (the latter
    does support display via the framebuffer so you can run it
    graphically without X).

    For some reason, I never managed to get the framebuffer to work. Have no
    idea why. =( I would like to get it to work though. Dillo was a good tip!
    I did play with it for a bit, but then forgot about it. Maybe the reason
    was a lack of tabs or buffers. I think links or maybe it was elinks, had a
    way for me to replicate tabs or vi buffers in the browser. It was super convenient!

    Basically my ideal would be to move all my "reading" to a text based
    browser, so that I would only have to keep work related stuff in the
    massive GUI browser. All the other 60+ tabs, would live in the text
    browser where I would reference them when needed.

    Maybe it would be possible to write a kind of "pre-processor" that
    formats web sites with a text based browser in mind?

    Despite me finding this solution really scary, something like that
    indeed exists:

    <https://www.brow.sh/>

    Ah yes... I've seen this before! I did drop it due to its dependency on
    FF, but the concept is similar. My idea was to aggressively filter a web
    page before passing it on to elinks or similar.

    Perhaps rewriting it a bit in order to avoid the looooooong list of menu
    options or links that always come up at the top of the page, before the
    content of the page shows after a couple of page downs (this happens for
    instance if I go to wikipedia).

    Lucky if it's just a couple of page-downs, I can easily be
    hammering the button on some insane pages where 10% is the actual
    content and 90% is menu links. Often it's quicker to press End
    and work up from the bottom, but many websites have a few pages of
    junk at the bottom too now, so you have to hunt for the little
    sliver of content in the middle.

    I know... as a perfectionist this does not go down well with me. ;)

    Instead parsing it, and adding those links at the bottom, removing
    javascript, and perhaps passing on only the text.

    A similar approach is taken by frogfind.com, except rather than
    parsing the links and putting them at the end, it detetes them,
    which makes it impossible to navigate many websites. It does the
    other things you mention, but the link rewriting would probably be
    the hardest part to get right with a universal parser.

    Did not know about frogfind! This could be a great start to improve the readability! In my home brew rss2email script, I automatically create archive.is links, so that when I want to read articles behind paywalls, archive.is is already built in.

    I imagine that I could whip up something similar, running page through http://frogfind.com/read.php?a=xyz... !

    Site-specific front-ends are a simpler goal. This is a list of ones
    that work in Dillo, and therefore without Javascript: https://alex.envs.net/dillectory/

    Of course then you have the problem of them breaking as soon as the
    target site/API changes or blocks them.

    This is the truth!
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Wed Nov 27 10:54:50 2024
    From Newsgroup: comp.misc



    On Tue, 26 Nov 2024, Mike Spencer wrote:


    D <nospam@example.net> writes:

    On Tue, 26 Nov 2024, yeti wrote:

    <https://www.brow.sh/>

    Ah yes... I've seen this before! I did drop it due to its dependency on
    FF, but the concept is similar. My idea was to aggressively filter a web
    page before passing it on to elinks or similar.

    Perhaps rewriting it a bit in order to avoid the looooooong list of menu
    options or links that always come up at the top of the page, before the
    content of the page shows after a couple of page downs (this happens for
    instance if I go to wikipedia).

    Instead parsing it, and adding those links at the bottom, removing
    javascript, and perhaps passing on only the text. Well, those are only
    ideas. Maybe I'll try, maybe I won't. Time will tell! =)

    I've done this for a few individual sites that I visit frequently.

    + A link to that site resides on my browser's "home" page.

    + That home page is a file in ~/html/ on localhost.

    + The link is actually to a target-specific cgi-bin Perl script on
    localhost where Apache is running, restricted to requests from
    localhost.

    + The script takes the URL sent from the home page, rewrites it for
    the routable net, sends it to the target using wget and reads all
    of the returned data into a variable.

    + Using Perl's regular expressions, stuff identified (at time of
    writing the script) as unwanted is elided -- js, style, svg,
    noscript etc. URLs self-referencing the target are rewritten to
    to be sent through the cgi-bin script.

    + Other tweaks peculiar to the specific target...

    + Result is handed back to the browser preceded by minimal HTTP
    headers.

    So far, works like a charm. Always the potential that a target host
    will change their format significantly. That has happened a couple of
    times, requiring fetching an unadorned copy of the target's page,
    tedious reading/parsing and edit to the script.

    This obviously doesn't work for those sites that initially send a
    dummy all-js page to verify that you have js enabled and send you a condescending reproof if you don't. Other server-side dominance games
    a potential challenge or a stone wall.

    Writing a generalized version, capable of dealing with pages from random/arbitrary sites is a notion perhaps worth pursuing but clearly
    more of a challenge than site-specific scripts. RSN, round TUIT etc.

    Brilliant! You are a poet Mike!

    Frogfind.com was a great start! I would love to have some kind of crowd sourced html5->html1 - javascript - garbage script.

    I also wondered if another approach might just be to take the top 500
    sites and base it on that? Or even looking through my own history, take
    the top 100.

    Due to the bad development of the net, it seems like a greater and greater part of our browsing takes place on ever fewer numbers of sites.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From not@not@telling.you.invalid (Computer Nerd Kev) to comp.misc on Thu Nov 28 06:44:48 2024
    From Newsgroup: comp.misc

    D <nospam@example.net> wrote:
    On Tue, 27 Nov 2024, Computer Nerd Kev wrote:
    D <nospam@example.net> wrote:
    On Tue, 26 Nov 2024, yeti wrote:
    D <nospam@example.net> wrote:
    I have been thinking about moving the reading part of web browsing
    into the terminal as well, but haven't found a browser I'm happy
    with.

    I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary >>>> fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini,
    Gopher and similar).

    True.

    I like seeing useful images, so prefer Dillo and Links (the latter
    does support display via the framebuffer so you can run it
    graphically without X).

    For some reason, I never managed to get the framebuffer to work. Have no idea why. =( I would like to get it to work though.

    I guess the framebuffer is working for the console, otherwise it
    will probably be a low-res BIOS character display like in DOS. So
    either a permissions problem or do you know that you need to start
    Links with the "-g" option?

    Dillo was a good tip!
    I did play with it for a bit, but then forgot about it. Maybe the reason
    was a lack of tabs or buffers. I think links or maybe it was elinks, had a way for me to replicate tabs or vi buffers in the browser. It was super convenient!

    Links doesn't do tabs, eLinks might but I haven't used it much.
    Dillo has tabs, but isn't great for managing huge numbers of them
    (although I avoid trying to do that anywhere).
    --
    __ __
    #_ < |\| |< _#
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From yeti@yeti@tilde.institute to comp.misc on Thu Nov 28 05:54:00 2024
    From Newsgroup: comp.misc

    not@telling.you.invalid (Computer Nerd Kev) wrote:

    Links doesn't do tabs, eLinks might

    Elinks does.


    ... but now for something completely different:

    Have you seen Twin?

    <https://github.com/cosmos72/twin>
    --
    1. Hitchhiker 5: (101) "You just come along with me and have a good
    time. The Galaxy's a fun place. You'll need to have this fish in your
    ear."
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Mike Spencer@mds@bogus.nodomain.nowhere to comp.misc on Thu Nov 28 01:41:56 2024
    From Newsgroup: comp.misc


    D <nospam@example.net> writes:

    Brilliant! You are a poet Mike!

    I'm doubtful that poetry can be done in Perl. Maybe free verse in
    Lisp.

    Frogfind.com was a great start! I would love to have some kind of crowd sourced html5->html1 - javascript - garbage script.

    Do note that Frogfind delivers URLs that send your click back to
    Frogfind to be proxied. I assume that's how you get de-enshitified
    pages in response to clicking a link returned from a search.

    Here's a curiosity:

    Google also sends all of your clicks on search results back through
    Google. I assume y'all knew that.

    If you search for (say):

    leon "the professional"

    you get:

    https://www.google.com/url?q=https://en.wikipedia.org/wiki/L%25C3%25A9on:_The_Professional&sa=U&ved=2ahUKEwi [snip tracking hentracks/data]

    Note that the "real" URL which Google proposes to proxy for you
    contains non-ASCII characters:

    en.wikipedia.org/wiki/L%25C3%25A9on:_The_Professional

    Wikipedia does *not* *have* a page connected to that URL! But if you
    click the link and send it back through Google, you reach the right
    Wikipedia page that *does* exist:

    en.wikipedia.org/wiki/Leon:_The_Professional

    AFAICT, when spidering the net, Google finds the page that *does*
    exist, modifies it according to (opaque, unknown) rules of orthography
    and delivers that to you. When you send that link back through
    Google, Google silently reverts the imposed orthographic "correction"
    so that the link goes to an existing page.

    Isn't the weird?

    Try it. Copy the "real" URL from such a Google response, eliding
    everything before (and including) "?q=" and after (and including) the
    first "&", paste it into your browser. Wikipedia will politely tell
    you that no such page is available and offer search suggestions.
    Revert the non-ASCII "e with a diacritical mark" to 'e' (mutatis
    mutandem for similar Google "hits") and it will work.

    I also wondered if another approach might just be to take the top 500
    sites and base it on that? Or even looking through my own history, take
    the top 100.

    Now there's a project suitable for AI: train the NN to treat a response containing stuff you don't want ever to see as a failure. Grovel
    repetitively through terabytes of HTML and finally come up with a
    generalized filter solution.

    Due to the bad development of the net, it seems like a greater and
    greater part of our browsing takes place on ever fewer numbers of
    sites.
    --
    Mike Spencer Nova Scotia, Canada
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Thu Nov 28 06:42:10 2024
    From Newsgroup: comp.misc

    On 28 Nov 2024 01:41:56 -0400, Mike Spencer wrote:

    AFAICT, when spidering the net, Google finds the page that *does*
    exist, modifies it according to (opaque, unknown) rules of orthography
    and delivers that to you.

    It adds an entirely unnecessary extra level of URL quoting.

    Trying your example through a redirection-removal script I hacked
    together:

    ldo@theon:unredirect> ./unredirect 'https://www.google.com/url?q=https://en.wikipedia.org/wiki/L%25C3%25A9on:_The_Professional&sa=U&ved=2ahUKEwi'
    https://en.wikipedia.org/wiki/L%25C3%25A9on:_The_Professional

    Wrong.

    ldo@theon:unredirect> ./unredirect --unquote 'https://www.google.com/url?q=https://en.wikipedia.org/wiki/L%25C3%25A9on:_The_Professional&sa=U&ved=2ahUKEwi'
    https://en.wikipedia.org/wiki/L%C3%A9on:_The_Professional

    Right.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Thu Nov 28 10:52:26 2024
    From Newsgroup: comp.misc



    On Wed, 28 Nov 2024, Computer Nerd Kev wrote:

    D <nospam@example.net> wrote:
    On Tue, 27 Nov 2024, Computer Nerd Kev wrote:
    D <nospam@example.net> wrote:
    On Tue, 26 Nov 2024, yeti wrote:
    D <nospam@example.net> wrote:
    I have been thinking about moving the reading part of web browsing >>>>>> into the terminal as well, but haven't found a browser I'm happy
    with.

    I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary >>>>> fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini, >>>>> Gopher and similar).

    True.

    I like seeing useful images, so prefer Dillo and Links (the latter
    does support display via the framebuffer so you can run it
    graphically without X).

    For some reason, I never managed to get the framebuffer to work. Have no
    idea why. =( I would like to get it to work though.

    I guess the framebuffer is working for the console, otherwise it
    will probably be a low-res BIOS character display like in DOS. So
    either a permissions problem or do you know that you need to start
    Links with the "-g" option?

    Ahh... ok, that might explain it. If it is console only, then it might not work in my terminal emulator, and -g just opens a window in X.

    I would have liked for it to shows images in the terminal, but maybe I
    need to find another terminal emulator for that to work? I think I use the default one that comes with xfce.

    Dillo was a good tip!
    I did play with it for a bit, but then forgot about it. Maybe the reason
    was a lack of tabs or buffers. I think links or maybe it was elinks, had a >> way for me to replicate tabs or vi buffers in the browser. It was super
    convenient!

    Links doesn't do tabs, eLinks might but I haven't used it much.
    Dillo has tabs, but isn't great for managing huge numbers of them
    (although I avoid trying to do that anywhere).

    Hmm, I should revisit that. I did manage to hack together something
    similar to buffers, but don't remember at the moment what I did exactly.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Thu Nov 28 10:56:54 2024
    From Newsgroup: comp.misc



    On Thu, 28 Nov 2024, Mike Spencer wrote:


    D <nospam@example.net> writes:

    Brilliant! You are a poet Mike!

    I'm doubtful that poetry can be done in Perl. Maybe free verse in
    Lisp.

    Is it true that Lisp is the secret name of god?

    Frogfind.com was a great start! I would love to have some kind of crowd
    sourced html5->html1 - javascript - garbage script.

    Do note that Frogfind delivers URLs that send your click back to
    Frogfind to be proxied. I assume that's how you get de-enshitified
    pages in response to clicking a link returned from a search.

    Yes, I noted that.

    Here's a curiosity:

    Google also sends all of your clicks on search results back through
    Google. I assume y'all knew that.

    Haven't used google in a long time, I use ddg.gg or startpage.com instead.
    As far as I can see based on a quick glance, they do no rewrites of the
    urls.

    Isn't the weird?

    I imagine it is done to record it and to help build your profile somehow, which can then be sold to advertisers?

    I also wondered if another approach might just be to take the top 500
    sites and base it on that? Or even looking through my own history, take
    the top 100.

    Now there's a project suitable for AI: train the NN to treat a response containing stuff you don't want ever to see as a failure. Grovel repetitively through terabytes of HTML and finally come up with a
    generalized filter solution.

    Maybe. I would be afraid of it becoming conscious and developing a will of its own! ;)

    Due to the bad development of the net, it seems like a greater and
    greater part of our browsing takes place on ever fewer numbers of
    sites.


    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Anssi Saari@anssi.saari@usenet.mail.kapsi.fi to comp.misc on Thu Nov 28 12:45:46 2024
    From Newsgroup: comp.misc

    Retrograde <fungus@amongus.com.invalid> writes:

    Doing everything from the terminal just isn’t viable for me, mostly because I
    didn’t grow up with it.

    I guess I was lucky, I was exposed to a bewildering variety of computers
    as I grew up in the 80s. There was the myriad of home computers, a lot
    of Commodores and Speccys but also Sharps and MSXs and whatever. Some
    CP/M machines at school, there were also some early Windows PCs there,
    then the GUIs like Atari ST and Amiga's Workbench, sometimes Macs.

    90s, I went to the University. They had MS-DOS PCs and text terminals
    connected to Unix machines. Some Sun and HP Unix workstations too but
    those were for more advanced students only for which I got access
    later. Funny contrast, in '91 I got a summer job in a university
    department which was all Macs. Looking back, it seems so radical that I
    had dual displays and a "huge" 17" monitor to work on way back
    then. Even if the other display was the minimal one integrated to the
    boxy Mac.

    In the meantime, my home computing went from a Commodore 64 to MS-DOS
    PC, then dual booting that with OS/2 and some Linux experiments. Games
    went to Windows so that MS-DOS became Windows 98 and XP and 7 and
    10. Late 90s Linux experiments became permanent when I learned of Debian Stable. OS/2 disappeared when picking supported hardware for it got too tiresome.

    Work, started mid-90s with Sun Unix workstations until I was kicked to
    Office land. That was an awful time and when I escaped, it's been much
    the same, Windows PC on the desk, Unix and later Linux server
    somewhere. Oh, one job actually provided a Linux workstation under the
    desk in addition to a Windows laptop but that was one time.

    But to the topic, text only in 2024? I don't think so. Web browsing and
    email, just no. Sure I just used Lynx on a Linux server at work to check
    the proxy settings are correct and I do use mutt to teach misses to my
    spam filter but that's pretty much it. For me, the email I get is HTML
    with pictures from commercial sources. Very little personal
    correspondence over email these days and mailing lists I get via NNTP
    and gmane.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From not@not@telling.you.invalid (Computer Nerd Kev) to comp.misc on Fri Nov 29 06:17:21 2024
    From Newsgroup: comp.misc

    D <nospam@example.net> wrote:
    On Wed, 28 Nov 2024, Computer Nerd Kev wrote:
    D <nospam@example.net> wrote:
    On Tue, 27 Nov 2024, Computer Nerd Kev wrote:
    D <nospam@example.net> wrote:
    On Tue, 26 Nov 2024, yeti wrote:
    I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary >>>>>> fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini, >>>>>> Gopher and similar).

    True.

    I like seeing useful images, so prefer Dillo and Links (the latter
    does support display via the framebuffer so you can run it
    graphically without X).

    For some reason, I never managed to get the framebuffer to work. Have no >>> idea why. =( I would like to get it to work though.

    I guess the framebuffer is working for the console, otherwise it
    will probably be a low-res BIOS character display like in DOS. So
    either a permissions problem or do you know that you need to start
    Links with the "-g" option?

    Ahh... ok, that might explain it. If it is console only, then it might not work in my terminal emulator, and -g just opens a window in X.

    Certainly, in X it'll always be in a separate window.

    I would have liked for it to shows images in the terminal, but maybe I
    need to find another terminal emulator for that to work? I think I use the default one that comes with xfce.

    W3m displays images in XTerm and other terminal emulators, so that
    might be what you want for a browser. I'm not sure if there's a
    list of terminal emulators that support image display from it.
    This page mentions that some require changes to the configuration: https://wiki.archlinux.org/title/W3m
    --
    __ __
    #_ < |\| |< _#
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Thu Nov 28 22:05:14 2024
    From Newsgroup: comp.misc



    On Thu, 29 Nov 2024, Computer Nerd Kev wrote:

    D <nospam@example.net> wrote:
    On Wed, 28 Nov 2024, Computer Nerd Kev wrote:
    D <nospam@example.net> wrote:
    On Tue, 27 Nov 2024, Computer Nerd Kev wrote:
    D <nospam@example.net> wrote:
    On Tue, 26 Nov 2024, yeti wrote:
    I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary >>>>>>> fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini, >>>>>>> Gopher and similar).

    True.

    I like seeing useful images, so prefer Dillo and Links (the latter
    does support display via the framebuffer so you can run it
    graphically without X).

    For some reason, I never managed to get the framebuffer to work. Have no >>>> idea why. =( I would like to get it to work though.

    I guess the framebuffer is working for the console, otherwise it
    will probably be a low-res BIOS character display like in DOS. So
    either a permissions problem or do you know that you need to start
    Links with the "-g" option?

    Ahh... ok, that might explain it. If it is console only, then it might not >> work in my terminal emulator, and -g just opens a window in X.

    Certainly, in X it'll always be in a separate window.

    I would have liked for it to shows images in the terminal, but maybe I
    need to find another terminal emulator for that to work? I think I use the >> default one that comes with xfce.

    W3m displays images in XTerm and other terminal emulators, so that
    might be what you want for a browser. I'm not sure if there's a
    list of terminal emulators that support image display from it.
    This page mentions that some require changes to the configuration: https://wiki.archlinux.org/title/W3m

    I did go back to play with elinks today, and it does seem like the text
    based browser that gets absolutely closest to what I need with the ability
    to auto save sessions.

    I think that together wish frogfind.com I have found my temporary
    solution for the terminal! It is also trivial to migrate my open "reading tabs" from firefox to elinks by just doing a save all open tabs, and then massaging the exported bookmarks file a bit and then just open all of the sites from the command line. =)
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From yeti@yeti@tilde.institute to comp.misc on Fri Nov 29 02:19:10 2024
    From Newsgroup: comp.misc

    not@telling.you.invalid (Computer Nerd Kev) wrote:

    W3m displays images in XTerm and other terminal emulators, so that
    might be what you want for a browser. I'm not sure if there's a
    list of terminal emulators that support image display from it.
    This page mentions that some require changes to the configuration: https://wiki.archlinux.org/title/W3m

    I think W3M seems to put another window layer atop the terminal to
    display images. It works, but my main use case for W3M is as man page
    viewer W3MMAN (aliased to man), so I don't care much for it's image capabilities.

    Elinks has a `./configure` option to enable Sixels, which I did, and I
    see the generated binary being linked to `libsixel`, found the run-time
    option to enable Sixel graphics, but I never see any images displayed.

    <https://github.com/rkd77/elinks>

    If someone succeeds with this, please ping me.
    --
    Die Partei | Martin Sonneborn | Die Partei
    Die Partei | Gespräch am Küchentisch, Teil II | Die Partei
    Die Partei | <https://www.youtube.com/watch?v=2C21SJd5SVE> | Die Partei
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Fri Nov 29 10:38:06 2024
    From Newsgroup: comp.misc



    On Fri, 29 Nov 2024, yeti wrote:

    not@telling.you.invalid (Computer Nerd Kev) wrote:

    W3m displays images in XTerm and other terminal emulators, so that
    might be what you want for a browser. I'm not sure if there's a
    list of terminal emulators that support image display from it.
    This page mentions that some require changes to the configuration:
    https://wiki.archlinux.org/title/W3m

    I think W3M seems to put another window layer atop the terminal to
    display images. It works, but my main use case for W3M is as man page
    viewer W3MMAN (aliased to man), so I don't care much for it's image capabilities.

    Elinks has a `./configure` option to enable Sixels, which I did, and I
    see the generated binary being linked to `libsixel`, found the run-time option to enable Sixel graphics, but I never see any images displayed.

    <https://github.com/rkd77/elinks>

    If someone succeeds with this, please ping me.


    Thank you for mentioning it. I will have a look!
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Fri Nov 29 22:39:12 2024
    From Newsgroup: comp.misc



    On Fri, 29 Nov 2024, D wrote:



    On Fri, 29 Nov 2024, yeti wrote:

    not@telling.you.invalid (Computer Nerd Kev) wrote:

    W3m displays images in XTerm and other terminal emulators, so that
    might be what you want for a browser. I'm not sure if there's a
    list of terminal emulators that support image display from it.
    This page mentions that some require changes to the configuration:
    https://wiki.archlinux.org/title/W3m

    I think W3M seems to put another window layer atop the terminal to
    display images. It works, but my main use case for W3M is as man page
    viewer W3MMAN (aliased to man), so I don't care much for it's image
    capabilities.

    Elinks has a `./configure` option to enable Sixels, which I did, and I
    see the generated binary being linked to `libsixel`, found the run-time
    option to enable Sixel graphics, but I never see any images displayed.

    <https://github.com/rkd77/elinks>

    If someone succeeds with this, please ping me.


    Thank you for mentioning it. I will have a look!


    I tried elinks with frogfind.com and I discovered that the best way to
    kind of replicate buffers are to start elinks with all the sites I have on
    my reading list (elinks $(cat links.txt)). In the links.txt I have
    prefixed all my sites with frogfind.com.

    I then discovered that they all entered the global history file, and in
    that file I can search among the sites.

    So all sites are opened in invisible tabs, and I can search for them in
    either the globalhistory, or I can make sure they are all saved as
    bookmarks, and drop the tabs altogether.

    Frogfind makes it fairly palatable!
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From candycanearter07@candycanearter07@candycanearter07.nomail.afraid to comp.misc on Sat Nov 30 01:20:04 2024
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday (GMT):
    On 25 Nov 2024 13:34:25 GMT, Retrograde wrote:

    This comes down to not knowing most commands by heart,
    and often not even knowing the options and flags for the most basic of
    commands ...

    Don’t need to. Type “man «cmd»” to see all the details of the options
    available for any external command. I do this all the time.

    I’m glad any modern Linux distribution – I use Fedora KDE on all my
    computers – offers both paths for almost anything you could do on your
    computer, and unless I specifically opt to do so, I literally –
    literally literally – never have to touch the command line.

    Also, running a command line through a GUI terminal emulator lets you take advantage of cut/copy/paste between windows, which is a feature not available on a pure-command-line system.


    You can technically emulate that with screen or a similar multiplexer.
    --
    user <candycane> is generated from /dev/urandom
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From yeti@yeti@tilde.institute to comp.misc on Sat Nov 30 04:22:57 2024
    From Newsgroup: comp.misc

    candycanearter07 <candycanearter07@candycanearter07.nomail.afraid> wrote:

    You can technically emulate that with screen or a similar multiplexer.

    Apropos similar: The funniest multiplexer I saw was Neercs.

    <https://github.com/cacalabs/neercs>
    <https://www.youtube.com/watch?v=7d33Pu2OW7k>
    <https://www.youtube.com/watch?v=sQr42LjaNCY>

    Was it ever officially finished and released?
    --
    I do not bite, I just want to play.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Sat Nov 30 03:52:19 2024
    From Newsgroup: comp.misc

    On Sat, 30 Nov 2024 01:20:04 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday (GMT):

    Also, running a command line through a GUI terminal emulator lets you
    take advantage of cut/copy/paste between windows, which is a feature
    not available on a pure-command-line system.

    You can technically emulate that with screen or a similar multiplexer.

    A GUI lets you do that between different apps, not just terminal
    emulators, as well.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From candycanearter07@candycanearter07@candycanearter07.nomail.afraid to comp.misc on Sun Dec 1 20:40:03 2024
    From Newsgroup: comp.misc

    yeti <yeti@tilde.institute> wrote at 03:40 this Saturday (GMT):
    candycanearter07 <candycanearter07@candycanearter07.nomail.afraid> wrote:

    You can technically emulate that with screen or a similar multiplexer.

    Apropos similar: The funniest multiplexer I saw was Neercs.

    <https://github.com/cacalabs/neercs>
    <https://www.youtube.com/watch?v=7d33Pu2OW7k>
    <https://www.youtube.com/watch?v=sQr42LjaNCY>

    Was it ever officially finished and released?


    Honestly, that looks super cool and it's a shame it doesn't seem like it
    was finished.
    --
    user <candycane> is generated from /dev/urandom
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From candycanearter07@candycanearter07@candycanearter07.nomail.afraid to comp.misc on Sun Dec 1 20:40:04 2024
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 03:52 this Saturday (GMT):
    On Sat, 30 Nov 2024 01:20:04 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday (GMT):

    Also, running a command line through a GUI terminal emulator lets you
    take advantage of cut/copy/paste between windows, which is a feature
    not available on a pure-command-line system.

    You can technically emulate that with screen or a similar multiplexer.

    A GUI lets you do that between different apps, not just terminal
    emulators, as well.


    I'm sure you can set something up with xclip if you really need that.
    --
    user <candycane> is generated from /dev/urandom
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Sun Dec 1 23:24:44 2024
    From Newsgroup: comp.misc

    On Sun, 1 Dec 2024 20:40:04 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 03:52 this Saturday (GMT):

    On Sat, 30 Nov 2024 01:20:04 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday (GMT):

    Also, running a command line through a GUI terminal emulator lets you
    take advantage of cut/copy/paste between windows, which is a feature
    not available on a pure-command-line system.

    You can technically emulate that with screen or a similar multiplexer.

    A GUI lets you do that between different apps, not just terminal
    emulators, as well.

    I'm sure you can set something up with xclip if you really need that.

    But xclip requires a GUI, does it not?
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From candycanearter07@candycanearter07@candycanearter07.nomail.afraid to comp.misc on Mon Dec 2 02:00:03 2024
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 23:24 this Sunday (GMT):
    On Sun, 1 Dec 2024 20:40:04 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 03:52 this Saturday (GMT):

    On Sat, 30 Nov 2024 01:20:04 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday (GMT): >>>>
    Also, running a command line through a GUI terminal emulator lets you >>>>> take advantage of cut/copy/paste between windows, which is a feature >>>>> not available on a pure-command-line system.

    You can technically emulate that with screen or a similar multiplexer.

    A GUI lets you do that between different apps, not just terminal
    emulators, as well.

    I'm sure you can set something up with xclip if you really need that.

    But xclip requires a GUI, does it not?


    So does running GUI apps. For terminal apps, using a multiplexer
    copy/paste should be fine.
    --
    user <candycane> is generated from /dev/urandom
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Mon Dec 2 05:41:24 2024
    From Newsgroup: comp.misc

    On Mon, 2 Dec 2024 02:00:03 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 23:24 this Sunday (GMT):
    On Sun, 1 Dec 2024 20:40:04 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 03:52 this Saturday
    (GMT):

    On Sat, 30 Nov 2024 01:20:04 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday
    (GMT):

    Also, running a command line through a GUI terminal emulator lets
    you take advantage of cut/copy/paste between windows, which is a
    feature not available on a pure-command-line system.

    You can technically emulate that with screen or a similar
    multiplexer.

    A GUI lets you do that between different apps, not just terminal
    emulators, as well.

    I'm sure you can set something up with xclip if you really need that.

    But xclip requires a GUI, does it not?

    So does running GUI apps.

    If you’re running a GUI, you might as well use full-function GUI cut/copy/ paste, which is more general than anything provided within a character-
    based multiplexer, anyway.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Oregonian Haruspex@no_email@invalid.invalid to comp.misc on Wed Dec 4 06:11:40 2024
    From Newsgroup: comp.misc

    EMacs EWW seems to work with a large number of sites these days. I try to
    do everything in eMacs. Of course for some stuff like shopping and banking
    a modern aka bloated browser is necessary. But eMacs is also TUI, not
    strictly a terminal program.

    There is something serene about text as your interface. If I could get
    Amazon, eBay, and my bank to work properly in EWW I wouldn’t even launch a browser, ever.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Wed Dec 4 06:42:40 2024
    From Newsgroup: comp.misc

    On Wed, 4 Dec 2024 06:11:40 -0000 (UTC), Oregonian Haruspex wrote:

    But eMacs is also TUI, not strictly a terminal program.

    It can display graphics. It has long been able to run under X11. I
    currently use a GTK build that works under Wayland.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From candycanearter07@candycanearter07@candycanearter07.nomail.afraid to comp.misc on Wed Dec 4 14:30:03 2024
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 06:42 this Wednesday (GMT):
    On Wed, 4 Dec 2024 06:11:40 -0000 (UTC), Oregonian Haruspex wrote:

    But eMacs is also TUI, not strictly a terminal program.

    It can display graphics. It has long been able to run under X11. I
    currently use a GTK build that works under Wayland.


    But does it support JS?
    --
    user <candycane> is generated from /dev/urandom
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Thu Dec 5 01:46:43 2024
    From Newsgroup: comp.misc

    On Wed, 4 Dec 2024 14:30:03 -0000 (UTC), candycanearter07 wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 06:42 this Wednesday
    (GMT):

    On Wed, 4 Dec 2024 06:11:40 -0000 (UTC), Oregonian Haruspex wrote:

    But eMacs is also TUI, not strictly a terminal program.

    It can display graphics. It has long been able to run under X11. I
    currently use a GTK build that works under Wayland.

    But does it support JS?

    This being Emacs, the answer would be “very likely”.

    But ... relevance being?
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From yeti@yeti@tilde.institute to comp.misc on Thu Dec 5 06:34:51 2024
    From Newsgroup: comp.misc

    candycanearter07 <candycanearter07@candycanearter07.nomail.afraid> wrote:

    But does it support JS?

    EWW?

    ------------------------------------------------------------------------
    Although EWW and shr.el do their best to render webpages in GNU Emacs
    some websites use features which can not be properly represented or are
    not implemented (e.g., JavaScript). ------------------------------------------------------------------------
    ( (eww.info)Basics )
    --
    I do not bite, I just want to play.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From not@not@telling.you.invalid (Computer Nerd Kev) to comp.misc on Sun Dec 8 07:52:33 2024
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote:
    On Wed, 4 Dec 2024 14:30:03 -0000 (UTC), candycanearter07 wrote:
    Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 06:42 this Wednesday
    (GMT):
    On Wed, 4 Dec 2024 06:11:40 -0000 (UTC), Oregonian Haruspex wrote:

    But eMacs is also TUI, not strictly a terminal program.

    It can display graphics. It has long been able to run under X11. I
    currently use a GTK build that works under Wayland.

    But does it support JS?

    This being Emacs, the answer would be "very likely".

    But ... relevance being?

    You snipped the relevance yourself, as usual:

    On Wed, 4 Dec 2024 06:11:40 -0000 (UTC), Oregonian Haruspex wrote:
    If I could get Amazon, eBay, and my bank to work properly in
    EWW I wouldn't even launch a browser, ever.

    I don't know about Emacs, but for TUI browsers with Javascript
    support ELinks is one that I'm aware of. However like the
    experimental JS support in Netsurf it doesn't seem to be advanced
    enough to be useful (although unlike Netsurf, ELinks uses Mozilla's SpiderMonkey JS engine, so I'm not exactly sure what makes it so
    difficult to get right).
    --
    __ __
    #_ < |\| |< _#
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From root@NoEMail@home.org to comp.misc on Sun Dec 8 14:11:04 2024
    From Newsgroup: comp.misc

    Computer Nerd Kev <not@telling.you.invalid> wrote:

    I don't know about Emacs, but for TUI browsers with Javascript
    support ELinks is one that I'm aware of. However like the
    experimental JS support in Netsurf it doesn't seem to be advanced
    enough to be useful (although unlike Netsurf, ELinks uses Mozilla's SpiderMonkey JS engine, so I'm not exactly sure what makes it so
    difficult to get right).


    I regard ELinks as worthless. At best, I hope it is a work in
    progress. I haven't tried Netsurf, but I have tried implementing,
    via jsdom, specific fetch routines for different sites of interest.
    I have found that even sites that contain json data do not provide
    consistent (across sites) methods of fetching the data. It is
    worse when the data are not as organized as json data, but it is
    distributed in unique ways for the specific site.

    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Bozo User@anthk@disroot.org to comp.misc on Sun Jan 12 23:01:23 2025
    From Newsgroup: comp.misc

    On 2024-12-08, root <NoEMail@home.org> wrote:
    Computer Nerd Kev <not@telling.you.invalid> wrote:

    I don't know about Emacs, but for TUI browsers with Javascript
    support ELinks is one that I'm aware of. However like the
    experimental JS support in Netsurf it doesn't seem to be advanced
    enough to be useful (although unlike Netsurf, ELinks uses Mozilla's
    SpiderMonkey JS engine, so I'm not exactly sure what makes it so
    difficult to get right).


    I regard ELinks as worthless. At best, I hope it is a work in
    progress. I haven't tried Netsurf, but I have tried implementing,
    via jsdom, specific fetch routines for different sites of interest.
    I have found that even sites that contain json data do not provide
    consistent (across sites) methods of fetching the data. It is
    worse when the data are not as organized as json data, but it is
    distributed in unique ways for the specific site.


    Once you get a Gopher/Gemini browser, among yt-dlp, the web can go away.

    Try these under lynx:

    gopher://magical.fish
    gopher://gopherddit.com
    gopher://sdf.org
    gopher://hngopher.com

    gemini://gemi.dev (head to news waffle)

    Magical Fish it's a HUGE portal and even a 386 would be
    able to use the services. You have a news source,
    a translator, stock prices, weather, wikipedia over gopher,
    Gutenberg, torrent search...

    Have fun.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Bozo User@anthk@disroot.org to comp.misc on Sun Jan 12 23:01:23 2025
    From Newsgroup: comp.misc

    On 2024-12-08, root <NoEMail@home.org> wrote:
    Computer Nerd Kev <not@telling.you.invalid> wrote:

    I don't know about Emacs, but for TUI browsers with Javascript
    support ELinks is one that I'm aware of. However like the
    experimental JS support in Netsurf it doesn't seem to be advanced
    enough to be useful (although unlike Netsurf, ELinks uses Mozilla's
    SpiderMonkey JS engine, so I'm not exactly sure what makes it so
    difficult to get right).


    I regard ELinks as worthless. At best, I hope it is a work in
    progress. I haven't tried Netsurf, but I have tried implementing,
    via jsdom, specific fetch routines for different sites of interest.
    I have found that even sites that contain json data do not provide
    consistent (across sites) methods of fetching the data. It is
    worse when the data are not as organized as json data, but it is
    distributed in unique ways for the specific site.

    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Bozo User@anthk@disroot.org to comp.misc on Sun Jan 12 23:01:24 2025
    From Newsgroup: comp.misc

    On 2024-11-25, Retrograde <fungus@amongus.com.invalid> wrote:
    From the «text is good enough» department:
    Title: Using (only) a Linux terminal for my personal computing in 2024 Author: Thom Holwerda
    Date: Sun, 24 Nov 2024 22:13:32 +0000
    Link: https://www.osnews.com/story/141194/using-only-a-linux-terminal-for-my-personal-computing-in-2024/


    A month and a bit ago, I wondered if I could cope with a terminal-only computer[1].
    […]

    The only way to really find out was to give it a go.

    My goal was to see what it was like to use a terminal-only computer for my personal computing for two weeks, and more if I fancied it.
    ↫ Neil’s blog[2]

    I tried to do this too, once.

    Once.

    Doing everything from the terminal just isn’t viable for me, mostly because I
    didn’t grow up with it. Our family’s first computer ran MS-DOS (with a Windows
    3.1 installation we never used), and I’m pretty sure the experience of using
    MS-DOS as my first CLI ruined me for life. My mental model for computing didn’t
    start forming properly until Windows 95 came out, and as such, computing is inherently graphical for me, and no matter how many amazing CLI and TUI applications are out there – and there are many, many amazing ones – my brain
    just isn’t compatible with it.

    There are a few tasks I prefer doing with the command line, like updating my computers or editing system files using Nano, but for everything else I’m just
    faster and more comfortable with a graphical user interface. This comes down to
    not knowing most commands by heart, and often not even knowing the options and
    flags for the most basic of commands, meaning even very basic operations that people comfortable using the command line do without even thinking, take me ages.

    I’m glad any modern Linux distribution – I use Fedora KDE on all my computers –
    offers both paths for almost anything you could do on your computer, and unless
    I specifically opt to do so, I literally – literally literally – never have to
    touch the command line.

    Links:
    [1]: https://neilzone.co.uk/2024/10/could-i-cope-with-a-terminal-only-computer/ (link)
    [2]: https://neilzone.co.uk/2024/11/using-only-a-linux-terminal-for-my-personal-computing-in-2024/ (link)

    In my case, I use cwm+uxterm+a bunch of cli/tui apps, such as profanity, catgirl, mocp... and the only X software I use are sxiv, mpv and mupdf.
    Oh, and GV for a random PostScript file. That's it.

    If you can I can post my setup. It's megafast.
    Ah, no, I forgot: xload and xlock, which just lie there.
    Anyway, it's like an advanced terminal from a different future.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Salvador Mirzo@smirzo@example.com to comp.misc on Sun Jan 12 22:03:06 2025
    From Newsgroup: comp.misc

    Bozo User <anthk@disroot.org> writes:

    [...]

    In my case, I use cwm+uxterm+a bunch of cli/tui apps, such as profanity, catgirl, mocp... and the only X software I use are sxiv, mpv and mupdf.
    Oh, and GV for a random PostScript file. That's it.

    I too run cwm+uxterm! But then I add the GNU EMACS on top.

    Thanks for mentioning mupdf---fast and nice. I wonder if it can display
    the outline of a pdf (if available).
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Mon Jan 13 10:46:53 2025
    From Newsgroup: comp.misc



    On Sun, 12 Jan 2025, Bozo User wrote:

    On 2024-12-08, root <NoEMail@home.org> wrote:
    Computer Nerd Kev <not@telling.you.invalid> wrote:

    I don't know about Emacs, but for TUI browsers with Javascript
    support ELinks is one that I'm aware of. However like the
    experimental JS support in Netsurf it doesn't seem to be advanced
    enough to be useful (although unlike Netsurf, ELinks uses Mozilla's
    SpiderMonkey JS engine, so I'm not exactly sure what makes it so
    difficult to get right).


    I regard ELinks as worthless. At best, I hope it is a work in
    progress. I haven't tried Netsurf, but I have tried implementing,
    via jsdom, specific fetch routines for different sites of interest.
    I have found that even sites that contain json data do not provide
    consistent (across sites) methods of fetching the data. It is
    worse when the data are not as organized as json data, but it is
    distributed in unique ways for the specific site.


    Once you get a Gopher/Gemini browser, among yt-dlp, the web can go away.

    Try these under lynx:

    gopher://magical.fish
    gopher://gopherddit.com
    gopher://sdf.org
    gopher://hngopher.com

    gemini://gemi.dev (head to news waffle)

    Magical Fish it's a HUGE portal and even a 386 would be
    able to use the services. You have a news source,
    a translator, stock prices, weather, wikipedia over gopher,
    Gutenberg, torrent search...

    Have fun.

    I imagine it would be very easy to write scripts to pull in what ever
    regular www site you might like and move it to gopher. I found it sad that gemini came into being and split the energies between gopher and gemini.

    I will have to remember magical.fish. Gohper works beautifully in links!
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From nospam@nospam@example.net to comp.misc on Mon Jan 13 10:48:17 2025
    From Newsgroup: comp.misc



    On Sun, 12 Jan 2025, Salvador Mirzo wrote:

    Bozo User <anthk@disroot.org> writes:

    [...]

    In my case, I use cwm+uxterm+a bunch of cli/tui apps, such as profanity,
    catgirl, mocp... and the only X software I use are sxiv, mpv and mupdf.
    Oh, and GV for a random PostScript file. That's it.

    I too run cwm+uxterm! But then I add the GNU EMACS on top.

    Thanks for mentioning mupdf---fast and nice. I wonder if it can display
    the outline of a pdf (if available).


    I use qpdf. Has sessions, and is fairly light weight.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Salvador Mirzo@smirzo@example.com to comp.misc on Mon Jan 13 16:24:27 2025
    From Newsgroup: comp.misc

    D <nospam@example.net> writes:

    On Sun, 12 Jan 2025, Salvador Mirzo wrote:

    Bozo User <anthk@disroot.org> writes:

    [...]

    In my case, I use cwm+uxterm+a bunch of cli/tui apps, such as profanity, >>> catgirl, mocp... and the only X software I use are sxiv, mpv and mupdf.
    Oh, and GV for a random PostScript file. That's it.

    I too run cwm+uxterm! But then I add the GNU EMACS on top.

    Thanks for mentioning mupdf---fast and nice. I wonder if it can display
    the outline of a pdf (if available).


    I use qpdf. Has sessions, and is fairly light weight.

    Wonderful! Pretty nice as well. Very easy to use. Now, it can't seem
    to use lpr for printing? That's how I print. :) But I can workaround it
    by figuring out how to tell lpr to tell my printer to only print a few
    pages I'm interested in and then use the command line. Thanks for
    mentioning qpdf.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From not@not@telling.you.invalid (Computer Nerd Kev) to comp.misc on Tue Jan 14 06:52:03 2025
    From Newsgroup: comp.misc

    D <nospam@example.net> wrote:
    On Sun, 12 Jan 2025, Bozo User wrote:
    Once you get a Gopher/Gemini browser, among yt-dlp, the web can go away.

    Try these under lynx:

    gopher://magical.fish
    gopher://gopherddit.com
    gopher://sdf.org
    gopher://hngopher.com

    gemini://gemi.dev (head to news waffle)

    Magical Fish it's a HUGE portal and even a 386 would be
    able to use the services. You have a news source,
    a translator, stock prices, weather, wikipedia over gopher,
    Gutenberg, torrent search...

    Have fun.

    I imagine it would be very easy to write scripts to pull in what ever regular www site you might like and move it to gopher.

    If it has a friendly API and that doesn't change every month. I
    notice Gopherddit.com is broken, it just says "Subreddit not found"
    for everything. Not that I care to read Reddit anyway.

    I will have to remember magical.fish. Gohper works beautifully in links!

    No Gopher support in Links, I guess you mean ELinks or Lynx.
    --
    __ __
    #_ < |\| |< _#
    --- Synchronet 3.20a-Linux NewsLink 1.2
  • From nospam@nospam@example.net to comp.misc on Tue Jan 14 18:50:51 2025
    From Newsgroup: comp.misc



    On Mon, 13 Jan 2025, Salvador Mirzo wrote:

    D <nospam@example.net> writes:

    On Sun, 12 Jan 2025, Salvador Mirzo wrote:

    Bozo User <anthk@disroot.org> writes:

    [...]

    In my case, I use cwm+uxterm+a bunch of cli/tui apps, such as profanity, >>>> catgirl, mocp... and the only X software I use are sxiv, mpv and mupdf. >>>> Oh, and GV for a random PostScript file. That's it.

    I too run cwm+uxterm! But then I add the GNU EMACS on top.

    Thanks for mentioning mupdf---fast and nice. I wonder if it can display >>> the outline of a pdf (if available).


    I use qpdf. Has sessions, and is fairly light weight.

    Wonderful! Pretty nice as well. Very easy to use. Now, it can't seem
    to use lpr for printing? That's how I print. :) But I can workaround it
    by figuring out how to tell lpr to tell my printer to only print a few
    pages I'm interested in and then use the command line. Thanks for
    mentioning qpdf.


    You're welcome! =)
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From nospam@nospam@example.net to comp.misc on Tue Jan 14 18:54:15 2025
    From Newsgroup: comp.misc



    On Mon, 14 Jan 2025, Computer Nerd Kev wrote:

    D <nospam@example.net> wrote:
    On Sun, 12 Jan 2025, Bozo User wrote:
    Once you get a Gopher/Gemini browser, among yt-dlp, the web can go away. >>>
    Try these under lynx:

    gopher://magical.fish
    gopher://gopherddit.com
    gopher://sdf.org
    gopher://hngopher.com

    gemini://gemi.dev (head to news waffle)

    Magical Fish it's a HUGE portal and even a 386 would be
    able to use the services. You have a news source,
    a translator, stock prices, weather, wikipedia over gopher,
    Gutenberg, torrent search...

    Have fun.

    I imagine it would be very easy to write scripts to pull in what ever
    regular www site you might like and move it to gopher.

    If it has a friendly API and that doesn't change every month. I
    notice Gopherddit.com is broken, it just says "Subreddit not found"
    for everything. Not that I care to read Reddit anyway.

    I will have to remember magical.fish. Gohper works beautifully in links!

    No Gopher support in Links, I guess you mean ELinks or Lynx.

    This is correct. I meant elinks. Apologies for the confusion.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Salvador Mirzo@smirzo@example.com to comp.misc on Wed Jan 15 22:10:38 2025
    From Newsgroup: comp.misc

    Salvador Mirzo <smirzo@example.com> writes:

    [...]

    I use qpdf. Has sessions, and is fairly light weight.

    Wonderful! Pretty nice as well. Very easy to use. Now, it can't seem
    to use lpr for printing? That's how I print. :) But I can workaround it
    by figuring out how to tell lpr to tell my printer to only print a few
    pages I'm interested in and then use the command line. Thanks for
    mentioning qpdf.

    I suspect I imagine wrong how things actually work. I thought perhaps
    there would be a command line such as ``lpr --pages 7-14''. Now I
    believe a program like evince generates a PostScript of the pages you
    asked it to and then sends this complete PostScript document of the
    pages you requested to a pipe or file on disk that lpr sends to the
    printer. So, if qpdf doesn't do the same, I'm out of luck in terms of
    printing with lpr. But I think I can find a program that takes page
    ranges and transformations like scaling and produces a PostScript
    document that I can send to lpr, so I can use qpdfview and use the
    command line to print stuff out.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Rich@rich@example.invalid to comp.misc on Thu Jan 16 04:15:53 2025
    From Newsgroup: comp.misc

    Salvador Mirzo <smirzo@example.com> wrote:
    Salvador Mirzo <smirzo@example.com> writes:

    [...]

    I use qpdf. Has sessions, and is fairly light weight.

    Wonderful! Pretty nice as well. Very easy to use. Now, it can't seem
    to use lpr for printing? That's how I print. :) But I can workaround it
    by figuring out how to tell lpr to tell my printer to only print a few
    pages I'm interested in and then use the command line. Thanks for
    mentioning qpdf.

    I suspect I imagine wrong how things actually work. I thought perhaps
    there would be a command line such as ``lpr --pages 7-14''. Now I
    believe a program like evince generates a PostScript of the pages you
    asked it to and then sends this complete PostScript document of the
    pages you requested to a pipe or file on disk that lpr sends to the
    printer.

    Yes, selecting "which pages" happens before the result gets sent to lpr
    (or cups).

    But I think I can find a program that takes page ranges and
    transformations like scaling and produces a PostScript document that
    I can send to lpr, so I can use qpdfview and use the command line to
    print stuff out.

    If you are dealing with pdf files, then pdftk <https://en.wikipedia.org/wiki/PDFtk> works very well of doing various transforms on pdf files (including selecting a subset of pages, that do
    not have to all be contiguous).

    If you have actual postscript files, you can use ghostscript from the
    command line to "distill" them to pdf (note ghostscrpts "pdfwrite"
    output driver) and then use pdftk for further transforming.

    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Computer Nerd Kev@not@telling.you.invalid to comp.misc on Thu Jan 16 15:58:27 2025
    From Newsgroup: comp.misc

    Salvador Mirzo <smirzo@example.com> wrote:
    Salvador Mirzo <smirzo@example.com> writes:
    Wonderful! Pretty nice as well. Very easy to use. Now, it can't seem
    to use lpr for printing? That's how I print. :) But I can workaround it
    by figuring out how to tell lpr to tell my printer to only print a few
    pages I'm interested in and then use the command line. Thanks for
    mentioning qpdf.

    I suspect I imagine wrong how things actually work. I thought perhaps
    there would be a command line such as ``lpr --pages 7-14''. Now I
    believe a program like evince generates a PostScript of the pages you
    asked it to and then sends this complete PostScript document of the
    pages you requested to a pipe or file on disk that lpr sends to the
    printer. So, if qpdf doesn't do the same, I'm out of luck in terms of printing with lpr. But I think I can find a program that takes page
    ranges and transformations like scaling and produces a PostScript
    document that I can send to lpr, so I can use qpdfview and use the
    command line to print stuff out.

    If you want a Postscript file of a page range from a PDF, convert the
    PDF to Postscript first then use psselect from psutils. Or use the
    "save marked" function in gv, which I personally use as my default
    PDF viewer.
    --
    __ __
    #_ < |\| |< _#
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From yeti@yeti@tilde.institute to comp.misc on Thu Jan 16 11:42:29 2025
    From Newsgroup: comp.misc

    I haven't yet managed to get JS (and Sixels) running with Elinks, but
    there is:

    <https://sr.ht/~bptato/chawan/>

    JS works at least a bit, maybe just enough for Gitea?

    <https://dev1galaxy.org/viewtopic.php?pid=53922#p53922>

    Despite allowing JS and cookies I couldn't use Google[0].

    Fossil's menu does open with JS disabled, but I cannot select stuff in
    there. With JS allowed it doesn't even open.

    <https://www.fossil-scm.org>

    I see frequent changes in Chawan, so maybe this is the one to watch now
    and maybe tomorrow stuff that glitches today may be working.

    ____________

    [0]: But meh ... there are alternatives[1].

    [1]: DDG
    <https://duckduckgo.com/>
    FrogFind
    <http://www.frogfind.com/>
    --
    4. Hitchhiker 11:
    (72) "Watch the road!'' she yelped.
    (73) "Shit!"
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Tue Jan 21 05:31:45 2025
    From Newsgroup: comp.misc

    On Wed, 15 Jan 2025 22:10:38 -0300, Salvador Mirzo wrote:

    I thought perhaps there would be a command line such as
    ``lpr --pages 7-14''.

    <https://manpages.debian.org/lp(1)>:

    -P page-list
    Specifies which pages to print in the document. The list can
    contain a list of numbers and ranges (#-#) separated by
    commas, e.g., "1,3-5,16". The page numbers refer to the output
    pages and not the document's original pages - options like
    "number-up" can affect the numbering of the pages.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Ivan Shmakov@ivan@siamics.netREMOVE.invalid to comp.misc on Thu Jan 23 19:33:36 2025
    From Newsgroup: comp.misc

    On 2025-01-16, Salvador Mirzo wrote:

    I suspect I imagine wrong how things actually work. I thought
    perhaps there would be a command line such as ``lpr --pages 7-14''.

    As has already been pointed in this thread, CUPS, a fairly
    common choice for a printer spooler in GNU/Linux systems,
    provides lp(1) command that does have just such an option.

    Now I believe a program like evince generates a PostScript of
    the pages you asked it to and then sends this complete PostScript
    document of the pages you requested to a pipe or file on disk
    that lpr sends to the printer.

    AIUI, traditional lpd(8) / lpr(1) do require the file to be
    preprocessed in such a way before it is submitted for printing,
    but even then, they do /not/ require for the file to be
    PostScript: it's possible to setup the respective filters to
    accept other formats, such as PDF.

    So, if qpdf doesn't do the same, I'm out of luck in terms of
    printing with lpr. But I think I can find a program that takes
    page ranges and transformations like scaling and produces a
    PostScript document that I can send to lpr, so I can use qpdfview
    and use the command line to print stuff out.

    I'm not too familiar with qpdf(1) (and I don't think I've ever
    used qpdfview [*]), but it does have a --pages option. E. g.:

    $ qpdf --empty --pages in.pdf 5-8 -- out.pdf
    $ qpdf in.pdf --pages . 5-8 -- out.pdf

    (The second variant preserves the input document metadata,
    which isn't probably of much use for printing anyway.)

    ... A somewhat little-known fact is that once uncompressed, PDF
    is largely a text file (perhaps unsurprising, given it comes
    from the same company that created PostScript), though employing
    byte offsets rather unrestrictedly.

    qpdf(1) has a --qdf option that undoes compressesion and annotates
    the file in such a way that the companion fix-qdf program can
    fix the byte offsets, at least in certain cases, thus allowing the
    PDF file to be edited with a text editor. (Though probably using
    a library, such as PDF::API2 for Perl, would be more practical
    than trying to, say, adapt sed(1) for automated edits in this case.)

    [*] Given a choice, I tend to prefer HTML. If the document I'm
    interested in is only available in a PDF version, I tend to
    use pdftotext(1). If that fails to produce a legible version,
    I resort to Zathura, preferring it mostly for its UI.
    --- Synchronet 3.20c-Linux NewsLink 1.2