From the «text is good enough» department:
Title: Using (only) a Linux terminal for my personal computing in 2024 Author: Thom Holwerda
Date: Sun, 24 Nov 2024 22:13:32 +0000
Link: https://www.osnews.com/story/141194/using-only-a-linux-terminal-for-my-personal-computing-in-2024/
A month and a bit ago, I wondered if I could cope with a terminal-only computer[1].
[…]
The only way to really find out was to give it a go.
My goal was to see what it was like to use a terminal-only computer for my personal computing for two weeks, and more if I fancied it.
↫ Neil’s blog[2]
I tried to do this too, once.
Once.
Doing everything from the terminal just isn’t viable for me, mostly because I
didn’t grow up with it. Our family’s first computer ran MS-DOS (with a Windows
3.1 installation we never used), and I’m pretty sure the experience of using
MS-DOS as my first CLI ruined me for life. My mental model for computing didn’t
start forming properly until Windows 95 came out, and as such, computing is inherently graphical for me, and no matter how many amazing CLI and TUI applications are out there – and there are many, many amazing ones – my brain
just isn’t compatible with it.
There are a few tasks I prefer doing with the command line, like updating my computers or editing system files using Nano, but for everything else I’m just
faster and more comfortable with a graphical user interface. This comes down to
not knowing most commands by heart, and often not even knowing the options and
flags for the most basic of commands, meaning even very basic operations that people comfortable using the command line do without even thinking, take me ages.
I’m glad any modern Linux distribution – I use Fedora KDE on all my computers –
offers both paths for almost anything you could do on your computer, and unless
I specifically opt to do so, I literally – literally literally – never have to
touch the command line.
Links:
[1]: https://neilzone.co.uk/2024/10/could-i-cope-with-a-terminal-only-computer/ (link)
[2]: https://neilzone.co.uk/2024/11/using-only-a-linux-terminal-for-my-personal-computing-in-2024/ (link)
This comes down to not knowing most commands by heart,
and often not even knowing the options and flags for the most basic of commands ...
I’m glad any modern Linux distribution – I use Fedora KDE on all my computers – offers both paths for almost anything you could do on your computer, and unless I specifically opt to do so, I literally –
literally literally – never have to touch the command line.
From the ?text is good enough? department:
Title: Using (only) a Linux terminal for my personal computing in 2024 Author: Thom Holwerda
Date: Sun, 24 Nov 2024 22:13:32 +0000
Link: https://www.osnews.com/story/141194/using-only-a-linux-terminal-for-my-personal-computing-in-2024/
A month and a bit ago,?I wondered if I could cope with a terminal-only computer[1].
[?]
The only way to really find out was to give it a go.
Doing everything from the terminal just isn't viable for me,
mostly because I didn't grow up with it.
Also, running a command line through a GUI terminal emulator lets you take advantage of cut/copy/paste between windows, which is a feature not available on a pure-command-line system.
Also, running a command line through a GUI terminal emulator lets you take advantage of cut/copy/paste between windows, which is a feature not available on a pure-command-line system.
Retrograde <fungus@amongus.com.invalid> wrote:
From the ?text is good enough? department:
Title: Using (only) a Linux terminal for my personal computing in 2024
Author: Thom Holwerda
Date: Sun, 24 Nov 2024 22:13:32 +0000
Link: https://www.osnews.com/story/141194/using-only-a-linux-terminal-for-my-personal-computing-in-2024/
A month and a bit ago,?I wondered if I could cope with a terminal-only
computer[1].
[?]
The only way to really find out was to give it a go.
I am glad you tried, sure it was a nice and very different
experience.
<snip>
Doing everything from the terminal just isn't viable for me,
mostly because I didn't grow up with it.
Fair enough, but at least you tried to see what things were
like for us old people. But yes, big changes like this are
hard to deal with.
I started before DOS existed on minis and I remember when
GUIs became a thing. I had to be dragged kicking and
screaming into that environment :) Still I pretty much live
in Xterms and only need a GUI for browsing and html email.
<snip>
Nice post!
I have been thinking about moving the reading part of web browsing
into the terminal as well, but haven't found a browser I'm happy
with.
Maybe it would be possible to write a kind of "pre-processor" that
formats web sites with a text based browser in mind?
D <nospam@example.net> wrote:
I have been thinking about moving the reading part of web browsing
into the terminal as well, but haven't found a browser I'm happy
with.
I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini,
Gopher and similar).
Maybe it would be possible to write a kind of "pre-processor" that
formats web sites with a text based browser in mind?
Despite me finding this solution really scary, something like that
indeed exists:
<https://www.brow.sh/>
Lawrence D'Oliveiro <ldo@nz.invalid> wrote:
Also, running a command line through a GUI terminal emulator lets you
take advantage of cut/copy/paste between windows, which is a feature
not available on a pure-command-line system.
I still can use Cut&Paste on Linux's "real VTs" but I'd prefer a decorationless fullscreen XTerm over those if I would try to work
GUIfree for a while because of easier size switching, Sixel and TeK40xx graphics.
Lawrence D'Oliveiro <ldo@nz.invalid> writes:
Also, running a command line through a GUI terminal emulator lets you
take advantage of cut/copy/paste between windows, which is a feature
not available on a pure-command-line system.
The command line is like language. The GUI is like shopping.
Took one look at KDE (shopping) and found twm.
On Tue, 26 Nov 2024, yeti wrote:
D <nospam@example.net> wrote:
I have been thinking about moving the reading part of web browsing
into the terminal as well, but haven't found a browser I'm happy
with.
I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary
fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini,
Gopher and similar).
True.
Maybe it would be possible to write a kind of "pre-processor" that
formats web sites with a text based browser in mind?
Despite me finding this solution really scary, something like that
indeed exists:
<https://www.brow.sh/>
Ah yes... I've seen this before! I did drop it due to its dependency on
FF, but the concept is similar. My idea was to aggressively filter a web page before passing it on to elinks or similar.
Perhaps rewriting it a bit in order to avoid the looooooong list of menu options or links that always come up at the top of the page, before the content of the page shows after a couple of page downs (this happens for instance if I go to wikipedia).
Instead parsing it, and adding those links at the bottom, removing javascript, and perhaps passing on only the text.
On Tue, 26 Nov 2024, yeti wrote:
<https://www.brow.sh/>
Ah yes... I've seen this before! I did drop it due to its dependency on
FF, but the concept is similar. My idea was to aggressively filter a web page before passing it on to elinks or similar.
Perhaps rewriting it a bit in order to avoid the looooooong list of menu options or links that always come up at the top of the page, before the content of the page shows after a couple of page downs (this happens for instance if I go to wikipedia).
Instead parsing it, and adding those links at the bottom, removing javascript, and perhaps passing on only the text. Well, those are only ideas. Maybe I'll try, maybe I won't. Time will tell! =)
D <nospam@example.net> wrote:
On Tue, 26 Nov 2024, yeti wrote:
D <nospam@example.net> wrote:
I have been thinking about moving the reading part of web browsing
into the terminal as well, but haven't found a browser I'm happy
with.
I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary
fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini,
Gopher and similar).
True.
I like seeing useful images, so prefer Dillo and Links (the latter
does support display via the framebuffer so you can run it
graphically without X).
Maybe it would be possible to write a kind of "pre-processor" that
formats web sites with a text based browser in mind?
Despite me finding this solution really scary, something like that
indeed exists:
<https://www.brow.sh/>
Ah yes... I've seen this before! I did drop it due to its dependency on
FF, but the concept is similar. My idea was to aggressively filter a web
page before passing it on to elinks or similar.
Perhaps rewriting it a bit in order to avoid the looooooong list of menu
options or links that always come up at the top of the page, before the
content of the page shows after a couple of page downs (this happens for
instance if I go to wikipedia).
Lucky if it's just a couple of page-downs, I can easily be
hammering the button on some insane pages where 10% is the actual
content and 90% is menu links. Often it's quicker to press End
and work up from the bottom, but many websites have a few pages of
junk at the bottom too now, so you have to hunt for the little
sliver of content in the middle.
Instead parsing it, and adding those links at the bottom, removing
javascript, and perhaps passing on only the text.
A similar approach is taken by frogfind.com, except rather than
parsing the links and putting them at the end, it detetes them,
which makes it impossible to navigate many websites. It does the
other things you mention, but the link rewriting would probably be
the hardest part to get right with a universal parser.
Site-specific front-ends are a simpler goal. This is a list of ones
that work in Dillo, and therefore without Javascript: https://alex.envs.net/dillectory/
Of course then you have the problem of them breaking as soon as the
target site/API changes or blocks them.
D <nospam@example.net> writes:
On Tue, 26 Nov 2024, yeti wrote:
<https://www.brow.sh/>
Ah yes... I've seen this before! I did drop it due to its dependency on
FF, but the concept is similar. My idea was to aggressively filter a web
page before passing it on to elinks or similar.
Perhaps rewriting it a bit in order to avoid the looooooong list of menu
options or links that always come up at the top of the page, before the
content of the page shows after a couple of page downs (this happens for
instance if I go to wikipedia).
Instead parsing it, and adding those links at the bottom, removing
javascript, and perhaps passing on only the text. Well, those are only
ideas. Maybe I'll try, maybe I won't. Time will tell! =)
I've done this for a few individual sites that I visit frequently.
+ A link to that site resides on my browser's "home" page.
+ That home page is a file in ~/html/ on localhost.
+ The link is actually to a target-specific cgi-bin Perl script on
localhost where Apache is running, restricted to requests from
localhost.
+ The script takes the URL sent from the home page, rewrites it for
the routable net, sends it to the target using wget and reads all
of the returned data into a variable.
+ Using Perl's regular expressions, stuff identified (at time of
writing the script) as unwanted is elided -- js, style, svg,
noscript etc. URLs self-referencing the target are rewritten to
to be sent through the cgi-bin script.
+ Other tweaks peculiar to the specific target...
+ Result is handed back to the browser preceded by minimal HTTP
headers.
So far, works like a charm. Always the potential that a target host
will change their format significantly. That has happened a couple of
times, requiring fetching an unadorned copy of the target's page,
tedious reading/parsing and edit to the script.
This obviously doesn't work for those sites that initially send a
dummy all-js page to verify that you have js enabled and send you a condescending reproof if you don't. Other server-side dominance games
a potential challenge or a stone wall.
Writing a generalized version, capable of dealing with pages from random/arbitrary sites is a notion perhaps worth pursuing but clearly
more of a challenge than site-specific scripts. RSN, round TUIT etc.
On Tue, 27 Nov 2024, Computer Nerd Kev wrote:
D <nospam@example.net> wrote:
On Tue, 26 Nov 2024, yeti wrote:
D <nospam@example.net> wrote:
I have been thinking about moving the reading part of web browsing
into the terminal as well, but haven't found a browser I'm happy
with.
I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary >>>> fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini,
Gopher and similar).
True.
I like seeing useful images, so prefer Dillo and Links (the latter
does support display via the framebuffer so you can run it
graphically without X).
For some reason, I never managed to get the framebuffer to work. Have no idea why. =( I would like to get it to work though.
Dillo was a good tip!
I did play with it for a bit, but then forgot about it. Maybe the reason
was a lack of tabs or buffers. I think links or maybe it was elinks, had a way for me to replicate tabs or vi buffers in the browser. It was super convenient!
Links doesn't do tabs, eLinks might
Brilliant! You are a poet Mike!
Frogfind.com was a great start! I would love to have some kind of crowd sourced html5->html1 - javascript - garbage script.
I also wondered if another approach might just be to take the top 500
sites and base it on that? Or even looking through my own history, take
the top 100.
Due to the bad development of the net, it seems like a greater and--
greater part of our browsing takes place on ever fewer numbers of
sites.
AFAICT, when spidering the net, Google finds the page that *does*
exist, modifies it according to (opaque, unknown) rules of orthography
and delivers that to you.
D <nospam@example.net> wrote:
On Tue, 27 Nov 2024, Computer Nerd Kev wrote:
D <nospam@example.net> wrote:
On Tue, 26 Nov 2024, yeti wrote:
D <nospam@example.net> wrote:
I have been thinking about moving the reading part of web browsing >>>>>> into the terminal as well, but haven't found a browser I'm happy
with.
I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary >>>>> fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini, >>>>> Gopher and similar).
True.
I like seeing useful images, so prefer Dillo and Links (the latter
does support display via the framebuffer so you can run it
graphically without X).
For some reason, I never managed to get the framebuffer to work. Have no
idea why. =( I would like to get it to work though.
I guess the framebuffer is working for the console, otherwise it
will probably be a low-res BIOS character display like in DOS. So
either a permissions problem or do you know that you need to start
Links with the "-g" option?
Dillo was a good tip!
I did play with it for a bit, but then forgot about it. Maybe the reason
was a lack of tabs or buffers. I think links or maybe it was elinks, had a >> way for me to replicate tabs or vi buffers in the browser. It was super
convenient!
Links doesn't do tabs, eLinks might but I haven't used it much.
Dillo has tabs, but isn't great for managing huge numbers of them
(although I avoid trying to do that anywhere).
D <nospam@example.net> writes:
Brilliant! You are a poet Mike!
I'm doubtful that poetry can be done in Perl. Maybe free verse in
Lisp.
Frogfind.com was a great start! I would love to have some kind of crowd
sourced html5->html1 - javascript - garbage script.
Do note that Frogfind delivers URLs that send your click back to
Frogfind to be proxied. I assume that's how you get de-enshitified
pages in response to clicking a link returned from a search.
Here's a curiosity:
Google also sends all of your clicks on search results back through
Google. I assume y'all knew that.
Isn't the weird?
I also wondered if another approach might just be to take the top 500
sites and base it on that? Or even looking through my own history, take
the top 100.
Now there's a project suitable for AI: train the NN to treat a response containing stuff you don't want ever to see as a failure. Grovel repetitively through terabytes of HTML and finally come up with a
generalized filter solution.
Due to the bad development of the net, it seems like a greater and
greater part of our browsing takes place on ever fewer numbers of
sites.
Doing everything from the terminal just isn’t viable for me, mostly because I
didn’t grow up with it.
On Wed, 28 Nov 2024, Computer Nerd Kev wrote:
D <nospam@example.net> wrote:
On Tue, 27 Nov 2024, Computer Nerd Kev wrote:
D <nospam@example.net> wrote:
On Tue, 26 Nov 2024, yeti wrote:
I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary >>>>>> fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini, >>>>>> Gopher and similar).
True.
I like seeing useful images, so prefer Dillo and Links (the latter
does support display via the framebuffer so you can run it
graphically without X).
For some reason, I never managed to get the framebuffer to work. Have no >>> idea why. =( I would like to get it to work though.
I guess the framebuffer is working for the console, otherwise it
will probably be a low-res BIOS character display like in DOS. So
either a permissions problem or do you know that you need to start
Links with the "-g" option?
Ahh... ok, that might explain it. If it is console only, then it might not work in my terminal emulator, and -g just opens a window in X.
I would have liked for it to shows images in the terminal, but maybe I
need to find another terminal emulator for that to work? I think I use the default one that comes with xfce.
D <nospam@example.net> wrote:
On Wed, 28 Nov 2024, Computer Nerd Kev wrote:
D <nospam@example.net> wrote:
On Tue, 27 Nov 2024, Computer Nerd Kev wrote:
D <nospam@example.net> wrote:
On Tue, 26 Nov 2024, yeti wrote:
I use Elinks, Emacs/EWW and W3m, but none of them can replace the scary >>>>>>> fullfat browsers. They seem to just fit Smolweb stuff (FTP, Gemini, >>>>>>> Gopher and similar).
True.
I like seeing useful images, so prefer Dillo and Links (the latter
does support display via the framebuffer so you can run it
graphically without X).
For some reason, I never managed to get the framebuffer to work. Have no >>>> idea why. =( I would like to get it to work though.
I guess the framebuffer is working for the console, otherwise it
will probably be a low-res BIOS character display like in DOS. So
either a permissions problem or do you know that you need to start
Links with the "-g" option?
Ahh... ok, that might explain it. If it is console only, then it might not >> work in my terminal emulator, and -g just opens a window in X.
Certainly, in X it'll always be in a separate window.
I would have liked for it to shows images in the terminal, but maybe I
need to find another terminal emulator for that to work? I think I use the >> default one that comes with xfce.
W3m displays images in XTerm and other terminal emulators, so that
might be what you want for a browser. I'm not sure if there's a
list of terminal emulators that support image display from it.
This page mentions that some require changes to the configuration: https://wiki.archlinux.org/title/W3m
W3m displays images in XTerm and other terminal emulators, so that
might be what you want for a browser. I'm not sure if there's a
list of terminal emulators that support image display from it.
This page mentions that some require changes to the configuration: https://wiki.archlinux.org/title/W3m
not@telling.you.invalid (Computer Nerd Kev) wrote:
W3m displays images in XTerm and other terminal emulators, so that
might be what you want for a browser. I'm not sure if there's a
list of terminal emulators that support image display from it.
This page mentions that some require changes to the configuration:
https://wiki.archlinux.org/title/W3m
I think W3M seems to put another window layer atop the terminal to
display images. It works, but my main use case for W3M is as man page
viewer W3MMAN (aliased to man), so I don't care much for it's image capabilities.
Elinks has a `./configure` option to enable Sixels, which I did, and I
see the generated binary being linked to `libsixel`, found the run-time option to enable Sixel graphics, but I never see any images displayed.
<https://github.com/rkd77/elinks>
If someone succeeds with this, please ping me.
On Fri, 29 Nov 2024, yeti wrote:
not@telling.you.invalid (Computer Nerd Kev) wrote:
W3m displays images in XTerm and other terminal emulators, so that
might be what you want for a browser. I'm not sure if there's a
list of terminal emulators that support image display from it.
This page mentions that some require changes to the configuration:
https://wiki.archlinux.org/title/W3m
I think W3M seems to put another window layer atop the terminal to
display images. It works, but my main use case for W3M is as man page
viewer W3MMAN (aliased to man), so I don't care much for it's image
capabilities.
Elinks has a `./configure` option to enable Sixels, which I did, and I
see the generated binary being linked to `libsixel`, found the run-time
option to enable Sixel graphics, but I never see any images displayed.
<https://github.com/rkd77/elinks>
If someone succeeds with this, please ping me.
Thank you for mentioning it. I will have a look!
On 25 Nov 2024 13:34:25 GMT, Retrograde wrote:
This comes down to not knowing most commands by heart,
and often not even knowing the options and flags for the most basic of
commands ...
Don’t need to. Type “man «cmd»” to see all the details of the options
available for any external command. I do this all the time.
I’m glad any modern Linux distribution – I use Fedora KDE on all my
computers – offers both paths for almost anything you could do on your
computer, and unless I specifically opt to do so, I literally –
literally literally – never have to touch the command line.
Also, running a command line through a GUI terminal emulator lets you take advantage of cut/copy/paste between windows, which is a feature not available on a pure-command-line system.
You can technically emulate that with screen or a similar multiplexer.
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday (GMT):
Also, running a command line through a GUI terminal emulator lets you
take advantage of cut/copy/paste between windows, which is a feature
not available on a pure-command-line system.
You can technically emulate that with screen or a similar multiplexer.
candycanearter07 <candycanearter07@candycanearter07.nomail.afraid> wrote:
You can technically emulate that with screen or a similar multiplexer.
Apropos similar: The funniest multiplexer I saw was Neercs.
<https://github.com/cacalabs/neercs>
<https://www.youtube.com/watch?v=7d33Pu2OW7k>
<https://www.youtube.com/watch?v=sQr42LjaNCY>
Was it ever officially finished and released?
On Sat, 30 Nov 2024 01:20:04 -0000 (UTC), candycanearter07 wrote:
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday (GMT):
Also, running a command line through a GUI terminal emulator lets you
take advantage of cut/copy/paste between windows, which is a feature
not available on a pure-command-line system.
You can technically emulate that with screen or a similar multiplexer.
A GUI lets you do that between different apps, not just terminal
emulators, as well.
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 03:52 this Saturday (GMT):
On Sat, 30 Nov 2024 01:20:04 -0000 (UTC), candycanearter07 wrote:
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday (GMT):
Also, running a command line through a GUI terminal emulator lets you
take advantage of cut/copy/paste between windows, which is a feature
not available on a pure-command-line system.
You can technically emulate that with screen or a similar multiplexer.
A GUI lets you do that between different apps, not just terminal
emulators, as well.
I'm sure you can set something up with xclip if you really need that.
On Sun, 1 Dec 2024 20:40:04 -0000 (UTC), candycanearter07 wrote:
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 03:52 this Saturday (GMT):
On Sat, 30 Nov 2024 01:20:04 -0000 (UTC), candycanearter07 wrote:
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday (GMT): >>>>
Also, running a command line through a GUI terminal emulator lets you >>>>> take advantage of cut/copy/paste between windows, which is a feature >>>>> not available on a pure-command-line system.
You can technically emulate that with screen or a similar multiplexer.
A GUI lets you do that between different apps, not just terminal
emulators, as well.
I'm sure you can set something up with xclip if you really need that.
But xclip requires a GUI, does it not?
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 23:24 this Sunday (GMT):
On Sun, 1 Dec 2024 20:40:04 -0000 (UTC), candycanearter07 wrote:
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 03:52 this Saturday
(GMT):
On Sat, 30 Nov 2024 01:20:04 -0000 (UTC), candycanearter07 wrote:
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 21:52 this Monday
(GMT):
Also, running a command line through a GUI terminal emulator lets
you take advantage of cut/copy/paste between windows, which is a
feature not available on a pure-command-line system.
You can technically emulate that with screen or a similar
multiplexer.
A GUI lets you do that between different apps, not just terminal
emulators, as well.
I'm sure you can set something up with xclip if you really need that.
But xclip requires a GUI, does it not?
So does running GUI apps.
But eMacs is also TUI, not strictly a terminal program.
On Wed, 4 Dec 2024 06:11:40 -0000 (UTC), Oregonian Haruspex wrote:
But eMacs is also TUI, not strictly a terminal program.
It can display graphics. It has long been able to run under X11. I
currently use a GTK build that works under Wayland.
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 06:42 this Wednesday
(GMT):
On Wed, 4 Dec 2024 06:11:40 -0000 (UTC), Oregonian Haruspex wrote:
But eMacs is also TUI, not strictly a terminal program.
It can display graphics. It has long been able to run under X11. I
currently use a GTK build that works under Wayland.
But does it support JS?
But does it support JS?
On Wed, 4 Dec 2024 14:30:03 -0000 (UTC), candycanearter07 wrote:
Lawrence D'Oliveiro <ldo@nz.invalid> wrote at 06:42 this Wednesday
(GMT):
On Wed, 4 Dec 2024 06:11:40 -0000 (UTC), Oregonian Haruspex wrote:
But eMacs is also TUI, not strictly a terminal program.
It can display graphics. It has long been able to run under X11. I
currently use a GTK build that works under Wayland.
But does it support JS?
This being Emacs, the answer would be "very likely".
But ... relevance being?
If I could get Amazon, eBay, and my bank to work properly in
EWW I wouldn't even launch a browser, ever.
I don't know about Emacs, but for TUI browsers with Javascript
support ELinks is one that I'm aware of. However like the
experimental JS support in Netsurf it doesn't seem to be advanced
enough to be useful (although unlike Netsurf, ELinks uses Mozilla's SpiderMonkey JS engine, so I'm not exactly sure what makes it so
difficult to get right).
Computer Nerd Kev <not@telling.you.invalid> wrote:
I don't know about Emacs, but for TUI browsers with Javascript
support ELinks is one that I'm aware of. However like the
experimental JS support in Netsurf it doesn't seem to be advanced
enough to be useful (although unlike Netsurf, ELinks uses Mozilla's
SpiderMonkey JS engine, so I'm not exactly sure what makes it so
difficult to get right).
I regard ELinks as worthless. At best, I hope it is a work in
progress. I haven't tried Netsurf, but I have tried implementing,
via jsdom, specific fetch routines for different sites of interest.
I have found that even sites that contain json data do not provide
consistent (across sites) methods of fetching the data. It is
worse when the data are not as organized as json data, but it is
distributed in unique ways for the specific site.
Computer Nerd Kev <not@telling.you.invalid> wrote:--- Synchronet 3.20a-Linux NewsLink 1.114
I don't know about Emacs, but for TUI browsers with Javascript
support ELinks is one that I'm aware of. However like the
experimental JS support in Netsurf it doesn't seem to be advanced
enough to be useful (although unlike Netsurf, ELinks uses Mozilla's
SpiderMonkey JS engine, so I'm not exactly sure what makes it so
difficult to get right).
I regard ELinks as worthless. At best, I hope it is a work in
progress. I haven't tried Netsurf, but I have tried implementing,
via jsdom, specific fetch routines for different sites of interest.
I have found that even sites that contain json data do not provide
consistent (across sites) methods of fetching the data. It is
worse when the data are not as organized as json data, but it is
distributed in unique ways for the specific site.
From the «text is good enough» department:
Title: Using (only) a Linux terminal for my personal computing in 2024 Author: Thom Holwerda
Date: Sun, 24 Nov 2024 22:13:32 +0000
Link: https://www.osnews.com/story/141194/using-only-a-linux-terminal-for-my-personal-computing-in-2024/
A month and a bit ago, I wondered if I could cope with a terminal-only computer[1].
[…]
The only way to really find out was to give it a go.
My goal was to see what it was like to use a terminal-only computer for my personal computing for two weeks, and more if I fancied it.
↫ Neil’s blog[2]
I tried to do this too, once.
Once.
Doing everything from the terminal just isn’t viable for me, mostly because I
didn’t grow up with it. Our family’s first computer ran MS-DOS (with a Windows
3.1 installation we never used), and I’m pretty sure the experience of using
MS-DOS as my first CLI ruined me for life. My mental model for computing didn’t
start forming properly until Windows 95 came out, and as such, computing is inherently graphical for me, and no matter how many amazing CLI and TUI applications are out there – and there are many, many amazing ones – my brain
just isn’t compatible with it.
There are a few tasks I prefer doing with the command line, like updating my computers or editing system files using Nano, but for everything else I’m just
faster and more comfortable with a graphical user interface. This comes down to
not knowing most commands by heart, and often not even knowing the options and
flags for the most basic of commands, meaning even very basic operations that people comfortable using the command line do without even thinking, take me ages.
I’m glad any modern Linux distribution – I use Fedora KDE on all my computers –
offers both paths for almost anything you could do on your computer, and unless
I specifically opt to do so, I literally – literally literally – never have to
touch the command line.
Links:
[1]: https://neilzone.co.uk/2024/10/could-i-cope-with-a-terminal-only-computer/ (link)
[2]: https://neilzone.co.uk/2024/11/using-only-a-linux-terminal-for-my-personal-computing-in-2024/ (link)
In my case, I use cwm+uxterm+a bunch of cli/tui apps, such as profanity, catgirl, mocp... and the only X software I use are sxiv, mpv and mupdf.
Oh, and GV for a random PostScript file. That's it.
On 2024-12-08, root <NoEMail@home.org> wrote:
Computer Nerd Kev <not@telling.you.invalid> wrote:
I don't know about Emacs, but for TUI browsers with Javascript
support ELinks is one that I'm aware of. However like the
experimental JS support in Netsurf it doesn't seem to be advanced
enough to be useful (although unlike Netsurf, ELinks uses Mozilla's
SpiderMonkey JS engine, so I'm not exactly sure what makes it so
difficult to get right).
I regard ELinks as worthless. At best, I hope it is a work in
progress. I haven't tried Netsurf, but I have tried implementing,
via jsdom, specific fetch routines for different sites of interest.
I have found that even sites that contain json data do not provide
consistent (across sites) methods of fetching the data. It is
worse when the data are not as organized as json data, but it is
distributed in unique ways for the specific site.
Once you get a Gopher/Gemini browser, among yt-dlp, the web can go away.
Try these under lynx:
gopher://magical.fish
gopher://gopherddit.com
gopher://sdf.org
gopher://hngopher.com
gemini://gemi.dev (head to news waffle)
Magical Fish it's a HUGE portal and even a 386 would be
able to use the services. You have a news source,
a translator, stock prices, weather, wikipedia over gopher,
Gutenberg, torrent search...
Have fun.
Bozo User <anthk@disroot.org> writes:
[...]
In my case, I use cwm+uxterm+a bunch of cli/tui apps, such as profanity,
catgirl, mocp... and the only X software I use are sxiv, mpv and mupdf.
Oh, and GV for a random PostScript file. That's it.
I too run cwm+uxterm! But then I add the GNU EMACS on top.
Thanks for mentioning mupdf---fast and nice. I wonder if it can display
the outline of a pdf (if available).
On Sun, 12 Jan 2025, Salvador Mirzo wrote:
Bozo User <anthk@disroot.org> writes:
[...]
In my case, I use cwm+uxterm+a bunch of cli/tui apps, such as profanity, >>> catgirl, mocp... and the only X software I use are sxiv, mpv and mupdf.
Oh, and GV for a random PostScript file. That's it.
I too run cwm+uxterm! But then I add the GNU EMACS on top.
Thanks for mentioning mupdf---fast and nice. I wonder if it can display
the outline of a pdf (if available).
I use qpdf. Has sessions, and is fairly light weight.
On Sun, 12 Jan 2025, Bozo User wrote:
Once you get a Gopher/Gemini browser, among yt-dlp, the web can go away.
Try these under lynx:
gopher://magical.fish
gopher://gopherddit.com
gopher://sdf.org
gopher://hngopher.com
gemini://gemi.dev (head to news waffle)
Magical Fish it's a HUGE portal and even a 386 would be
able to use the services. You have a news source,
a translator, stock prices, weather, wikipedia over gopher,
Gutenberg, torrent search...
Have fun.
I imagine it would be very easy to write scripts to pull in what ever regular www site you might like and move it to gopher.
I will have to remember magical.fish. Gohper works beautifully in links!
D <nospam@example.net> writes:
On Sun, 12 Jan 2025, Salvador Mirzo wrote:
Bozo User <anthk@disroot.org> writes:
[...]
In my case, I use cwm+uxterm+a bunch of cli/tui apps, such as profanity, >>>> catgirl, mocp... and the only X software I use are sxiv, mpv and mupdf. >>>> Oh, and GV for a random PostScript file. That's it.
I too run cwm+uxterm! But then I add the GNU EMACS on top.
Thanks for mentioning mupdf---fast and nice. I wonder if it can display >>> the outline of a pdf (if available).
I use qpdf. Has sessions, and is fairly light weight.
Wonderful! Pretty nice as well. Very easy to use. Now, it can't seem
to use lpr for printing? That's how I print. :) But I can workaround it
by figuring out how to tell lpr to tell my printer to only print a few
pages I'm interested in and then use the command line. Thanks for
mentioning qpdf.
D <nospam@example.net> wrote:
On Sun, 12 Jan 2025, Bozo User wrote:
Once you get a Gopher/Gemini browser, among yt-dlp, the web can go away. >>>
Try these under lynx:
gopher://magical.fish
gopher://gopherddit.com
gopher://sdf.org
gopher://hngopher.com
gemini://gemi.dev (head to news waffle)
Magical Fish it's a HUGE portal and even a 386 would be
able to use the services. You have a news source,
a translator, stock prices, weather, wikipedia over gopher,
Gutenberg, torrent search...
Have fun.
I imagine it would be very easy to write scripts to pull in what ever
regular www site you might like and move it to gopher.
If it has a friendly API and that doesn't change every month. I
notice Gopherddit.com is broken, it just says "Subreddit not found"
for everything. Not that I care to read Reddit anyway.
I will have to remember magical.fish. Gohper works beautifully in links!
No Gopher support in Links, I guess you mean ELinks or Lynx.
I use qpdf. Has sessions, and is fairly light weight.
Wonderful! Pretty nice as well. Very easy to use. Now, it can't seem
to use lpr for printing? That's how I print. :) But I can workaround it
by figuring out how to tell lpr to tell my printer to only print a few
pages I'm interested in and then use the command line. Thanks for
mentioning qpdf.
Salvador Mirzo <smirzo@example.com> writes:
[...]
I use qpdf. Has sessions, and is fairly light weight.
Wonderful! Pretty nice as well. Very easy to use. Now, it can't seem
to use lpr for printing? That's how I print. :) But I can workaround it
by figuring out how to tell lpr to tell my printer to only print a few
pages I'm interested in and then use the command line. Thanks for
mentioning qpdf.
I suspect I imagine wrong how things actually work. I thought perhaps
there would be a command line such as ``lpr --pages 7-14''. Now I
believe a program like evince generates a PostScript of the pages you
asked it to and then sends this complete PostScript document of the
pages you requested to a pipe or file on disk that lpr sends to the
printer.
But I think I can find a program that takes page ranges and
transformations like scaling and produces a PostScript document that
I can send to lpr, so I can use qpdfview and use the command line to
print stuff out.
Salvador Mirzo <smirzo@example.com> writes:
Wonderful! Pretty nice as well. Very easy to use. Now, it can't seem
to use lpr for printing? That's how I print. :) But I can workaround it
by figuring out how to tell lpr to tell my printer to only print a few
pages I'm interested in and then use the command line. Thanks for
mentioning qpdf.
I suspect I imagine wrong how things actually work. I thought perhaps
there would be a command line such as ``lpr --pages 7-14''. Now I
believe a program like evince generates a PostScript of the pages you
asked it to and then sends this complete PostScript document of the
pages you requested to a pipe or file on disk that lpr sends to the
printer. So, if qpdf doesn't do the same, I'm out of luck in terms of printing with lpr. But I think I can find a program that takes page
ranges and transformations like scaling and produces a PostScript
document that I can send to lpr, so I can use qpdfview and use the
command line to print stuff out.
I thought perhaps there would be a command line such as
``lpr --pages 7-14''.
On 2025-01-16, Salvador Mirzo wrote:
I suspect I imagine wrong how things actually work. I thought
perhaps there would be a command line such as ``lpr --pages 7-14''.
Now I believe a program like evince generates a PostScript of
the pages you asked it to and then sends this complete PostScript
document of the pages you requested to a pipe or file on disk
that lpr sends to the printer.
So, if qpdf doesn't do the same, I'm out of luck in terms of
printing with lpr. But I think I can find a program that takes
page ranges and transformations like scaling and produces a
PostScript document that I can send to lpr, so I can use qpdfview
and use the command line to print stuff out.
Sysop: | DaiTengu |
---|---|
Location: | Appleton, WI |
Users: | 1,007 |
Nodes: | 10 (0 / 10) |
Uptime: | 61:47:29 |
Calls: | 13,153 |
Calls today: | 3 |
Files: | 186,574 |
D/L today: |
1,817 files (808M bytes) |
Messages: | 3,313,007 |