.*?<\/div><\/div><\/li>',
dmap => {
## Default Atom
link => 'class=s-item__link href=(.*?)\?.*?>',
title => '
(?:)?(.*?)<\/h3>',
published => 's-item__listingDate">(.*?)<\/span><\/span>',
content => '',
## Custom fields used in applyCustomFormating()
price => '(.*?)<\/span>',
postage => '(.*?) postage<\/span>',
buyprice => '(.*?)<\/span>',
buyitnow => '(.*?)<\/span>'
}
}
);
```
You must create a profile such as 'ebay_watcher' for each site you
wish to parse data for. This entry must exist in the program itself
and also in the feeds() and parse() functions in ~/.sfeed/sfeedrc
as per the example . It is important that the 'itm' regular
expression encompasses each item you wish to capture.
*note* Setting the DEBUG constant to 1 will be useful while
experimenting. Using firefox developer tools to view the html layout
of a page and select the containers you're interested in is useful
for creating the regular expressions.
## Default fields
Treat the 'Default Atom' section as required fields for making a
basic atom feed. The content regular expression is left blank as this
will be populated later in the script with whatever is matched by the
'itm' regular expression.
## Custom fields
The 'Custom fields' are totally optional and allow a great deal of
flexibility. You can use the applyCustomFormatting() function to
manipulate any of the captured fields.
The below example shows how dates can be manipulated and how the
title can be amended to include other data:
```
sub applyCustomFormatting
{
my ($src, @entry) = @_;
foreach my $entry (@entry)
{
if ($src =~ /^ebay/)
{
# Fix and format date
my ($date, $time) = split(/ /, $entry->{'published'});
$date .= qx#echo -n `date +"-%Y"`#;
$entry->{'published'} = qx#echo -n `date --date="$date $time" +"%a, %d %b %Y %T %z"`#;
# Append stuff to title
$entry->{'title'} = "[BID $entry->{'price'} $entry->{'postage'}] - $entry->{'title'}";
if ($entry->{'buyitnow'})
{
$entry->{'title'} = "[BUY $entry->{'buyprice'}]$entry->{'title'}";
}
}
}
}
```
[1](https://codemadness.org/sfeed.html)]]>
OCC-day2gopher://gopher.icu/0/phlog/Computing/OCC-day2.md2023-03-22T00:59:18+01:00VPS-vs-Home-hostinggopher://gopher.icu/0/phlog/Computing/VPS-vs-Home-hosting.md2023-03-31T01:10:32+02:00Compaq-Portable-386gopher://gopher.icu/0/phlog/Computing/Compaq-Portable-386.md2021-02-11T01:44:31+01:00
DOS/Windows:
RAWRITE.EXE
After practically a day of trying, this allowed me to run the utility
to set the date; and then spend several more hours trying to guess the
correct hard drive type from the 47 available options...
### Installing NetBSD
With only a 120mb (unformatted) hard drive and 6mb of RAM I knew that
this would be a challenge. I was not wrong!
NetBSD 4.0.1 was the last to support 80386 processors. The minimum
requirements were 4mb ram and 50mb hard disk space. Though there is a
disclaimer of sorts "we do not know of anyone running with a system
quite this minimal today.". I tried, the memory requirement was too
high...
Going back a little further, to 3.1.1, there is a boot-tiny.fs
floppy image which had a GENERIC_TINY kernel. This booted and got me
to the installation menu. I proceeded through the install, partitioned
the hard disk (manually set the geometry because it was incorrect) and
tried to configure the networking but I could not get the network to
come up. Eventually I gave up and shut down the machine. Shutting down
gave me opportunity to swap out the Kingston KNE2000 for a 3com 3c509.
I tried installing again. This time I succeeded in configuring the
network card and, with some modification of URL paths due to these
old BSD's being archived, I could now install the basic package sets
(base, kernel and etc). I finished up the install and rebooted.
On booting up it just stopped, as if there was no bootloader installed.
I remembered that I had manually altered the drive geometry because it
was reading incorrectly from the BIOS. I thought maybe this was the
problem so I rebooted to perform the install again, this time leaving
the disk as recognized by the BIOS. Unfortunately this reduced the
available drive size to around 70mb which was not enough to do the
install.
I decided to try selecting another drive type using the utility disk
to see if I could find one that showed more space when running the
installer.
The result was much better, with closer to 100mb available without
manually changing the disk geometry. I made a swap partition of 12mb
(twice the ram) and used the rest for /. I proceeded to install the
sets via the network and finished up the install.
As I saw the NetBSD boot loader come up on screen and scroll through
to a login prompt, I was so happy! My efforts had not been in vain and
I had managed to get NetBSD installed on a 386 class computer with only
6mb of ram and in less than 100mb of hard disk space. [8]A success!
There is still some tweaking required but I'm happy with how it's gone
so far. Now I just need to get a dvorak keymap on there...
## Update 24/1/2020
I modified the network configuration to use DHCP. This required setting
dhclient=YES in /etc/rc.conf and modifying the static ifconfig line
in /etc/ifconfig.ep0 to read 'media 10baseT'. I discovered during the
install that media type is not detected automatically on some older
multi-media (BNC, AUI, 10baseT) network cards.
The keymap has been a struggle. Aparently the GENERIC_TINY kernel
doesn't have wsmux, which would ordinarily link to wskbd. The end fix
was 'wsconsctl -f /dev/wskbd0 -k -w encoding=us.dvorak'. I have added
the line to /etc/rc.conf for now to set it at boot.
ksh needed a .kshrc to make arrow keys and command history work :
set -o emacs
HOME=/home/
export HOME
HISTFILE=$HOME/.sh_history
export HISTFILE
While not essential, the above tweaks have made further configuration
of the system much more efficient.
## Update 30/1/2021
The CF to IDE adaptor arrived today. I connected everything up and
hoped that a CF to SD card adaptor with a 2gb SD card would do for
testing. However, the system refuses to boot with the card fitted.
I have ordered a 512mb CF card to see if it is the card or the adaptor
that is the problem.
## Update 04/2/2021
The 512mb CF card arrived today, so I tested the CF adapter with the
new card. After some time spent trying the various drive types I found
one that seemed to give me close to the full drive capacity (type 41).
After installing DOS it successfully booted from the CF card.
Now that I was sure everything was going to work, I could start work
to mount the CF adapter. The adapter came only with a steel bracket
to mount it into the back of an ATX case. I figured that I could bend
and drill the [9]bracket to fit into the drive rails without drilling
any extra holes. This worked quite well until the point I realized
the power connector was too close to the rail to fit. After some time
thinking of a solution, it stuck me that I could unsolder the connector
and place it on the back of the adapter which would make it face away
from the rail. I was pleased with the [10]final assembly.
I then cut one connector off an old HD ribbon cable for a best fit as
the original was too short to reach the CF adapter. I then trial fitted
and tested the adapter in situ. All was well, so I decided to install
Windows 3.11 for workgroups. This was a little tricky as I had to
modify the disk images to add support for the Compaq Plasma screen
and CGA adapter. This makes the installation more straight forward
though as you don't have to install drivers from alternative disks
should it need to be reinstalled in future. To complete the retrofit
I installed a soundblaster 1.5 that came with the machine and installed
prince of persia and lemmings which would allow me to test the card.
After making an IRQ jumper change everything worked great. I was quite
impressed by the sound from this old card, despite nearly deafening
myself when testing it with earphones!
I reassembled the unit and gave it a final test by playing a few levels
of lemmings with some powered speakers plugged in. The experience took
me back to my teens when we had an Amstrad PC1512.
## Conclusion
It was nice to have retrofitted the Compaq and have it working again
without harming the integrity of the original system. I had ambitions
when I started the project about maybe using it myself and installing
WIFI and speakers in it. But really it belongs in a museum or with
someone who will use it and appreciate it more. That is why I installed
DOS and Windows 3.11 rather than NetBSD in the end. I set it up for
a new owner rather than for myself as through the process I sadly
realized that I would rarely, if ever, use it.
1. [Compaq Portable 386]( gopher://gopherpedia.com/0/Compaq%20Portable%20386 )
2. [Cat 5 to WIFI bridge]( https://www.amazon.co.uk/Convert-Ethernet-Wireless-Repeater-Multi-Functional/dp/B07PMR673M )
3. [Original battery location]( gopher://gopher.icu/I/phlog/Computing/images/old-battery.jpg )
4. [Trial fitting new battery]( gopher://gopher.icu/I/phlog/Computing/images/trial-fit.jpg )
5. [Finished assembly]( gopher://gopher.icu/I/phlog/Computing/images/new-battery.jpg )
6. [Utility disk programs (3.5" & 5.25")]( gopher://gopher.icu/9/files/compaq-portable-386-utils.zip )
7. [3.5" 720k floppy utility disk image]( gopher://gopher.icu/9/files/compaq386-cmosdiskimg.zip )
8. [NetBSD Installed]( gopher://gopher.icu/I/phlog/Computing/images/install.jpg )
9. [CF Card Adapter Bracket](gopher://gopher.icu/I/phlog/Computing/images/cf-bracket.jpg )
10. [CF Assembly](gopher://gopher.icu/I/phlog/Computing/images/cf-assembly.jpg )]]>Gopher-Pollgopher://gopher.icu/0/phlog/Computing/Gopher-Poll.md2023-02-12T01:09:00+01:00Nokia-E71-in-2021gopher://gopher.icu/0/phlog/Computing/Nokia-E71-in-2021.md2021-07-31T09:13:12+02:00dallas-ds12887-cr2032-battery-modgopher://gopher.icu/0/phlog/Computing/dallas-ds12887-cr2032-battery-mod.md2023-03-02T21:46:24+01:00OCC-day1gopher://gopher.icu/0/phlog/Computing/OCC-day1.md2023-03-21T00:48:24+01:00 ratpoison
lxterminal -> urxvt
bash -> mksh
Copied over my .mkshrc and .Xresources files.]]>Sabbatical-over-and-permacomputinggopher://gopher.icu/0/phlog/Computing/Sabbatical-over-and-permacomputing.md2023-01-25T16:31:23+01:00Quarrygopher://gopher.icu/0/phlog/Computing/Quarry.md2023-05-16T02:02:00+02:00 A place, cavern, or pit where stones are dug from the earth, or
> separated, as by blasting with gunpowder, from a large mass of rock.
> Hunted or slaughtered game, or any object of eager pursuit.
Quarry contains a number of components:
1. Crawler/indexer (quarry.pl)
2. Gopher search, front end to search index (search.dcgi)
3. Wrapper for quarry.pl to process pending host index requests
(indexPending.pl)
4. Sitemap generator (generateSitemap.pl)
5. Host and selector maintenance (checkHosts.pl)
Requirements:
* Perl
* curl
* MariaDB/MySQL
Try it: gopher://gopher.icu/1/quarry
Get it: git clone git://gopher.icu/quarry
## 1. Crawler/indexer
The indexer will by default visit every link on a gopher site and
store the type, link-title, selector, hostname and port in the
'selectors' table. It will do this only for those types defined in
HARVEST_TYPES.
The robots.txt standard file format is supported and honoured. A
bespoke sitemap file format is also supported and will be used to
populate the database if found.
There are a number of parameters which can be set at the top of
the file to change it's behaviour:
DEBUG (Default 1)
Display verbose status messages.
MAX_DEPTH (Default 5)
Defines the maximum number of levels of recursion.
IGNORE_ROBOTS (Default 0)
Ignores robots.txt and any directives therein.
IGNORE_SITEMAP (Default 0)
Ignores the sitemap and instead indexes the site by recursion.
CRAWL_DELAY (Default 2)
Default delay in seconds between requests. This parameter is
overridden by robots.txt if found.
HARVEST_TYPES (Default '10Igs9')
Defines the gopher types captured.
TRAVERSE_DOMAINS (Default 0)
*Best avoided* as sitemap and robots are only parsed for the
start domain. It is better to index each host individually and
use alternative means of host discovery.
REINDEX (Default 0)
Removes all selectors for host before re-indexing.
Usage: quarry.pl some-gopher-domain.net
The port can be optionally specified eg. some-gopher-domain.net:7070.
## 2. Gopher search
This provides a front end to the index generated by quarry.pl.
Features include:
* General search
* Image search
* Sound search
* Video search
* Submit site to be indexed
The current search function is basic and tries to match the search
string against the selector or title fields and returns any that
match. This will change once metadata is added and implemented in
the search.
## 3. Wrapper
This program simply looks in the 'pending' database table for hosts
submitted to be indexed, via the gopher search front end, and passes
them to quarry.pl to be indexed.
Usage: indexPending.pl
## 4. Sitemap generator
The sitemap generator uses data from the index generated by quarry.pl.
The reasons for the sitemap are twofold:
1. Efficiency, downloading a single index file rather than crawling.
2. The format supports additional metadata:
* Description
* Categories
* Keywords
These extra metadata fields can be used to greatly enhance search
results.
Example of records:
```
Type: 1
Selector: /contact.dcgi
Host: gopher.icu
Port: 70
LinkName: Contact
Description: My contact details
Categories:
Keywords:
--------
Type: 1
Selector: /gutenberg
Host: gopher.icu
Port: 70
LinkName: Gutenberg (unofficial book and audio search interface)
Description: Gopher search interface to the official Gutenberg book
repository
Categories: Books
Keywords: Books
--------
```
Usage: generateSitemap.pl some-gopher-domain.net > sitemap.txt
## 5. Host and selector maintenance
Only basic host checking has been implemented using 'hostcheck'.
Currently the checkHosts.pl script checks each host twice in a 24
hour period. If the host fails two concurrent checks then it is
flagged as inactive and selectors will not display in search results.
If on a subsequent check the host has recovered then it is again
flagged as active.
Hostcheck: git clone git://gopher.icu/hostcheck
## 6. IndexNow
IndexNow[1] is an easy way for website owners to instantly inform
search engines about content changes on their website.
It has been implemented in a basic way to allow submission of a
single URI per request:
curl -s 'gopher://gopher.icu/7/quarry/indexnow.dcgi?url=&key='
[1](https://www.indexnow.org/)]]>OCC-day5gopher://gopher.icu/0/phlog/Computing/OCC-day5.md2023-03-25T00:53:40+01:00Simple-Computinggopher://gopher.icu/0/phlog/Computing/Simple-Computing.md2023-01-26T00:03:42+01:00Unnamed-4U-computergopher://gopher.icu/0/phlog/Computing/Unnamed-4U-computer.md2023-03-09T17:49:19+01:00vimb-web-browsergopher://gopher.icu/0/phlog/Computing/vimb-web-browser.md2023-02-08T03:24:27+01:00Gutenberggopher://gopher.icu/0/phlog/Computing/Gutenberg.md2021-06-01T10:31:25+02:00. There is also a GUTINDEX
.ALL and GUTINDEX.zip containing a consolidated catalogue listing. I
chose this later file to parse and build a searchable index.
Parsing the file was not without it's trials, with multibyte charact
ers causing me some frustrations. However, perseverence allowed me to
get what I needed from the file and build the index. Creating the file
path URI's from the book numbers was fairly easy. An example of how
to resolve the number to a directory was helpfully given in the README
file.
Example, book 1190. You split the numbers and each number would be a
directory except the last. So /1/1/9/1190 where the book was single
digit number you had to begin at zero. So book 9 would be /0/9.
## Gopher menu
Initially I tried to link directly to the text files. Unfortunately
the naming convention was not consistent and after guessing and
programatically trying to verify the existance of the file. I decided
to just link to the directory. This was a much cleaner solution.
When I noticed there were [2]audio and video files as well I conceded
that this was the best option.
The end result is a [3]gopher menu which provides a search function
that searches the catalogue for matching authors and titles. It also
shows the five most recent additions to the catalogue and five most
downloaded, with a link at the bottom of each section to list more.
## Update 01/06/2021 - Atom and RSS new book feeds added.
After a request on IRC for a feed of newly added books, both [4]Atom
and [5]RSS feeds have now been created.
The Atom feed can be accessed directly from a link on the front page
of the gutenberg interface.
1. [Gutenberg Gopher Server](gopher://dante.pglaf.org/1)
2. These can be searched for by searching 'Audio:' and 'Film:'.
3. [Gutenberg gopher interface](gopher://gopher.icu/1/gutenberg)
4. [Atom new book feed](gopher://gopher.icu/0/gutenberg/newbooks.atom.xml)
5. [RSS new book feed](gopher://gopher.icu/0/gutenberg/newbooks.rss.xml)]]>OCC-day3gopher://gopher.icu/0/phlog/Computing/OCC-day3.md2023-03-23T02:50:13+01:00Old-computer-challengegopher://gopher.icu/0/phlog/Computing/Old-computer-challenge.md2023-03-16T13:37:53+01:00Tablifygopher://gopher.icu/0/phlog/Computing/Tablify.md2021-10-06T14:13:05+02:00For-Salegopher://gopher.icu/0/phlog/Computing/For-Sale.md2024-01-20T15:19:06+01:00Reduce-and-Simplifygopher://gopher.icu/0/phlog/Computing/Reduce-and-Simplify.md2023-03-13T18:58:40+01:00 i3wm - I started using a tiling window manager to maximize
usable screen real estate and minimise distractions, along with CLI
application replacements for any GUI ones that I happened to still be
using.
I have recently been toying with the idea of giving ratpoison a try.
My typical desktop[0].
### Terminal
xterm -> urxvt - Urxvt is a lot smaller in size and has some nice
extras. It allows you to configure URI recognition so you can click
on URI's in any terminal application to open them. Using this in
combination with a plumber program, which decides on the correct
application to open them with, makes for a very slick integrated
feel: I click on youtube links and it opens them with mpv, I click on
gopher links and it opens them in lynx, http and it opens vimb etc.
Kudos to __20h__ for the plumber utilities.
I use tmux in combination with the terminal application to
essentially give myself virtual workspaces with different
applications so I can easily switch between them. Also it means I can
detach tmux and restart my windowmanager if there's an issue and
reattach without having to re-open everything.
### Editor
Vim - I use vim for everything and use vim keys to control other
applications wherever possible (vimb, ksh, lynx, ...).
For years I used vim commercially and without a clue as to its power,
not even scratching the surface of its features.
A text editor is a vital part of your tool kit, invest the time and
learn to use it well.
### Email
Thunderbird -> Mutt - It was a coin flip between alpine and mutt. I
found a few good tutorials on setting up the mutt config file so
after all the time invested in setting it up how I like it, I've
stuck with it.
### RSS
Newsboat -> sfeed_curses - Brilliant RSS feed reader, kudos to
Evil_Bob for such a great piece of software. I use it every day and
it saves me from the tyranny of the WWW by bringing content I'm
interested in to me.
### WWW + Gopher
Firefox -> Lynx/vimb - I find myself using Lynx more and more as it's
a great text based browser which also supports Gopher out of the box.
All modern graphical web browsers are huge and bloated and there's no
way around it. Firefox, I started to have some ethical as well as
usability issues with. Fortunately after a bit of searching, keeping
in mind my love of vim keys, I discovered a webkit based browser
called vimb. For the most part is has been nice to use and has
changed the way I browse the web. I no longer leave lots of inactive
tabs open.
### IRC
Xchat -> weechat
### Music
Rhythmbox -> mocp
### Video
VLC -> mpv + yt-dlp
## Programming languages
Perl was the first programming language I learned, and my general
go-to language, but it has quite a large footprint. There are worse
options, but I know now that there also lighter alternatives. I will
certainly be looking at using Lua in place of Perl for future
projects. Although in the meantime I have found myself using shell a
lot more for small jobs.
I should also put some real effort into improving my C programming
as many projects are written using it in the open source world.
Forth looks an intriguing, minimalist, low level language, maybe I
will dabble with it a bit too at some point.
## Conclusion
I hope the above will give some inspiration for those looking to
reduce and simplify their hardware and software footprint. If you
manage to have a more focused computing experience or get a few more
years service out of your existing system from anything I've written
then, to me, it's been worthwhile.
If you have any suggestions or alternatives to any of the above
software, that you think I should take a look at, then please get in
touch.
[0](gopher://gopher.icu/I/files/desktop.png)]]>OCC-day4gopher://gopher.icu/0/phlog/Computing/OCC-day4.md2023-03-24T11:37:00+01:00openbsd.amsterdam-vpsgopher://gopher.icu/0/phlog/Computing/openbsd.amsterdam-vps.md2023-05-15T00:42:47+02:00kindle-offlinegopher://gopher.icu/0/phlog/Computing/kindle-offline.md2023-05-26T19:32:14+02:00Usenetgopher://gopher.icu/0/phlog/Computing/Usenet.md2023-06-14T12:52:09+02:00IRCNowgopher://gopher.icu/0/phlog/Computing/IRCNow.md2023-06-16T23:58:01+02:00Ode-to-Gophergopher://gopher.icu/0/phlog/Computing/Ode-to-Gopher.md2023-06-19T20:07:56+02:00IRCNow-updategopher://gopher.icu/0/phlog/Computing/IRCNow-update.md2023-06-20T23:50:18+02:00Thinkpadsgopher://gopher.icu/0/phlog/Computing/Thinkpads.md2023-06-21T13:45:18+02:00Advent-of-Computinggopher://gopher.icu/0/phlog/Computing/Advent-of-Computing.md2023-06-23T17:34:55+02:00The-Asceticgopher://gopher.icu/0/phlog/Computing/The-Ascetic.md2023-06-24T21:33:18+02:00OCC-July-2023-Prepgopher://gopher.icu/0/phlog/Computing/OCC-July-2023-Prep.md2023-06-28T14:27:32+02:00Smart-terminal-not-clientgopher://gopher.icu/0/phlog/Computing/Smart-terminal-not-client.md2023-07-13T11:26:11+02:00grouch-occ-prepgopher://gopher.icu/0/phlog/Computing/grouch-occ-prep.md2023-07-12T23:54:57+02:00Old-Computer-Challenge-July-2023gopher://gopher.icu/0/phlog/Computing/Old-Computer-Challenge-July-2023.md2023-07-17T01:26:42+02:00
int main()
{
printf("Hello, world!
");
}
## Day 5
Well, I started out with the intention of continuing with my C
exercises but somehow that was put asside quite quickly.
Quarry[3] became the focus of the day as I was very aware that I'd
neglected it for some time. I hadn't re-indexed any of the sites I
had already in there for months and I had never gone back and removed
dead links. That was something I had intended to fix some time ago
but hadn't got around to.
I managed to crash my webserver while running the re-indexing script,
that caused me a bit of a panic as I had to login to the vm host for
the first time and use vmctrl. It was all a bit daunting as I
couldn't email for help as my email is hosted on the server doh!
This is the first day that I've just used the computer without making
any tweaks to configuration or adding any software.
## Day 6 and 7
Not much to report really. I spent day 6 continuing my work on quarry
and most of day 7 I was out. When I did return home I continued my
work on quarry and read about the experiences of other people taking
the challenge, on gopher of course!
## Conclusion
I didn't get off to the best start in spite of putting about a week
into preparing the system I intended to use for the challenge.
I was surprised that OpenBSD performs as well as it does on this
system. I fully expected NetBSD to be my only option but OpenBSD was,
for me anyway, a much more polished experience.
Unfortunately I didn't manage to get hardware acceleration working or
I may well have been able to prove youtube to be watchable at low
resolutions.
It was nice to have a fresh setup for the first time in several years
to experiment with. I had used ratpoison for my last challenge
earlier in the year but hadn't put much effort into configuring it
last time. In the end I found myself using tmux as my window manager
and ratpoison windows as virtual desktops; one for local and one for
my vps.
The temptation to install tin (for usenet) and sfeed (for rss/atom
feeds) was resisted so I was better able to stay focused on doing
something useful. The older system has just enough latency to make
you consider more consciously what you are about to do.
When all is said and done, I have quite enjoyed the challenge on this
occasion. I have proved that I can be productive using this 27 year
old computer and I don't feel that I have been lacking for anything.
[1](gopher://gopher.icu/0/phlog/Computing/grouch-occ-prep.md)
[2](gopher://gopher.icu/0/phlog/Computing/Smart-terminal-not-client.md)
[3](gopher://gopher.icu/0/phlog/Computing/Quarry.md)]]>Quarry-gopher-search-enginegopher://gopher.icu/0/phlog/Computing/Quarry-gopher-search-engine.md2023-07-20T00:36:25+02:00Darkmode-Saves-Powergopher://gopher.icu/0/phlog/Computing/Darkmode-Saves-Power.md2023-07-26T13:09:53+02:004G-Challengegopher://gopher.icu/0/phlog/Computing/4G-Challenge.md2023-07-26T17:49:57+02:00Computer-Upgradegopher://gopher.icu/0/phlog/Computing/Computer-Upgrade.md2023-08-10T02:49:52+02:00 vga adapter
and put the raid card back in and tried again. It worked!
What a palava! I'd been pulling my hair out all because the HDMI
connector is disabled if you fit a card in the pci-e x16 slot. Just
to be sure I connected the SAS cable and configured the drives. Sure
enough everything worked as it should and the day, and my sanity, has
been saved!
## Update 09/08/2023
The saga continues. After my last update it seemed that I was home
free and only had to install my OS. Well unfortunately that wasn't
the end of it. I tried to install OpenBSD but it turns out there is
no driver support for that card (adaptec ASR-5405Z). I was back to
square one...
At this point everything came into question, should I install
FreeBSD which does have a driver? Should I just go back to Linux
which also has a driver, or should I just go back to my old computer
with OpenBSD as I could see that the p410 raid card was supported?
I tried the p410 raid card in the Xeon for a second time and it
seemed to work at first, I could see it initializing the raid. But
just as I thought I was out of the woods, and it should have started
reading from the disk, the screen went a light grey and everything
appeared to hang. I tried a couple more times just to be sure it
wasn't some weird timing issue but to no avail.
I actually conceded defeat at this point and installed OpenBSD to the
HP microserver but the pci-e Sound Blaster sound card wouldn't
work...
Now more determined to stick with the plan, I put the 3ware raid card
back in the Xeon and began to use it to make sure it was going to be
viable as my daily driver. Everything appeared to work OK and it was
fairly snappy, apart from some noticeable lag of the disk subsystem.
As I saw it I had 3 options:
1) Buy a supported SAS raid card, knowing I would lose HDMI
2) A hybrid system - SSD for OS and PCI raid with HDD for storage
3) SSD only - ditch the raid, small SSD for OS and second bigger
one for storage
In an attempt to speed the system up, future proof it and reduce its'
energy consumption; I have decided upon option 3. This will consist
of one 120Gb SSD main drive and a 500Gb SSD for storage. i
I can't realistically ever see myself filling 2Tb. I managed to back
up everything I have onto the 500Gb SSD with quite a bit of space to
spare and I know there is a lot that could be deleted...
[1](gopher://gopher.icu/0/phlog/Computing/Stop-IT-waste.md)
[2](gopher://gopher.icu/I/images/xeon.png)]]>Low-power-computinggopher://gopher.icu/0/phlog/Computing/Low-power-computing.md2023-08-17T14:57:42+02:00Do-more-with-lessgopher://gopher.icu/0/phlog/Computing/Do-more-with-less.md2023-09-01T02:01:30+02:00Offlinegopher://gopher.icu/0/phlog/Computing/Offline.md2023-12-03T12:02:05+01:00Phlog-formatgopher://gopher.icu/0/phlog/Computing/Phlog-format.md2025-02-21T12:46:33+01:00Usenet-updategopher://gopher.icu/0/phlog/Computing/Usenet-update.md2023-12-20T13:10:02+01:00 Starting on February 22, 2024, you can no longer use Google
> Groups (at groups.google.com) to post content to Usenet groups,
> subscribe to Usenet groups, or view new Usenet content. You can
> continue to view and search for historical Usenet content posted
> before February 22, 2024, on Google Groups.
In my view googles groups integration with Usenet practically
destroyed it. As they do with most things, they assimilate and
destroy wholesome technologies from within.
## What's so good about Usenet?
There are several good reasons to make efforts to revive Usenet:
* It is the original decentralised, federated forum.
* It uses a simple protocol designed to work over low bandwidth and
impermanent communications links, like dial-up.
* News reading clients are ubiquitous and part of many email clients,
even if you don't realize it.
* The small technical barrier to entry keeps undesirables away, or
at least that's how it should play out once google ceases
to make it easy for web users to access Usenet.
* Forums on websites come and go with the fortunes and interests of
their often corporate owners. Because of the federated nature of
Usenet, the information therein continues to live on.
* ISPs no longer routinely provide access to Usenet.
Is Eternal September[2] finally coming to an end? No longer are ISPs
routinely providing access to news servers and from February google
will stop providing access to people via the web. This should return
the sovereignty of Usenet to those who should rightfully have a claim
to it.
There has never been a better time to reclaim the wonderful resource
that is Usenet.
## How do I access Usenet?
Please read the wonderful guide[3] written by Matto.
[1](https://www.theregister.com/2023/12/18/google_ends_usenet_links/)
[2](gopher://gopherpedia.com/0/Eternal%20September)
[3](gopher://box.matto.nl:70/0/usenet-getting-started-guide.txt)]]>Nokia-BLJ-2-battery-refurbgopher://gopher.icu/0/phlog/Computing/Nokia-BLJ-2-battery-refurb.md2024-01-13T03:10:48+01:00Computing-in-2024gopher://gopher.icu/0/phlog/Computing/Computing-in-2024.md2024-01-19T17:59:52+01:00The-state-of-gophergopher://gopher.icu/0/phlog/Computing/The-state-of-gopher.md2024-02-05T20:31:55+01:00 3.5 Building clients
>
> If a client does not understand what a say, type 'B' item (not a
> core item) is, then it may simply ignore the item in the directory
> listing; the user never even has to see it. Alternatively, the
> item could be displayed as an unknown type.
To put your content within a non-core type is therefore not advised.
Additionally, there is a commit[3] to the client code which notes
that it was a deliberate change to "Skip over type 'i' items". Maybe
they foresaw the potential for misuse, or it was already being
misused, and decided to address it in this way?
[3](https://raw.githubusercontent.com/jgoerzen/gopher/master/doc/client.changes)
### Gopher clients are not terminals
Please do not use terminal escape codes to attempt colours and
artwork in gopher. They may work in your particular client but
generally they don't and look horrendous to the casual user[4] [5].


## Conclusion
The saddest part for me is that many of these individuals appear to
be in IT or technical disciplines and have some proficiency.
Unfortunately they choose either not to read the RFC, blatantly
ignore it, or adhere to best practice. Don't be that person ...]]>Responses-to-The-state-of-gophergopher://gopher.icu/0/phlog/Computing/Responses-to-The-state-of-gopher.md2024-02-08T18:14:32+01:00Wordprocessing-with-vim-and-pandocgopher://gopher.icu/0/phlog/Computing/Wordprocessing-with-vim-and-pandoc.md2024-05-16T12:35:50+02:00 $ pandoc myfile.md -o myfile.pdf
Unfortunately something was missing and the conversion failed with an
error. After a short search for the error message, and test
installing other packages, I discovered that texlive_base-2023 and
texlive_texmf-minimal-2023 were also required.
## Workflow
I write the documents in vim, using markdown in the same way that I
would for my phlog, and then run pandoc on the file to output it in
PDF format.
## Summary
I use vim for all my text editing and wanted a solution where I could
continue to do that. This solution is fast and avoids any interaction
with a word processor, while achieving the same result. I'm quite
satisfied with it.]]>Computing-in-2024-Reviewgopher://gopher.icu/0/phlog/Computing/Computing-in-2024-Review.md2024-12-31T14:23:34+01:00Computing-like-1995gopher://gopher.icu/0/phlog/Computing/Computing-like-1995.md2024-12-04T14:42:50+01:00Display-lines-and-UMN-gophergopher://gopher.icu/0/phlog/Computing/Display-lines-and-UMN-gopher.md2025-02-25T17:30:02+01:00 [XXXX] '
When accounting for the entire line length UMN gopher reserves 13
characters for the left margin, even though it doesn't always use
them. Essentially it calculates whether to truncate your line as
follows: line <= COLS-13.
If your display line is 69, under 70 as per the RFC, then it
calculates a deficit of 2 characters on an 80 character display...
I've never had cause to look at the source for UMN gopher before or
really critically assess the user interface.
## Initial observations
* There is a lot of wasted space within the margin.
'--> ' could be replaced with terminal foreground/background
invert, to indicate the selected item. This would save 3 or 4
characters in one swoop.
* The margin width is variable depending on the index number.
I think that this should be fixed as when it expands it pushes
the menu title across to the right reducing usable space.
## Summary
Is it a bug, is it a feature? Should it it be condemned forevermore,
or should it be fixed?
Lynx suffers in the same regard when it comes to wasted space and
variable width info margin. I actually think it's worse...
Even with all its quirks and occasional crashes, it is still my
preferred client.
## References
[1](gopher://box.matto.nl/0/shorter-lines-in-gophermap.txt)
[2](https://baymard.com/blog/line-length-readability)
[3](https://github.com/jgoerzen/gopher.git)
[4](https://github.com/jgoerzen/gopher/blob/master/gopher/manager.c)]]>UMN-gopher-revisitedgopher://gopher.icu/0/phlog/Computing/UMN-gopher-revisited.md2025-03-02T19:54:10+01:00UMN-gopher-revisited-pt2gopher://gopher.icu/0/phlog/Computing/UMN-gopher-revisited-pt2.md2025-03-03T22:15:00+01:00DICT-dictionary-server-protocolgopher://gopher.icu/0/phlog/Computing/DICT-dictionary-server-protocol.md2025-03-12T10:49:26+01:00 $telnet dict.org 2628
> Trying 199.48.130.6...
> Connected to dict.org.
> Escape character is '^]'.
Type 'help'.
> help
> 113 help text follows
> DEFINE database word -- look up word in database
> MATCH database strategy word -- match word in database using str..
> SHOW DB -- list all accessible databases
> SHOW DATABASES -- list all accessible databases
> SHOW STRAT -- list available matching strategies
> SHOW STRATEGIES -- list available matching strategies
> SHOW INFO database -- provide information about the da..
> SHOW SERVER -- provide site-specific information
> OPTION MIME -- use MIME headers
> CLIENT info -- identify client to server
> AUTH user string -- provide authentication information
> STATUS -- display timing information
> HELP -- display this help information
> QUIT -- terminate connection
>
> The following commands are unofficial server extensions for debu..
> only. You may find them useful if you are using telnet as a cli..
> If you are writing a client, you MUST NOT use these commands, si..
> they won't be supported on any other server!
>
> D word -- DEFINE * word
> D database word -- DEFINE database word
> M word -- MATCH * . word
> M strategy word -- MATCH * strategy word
> M database strategy word -- MATCH database strategy word
> S -- STATUS
> H -- HELP
> Q -- QUIT
> .
> 250 ok
To see a list of all available databases.
> show db
> 110 166 databases present
> gcide "The Collaborative International Dictionary of English v.0.48"
> wn "WordNet (r) 3.0 (2006)"
> moby-thesaurus "Moby Thesaurus II by Grady Ward, 1.0"
> elements "The Elements (07Nov00)"
> vera "V.E.R.A. -- Virtual Entity of Relevant Acronyms (February 2016)"
> jargon "The Jargon File (version 4.4.7, 29 Dec 2003)"
> foldoc "The Free On-line Dictionary of Computing (30 December 2018)"
> easton "Easton's 1897 Bible Dictionary"
> hitchcock "Hitchcock's Bible Names Dictionary (late 1800's)"
> bouvier "Bouvier's Law Dictionary, Revised 6th Ed (1856)"
> devil "The Devil's Dictionary (1881-1906)"
> world02 "CIA World Factbook 2002"
> gaz2k-counties "U.S. Gazetteer Counties (2000)"
> gaz2k-places "U.S. Gazetteer Places (2000)"
> gaz2k-zips "U.S. Gazetteer Zip Code Tabulation Areas (2000)"
> ...
To do a basic word search of all databases.
> d gopher
To do a word search of a specific database.
> d foldoc gopher
> 150 1 definitions retrieved
> 151 "gopher" foldoc "The Free On-line Dictionary of Computing (3..
> gopher
>
> A {distributed} document retrieval sys..
If you want to script something or just use it from the command line
then you can do something like the following:
> $echo "d foldoc gopher" | nc -N dict.org 2628 | less
## Conclusion
There are many dictionaries available so this is a very useful
resource.
I thought about writing a gopher interface to DICT, but maybe it's
worth learning a bit about the protocol[2]. I may come back to it if
there is any interest...
## References
[1](gopher://gopherpedia.com:70/0/DICT)
[2](gopher://gopher.icu/0/files/rfc/rfc2229.txt)]]>Stabilizinggopher://gopher.icu/0/phlog/Computing/Stabilizing.md2025-06-01T00:35:13+02:00Make-your-own-toolsgopher://gopher.icu/0/phlog/Computing/Make-your-own-tools.md2025-06-09T10:25:40+02:00/bin/logbook.sh
alias LG='lb_log'
alias LF='lb_find'
'''
> $ LF DL4ST
> 2025-06-07 19:18:11 DL4ST 10.120 30m CW 579 4..
The above shows all contacts for DL4ST.
> $ LG G0GSY
The above will add the call to the logfile and open it with your
$EDITOR falling back to vi if it's not defined.
Defaults for band and mode are set using environment variables,
LB_BAND and LB_MODE in the logbook.sh[0] file, which also contains
the shell functions that make it work. I currently have the variables
set to 40m and CW respectively, but you can export them again to
dynamically change them.
> $ export LB_MODE=SSB
When you add a contact the frequency is pre-filled based on the band
of operation like so 7.### for 40m allowing you to do a simple search
and replace with the exact frequency.
It will probably evolve over time, but for now it covers the basics
of my needs.
[0](gopher://gopher.icu/0/files/logbook.sh)]]>Phlog-questions-challenge:-technology-editiongopher://gopher.icu/0/phlog/Computing/Phlog-questions-challenge:-technology-edition.md2025-04-16T01:08:40+02:00UMN-gopher-no-moregopher://gopher.icu/0/phlog/Computing/UMN-gopher-no-more.md2025-06-16T01:32:38+02:00 $ echo "" | nc gopher.icu 70 | ./degopher.awk | less
## References
[0](gopher://gopher.icu/0/phlog/Computing/Display-lines-and-UMN-gopher.md)
[1](gopher://gopher.icu/0/phlog/Computing/Smart-terminal-not-client.md)
[2](gopher://gopher.icu/0/files/degopher.awk)]]>UNIX-Primitivismgopher://gopher.icu/0/phlog/Computing/UNIX-Primitivism.md2025-06-16T16:34:42+02:00URI-open--plumber-and-xclipgopher://gopher.icu/0/phlog/Computing/URI-open--plumber-and-xclip.md2025-06-18T21:41:18+02:00 $ xclip -o
If the above prints out the selected text or URI then you can proceed
to install the plumber[1] and configure it to your requirements.
## Plumber configuration
Configuration involves exporting environment variables to determine
the application to be used for opening the specific URI/file type:
PLUMB_IMAGE=
PLUMB_MEDIA=
PLUMB_GOPHER=
PLUMB_TXTGOPHER=
PLUMB_PDF=
PLUMB_FILEMANAGER=
PLUMB_WEB=
PLUMB_TXTWEB=
PLUMB_FEED=
PLUMB_WAIS=
PLUMB_CSO=
PLUMB_NEWS=
PLUMB_NEX=
*Note: * If you are into self flagellation you could instead
use xdg-open.
Create a key binding to trigger calling the plumber with the contents
of the xclip buffer.
## An example for ratpoison
Edit .ratpoisonrc and add the following:
> bind o exec plumb $(xclip -o)
After restarting ratpoison you should be able to Ctrl-t+o to open
selected URI's.
## References
[0](gopher://gopher.icu/0/phlog/Computing/Smart-terminal-not-client.md)
[1](git://r-36.net/plumber)]]>Gophering-without-a-clientgopher://gopher.icu/0/phlog/Computing/Gophering-without-a-client.md2025-06-21T16:48:55+02:00 gopen gopher.icu
If the link contains a search URI then it prompts you for entry.
> gopen gopher.icu/7/quarry
> Search:
For every link you open a new shell is spawned so you back-pedal
through your navigation by just exiting less. This may work better
inside something like the tabbed suckless application?
## Summary
The scripts are very crude and can be improved a lot. The degopher
filter I have already discovered doesn't handle 'i' type lines well.
It's a very crude way of navigating gopher but it makes all URI's
actionable from the command line in a common way and I have taken the
same approach with NEX.
## References
[0](gopher://gopher.icu/0/phlog/Computing/UMN-gopher-no-more.md)
[1](gopher://gopher.icu/0/files/degopher.awk)
[2](gopher://gopher.icu/0/phlog/Computing/URI-open--plumber-and-xclip.md)
[3](gopher://gopher.icu/0/files/gopen)]]>urxvt-no-moregopher://gopher.icu/0/phlog/Computing/urxvt-no-more.md2025-06-26T23:46:03+02:00OpenBSD-web-browsersgopher://gopher.icu/0/phlog/Computing/OpenBSD-web-browsers.md2025-06-27T19:23:47+02:00 noto-emoji-20241002
> noto-fonts-24.9.1v0
> nspr-4.36
> nss-3.110
> libudev-openbsd-20230921p0
> libxslt-1.1.43p0
> epoll-shim-0.0.20240608
> wayland-1.23.1
> libxkbcommon-1.8.1
> dconf-0.40.0p2
> adwaita-icon-theme-legacy-46.2p0
> adwaita-icon-theme-47.0p0
> gtk+3-3.24.49
> jtk+3-cups-3.24.49
> xdg-utils-1.2
1.134G of disk usage in my estimation. The noto-fonts are enormous
and the major bulk of the installation.
I have been using this browser since I started using OpenBSD on my
daily driver, around 3 years. There is no Firefox for 32bit, so this
was really my only choice and I've stuck with it until today.
Performance is acceptable and it seems to cope with most websites.
Unfortunately every so often a website will cause it to go into a
death spiral which will freeze my system if I don't catch it quickly.
## Vimb
Dependencies:
> iso-codes-4.17.0:webkitgtk41-2.48.3
> iso-codes-4.17.0:gst-plugins-bad-1.26.0
> iso-codes-4.17.0:gst-plugins-good-1.26.0
> iso-codes-4.17.0:gst-plugins-base-1.26.0
> iso-codes-4.17.0
> mozilla-dicts-en-GB-1.3p1:enchant2-2.8.2
> mozilla-dicts-en-GB-1.3p1:hunspell-1.7.2
> mozilla-dicts-en-GB-1.3p1
> pango-1.56.3:gtk+3-3.24.49
> wayland-1.23.1:libxkbcommon-1.8.1
> wayland-1.23.1
> orc-0.4.32
> lame-3.100p2:libbs2b-3.1.0p5
> lame-3.100p2:twolame-0.4.0
> lame-3.100p2:libsndfile-1.2.2p0
> libunbound-1.22.0:geoclue2-2.7.2
> libunbound-1.22.0:libsoup3-3.6.5
> libunbound-1.22.0:glib2-networking-2.80.1p0
> flac-1.5.0
> libnotify-0.8.6
> hyphen-2.8.8p0
> aspell-0.60.8.1p0
> taglib-1.13.1
> woff2-1.0.2p0
> dbus-daemon-launch-helper-1.16.2
> epoll-shim-0.0.20240608
> libavif-1.1.1
> wavpack-5.6.0p0
> cdparanoia-3.a9.8p5
> json-glib-1.10.6
> soundtouch-2.3.3
> duktape-2.7.0p1:libproxy-0.5.9p2
> duktape-2.7.0p1
> libxslt-1.1.43p0
> harfbuzz-icu-11.0.0
> opencore-amr-0.1.6
> at-spi2-core-2.54.1
> gsettings-desktop-schemas-47.1p0
> gstreamer1-1.26.0
> dbus-1.16.2v0:avahi-glib-0.8p3
> libshout-2.4.5
> libpsl-0.21.1p0
> graphene-1.10.8p1
> flite-2.2
> dconf-0.40.0p2
> adwaita-icon-theme-legacy-4...:adwaita-icon-theme-47.0p0
> adwaita-icon-theme-legacy-46.2p0
400MB of disk usage in my estimation. Dependency listing looks worse
than it is with regard to overall size of installation.
Unfortunately performance on my system sucked, YMMV. It didn't last
more than a couple of minutes before it was uninstalled.
## Firefox-ESR
Dependencies:
> nspr-4.36
> nss-3.110
> adwaita-icon-theme-legacy-46.2p0
> adwaita-icon-theme-47.0p0
> at-spi2-core-2.54.1
> epoll-shim-0.0.20240608
> wayland-1.23.1
> libxkbcommon-1.8.1
> dconf-0.40.0p2
> gtk+3-3.24.49
300MB of disk usage in my estimation.
I have moral issues with the Mozilla Foundation but performance wise,
it still seems to be the best of a bad bunch.
## Wayland?
Not so long ago wayland wasn't even a thing, now it is seemingly a
requirement to significant pieces of software, leverage? Is this what
people do when they want to force their beliefs and choices upon
others?
Redhat with systemd and then Firefox with Rust. Now all web browsers
seem to come bundled with wayland, whether you want or not...
## Summary
This trip down the rabbit hole was prompted by the OpenBSD 7.7
update. As time goes on it seems to be bloating and the update failed
due to my running out of available disk space. This forced me to comb
through the system, uninstall anything I wasn't using and check the
sizes of the larger packages. That of course meant the web browser.
I use vim key-bindings wherever I can get them, so I added the vimium
plug-in as I had done with ungoogled-chromium.
Emerging from the rabbit hole I'm back on firefox-esr. The
performance to bloat ratio is better than the rest.]]>Legacygopher://gopher.icu/0/phlog/Computing/Legacy.md2025-07-10T01:34:42+02:00ed-vi-vim-sam-vis-editorgopher://gopher.icu/0/phlog/Computing/ed-vi-vim-sam-vis-editor.md2025-07-12T01:39:37+02:00password-storegopher://gopher.icu/0/phlog/Computing/password-store.md2025-07-18T00:38:20+02:00 $ export PASSWORD_STORE="$HOME/.password-store"
Don't forget to add the above export to your .kshrc, or whatever,
so that it is set next time you come to use it.
If all is well the following will list all your password files.
> $ pw
The following will list files based on a partial match, if there
are more than one, or it will request you to enter your password
to decrypt the file.
> $ pw
If only one was found, after entering the password, it will be
copied to your pastebuffer for pasting.
## Flags
There are two additional flags that can be used:
-n Which when supplied with a filename will generate a random
password, create a new password file and copy the password to your
pastebuffer.
> $ pw -n
-v Which will list files based on a partial match, if there are
more than one, or it will request you to enter your password to
decrypt the file.
If only one was found, after entering the password, the file will
be printed to stdout.
> $ pw -v
This is useful if you have several pieces of information within the
file you need access to.
At the top of the pw script is a variable, PB_SECONDS. This determines
the amount of seconds after which the pastebuffer is overwritten,
the default is 30 seconds.
## References
[1]( gopher://ams.jay.scot/0/phlog/016.txt )
[2]( gopher://gopher.icu/0/files/pw )]]>rc-shell-and-new-gopher-servergopher://gopher.icu/0/phlog/Computing/rc-shell-and-new-gopher-server.md2025-07-26T02:03:48+02:00 1: rc-1.7.4p1
> 2: rc-1.7.4p1-editline
> 3: rc-1.7.4p1-readline
As I like modal commandline editing I gave both of the later a try.
I settled on the readline variety as it seemed more compatible
with what I had become used to with ksh's vi-mode.
I started by creating an .rcrc file and converting my existing
.kshrc to rc.
This was a very helpful learning experience to start to familiarize
myself with the syntax, as was reading Tom Duff's paper on the rc
shell[1].
I modified my prompt to include the directory path using the examples
from the paper. Then in short order I converted my aliases to
functions and ported the remainder of my ksh functionality.
By comparison the resulting file looks a lot neater.
## gopher server (gophrc)
Encouraged by what I had learned so far, I wanted to continue
familiarizing myself with the language.
The best way for me to do that is be creating some small project.
I am of the type that learns through doing and a small project is
usually how I go about learning.
I remembered I had started writing a gopher server in shell that
ran from inetd but had run into some issues and mostly given up on
the idea after discovering Katolaz had already created one called
gosher[2].
So, for the past couple of days I have been chipping away at making
*gophrc*[3] a gopher server that runs from inetd and that can work
with my existing geomyidae formated menus and cgi programs.
You are reading this file from that experimental server.
## Summary
It has been a great learning experience and I am a great fan of
home brew software.
I feel that rc shell has a lot of potential, but that I am still
missing a lot regarding nuances of the language.
If you know of any more documentation than that I've linked below,
then please drop me a line.
If you spot anything that doesn't work or things I could have done
better in rc shell, then likewise please let me know.
## References
[1]( http://9p.io/sys/doc/rc.html )
[2]( gopher://katolaz.net/1/software/gosher )
[3]( gopher://gopher.icu/0/files/gophrc )]]>The-UNIX-way--a-transcriptgopher://gopher.icu/0/phlog/Computing/The-UNIX-way--a-transcript.md2025-08-11T22:01:21+02:00Gopher-cleaninggopher://gopher.icu/0/phlog/Computing/Gopher-cleaning.md2025-07-27T02:05:15+02:00Timezonesgopher://gopher.icu/0/phlog/Computing/Timezones.md2024-12-29T01:07:13+01:00Down-the-rabbit-hole--9frontgopher://gopher.icu/0/phlog/Computing/Down-the-rabbit-hole--9front.md2025-08-18T00:23:56+02:00 % /bin/kbmap
Right click to select the keymap, then press 'q' to quit.
To begin installation:
> % inst/start
configfs - will ask you to choose a filesystem type, for my purposes
it was recommended gefs (good enough file system).
partdisk - you can delete and create a partition for your system.
Here I ran into a problem with the following stage 'prepdisk' which
is meant to divide the allocated partition. It failed repeatedly
until I left a couple of sectors at the beginning of the disk free
during this stage. So, I couldn't allocate from 0 - .
prepdisk - automatically divides the partition or you can manually
override. I just went with what it recommended.
mountfs - reams/formats the partitions and mounts them
configdist - choose the location of the install media
confignet - choose manual/auto (dhcp).
tzsetup - choose timezone (GMT)
bootsetup - install mbr and mark partition active (yes to both).
finish - reboot
## Post install
First thing I needed to do was make my keymap selection permanent.
To do this edit $home/lib/profile, depending on your level of bravery
your choices of editor are ed, sam or acme.
Be ware, there are different sections in profile which are run
depending if you are logging in from a local terminal or as a cpu.
I made the mistake of adding the line to set the keyboard at the
very top of the file and it really didn't like it when I connected
to it using drawterm from another computer. Make sure local changes
go under the terminal section.
To get a list of keymaps:
> % ls /sys/lib/kbmap
Add the following line to your profile under the terminal section:
cp /sys/lib/kbmap/ /dev/kbmap
## ssh to other systems
> % auth/rsagen -t 'service=ssh' >$home/lib/sshkey
> % auth/rsa2ssh $home/lib/sshkey >$home/lib/sshkey.pub
> % cat $home/lib/sshkey >/mnt/factotum/ctl
I then copied the public key to .ssh/authorized_keys on my remote
machine.
*WARNING* Do not try to ssh to a remote machine from an ordinary
9front terminal! It hung and crashed rio in my case.
For vt emulation there is a vt program which you must start before
trying to use ssh:
> % vt -xb
Then:
> % ssh user@remotehost
I will probably create a function in my profile to make this override
ssh with a single command.
## Summary
This is as much a reference for myself as it is maybe of some use
to someone else. My memory isn't great so having something to refer
to, should I have to do it again, is quite useful.
The current state of play is that I have a 9front system that I can
connect to using drawterm from my usual openbsd desktop and I can
connect from the 9front system to my vps.
Old habits die hard. It is evident, when it's not there, that I
use less a lot. What is significant about the 9systems is not what
is included, but what has been removed.
less/more is a good example of this, you don't need a pager if your
terminal window behaves as a pager. Terminal windows by default do
not scroll to the bottom like a unix terminal would, so man pages
etc appear to just be cat'd onto the screen.
This small adjustment from using less to using cat and the screen
not scrolling is taking some getting used to, and this is just one
small illustration.
Coming from linux/unix it is frustrating because while you know it
is different, it looks very familiar and even some of the same
programs exist, yet don't quite behave the same.
If you do get stuck without less and it becomes too much there is
'p' which is a pager. It is tempting to alias it, and a number of
other things, to more familiar names. But then you would be convincing
yourself more that it is what it is not.
Much of the above install and configuration information was gathered
from the 9front Wiki[1] and this excellent Plan 9 Desktop Guide[2].
## References
[1]( https://wiki.9front.org )
[2]( https://pspodcasting.net/dan/blog/2019/plan9_desktop.html )]]>Down-the-rabbit-hole--9front-pt2gopher://gopher.icu/0/phlog/Computing/Down-the-rabbit-hole--9front-pt2.md2025-08-20T03:18:15+02:00 % "
To run the last command again:
> % ""
Now for the interesting part.
To recall a command matching a pattern:
> % "
To run a command matching a pattern:
> % ""
If you're not satisfied with that, then how about this.
All the text you have in your window is in /dev/text. Want to list
previous commands?
> % grep term% /dev/text
*NOTE* Each window is its own entity with its own environment and
/dev/text so you will only have contextual history in that window. If
you mount some resource in another window, don't expect it to be
available in another.
## Keyboard remapping
I use modal editing not only in my editor but also on the commandline.
Due to this I use the escape key quite heavily and have in recent
years got into the habit of remapping caps-lock to escape.
Even though I don't use modal editing in 9front I still use it on
my server and so want the additional escape key where I'm used to
it.
In order to do this I had to first discover what keys were being
pressed. 9front does not seem to use familiar key codes in its
kbmap files. In order to discover the codes in the correct format
I had to use evdump(1) which appears to be very similar to xev under
xf86.
Once I had learned that escape was '0 1 0x1b' and caps-lock was '0
58 0xf017'. I looked to figure out how to add a mapping for capslock
to escape. Looking inside /sys/lib/ascii told me what I needed to
know '0 1 ^['. So all that was required was to add '0 58 ^[' to
my modified kbmap.
I had copied the dvorak map to dvorak-riow, so as not to destroy
the original with my tinkering. To check if the new file worked,
all that was required was to:
> % cat /sys/lib/kbmap/dvorak-riow > /dev/kbmap
Sure enough caps-lock now behaved as escape.
With that confirmed I edited my user profile to reflect the change
of keymap for next login.
## Keyboard shortcuts
ctrl+a - jump to start of line
ctrl+b - jump to text output point (useful to enter next command)
ctrl+e - jump to end of line
ctrl+d - seems same as linux/unix and closes terminal
ctrl+f - autocomplete file names
ctrl+h - delete letter backwards
ctrl+w - delete word backwards
ctrl+u - delete text from cursor back to the output point
## Still To-Do
* Map a key to Kmod4 to use riow (I have no windows key!)
* Check sound, fix if not working. (I can see audio device)
* Change rio window colours (I like white on black)
* Get full resolution (1920x1200x32), currently at 1600x1200x32 vesa.
* Start winwatch and riow on login
winwatch is a taskbar where you can minimize your windows.]]>Down-the-rabbit-hole--9front-pt3gopher://gopher.icu/0/phlog/Computing/Down-the-rabbit-hole--9front-pt3.md2025-08-23T21:53:03+02:00 0 56 0xf868 # alt to windows key
> 2 56 0xf015 # alt-gr to alt
## Getting sound working
Fairly easy fix from reading the manual.
The headphone socket was easily identifiable, once that had been
enabled I had sound and near deafened myself. Echoing a more
reasonable value into /dev/volume reduced the sound to a more sensible
level.
Both settings were added to my $home/lib/profile:
> echo pin 27 > /dev/audioctl
> echo 70 > /dev/volume
## Change rio windows to white on black
Today I was reading through a 9front added feature list and discovered
that rio had a flag to do this. It was simply a matter of adding -b
to the rio startup line in my profile.
## riow and winwatch on login
Now that I once again have virtual desktops I don't feel I need
winwatch.
winwatch was really just a crutch to save me being overwhelmed by too
many windows and losing everything.
The past few days have been a good reminder why I prefer tiling window
managers and virtual desktops. They really do make keeping your work
space organized much easier.
## Screen resolution
I did a little digging today and I think my graphics card is supported
by igfx. The output[1] seems to suggest that igfx should allow me
full resolution at 1920x1200. However, on trying to initialize it, I
get this error:
> term% aux/vga -m igfx -l '1920x1200x32'
> aux/vga: main: igfx@1920x1200x32 not in /lib/vgadb
## Summary
There is a bit of satisfaction, and relief, to actually now use the
system for doing something useful. This text was written in acme to a
9fs filesystem, mounted over ssh, from my OpenBSD server.
No longer do I need to be connected to the server via ssh to edit the
files directly or scp the files across from my local system. The
process now is as simple as writing a file to a local filesystem.
The mail attachment downloads directory is also shared in the same
way, so that I have easy local access. No more scp downloading email
attachments!
[1]( gopher://gopher.icu/0/files/output-of-aux-vga--vga.txt )]]>Down-the-rabbit-hole--9front-pt4gopher://gopher.icu/0/phlog/Computing/Down-the-rabbit-hole--9front-pt4.md2025-08-29T11:46:59+02:00 % ls /dev/sd*
Then simply:
> % mount /srv/ext4 /n/ /dev/sd/linux
I added the following to my lib/profile:
> ## Archive HDD
> ARCH_DRIVE=/dev/sdU1c3b2/linux
> if (test -e $ARCH_DRIVE) {
> ext4srv
> mount /srv/ext4 /n/archive $ARCH_DRIVE
> }
## Ham radio logging script migration
Even though I had ported my ksh script to rc in advance, it seems that
I was still using linuxisms:
* No printf on 9front, used echo instead as no formatting needed.
* date formatting is quite different:
> date -uf 'Y-MM-DD hh:mm:ss'
vs
> date -u '+%Y-%m-%d %H:%M:%S'
I think you'll agree that the date formatting in 9front is cleaner.
## password-store
This was mild torture. Due to already having migrated the ksh script
to rc on my OpenBSD system, I was expecting a fairly painless
migration, but no.
Things that were valid syntax on the OpenBSD implementation of rc were
not valid on 9front. I had lots of errors about concatenation of null
lists, or something like... I found that using $"var instead of $var
seemed to alleviate the errors and get things working.
I also had to replace sending data to xclip with /mnt/wsys/snarf and
use an alternative to gpg to encrypt and decrypt[1] the file.
## Summary
The only way to truly learn another system, and discover if it can meet
your needs, is to use it how you intend to use it and do the things you
would ordinarily do day to day.
## References
[1]( https://pspodcasting.net/dan/blog/2019/plan9_desktop.html#pim_pass )]]>Confessiongopher://gopher.icu/0/phlog/Computing/Confession.md2025-10-09T12:08:51+02:00Electrolytic-Capacitorsgopher://gopher.icu/0/phlog/Computing/Electrolytic-Capacitors.md2025-11-13T01:09:19+01:00