A feed aggregator that only polls feeds when they've been updated.
Björn Wärmedal 9d4f711920 Removed unused import | hai 3 meses | |
---|---|---|
.gitignore | %!s(int64=2) %!d(string=hai) anos | |
LICENSE | %!s(int64=3) %!d(string=hai) anos | |
README.md | %!s(int64=3) %!d(string=hai) anos | |
URLHelper.py | hai 3 meses | |
antennaDB.py | %!s(int64=2) %!d(string=hai) anos | |
customFilters.py | %!s(int64=2) %!d(string=hai) anos | |
direct-ingestion.py | %!s(int64=2) %!d(string=hai) anos | |
example.conf | %!s(int64=2) %!d(string=hai) anos | |
multiFeedParsing.py | hai 3 meses | |
pageGeneration.py | %!s(int64=2) %!d(string=hai) anos | |
prunefilters.sh | %!s(int64=2) %!d(string=hai) anos | |
prunelog.sh | %!s(int64=2) %!d(string=hai) anos | |
signoffs.py | %!s(int64=2) %!d(string=hai) anos |
|\
\ \ @
/\_\/
_|_
_|___|___
ANTENNA
This is a feed aggregator with a twist. It doesn't have a list of feeds that it repeatedly checks. Instead it takes feed URLs as user input, and checks newly submitted feeds every few minutes. No more useless hammering of dead feed files.
Most gemlogs live only a brief life, or post seldom and irregularly. A common feed aggregator relies on the consumer to curate a list of feeds, all of which will be fetched again and again and again at some interval. Few of them (most of the time none of them) have any new entries since last check. Some disappear, causing timeouts. I find this incessant blind polling a waste of resources.
This became very apparent when Solderpunk decided that the central CAPCOM installation should no longer poll every feed it knew of, but only a random subset of 100 of them each month. At the time of this writing, CAPCOM looks like a ghost town.
I believe that a better way to aid discoverability in any community is to let publishers push their content to where people are looking for it. And the gemini community is not yet too large for a central hub of information.
Antenna runs on python3 and mostly uses modules available in core. Two exceptions I remember are feedparser
and sqlite3
. Please tell me if you try to run this and run into any undocumented requirements.
The current code base makes a few assumptions that may or may not be true for your system:
antenna
and public_gemini
, respectively.queuefeeds.py
script has both read and write access to the SQLite3 database and the folder antenna
, which the database file will be in.about.gmi
, log
, and submit
in the public_gemini
folder, because the generated page will link to them.My setup is a useful referense. It looks like this:
~antenna/
|
+--- antenna/
| |
| +--- README.md
| +--- LICENSE
| +--- antennaDB.py
| +--- ingestfeeds-wrapper.sh
| +--- queuefeed.py
| +--- ingestfeeds.py
| +--- antenna.log
| +--- antenna.sqlite
| +--- blocklist.txt
|
+--- public_gemini/
|
+--- index.gmi
+--- about.gmi
+--- submit
+--- log
Nothing outside of ~antenna/public_gemini/
is publicly reachable. The file about.gmi
is a handwritten file about my instance. The index.gmi
file is generated by Antenna. The two scripts log
and submit
change the working directory to ~antenna/antenna/ and then runs tail -n 50 antenna.log
and ./queuefeed.py
respectively.
Clone this repo:
git clone https://notabug.org/tinyrabbit/gemini-antenna.git antenna
Enter the catalogue and create a database:
cd antenna
python3
> import antennaDB
> antennaDB.AntennaDB.createDB()
Make sure that the user that executes the queuefeeds.py
script has read and write permissions to the directory antenna
as well as antenna/antenna.sqlite
.
Create a cron job that runs the ingestfeeds.py via the wrapper (substitute for whichever user should run the ingest job, and which directory the ingestfeeds-wrapper.sh
is in):
sudo echo "*/10 * * * * antenna /home/antenna/antenna/ingestfeeds-wrapper.sh" > /etc/cron.d/antenna-ingestion
If there are any specific domains you'd like to block from publishing to your Antenna, please fill them in (one on each line) in a file named blocklist.txt
, one URL per line and all starting with gemini://
or other scheme and separator, depending on what you'd like to block.