~elais/elais.codes

9bf4a0c631b3b521995487900eef945b8d728d06 — Elais Player 4 months ago 69bd4c4
make links l33t
2 files changed, 56 insertions(+), 60 deletions(-)

M haunt.scm
M posts/how-i-deploy-this-site/index.md
M haunt.scm => haunt.scm +5 -6
@@ 69,13 69,13 @@
(define %header-menu
  `(div (@ (class "sidebar"))
        (a (@ (href "/"))
           "(home) page")
           "(home) pag3")
        (a (@ (href "https://git.sr.ht/~elais"))
           "software (repo)")
           "s0ftwar3 (repo)")
        (a (@ (href "/feed.xml"))
           "(rss) feed")
           "(rss) f33d")
        (a (@ (href "/about.html"))
           "about (me)")))
           "ab0ut (me)")))

(define (site-layout site title body)
  `((doctype "html")


@@ 117,8 117,7 @@
    (center "---")))

(define (collection-template site title posts prefix)
  `(
    ,(map
  `(,(map
      (lambda (post)
        (let ((uri
               (string-append

M posts/how-i-deploy-this-site/index.md => posts/how-i-deploy-this-site/index.md +51 -54
@@ 15,26 15,26 @@ package manager for my distribution (also called guix) and has a lot of neat
features built around building things in reproducible environments. Haunt is a
static site generator written in guile scheme, which allows me to reuse tooling
that I've set up to manage my guix system. I primarily use sourcehut to
store my repositories for its web interface's simplicity and approach to using
existing tools rather than tools that lock in users, but it does not include a
static site hosting service, so I decided to deploy my site to IPFS mostly
because it is shiny and new. 
store my repositories, but it unfortunately does not include a
static site hosting service, so I decided to deploy my site to IPFS because its
shiny and I did not want to manage a server.

## Guix, A Most Advanced Waste of Time
## Setting Up The Environment With Guix

The three of you who read my inaugural post know that I nurture an unhealthy
infatuation with this package manager. Briefly, it is like nix except written
in guile scheme and committed to only adding free software to its official
repositories.

I first create a manifest for my development environment in my project's root
directory called `guix.scm` and activate it using the command `guix environment
--ad-hoc -m guix.scm`. This is similar to creating a `shell.nix` file and
running the command `nix-shell` in that `guix environment` spawns a new shell
that includes the packages described in the manifest and does not pollute the
user's package profile, like a generalized directory-local nvm. One can also
activate these shells automatically and export their environment to an editor of
choice using direnv.
repositories. I personally believe it has a more intuitive interface than nix
and feature parity where it matters most (yak-shaving).

The first thing I created when starting this project was a manifest. The file is
called `guix.scm` and I activate it using the command `guix environment --ad-hoc
-m guix.scm`. This is similar to creating a `shell.nix` file and running the
command `nix-shell` in that `guix environment` spawns a subshell that includes
the packages described in the manifest and does not pollute the user's package
profile, like a generalized nvm. One can also activate these
shells automatically and export their environment to an editor of choice using
direnv.

```scheme
;;; guix.scm


@@ 54,19 54,20 @@ choice using direnv.
## Building A Static Website With Haunt

Haunt is a simple static site generator that allows authors to treat their
websites as guile scheme programs. It includes a command line utility that only has
two commands: `haunt build` and `haunt serve`. The former converts markdown (or skriblio)
files to html and the latter sets up a server primarily for local development. 
websites as guile scheme programs. It includes a shell utility that only has two
commands: `haunt build` and `haunt serve`. The former converts markdown (and
more) files to html and the latter sets up a server primarily for local
development.


### Using Scheme

Haunt provides several procedures for declarative site generation. These
include procedures for creating site metadata, builders for post tagging and
feeds, and to load static resources.
Haunt provides several procedures for declarative site generation. These include
procedures for creating site metadata, feeds, post templates, and static
resource loaders, standard shit really.

```scheme
;;; procedure for generating a site
;;; a site declaration procedure
(site #:title "Elais Codes"
      #:domain "elais.codes"
      #:default-metadata


@@ 83,7 84,7 @@ feeds, and to load static resources.

Posts and pages are templated with SXML, which is an alternative syntax for XML
that uses S-expressions. Since HTML is practically a subset of XML, this allows
me to embed my site's templates directly in scheme code using backquotes.
the site's templates to be embedded directly in scheme code using backquotes.

```scheme
;; the backquote (`) character signals that in the expression that


@@ 109,38 110,35 @@ me to embed my site's templates directly in scheme code using backquotes.
At the time of writing this site's content is written in markdown for now. Each
article is stored in the `${PWD}/posts` directory. Once its time to publish a
new version of this site I run `haunt build` and barring any errors this
generates a static site that is stored in `${PWD}/site`. There is not much more
to it than that. However, like any cultist wielding the eldritch horror of emacs
as a daily driver I would prefer to use org-mode as my primary means of editing
*all* text. Eventually I will get around to writing a conversion script that
takes org files and uses markdown as an intermediate format before building the
site; I've just been lazy about it.
generates a static site that is stored in `${PWD}/site`. During development I
run `haunt serve -w` which serves the static content and adds a watcher for
changes, it is very lightweight and serves the content in no time.

## Deploying with IPFS

IPFS is an adolescent peer-to-peer protocol for storing and sharing data in a
[IPFS](https://ipfs.io/) is an adolescent peer-to-peer protocol for storing and sharing data in a
distributed file system. I use it primarily because sourcehut does not host
static websites like github or gitlab and I wanted to play with something new.
Making a site deployed to IPFS accessible to the world wide web requires two
Making an IPFS deployed site accessible to the world wide web requires two
services, one for pinning and another for DNS resolution.

### Pinning a Static Website

Content addresses in IPFS are immutable and will always find data if it is still
on someone's node in the network. However data on nodes are treated as a cache
Content addresses in IPFS are immutable and will always find data if said data
is still on a node in the network. However data on nodes are treated as a cache
by default and when the node fills up it runs a garbage collector, emptying the
its cache to make room for more data. "Pinning" tells a node that the data it
is hosting is important and should not be thrown away during garbage collection.
So to make sure my deployed site is always available I use a pinning service
called Pinata.
its cache to make room for more data. "Pinning" tells a node that the data it is
hosting is important and should not be thrown away during garbage collection. So
to make sure my deployed site is always available and doesn't get garbage
collected I use a pinning service called [Pinata](https://pinata.cloud).

Pinata "pins" my content, which shifts the burden of maintaining and monitoring
an ipfs node to them and their highly available public nodes rather than keeping
it on me and having to have an always on machine in my house. Though I have to
it on me and having to manage an always on machine. Though I have to
sign up to use their service, the api is dead simple to use and they don't
charge users until they reach 1 GB of pinned content. Since my site weighs in at
~90kb and I don't have any images (yet) it's going to be a while before I hit
that limit. 
~90kb and won't be growing much anytime soon it's going to be a while before I
hit that limit.

```bash
# pinata's api environment variables that I store in a .env file


@@ 176,7 174,7 @@ IPFS_DEPLOY_CLOUDFLARE__API_TOKEN=<api-token>

So far we have created, built, and set up hosting platforms
for *elais.codes*. Now we need to deploy the damn thing. To do this we use a
nodejs package called ipfs-deploy, which has been hinted at in some of the
nodejs utility called ipfs-deploy, which has been hinted at in some of the
previous code blocks. `ipfs-deploy` only requires a `.env` with the correct api
tokens and credentials to work and is a one liner.



@@ 214,24 212,23 @@ publish: build
	guix environment --ad-hoc -m guix.scm -- npx ipfs-deploy -p pinata -d cloudflare -O site/
```

I use `guix environment` to run the commands in a subshell with all the
I use `guix environment` to run the commands in a subshell with all of their
dependencies. I'm not sure if I can run `guix environment` once and then exit
after the build is finished. I haven't tried and the time it takes to create the
subshell is trivial, in my opinion. 
after the build is finished, I've never tried and this works so I don't care.
The time it takes to create the subshell is trivial.

## Future Work
## Conclusion

There are a few things I would like to add in the future with regards to
building this site. For instance, I would like to write my posts using org-mode.
Like most people who uses the eldritch horror that is emacs for writing prose in
addition to code, org-mode is my first choice when editing text. I have pandoc
listed as a dependency but haven't actually cared enough to write the script to
take a bunch of org-mode files and convert them to markdown as an intermediate
step before running `haunt build`.
There are a few things I would like to add in the future. For instance, I would
like to write my posts using org-mode. Like most people who uses the eldritch
horror that is emacs for writing prose in addition to code, org-mode is my first
choice when editing text. I have pandoc listed as a dependency but haven't
actually cared enough to write the script to take a bunch of org-mode files and
convert them to markdown as an intermediate step before running `haunt build`.

Also you'll notice that I use two proprietary platforms --pinata and cloudflare--
as part of my deployment process. Proprietary platforms are not ideal and
can represent a threat to user privacy and security. I'd like to look into
platforms that are more free but also don't carry the burden of self-hosting,
but for this project I went with what worked. There are hopefully better
solutions for people who care about software freedom.
but for this project I went with what worked. This is not a solution for those
who strictly adhere to the idea of software freedom.