Half-Elf on Tech

Thoughts From a Professional Lesbian

Tag: hugo

  • Cookie Consent on Hugo

    Cookie Consent on Hugo

    Hugo is my favourite static site generator. I use it on a site I originally created in 1996 (yes, FLF is about to be 30!). Over the last 6 months, I’ve been totally overhauling the site from top to bottom, and one of the long-term goals I had was to add in Cookie Consent.

    Hugo Has Privacy Mode

    One of the nice things about Hugo is they have a built in handler for Privacy Mode.

    I have everything set to respect Do Not Track and use PrivacyMode whenever possible. It lightens my load a lot.

    Built Into Hinode: CookieYes

    The site makes use of Hinode, which has built in support for cookie consent… Kind of. They use the CookieYes service, which I get but I hate. I don’t want to offload things to a service. In fact, part of the reason I moved of WordPress and onto Hugo for the site was GDPR.

    I care deeply about privacy. People have a right to privacy, and to opt in to tracking. A huge part of that is to minimize the amount of data from your own websites that are sent around to other people and saved on your own server/services!

    Obviously I need to know some things. I need to know how many mobile users there are so I can make it better. I need to know what pages have high traffic so I can expand them. If everyone is going to a recap page only to try and find a gallery, then I need to make those more prominent.

    In other words, I need Analytics.

    And the best analytics? Still Google.

    Sigh.

    Alternatively: CookieConsent

    I did my research. I checked a lot of services (free and pay), I looked into solutions people have implemented for Hugo, and then I thought there has to be a simple tool for this.

    There is.

    CookieConsent.

    CookieConsent is a free, open-source (MIT) mini-library, which allows you to manage scripts — and consequently cookies — in full GDPR fashion. It is written in vanilla js and can be integrated in any web platform/framework.

    And yes, you can integrate with Hugo.

    How to Add CookieConsent to Hugo

    First, download it. I have node set up to handle a lot of things, so I went with the easy route:

    npm i vanilla-cookieconsent@3.1.0

    Next, I have to add the dist files to my site. I added in a command to my package.json:

    "build:cookie": "cp node_modules/vanilla-cookieconsent/dist/cookieconsent.css static/css/cookieconsent.css && cp node_modules/vanilla-cookieconsent/dist/cookieconsent.umd.js static/js/cookieconsent.umd.js",
    

    If you’re familiar with Hinode, may notice I’m not using the suggested way to integrate JS. If I was doing this in pure Hinode, I’d be copying the files to assets/js/critical/functional/ instead of my static folder.

    I tried. It errors out:

    Error: error building site: EXECUTE-AS-TEMPLATE: failed to transform "/js/critical.bundle-functional.js" (text/javascript): failed to parse Resource "/js/critical.bundle-functional.js" as Template:: template: /js/critical.bundle-functional.js:210: function "revisionMessage" not defined
    

    I didn’t feel like debugging the whole mess.

    Anyway, once you get those files in, you need to make another special js file. This file is your configuration or initialization file. And if you look at the configuration directions, it’s a little lacking.

    Instead of that, go look at their Google Example! This gives you everything you need to comply with Google Tag Manager Consent Mode, which matters to me. I copied that into /static/js/cookieconsent-init.js and customized it. Like, I don’t have ads so I left that out.

    Add Your JS and CSS

    I already have a customized header (/layouts/partials/head/head.html) for unrelated reasons, but if you don’t, copy the one from Hinode core over and add in this above the call for the SEO file:

    <link rel="stylesheet" href="/css/cookieconsent.css">

    Then you’ll want to edit /layouts/partials/templates/script.html and add in this at the bottom:

    <script type="module" src="/js/cookieconsent-init.js"></script>

    Since your init file contains the call to the main consent code, you’re good to go!

    The Output

    When you visit the site, you’ll see this:

    Screenshot of a cookie consent page.
    Screenshot

    Now there’s a typo in this one, it should say “That means if you click “Reject” right now, you won’t get any Google Analytics cookies.” I fixed it before I pushed anything to production. But I made sure to specify that so people know right away.

    If you click on manage preferences, you’ll get the expanded version:

    Screenshot of expanded cookie preferences.
    Screenshot

    The language is dry as the desert because it’s to meet Google’s specifics.

    As for ‘strictly necessary cookies’?

    At this time we have NO necessary cookies. This option is here as a placeholder in case we have to add any later. We’ll notify you if that happens.

    And how will I notify them? By using Revision Management.

  • Small Hugo, Big Images

    Small Hugo, Big Images

    In working on my Hugo powered gallery, I ran into some interesting issues, one of which was from my theme.

    I use a Bootstrap powered theme called Hinode. And Hinode is incredibly powerful, but it’s also very complicated and confusing, as Hugo’s documentation is still in the alpha stage. It’s like the early days of other web apps, which means a lot of what I’m trying to do is trial and error. Don’t ask me when I learned about errorf, okay?

    My primary issues are all about images, sizing and filing them.

    Image Sizes

    When you make a gallery, logically you want to save the large image as the zoom in, right? Click to embiggen. The problem is, in Hinode, you can load an image in a few ways:

    1. Use the standard old img tag
    2. Call the default Hugo shortcode of {{< figure >}}
    3. Call a Hinode shortcode of {{< image >}}
    4. Use a partial

    Now, that last one is a little weird, but basically you can’t us a shortcode inside a theme file. While WordPress has a do_shortcode() method, you use partial calls in Hugo. And you have to know not only the exact file, but if your theme even uses partials! Some don’t, and you’re left reconstructing the whole thing.

    Hinode has the shortcodes in partials and I love them for it! To call an image using the partial, it looks like this:

                        {{- partial "assets/image.html" (dict
                                 "url" $imgsrc
                                 "ratio" "1x1"
                                 "wrapper" "mx-auto"
                                 "title" $title)
                         -}}
    

    That call will generate webp versions of my image, saved to static image folder (which is a post of its own), and have the source sets so it’s handy and responsive.

    What it isn’t is resized. Meaning if I used that code, I would end up with the actual huge ass image used. Now, imagine I have a gallery with 30 images. That’s 30 big ass images. Not good. Not good for speed, not good for anyone.

    I ended up making my own version of assets/image.html (called lightbox-image.html) and in there I have this code:

     {{ with resources.Get $imagefile }}
         {{ $image    = .Fill "250x250" }}
         {{ $imagesrc = $image.RelPermalink }}
     {{ end }}
    

    If the file is local, which is what that get call is doing, it uses the file ($imagefile is the ‘path’ to the file) to make a 250×250 sized version and then grabs that new permalink to use.

    {{ if $imagefile }}
         &lt;img src="{{ $imagesrc }}" class="img-fluid img-lightbox" alt="{{ $title }}" data-toggle="lightbox" data-gallery="gallery" data-src="{{ $imagefile }}">
     {{ end }}
    

    Boom!

    This skips over all the responsive resizing, but then again I don’t need that when I’m making a gallery, do I?

    Remote Image Sizes

    Now let’s add in a wrinkle. What if it’s a remote image? What if I passed a URL of a remote image? For this, you need to know that on build, that Hinode code will download the image locally. Local images load faster. I can’t use the same get, I need the remote get, but now I have a new issue!

    Where are the images saved? In the img folder. No subfolders, the one folder. And I have hundreds of images to add.

    Mathematically speaking, you can put about four billion files in a folder before it’s an issue for the computers. But if you’ve ever tried to find a specific file to check in a folder that large, you’ve seriously reconsidered your career trajectory. And practically speaking, the more files, the slower the processing.

    Anyone else remember when GoDaddy announced a maximum of 1024 files in a folder on their shared hosting? While I question the long term efficacy of that, I do try to limit my files. I know that using the get/remote get calls with tack on a randomized name at the end, but I’d like them to be organized.

    Since I’m calling all my files from my assets server (assets.example.com), I can organize them there and replicate that in my build. And my method to do that is as follows:

         {{ if eq $image "" }}
             {{- $imageurl = . | absURL -}}
             {{- $imagesrc = . | absURL -}}
     
             {{ $dir := (urls.Parse $imageurl).Path }}
    
             {{ with resources.GetRemote $imageurl | resources.Copy $dir }}
                 {{ with .Err }}
                     {{ warnf "%s" . }}
                 {{ else }}
                     {{ $image    = . }}
                     {{ $imageurl = $image.Permalink }}
                     {{ $image    = $image.Fill "250x250" }}
                     {{ $imagesrc = $image.RelPermalink }}
                 {{ end }}
             {{ end }}
         {{ end }}
    

    I know that shit is weird. It pairs off the earlier code. If you don’t create the image variable, then you know the image wasn’t local. So I started by getting the image url from ‘this’ (that’s what the period is) as an absolute url. Then I used the path of the url to generate my local folder path! When I use the Copy command with the pipe, it will automatically use that as the destination.

    Conclusion

    You can totally make images that are resized in Hugo. While I wish that was easier to do, most people aren’t as worried as I am about storing the images on their repository, so it’s less of an issue. Also galleries on Hugo are fairly rare.

    I’m starting some work now with Hinode for better gallery-esque support, and we’ll see where that goes. Maybe I’ll get a patch in there!

  • Hugo and a Lot of Images

    Hugo and a Lot of Images

    One issue with Hugo is that the way I’m deploying it is via Github actions, which means every time I want to update, the site has to be totally rebuilt. Now the primary flaw with that process is that when Hugo builds a lot of images, it takes a lot of time. About 8 minutes.

    The reason Hugo takes this long is that every time it runs its builds, it regenerates all the images and resizes them. This is not a bad thing, since Hugo smartly caches everything in the /resources/_gen/ folder, which is not sync’d to Github, and when you run builds locally it doesn’t take half as long.

    Now, this speed is about the same whether the images are locally (as in, stored in the repository) or remote (which is where mine are located – assets.example.com), because regardless it has to build the resized images. This only runs on a build, since it’s only needed for a build. Once the content is on the server, it’s unnecessary.

    The obvious solution to solve my speed issues would be to include the folder in Github, only I don’t want to store any images on Github if I can help it (legal reasons, if there’s a DMCA its easier to nuke them from my own storage). The less obvious solution is how we got here.

    The Basic Solution

    Here’s your overview:

    1. Checkout the repo
    2. Install Hugo
    3. Run the repo installer (all the dependancies etc)
    4. Copy the files from ‘wherever’ to the Git resource
    5. Run the build (which will use what’s in the resource folder to speed it up)
    6. Copy the resources folder content back down to the server

    This would allow me to have a ‘source of truth’ and update it as I push code.

    The Setup

    To start with, I had to decide where to upload the content. The folder is (right now) about 500 megs, and that’s only going to get bigger. Thankfully I have a big VPS and I was previous hosting around 30 gigs there, so I’m pretty sure this will be okay.

    But the ‘where’ specifics needed a little more than that. I went with a subdomain like secretplace.example.com and in there is a folder called /resources/_gen/

    Next, how do I want to upload to for starters? I went with only uploading the static CSS files because my plan involves pushing things back down after I re-run the build.

    Then comes the downloading. Did you know that there’s nearly no documentation about how to rsync from a remote source to your Github Action instance? It doesn’t help that the words are all pretty generic, and search engines think “Oh you want to know about rsync and a Github Action? You must want to sync from your action to your server!” No, thank you, I wanted the opposite.

    While there’s a nifty wrapper for syncing over SSH for Github, it only works one way. In order to do it the other way, you have to understand the actual issue that action is solving. The SSH-sync isn’t solving rsync at all, that’s baked in to the action image (assuming you’re using ubuntu…). No, what the action solves is the mishegas of adding in your SSH details (the key, the known hosts, etc).

    I could use that action to copy back down to the server, but if you’re going to have to solve the issue once, you may as well use it all the time. Once that’s solved, the easy part begins.

    Your Actions

    Once we’ve understood where we’re going, we can start to get there.

    I’ve set this up in my ci.yml, which runs on everything except production, and it’s a requirement for a PR to pass it before it can be merged into production. I could skip it (as admin) but I try very hard not to, so I can always confirm my code will actually push and not error when I run it.

    name: 'Preflight Checks'
    
    on:
      push:
        branches:
          - '!production'   # excludes production.
    
    concurrency:
         group: ${{ github.ref }}-ci
         cancel-in-progress: true
    
    jobs:
      preflight-checks:
        runs-on: ubuntu-latest
    
        steps:
          - name: Do a git checkout including submodules
            uses: actions/checkout@v4
            with:
              submodules: true
    
          - name: Install SSH Key
            uses: shimataro/ssh-key-action@v2
            with:
              key: ${{ secrets.SERVER_SSH_KEY }}
              known_hosts: unnecessary
    
          - name: Adding Known Hosts
            run: ssh-keyscan -H ${{ secrets.REMOTE_HOST }} >> ~/.ssh/known_hosts
    
          - name: Setup Hugo
            uses: peaceiris/actions-hugo@v3
            with:
              hugo-version: 'latest'
              extended: true
    
          - name: Setup Node and Install
            uses: actions/setup-node@v4
            with:
              node-version-file: '.nvmrc'
              cache: 'npm'
    
          - name: Install Dependencies
            run: npm install && npm run mod:update
    
          - name: Lint
            run: npm run lint
    
          - name: Make Resources Folder locally
            run: mkdir resources
    
          - name: Download resources from server
            run: rsync -rlgoDzvc -i ${{ secrets.REMOTE_USER }}@${{ secrets.REMOTE_HOST }}:/home/${{ secrets.REMOTE_USER }}/${{ secrets.HUGO_RESOURCES_URL }}/ resources/
    
          - name: Test site
            run: npm run tests
    
          - name: Copy back down all the regenerated resources
            run: rsync -rlgoDzvc -i --delete resources/ ${{ secrets.REMOTE_USER }}@${{ secrets.REMOTE_HOST }}:/home/${{ secrets.REMOTE_USER }}/${{ secrets.HUGO_RESOURCES_URL }}/
    

    Obviously this is geared towards Hugo. My command npm run tests is a home-grown command that runs a build and then some tests on said build. It’s separate from the linting, which comes with my theme. Because it’s running a build, this is where I can make use of my pre-built resources.

    You may notice I set known_hosts to ‘unnecessary’ — this is a lie. They’re totally needed but I had a devil of a time making it work at all, so I followed the advice from Zell, who had a similar headache, and put in the ssh-keyscan command.

    When I run my deploy action, it only runs the build (no tests), but it also copies down the resources folder to speed it up. I only copy it back up on testing for the teeny speed boost.

    Results

    Before all this, my builds took 8 to 9 minutes.

    After they took 1 to 2, which is way better. Originally I only had it down to 4 minutes, but I was using wget to test things (and that’s generally not a great idea — it’s slow). Once I switched to rsync, it’s incredibly fast. The build of Hugo is still the slowest part, but it’s around 90 seconds.

  • Why NOT WordPress?

    Why NOT WordPress?

    There’s a website I’ve been running since 1996.

    Yes, I know, I’m an Internet Old.

    1996. That’s 7 years before WordPress was a thing. So it’s not surprising this site was (at one point) moved from ‘something else’ to WordPress. Actually a lot of something-elses over the nearly 30 years of its existence. I moved it over to WP around 2005 (WordPress 1.5) and pretty much left it there for years.

    Now it’s different. Now the site is 100% powered by Hugo.

    Why Did I Stop Using WordPress?

    To understand this decision, you have to keep in mind that the site had been three parts for about 20 years.

    1. The Blog, where announcements were made, etc, powered by WordPress
    2. The Image Gallery, which had … images (about 20 Gigs), powered by netPhotoGraphics
    3. The Wiki/Library, which is the documentation, powered by Hugo

    Well, this year I got a nasty-gram and was forced to shut down the gallery. The simple truth was yes, the gallery included images that legally I didn’t have the right to use. No excuses. But the company involved was kind enough to work out a partial situation. I’m still in the middle of moving what images I can keep into a new home, but while that’s going on, I had a chance to sit down and face reality.

    The gallery, you see, was the biggest feature of the site. Next was the Wiki/Library, and the blog was pretty much just announcements. There was a forum, it was removed ages ago. There was BuddyPress, ditto. People management just isn’t fun.

    There was also the matter of cross linked data. Oh my, did a lot of images appear on the blog and the gallery. I was going to have to purge the old blog posts en masse anyway, so at that point, I asked myself that big question.

    Do I want to move the library to WP, or the blog to Hugo?

    Consider the following:

    1. I was going to have to manually curate nearly 30 years of blog posts (took a few thousand down to about 50)
    2. I already had a running Hugo site and was familiar with it (it has over 1600 files)
    3. If I ported to WP, I would have to rebuild my data setup for how the data is output
    4. Importing blog posts as text only is incredibly easy

    With that in mind, it seemed obvious. Hugo.

    What’s Different with Hugo?

    Obviously I lose the ability to write a blog post and press publish. I have to add a new file, manually link it to my new image, and push to GitHub, where it’s auto-deployed to the site in question. The process for any data is basically this:

    1. Create a branch on my GitHub Repo
    2. Add the new content
    3. Merge the branch into Production

    At that point a GitHub action takes over.

    Beyond that, however, there are some things you take for granted with WP. Like the ease of a mailing list with Jetpack. Now, I did export my Jetpack subscribers and I’m working on a solution there, but yeah, that was a big hit. There’s also the matter of auto-deploying content to socials. But… honestly that’s been pretty much shit-and-miss lately, what with Facebook and Twitter being what they are.

    But all the ‘easy’ stuff? Well Hugo has RSS Feeds, it can process images as it builds (though that will cause your deployments to take longer), it’s open source, and best of all? The output is static HTML.

    Go ahead, try and hack that.

    How Hard Was It?

    Honestly, it took me about 3 days to pick a new theme, apply it, move my basic content over, and start rebuilding the blog. Migrating blog posts took me about 3 weeks. The hardest part was realizing I was going to have to write some complex Hugo Mod code to include my gallery with lightbox code, but I banged that out in an evening.

    There were frustrating moments. The Hugo community is significantly smaller than WordPress (I mean, whose isn’t?) and some of the code is a little on the ‘understood’ level (by which I mean things aren’t always spelled out, they assume you know what they’re talking about). In a way, it’s like using WordPress back in 2006 all over again, and look at where that’s taken me!

    I’m very happy with the result. I picked a ‘fancy’ theme, called Hinode, and it came with Dark Mode built in. I ported over my custom code for recaps (I have a whole star rating system) and started building out topical small galleries where I could.

    If I was a newbie to the web world? This would have been impossible. Then again, a lot of the work I’m doing in WP would be impossible for a newbie. About the only tool I’ve used where I think it’d be easier would be … Maybe MediaWiki? But only because you can build templates from the editor backend.

    Even with Full Site Editing, WordPress would have been a bear and a half.

    Historical Notes

    The ‘Library’ was once on MediaWiki because I had this idea to be a public repository anyone could edit. Only I kept getting attacked by spammers, so I turned off registration. Then I had to apply all sorts of plugins, only MediaWiki didn’t allow you to self-update like WordPress, and I had to write scripts and it was just a pain.

    I rebuilt it all as Hugo about 6 years ago, and I really enjoyed it. GoLang is not something I’m familiar with, and sometimes the language drives me to drink, but so does PHP.

    The Gallery used to be a home-grown SHTML setup, which then moved to a now defunct project, Gallery, and then to ZenPhoto, and finally to NetPhotoGraphics after ZenPhoto decided to be more than just a photo library. NetPhotoGraphics is hella fun to use, and I even built an embed tool for it, so you could paste a link into WP.

    I did that with Hugo as well, and I’ll probably port that back to the new site sooner or later.

    It Is Sad Though

    Basically this site has been a part of my dev growth from day one. I wouldn’t be working in WordPress were it not for this site, and I owe it a lot. Moving to Hugo is the end of an era, and it is a bit sad. But at the same time, I feel like I’m now in even more control over everything, and I’m making a leaner, faster, website every day.

    I have no regrets for the steps I’ve taken on the way, and none about this move. It’s nice to not have to worry about updates all the time. After all, what’s on the site is just HTML.

    I do miss being able to schedule posts though…

  • oEmbedding Hugo

    oEmbedding Hugo

    No, not Hugo …

    Hugo Weaving as "Mitzi Del Bra" from "The Adventures of Priscilla, Queen of the Desert"

    No no. Hugo

    HUGO - logo to the app.

    I’ve been using Hugo to power a part of a website for quite a while now. Since 2015 I’ve had a static site, a simple library, with around 2000 posts that I wanted to be a static, less-possible-to-be-hackable site. It’s purpose was to be an encyclopedia, and as a Hugo powered site, it works amazingly.

    But… I do use WordPress, and sometimes I want to link and embed.

    Integrations Are Queen

    It helps to have a picture of how I built things. Back in *cough* 1995, the site was a single HTML page. By 1996 it was a series of SHTML (yeah) with a manually edited gallery. Fast-forward to 2005 and we have a gallery in the thousands and a full blown wiki.

    Now. Never once did I have integrated logins. While I love it for the ease of … me, I hate it from a security standpoint. Today, the blog is powered by WordPress and the gallery by NetPhotoGraphics and the ‘wiki’ by Hugo (I call it a library now). Once in a while I’ll post articles or transcripts or recaps over on the library and I want to cross link to the blog to tell people “Hey! New things!”

    But… from a practical standpoint, what are the options?

    1. A plain ol’ link
    2. A ‘table’ list of articles/transcripts/etc by name with links
    3. oEmbed

    Oh yes. Option 3.

    oEmbed and Hugo is Complex

    Since Hugo is a static HTML generator, you have to create faux ‘endpoints’ and you cannot make a dynamic JSON generator per post. Most of the things you’ll find when you google for oEmbed and Hugo is how to make it read oEmbed (like “adding a generic oEmbed handler for Hugo“). I wanted the other way, so I broke down what I needed to do:

    1. Make the ‘oembed’ JSON
    2. Make the returning iframe
    3. Add the link/alternate tag to the regular HTML

    Unlike with NetPhotoGraphics, wherein I could make a single PHP file which generated the endpoints and the json and the iframe, I had to approach it from a different angle with Hugo, and ask myself “How do I want the ‘endpoints’ to look?

    See you actually can make a pseudo endpoint of example.com/json/link/to/page which would generate the iframe from example.com/link/to/page and then example.com/oembed/link/to/page but this comes with a weird cost. You will actually end up having multiple folders on your site, and you’d want to make an .htaccess to block things.

    This has to do with how Hugo (and most static site generators) make pages. See if I wanted to make a page for ‘about’, then I would go into /posts/ and make a file called about.md with the right headers. But that doesn’t make a file called about.html, it actually makes a folder in my public_html director, called about with a file in there named index.html — that’s basic web directory stuff, though.

    But Hugo has an extra trick, which allows you to make custom files. Most people use it to make AMP pages and they explain the system like this:

    A page can be output in as many output formats as you want, and you can have an infinite amount of output formats defined as long as they resolve to a unique path on the file system. In the above table, the best example of this is AMP vs. HTMLAMP has the value amp for Path so it doesn’t overwrite the HTML version; e.g. we can now have both /index.html and /amp/index.html.

    Except… your ‘unique path’ doesn’t have to be a path! And you can customize it to kick out differently named files. So instead of /index.html and /amp/index.html I could do /index-amp.html in the same location.

    So that means my options were:

    1. A custom folder (and subfolders) for every post per ‘type’ of output
    2. Subfiles in the already existing folder

    I picked the second and here’s how:

    Output Formats

    The secret sauce for Hugo is making a new set of output formats.

    outputFormats:
      iframe:
        name: "iframe"
        baseName: "iframe"
        mediaType: "text/html"
        isHTML: true
      oembed:
        name: "oembed"
        baseName: "oembed"
        mediaType: "application/json"
        isPlainText: true

    By omitting the path value and telling it that my baseName is iframe and oembed, I’m telling Hugo not to make a new folder, but to rename the files! Instead of making /oembed/index.html and /oembed/about/index.html I’m making /about/oembed.html!

    Boom.

    The next trick was to tell Hugo what ‘type’ of content should use those new formats:

    outputs:
      home: [ "HTML", "JSON", "IFRAME", "OEMBED" ]
      page: [ "HTML", "IFRAME", "OEMBED" ]
      section: [ "HTML", "IFRAME", "OEMBED" ]

    Home also has a JSON which is something I use for search. No one else needs it.

    New Template Files

    I’ll admit, this took me some trial and error. In order to have Hugo generate the right files, and not just a copy of the main index, you have to add new template files. Remember those basenames?

    • index.oembed.json
    • index.iframe.html

    Looks pretty obvious, right? The iframe file is the HTML for the iframe. The oembed is the JSON for oembed discovery. Those go right into the main layouts folder of your theme. But… I ended up having to duplicate things in order to get everything working and that meant I also made:

    • /_default/baseof.iframe.html
    • /_default/baseof.oembed.json
    • /_default/single.iframe.html
    • /_default/single.json

    Now, if you;’re wondering “Why is it named single.json?” I don’t know. What I know is if I named it any other way, I got this error:

    WARN: found no layout file for “oembed” for layout “single” for kind “page”: You should create a template file which matches Hugo Layouts Lookup Rules for this combination.

    So I did that and it works. I also added in these:

    • /section/section.iframe.html
    • /section/section.oembed.json

    Since I make heavy use of special sections, that was needed.

    The Template Files

    They actually all look pretty much the same.

    There’s the oembed JSON:

    {
      "version": "1.0",
      "provider_name": "{{ .Site.Title }}",
      "provider_url": "{{ .Site.BaseURL }}",
      "type": "rich",
      "title": "{{ .Title }} | {{ .Site.Title }}",
      "url": "{{ .Permalink }}",
      "author_name": "{{ if .Params.author }}{{ .Params.author }}{{ else }}Anonymous{{ end }}",
      "html": "<iframe src=\"{{ .Permalink }}iframe.html\" width=\"600\" height=\"200\" title=\"{{ .Title }}\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" class=\"hugo-embedded-content\"></iframe>"
    }
    
    

    And there’s the iframe HTML:

    <!DOCTYPE html>
    <html lang="en-US" class="no-js">
    <head>
    	<title>{{ .Title }} &middot; {{ .Site.Title }}</title>
    	<base target="_top" />
    	<style>
    		{{ partial "oembed.css" . | safeCSS }}
    	</style>
    	<meta name="robots" content="noindex, follow"/>
    	<link rel="canonical" href="{{ .Permalink }}" />
    </head>
    <body class="hugo hugo-embed-responsive">
    	<div class="hugo-embed">
    		<p class="hugo-embed-heading">
    			<a href="{{ .Permalink }}" target="_top">{{ .Title }}</a>
    		</p>
    		<div class="hugo-embed-excerpt">
    			{{ .Summary }}...
    		</div>
    		<div class="hugo-embed-footer">
    			<div class="hugo-embed-site-title">
    				<a href="{{ .Site.BaseURL }}" target="_top">
    					<img src="/images/oembed-icon.png" width="32" height="32" alt="{{ .Site.Title }}" class="hugo-embed-site-icon"/>
    					<span>{{ .Site.Title }}</span>
    				</a>
    			</div>
    		</div>
    	</div>
    </body>
    </html>
    
    

    Note: I set summaryLength: 10 in my config to limit the summary to something manageable. And no, you’re not mis-reading that, the library generally has no images.

    And then in my header code for the ‘normal’ html pages:

    	{{ if not .Params.notoembed }}
    	{{ "<!-- oEmbed -->" | safeHTML }}
    	<link rel="alternate" type="application/json+oembed" href="{{ .Permalink }}/oembed.json"/>
    	{{ end }}
    

    I wanted to leave a way to say certain pages were non embeddable, and while I’m not using it at the moment, the logic remains.

    Does it Float Work?

    Of course!

    Nice, quick, to the point.

  • Hugo Deployment via GitHub Actions

    Hugo Deployment via GitHub Actions

    For a long time I’ve been managing deployment of a Hugo site via a few home-grown scripts and actions. All that has changed.

    The Setup

    So let’s get our setup explained:

    1. Hugo is used to manage a library of data
    2. The posts and the theme are in the same repo, but stored under ‘Content’ (which has data, posts, and static) and themes (which has … the theme)
    3. Most of the changes are in the post content

    Okay, now this is not a conversation about why (or why not) use Hugo. I like it for my pretty much text-only wiki type content, in that it lets me keep things organized and usable from everything including my iPad.

    But this use of Hugo comes with a cost. One of the reasons people love CMS tools like WordPress is that you can edit in your browser, and frankly that’s super easy. Using a static site builder, you have to run (somewhere) the static site build command. For a while I had a ‘deployme’ local command that did this:

    $ hugo -F -s /home/username/Development/site/hugo
    $ rsync -aCt --delete --exclude-from '/home/username/Development/rsync-exclude.txt' --force --omit-dir-times -e ssh /home/username/Development/site/hugo/ user@example.com:/home/username/domain/hugo
    

    Not super complicated, right? I have it on my laptop but it means I can’t always push code. Like if I edit directly on Github or on my iPad.

    Normally I’d look into something like Codeship (which I’ve talked about before) but … I thought it was high time I sat down and made things simpler.

    What’s Simpler?

    In this case, simpler means “fewer moving parts that could go wrong.”

    See I love Codeship, it lets me do a lot of cool things, but it’s also a little fragile and (honestly) sucky when it comes to Hugo. Creating a server and running hugo took longer than it did on my laptop. A lot longer. Many minutes longer.

    If I ran it locally it was a few seconds:

    Start building sites …
    hugo v0.87.0+extended darwin/arm64 BuildDate=unknown
    
                       |  EN
    -------------------+-------
      Pages            | 1762
      Paginator pages  |    0
      Non-page files   |    0
      Static files     |   92
      Processed images |    0
      Aliases          |    2
      Sitemaps         |    1
      Cleaned          |    0
    
    Built in 1562 ms
    

    When I did it on Codeship it would be 5-14 minutes! That’s crazy, right? Which is why I moved off Codeship and down to local. But that came with the cost of limiting when and where I could run anything. While my code is super simple, it’s also silo’d and that’s bad.

    In order to achieve simplicity, what I really needed is code that runs from Github. Or on the server where the site is. Back in ‘the day’ I installed hugo on the server, but also Git! That means I pushed to my git repo, which was on the same server, and I used post-commit hooks to deploy. I’ve toyed around with a few iterations, but then I moved to a new server where I didn’t install Go because … I didn’t need it.

    And that means here, simple is:

    • runnable from anywhere
    • automated
    • restricted when needed
    • not crossing multiple services

    Which led me to Github actions.

    Github Actions

    This is a service from Github.

    GitHub Actions makes it easy to automate all your software workflows, now with world-class CI/CD. Build, test, and deploy your code right from GitHub. Make code reviews, branch management, and issue triaging work the way you want.

    In other words, Github saw us all using Travis and Codeship and thought “We could do that and keep people here, right?”

    But Actions goes beyond just automation. The Actions interface allows you to run tests, builds, checks, and, yes, deploys. It’s an order of magnitude faster than tools like Codeship because it’s a stripped down, basic interface. It’s also controlled by Github so you don’t need more access than committing code.

    There are some cons, though. One of the headaches with Codeship was that when Hugo updated, Codeship might just … stop working right. So you had to find the magic sauce to make it work. With Github Actions, you’re using ‘actions’ built by other people a lot of the time, and if you’re familiar with the drama that happened in npm a while ago, you may share my fear of “What if someone else deletes their action…?”

    Yeah, I have concerns/

    main.yml

    Here’s my code:

    name: 'Generate and deploy'
    
    on:
      push:
        branches: [ production ]
    
    jobs:
      deploy-website:
        runs-on: ubuntu-latest
        steps:
          - name: Do a git checkout including submodules
            uses: actions/checkout@v2
            with:
              submodules: true
    
          - name: Setup Hugo
            uses: peaceiris/actions-hugo@v2
            with:
              hugo-version: 'latest'
              # extended: true
    
          - name: Build Hugo
            run: hugo --minify
    
          - name: Deploy to Server
            uses: easingthemes/ssh-deploy@main
            env:
              SSH_PRIVATE_KEY: ${{ secrets.SERVER_SSH_KEY }}
              ARGS: "-rlgoDzvc -i"
              SOURCE: "public/"
              REMOTE_HOST: ${{ secrets.REMOTE_HOST }}
              REMOTE_USER: ${{ secrets.REMOTE_USER }}
              TARGET: "/home/username/domain/library/"
              #EXCLUDE: "/dist/, /node_modules/"
    

    There are a number of alternatives, but I picked peaceiris/actions-hugo because that developer is well known and respected. And while there are all-in-one Hugo build and deploy, I decided to separate them because I linked peaceiris’ code. This meant I needed an Rsync or ssh deployment. I settled on easingthemes/ssh-deploy because they strongly encouraged the use of secrets, and that’s a good sign to me. Also it’s heavily recommended by Flywheel, and a I cannot imagine them being reckless.

    The only ‘gotcha’ I had was the directions about how to setup SSH was not great.

    To make it work, you need to create a pem key on the server:

    ssh-keygen -m PEM -t rsa -b 4096
    

    Then you need to put that key in a secret (I named mine SERVER_SSH_KEY). But what they don’t mention quite as clearly is what this means:

    Private key part of an SSH key pair. The public key part should be added to the authorized_keys file on the server that receives the deployment.

    Yes, they’re saying “the public key for your own server has to be on the authorized_keys for the server.” And yes, that’s a weird thing to say, but there it is. That means you copy your own key from your server at ~/.ssh/id_rsa.pub (the .PUB is the part that explains this is a PUBLIC key) and you paste that in to the end of ~/.ssh/authorized_keys on the same server. Yes, it’s really funky.

    My ongoing concerns

    I mentioned that there are security issues. I spend a lot of time in WordPress plugin land, where people could commit nefarious code to a plugin any day. Some do. I’ve banned enough people from .org for stupid stuff like that. And some of Github’s advice matches my own: the only way to be safe is to do the damn research yourself.

    But that’s not really something most people can do. And it’s something for a longer post in general. My short list of concerns right now is:

    • the action I’m using is deleted
    • the action is edited and breaks my flow/injects malware
    • the action is used to steal my credentials

    There are ways to mitigate this

    • I can use actions made and managed and maintained by Github only (those are under the Actions org) — those are unlikely to be deleted and can be trusted as much as Github
    • I can make copies of the actions I want to use (and periodically remember to update them…)
    • I can make use of encrypted secrets to hide sensitive information

    But. It’s a risk. And I know it.