I use NetPhotoGraphics to handle a 2.5 gig gallery, spanning back 20 or so years. The gallery used to be a home grown PHP script, then it was Gallery, then Gallery 2, then ZenPhoto, and now NetPhotoGraphics (which ostensibly is a fork of ZenPhoto, but diverged in a way I’m more supportive of).
Anyway. I use this gallery in conjunction with a WordPress site. I’ll post news on WordPress and link to the gallery. But for years, to do that my choices were:
make a text link
make a photo which is a link
copy all the thumbnails over and link each one
Those all suck. Especially the third, since you can’t (out of the box) custom link images in a gallery in WordPress and frankly I don’t like any of the plugins.
Once upon a time, I used a ZenPhoto plugin, but it’s been abandoned for years and stopped working a while ago. I needed something that had an elegant fallback (i.e. if you uninstall the plugin) and seriously thought about forking the WordPress plugin…
But then I had a better idea.
Why oEmbed?
oEmbed is an industry standard. By having your app (Flickr, Twitter, your WordPress blog) offer a custom endpoint, someone can embed it easily into their own site! WordPress has supported many embeds for a long time, but as of 2015, it’s included oEmbed Discovery. That’s why you can paste in a link to Twitter, and WordPress will automagically embed it!
I maybe wrote an oembed plugin for another CMS so I could embed things into WordPress… Because the other option was a MASSIVE complex WP Plugin and FFS why not?
(Note: I shut down my twitter account in November ‘22 when it was taken over by a narcissist who brought back abuse.)
I just pasted the URL https://twitter.com/Ipstenu/status/1441950326777540609 in and WordPress automagically converts it to a pretty embed. About the only social media company you can’t do that with is Facebook, who requires you to make an app (I use Jetpack for it). Anyway, point being, this is also how tools like Slack or Discord know to embed your content when you paste in a link!
By making an oEmbed endpoint, I allow my site to become more shareable and more engageble, which is a net positive for me. If I do it right, out of the box it’ll allow anyone with a WordPress site (i.e. me) to paste in a URL to my gallery and it looks pretty! Win win!
The NetPhotoGraphics Plugin
Now. I’m a terrible designer, so I literally copied the design WordPress itself uses for embeds and spun up a (relatively) fast solution: oEmbed for NetPhotoGraphics.
The code is one file (oembed.php) which goes in the /plugins/ folder in your NetPhotoGraphics install. Then you activate the plugin and you’re done. There are only one thing to customize, the ‘gallery’ icon. By default it grabs a little NPG logo, but if you put a /images/oembed-icon.png image in your gallery, it’ll use that.
And does it work? Here’s how the first version looked on a live page:
I wanted to limit the images since sometimes I have upwards of 200 (look, episodes of CSI are a thing for me). And frankly pasting in a URL to the gallery is a lot easier than drilling down on a list of a hundred albums. This is exactly what I needed.
Since the creation of that, I worked with netPhotoGraphics and he helped me make it better.
One Bug and a Future
There’s room to grow here. Thanks to S. Billard, we’ve got a lot more flexible. You can override the basic design with your own theme, you can replace the icons, and there are even options to adjust the size of the iframes. Part of me thinks it could use a nicer design, maybe a single-photo Instagram style embed instead of what I have, but that’s not my forte. Also I have yet to get around to putting in ‘share’ options. (Pull Requests welcome!)
And yes, I know the security isn’t ‘enough’ but I wasn’t able to get it to work how I wanted due to a weird bug. You see, I did run into a rare quirk with WordPress due to how I built out the site. IF you have your gallery in a subfolder under/beside a WordPress install AND you try to embed the gallery into that WordPress site, you MAY find out WP thinks your embed is WordPress and not NPG.
In my case, I have:
example.com – WordPress
example.com/gallery – NetPhotoGraphics
I guess WordPress reads a little too deep into who’s WP and who’s not, which resulted in me making this WordPress filter:
The posts and the theme are in the same repo, but stored under ‘Content’ (which has data, posts, and static) and themes (which has … the theme)
Most of the changes are in the post content
Okay, now this is not a conversation about why (or why not) use Hugo. I like it for my pretty much text-only wiki type content, in that it lets me keep things organized and usable from everything including my iPad.
But this use of Hugo comes with a cost. One of the reasons people love CMS tools like WordPress is that you can edit in your browser, and frankly that’s super easy. Using a static site builder, you have to run (somewhere) the static site build command. For a while I had a ‘deployme’ local command that did this:
Not super complicated, right? I have it on my laptop but it means I can’t always push code. Like if I edit directly on Github or on my iPad.
Normally I’d look into something like Codeship (which I’ve talked about before) but … I thought it was high time I sat down and made things simpler.
What’s Simpler?
In this case, simpler means “fewer moving parts that could go wrong.”
See I love Codeship, it lets me do a lot of cool things, but it’s also a little fragile and (honestly) sucky when it comes to Hugo. Creating a server and running hugo took longer than it did on my laptop. A lot longer. Many minutes longer.
If I ran it locally it was a few seconds:
Start building sites …
hugo v0.87.0+extended darwin/arm64 BuildDate=unknown
| EN
-------------------+-------
Pages | 1762
Paginator pages | 0
Non-page files | 0
Static files | 92
Processed images | 0
Aliases | 2
Sitemaps | 1
Cleaned | 0
Built in 1562 ms
When I did it on Codeship it would be 5-14 minutes! That’s crazy, right? Which is why I moved off Codeship and down to local. But that came with the cost of limiting when and where I could run anything. While my code is super simple, it’s also silo’d and that’s bad.
In order to achieve simplicity, what I really needed is code that runs from Github. Or on the server where the site is. Back in ‘the day’ I installed hugo on the server, but also Git! That means I pushed to my git repo, which was on the same server, and I used post-commit hooks to deploy. I’ve toyed around with a few iterations, but then I moved to a new server where I didn’t install Go because … I didn’t need it.
And that means here, simple is:
runnable from anywhere
automated
restricted when needed
not crossing multiple services
Which led me to Github actions.
Github Actions
This is a service from Github.
GitHub Actions makes it easy to automate all your software workflows, now with world-class CI/CD. Build, test, and deploy your code right from GitHub. Make code reviews, branch management, and issue triaging work the way you want.
In other words, Github saw us all using Travis and Codeship and thought “We could do that and keep people here, right?”
But Actions goes beyond just automation. The Actions interface allows you to run tests, builds, checks, and, yes, deploys. It’s an order of magnitude faster than tools like Codeship because it’s a stripped down, basic interface. It’s also controlled by Github so you don’t need more access than committing code.
There are some cons, though. One of the headaches with Codeship was that when Hugo updated, Codeship might just … stop working right. So you had to find the magic sauce to make it work. With Github Actions, you’re using ‘actions’ built by other people a lot of the time, and if you’re familiar with the drama that happened in npm a while ago, you may share my fear of “What if someone else deletes their action…?”
Yeah, I have concerns/
main.yml
Here’s my code:
name: 'Generate and deploy'
on:
push:
branches: [ production ]
jobs:
deploy-website:
runs-on: ubuntu-latest
steps:
- name: Do a git checkout including submodules
uses: actions/checkout@v2
with:
submodules: true
- name: Setup Hugo
uses: peaceiris/actions-hugo@v2
with:
hugo-version: 'latest'
# extended: true
- name: Build Hugo
run: hugo --minify
- name: Deploy to Server
uses: easingthemes/ssh-deploy@main
env:
SSH_PRIVATE_KEY: ${{ secrets.SERVER_SSH_KEY }}
ARGS: "-rlgoDzvc -i"
SOURCE: "public/"
REMOTE_HOST: ${{ secrets.REMOTE_HOST }}
REMOTE_USER: ${{ secrets.REMOTE_USER }}
TARGET: "/home/username/domain/library/"
#EXCLUDE: "/dist/, /node_modules/"
There are a number of alternatives, but I picked peaceiris/actions-hugo because that developer is well known and respected. And while there are all-in-one Hugo build and deploy, I decided to separate them because I linked peaceiris’ code. This meant I needed an Rsync or ssh deployment. I settled on easingthemes/ssh-deploy because they strongly encouraged the use of secrets, and that’s a good sign to me. Also it’s heavily recommended by Flywheel, and a I cannot imagine them being reckless.
The only ‘gotcha’ I had was the directions about how to setup SSH was not great.
To make it work, you need to create a pem key on the server:
ssh-keygen -m PEM -t rsa -b 4096
Then you need to put that key in a secret (I named mine SERVER_SSH_KEY). But what they don’t mention quite as clearly is what this means:
Private key part of an SSH key pair. The public key part should be added to the authorized_keys file on the server that receives the deployment.
Yes, they’re saying “the public key for your own server has to be on the authorized_keys for the server.” And yes, that’s a weird thing to say, but there it is. That means you copy your own key from your server at ~/.ssh/id_rsa.pub (the .PUB is the part that explains this is a PUBLIC key) and you paste that in to the end of ~/.ssh/authorized_keys on the same server. Yes, it’s really funky.
My ongoing concerns
I mentioned that there are security issues. I spend a lot of time in WordPress plugin land, where people could commit nefarious code to a plugin any day. Some do. I’ve banned enough people from .org for stupid stuff like that. And some of Github’s advice matches my own: the only way to be safe is to do the damn research yourself.
But that’s not really something most people can do. And it’s something for a longer post in general. My short list of concerns right now is:
the action I’m using is deleted
the action is edited and breaks my flow/injects malware
the action is used to steal my credentials
There are ways to mitigate this
I can use actions made and managed and maintained by Github only (those are under the Actions org) — those are unlikely to be deleted and can be trusted as much as Github
I can make copies of the actions I want to use (and periodically remember to update them…)
I’ve been using Tiny Tiny RSS for … well years. Almost a decade. I like it a lot, the interface is nice and pretty to use. But there have always been some serious lingering issues with it.
The developer is very opinionated, to the point of aggression
The development is Docker, to the point that non-Docker support is non existent
Support for ‘non modern’ browsers means Safari is not supported
Now I’m opinionated, and I can be curt and blunt at times. And I work with a lot (A LOT) of people who are similar. I do plugin reviews for WordPress.org — trust me, I know from opinionated developers. I have lost track of the time I’ve spent arguing with prima donnas who cannot fathom that their code might not be god’s gift to the universe.
The majority of people, thankfully, are not like that. They recognize no one is perfect, they understand that sometimes you have to make allowances in your code for the sake of a system, and most of all they aren’t aggro when told “no.” (If you find yourself getting pissed off, BTW, when someone reviews your code, yes, I’m talking about you.)
Anyway. Andrew Dolgov is an amazing developer, a talented one at that. But he has a very ‘my way or GTFO’ kind of flow, and since it’s a single-man project, I really do get that. And for the time that he happily supported PHP on whatever, I didn’t care. The code worked, he didn’t have any strong opinions that offended me (like being a Nazi sympathizer, and yes, I’ve ditched software I love for that), and so what if he was a bit prickly?
But… He’s Docker all in. And I like Docker, but I don’t want to run it all the time, and certainly not for a flippin’ RSS reader that is PHP and SQL and that’s it. As time went on, it got harder and harder and harder to manage and maintain a slight fork, to the point that it’s just not worth it.
The Replacement: FreshRSS
FreshRSS. It’s a barebones, simple, easy to install RSS reader. How easy? It’s practically a ‘famous five minute install.’
The install was to download the latest release, unzip it on my server, and then I went to the URL where I’d installed it ( i.e. https://example.com ) and entered the DB credentials. Then I made a new account and boom. Done.
Much like with TTRSS, I have to set up a cron job to run the refresh, which I set to hourly:
Don’t waste time with the various plugins, they’re not supported and in my experience, don’t work. Also if you’re mystified trying to find out how to export, it’s not just you. I had to trawl through the forums to find an example that didn’t work, but did link me to the code and I was able to figure it out from there.
Once you have that, save the OPML file and pop over to FreshRSS and import. It will keep your categories and everything.
Yeah, that was it!
The Tweaks
Most of the settings are fine as is. I turned off the option to mark as read when I scroll by (I regularly use unread to know what I need to handle next):
I also added in a filter to mark a specific feed as read unless it mentions a keyword which was as easy as a filter for -intitle:keyword to that feed.
Recently Gravity Forms was added to a site I work on. Now, I’ve never used it before, so I was hands off (except for changing the email it sent to) and I know pretty much nothing at all about it. But what I do know is that there’s a real jerk out there who’ll spam it, given a chance.
Unlike other contact form plugins out there, Gravity Forms comes with built in free integration with Akismet! But, like pretty much every other plugin out there, it does not integrate with my disallowed keys.
I’m a big proponent of not reinventing the wheel, and I strongly feel that being able to block someone from comments and contact forms should be a done deal. I opted to mark people who do this as spam, instead of a rejection, so they will never know if I ever saw their email or not. This is a questionable use of the spam settings, but at the same time, it’s been a rough couple of years.
The Process
Since the disallowed_keys list contains emails and words, the first thing I wanted to do was strip out everything that wasn’t an email or an @-domain — that means foobar@example.com is a valid entry, and @spammers-r-us.com is a valid entry, but foobar on it’s own is not. I run through my disallowed list, add everything valid to an array in a new variable.
Before I can pass through the email, though, I need to remove any periods from the username. You see, Gmail allows you to use foobar and foo.bar and fo.o.b.a.r all as the same valid username on your email. Yes. all those would go to the same person. To get around this, I remove all periods and make a clean username.
Also I have to consider the reality of jerks, who do things like foobar+cheater@example.com — Gmail allows you to use the + sign to get clever and isolate emails, which I use myself to track what sign-up spams me. At the same time, I don’t want people to get around my blocks, so I have to strip everything following the plus-sign from the email.
While I’m doing this, I’ll save the domain as it’s own variable, because that will allow me to check if @spammers-r-us.com is on my list or not.
Once I’ve got it all sorted, I do an in-array: if either the exact (clean) email is in the array, or the exact @-domain is in the array, it’s spam and I reject.
The Code
add_action( 'gform_entry_is_spam_1', 'my_spam_filter_gform_entry_is_spam_1', 10, 3 );
function my_spam_filter_gform_entry_is_spam_1( $is_spam, $form, $entry ) {
// If this is already spam, we're gonna return and be done.
if ( $is_spam ) {
return $is_spam;
}
// Email is field 2.
$email = rgar( $entry, '2' );
// Build a list of valid emails & domains from disallowed_keys
$disallowed_emails = array();
$disallowed_array = explode( "\n", get_option( 'disallowed_keys' ) );
// Make a list of spammer emails and domains.
foreach ( $disallowed_array as $spammer ) {
if ( is_email( $spammer ) ) {
// This is an email address, so it's valid.
$disallowed_emails[] = $spammer;
} elseif ( strpos( $spammer, '@' ) !== false ) {
// This contains an @ so it's probably a whole domain.
$disallowed_emails[] = $spammer;
}
}
// Break apart email into parts
$emailparts = explode( '@', $email );
$username = $emailparts[0]; // i.e. foobar
$domain = '@' . $emailparts[1]; // i.e. @example.com
// Remove all periods (i.e. foo.bar > foobar )
$clean_username = str_replace( '.', '', $username );
// Remove everything AFTER a + sign (i.e. foobar+spamavoid > foobar )
$clean_username = strstr( $clean_username, '+', true ) ? strstr( $clean_username, '+', true ) : $clean_username;
// rebuild email now that it's clean.
$clean_email = $clean_username . '@' . $emailparts[1];
// If the email OR the domain is an exact match in the array, then we know this is a spammer.
if ( in_array( $clean_email, $disallowed, true ) || in_array( $domain, $disallowed, true ) ) {
$return = true;
}
// If we got all the way down here, we're not spam!
return false;
}
Of Note…
Before you use this yourself, you will need to customize two things!
gform_entry_is_spam_1 is actually the specific form I’m checking. Form ID 1. Customize that to match your form ID.
$email = rgar( $entry, '2' ); — you may have noticed I put ’email is field 2′ as a note above it. That’s because email is the second field on form 1, so I hard grabbed it. If yours is different, change that.
Also … I actually broke this out into two files, one that just checks “Is this a spammer?” and the Gravity Forms file, so the latter calls spammers.php and checks the email against the is_spammer() function. The reason I did that is because I need to run this same check on Jetpack’s contact form. Both call the same function to know if someone is evil.
Note: While this post is about using Algolia, the irony is that shortly after I posted it, I removed Algolia. The reason being, InstantSearch counts as a separate search per letter used — that means I was about to skyrocket over my allowance and hit the thousands-a-month. I feel their pricing was quite unclear about this. But hey, now you know!
My friends, it’s been a while. And if you follow my rants on Twitter (this is not a suggestion you should). you saw I faced off with ElasticSearch, my nemesis.
Moons ago, I attempted to use it to make a site I run faster by using ElasticSearch. At the time, I struggled with search ranking and all those things. Then it broke with Jetpack and made my server core-dump. So in 2016 I tossed it in the can and walked away. After all, I didn’t need it. WP’s search was sufficient.
Fastforward 4 years and with around 12k posts to search, guess what isn’t so okay anymore?
What’s Wrong With WordPress Search?
Nothing.
When you search on WordPress, it uses SQL queries to check in and find all instances of ‘a thing’ (whatever it is you searched for). So logically if you have a lot of posts (or a lot of content, be that in the form of a few huge posts or a high number of smaller ones), you’re going to experience slower searches.
Also WordPress’ search isn’t customizable. You can’t tell it “Don’t search page X” or even “Prioritize post titles over content.” This leads to some odd results.
But realistically neither of those issues are ‘wrong.’ Those are broad choices made to support 80% of WordPress users.
This means your question of “What’s wrong with search?” is really “Are there specific cases wherein the default search won’t be the best choice for me?” And those two issues? They’re why. If your site is large (or getting there) and if you need to ‘weigh’ search results to prioritize A over B, then this post is for you.
Solving The Right Problem
While the first thing you always look at is “What do I need to solve?” by the time you get around to ranting about how WP search sucks, you kind of know where to start. That is, either search is too slow or you need to customize it. Or both.
If you need to customize your search results, I recommend you look at plugins like Relevanssi, which does a great job of handling that. However there are two critical flaws for most (if not all) self-hosted plugin solutions. They’re going to make your database big. And let’s be clear here, a bigger DB is not going to help your speed issues. It becomes harder to back-up and more fragile. Relevanssi is refreshingly honest about this, warning you that your DB will triple in size, but also making sure you know that over 50k posts won’t work.
Subsequently, a large site means you need to start looking at services. Algolia, Swiftype, ElasticSearch, and Solr are all amazing, viable, services. Some have plugins for easy WordPress integration, some do not. Some are open source, some are not. Some let you build your own… Let me just show you:
Name
Open Source
Service
Roll Your Own
Plugin
Algolia
No
Yes
No
Unofficial
Switftype
No
Yes
No
Official
Elastic
Yes
Yes
Yes
Unofficial
Solr
Yes
No
Yes
Unofficial
You get the idea. Lots of options. And I did not pick Elastic (who owns Swiftype now, BTW).
You see, Elastic is more than just a search. It’s really a whole database of your content. This means you can hook into it to speed up WordPress queries for long/large tables. But … That isn’t my problem. My problem is just search.
Services are Spendy
I ended up using a service because I was going bonkers. Seriously. The ‘directions’ for both Solr and Elastic are really terrible. They go in with some assumptions that you’ve done similar things (haven’t) and don’t have what I would call an ‘intro doc.’ Solr I got a lot further than Elastic, but WP integration was weird. And Elastic … No.
Installing it was weirdly easy. The problem was I could not find any information about configuring it. People say “You must secure it.” Okay, sure, I can do that… But no one sat and explained why you want the nodes, what they do together, why you want them on separate servers (or even that you do) and .. Honestly I wanted to throw my laptop out the window.
It does, mind you, bring up an important note. Search storage and Elastic services are expensive. Even Jetpack, who offers a bundled Elasticsearch integration (yes, that’s what Jetpack Search is) would cost a site with 10,000 posts around $600 a year. Even using Amazon’s Elasticsearch it’s going to run you a lot. How much? Well if you just toss in their defaults and accept the large settings, it’s to the tune of $22 a day. Give or take. Small settings for my site? Around $8 a day.
ElasticPress (whom I do recommend if you’re using Elastic) starts at $79 a month. Jetpack’s new search is free for small sites, but for mine (again, we’re over 10k posts) it would begin at $60.
Algolia though … 12k records is about $3 a month. And it’s all search.
Enter Algolia
The name meaning delights me:
inability to speak due to mental deficiency or a manifestation of dementia.
Because when you are searching, you often feel like you’re losing your mind and you have a problem. Search is hard okay? There’s a reason AltaVista, Lycos, Yahoo, and now Google are important. Searching is crazy weird and hard and sometimes it’s faster to go “lezwatchtv ACTOR NAME” than to search on our site.
That was not good at all.
Algolia is one of the more straightforward setups I’ve had in a while.
I decided to make my records smaller. Algolia only cares about the number of records, not the size, as long as each record is under 10k. I have a lot of meta data and a lot of records. If I was to index everything, I’d be around 15k records, which isn’t bad but I really only needed about 12k of them.
One of the odd things the plugin does is that it uses separate indexes for Auto-Complete. So I could store all my searchable posts and all my shows, characters, etc etc. Which would make for 50k records, and I didn’t want that. Sure it makes some aspects of search easier, but I knew I could do this a better way.
I started by making a plugin and removing some records:
This tells the plugin “Never index users or taxonomies” Most of you will want the taxonomies! I don’t, mostly because they don’t impact how people search really. And yes, I did study my logs. No one cares who wrote what for my site, and that’s okay.
Refining Search Results
Next I needed to make sure that autocomplete (which I use) and the search page both put the right content to the top.
There is one and only one ‘flaw’ with Algolia, and that’s they don’t make it easy to define a ‘perfect’ match. I have a case where I have 5 post types (posts, pages, shows, actors, characters) and there’s crossover. If I search for “One Day at a Time” I get everything that mentions it. Which is not what I wanted. And while the title of the post I wanted to find was the TV show “One Day at a Time”, it was bringing up my blog posts (and the page!) first.
This was solvable because the plugin is amazing. I filtered and told it what attributes to remove:
add_filter( 'algolia_post_shared_attributes', 'my_prefix_algolia_attributes', 10, 2 );
add_filter( 'algolia_searchable_post_shared_attributes', 'my_prefix_algolia_attributes', 10, 2 );
function algolia_attributes( array $attributes, WP_Post $post ) {
// Remove things we're not using to make it easier.
$remove_array = array( 'taxonomies_hierarchical', 'post_excerpt', 'post_modified', 'comment_count', 'menu_order', 'taxonomies', 'post_author', 'post_mime_type' );
foreach ( $remove_array as $remove_this ) {
if ( isset( $attributes[ $remove_this ] ) ) {
unset( $attributes[ $remove_this ] );
}
}
return $attributes;
}
This ensured I keep my records small enough because I had some math to do.
The function algolia_attributes() needed to promote certain posts over others, so I added in a switch using some data I already saved
// Add Data for individual ranking
switch ( $post->post_type ) {
case 'post_type_shows':
// Base score on show score + 50
$attributes['score'] = round( get_post_meta( $post->ID, 'lezshows_the_score', true ), 2 );
$attributes['score'] = 50 + (int) $attributes['score'];
break;
case 'post_type_characters':
$attributes['score'] = 150;
break;
case 'post_type_actors':
$attributes['score'] = 150;
break;
default:
$attributes['score'] = 0;
break;
}
This adds an attribute of ‘score’ based on post type. I could weigh the up or down as I wanted.
Then I went into Algolia’s admin, and this is where the search tool becomes a champ. under Indices -> Configuration, I changed up the Ranking and Sorting:
Their default is: [“typo”,”geo”,”words”,”filters”,”proximity”,”attribute”,”exact”,”custom”]
Mine is: [“exact”,”score”,”post_title”,”attribute”,”post_type_label”,”typo”,”proximity”,”words”, “is_sticky”, “post_date”]
This actually handled 90% of what I needed without any custom tweaks or rules.
But weight, there’s more!
Those ‘Attributes’ are the searchable parts of the attributes I was messing with in the refinements section. Most of what they’re used for is helping rank and sort the relevant data to make sure Sara Lance is on top. Which she always is. But. I always wanted to make some related data show up.
By default, the searchable attributes were title and content. I added in a new attribute called lwtv_meta and in it I added more data. When the index is built for a character (say), it adds a list of all the actors who play the character and all the shows they’re on into that meta attribute. Then I added that attribute to the search. This means if you look for “Legends of Tomorrow” you will see our girl Sara Lance.
That has a small side effect though… Where’s the show!?
So I still have some kinks to work out, but the point is that with a couple tweaks and some extra data, I got everything set up in 3 days. Bonus? The plugin came with templates I quickly tweaked to match my theme. And I’m bad at design!
So PHP 7.4 is rolling out, WP is hawking it, and you know your code is fine so you don’t worry. Except you start getting complaints from your users! How on earth could that be!? Your plugin is super small and simple, why would this happen? You manage to get your hands on the error messages and it’s weird:
NOTICE: PHP message: PHP Warning: Illegal string offset 'TERM' in /wp-content/plugins/foobar/search-form.php on line 146
NOTICE: PHP message: PHP Warning: ksort() expects parameter 1 to be array, string given in /wp-content/plugins/foobar/search-form.php on line 151
NOTICE: PHP message: PHP Warning: Invalid argument supplied for foreach() in /wp-content/plugins/foobar/search-form.php on line 153
NOTICE: PHP message: PHP Warning: session_start(): Cannot start session when headers already sent in /wp-content/plugins/foobar/config.php on line 12
But that doesn’t make any sense because your code is pretty straight forward:
Illegal string offset 'TERM' in /wp-content/plugins/foobar/search-form.php on line 146
We’re clearly trying to save things into an array, but the string offset actually means you’re trying to use a string as an array.
An example of how we might force this error is as follows:
$fruits_basket = array(
'persimmons' => 1,
'oranges' => 5,
'plums' => 0,
);
echo $fruits_basket['persimmons']; // echoes 1
$fruits_basket = "a string";
echo $fruits_basket['persimmons']; // illegal string offset error
$fruits_basket['peaches'] = 2; // this will also throw the same error in your logs
Simply, you cannot treat a string as an array. Makes sense, right? The second example (peaches) fails because you had re-set $fruits_basket to a string, and once it’s that, you have to re-declare it as an array.
But with our error, we can see line 146 is $name_array[ $term->name ]= $child; and that should be an array, right?
Well. Yes, provided$name_array is an array. Hold on to that. Let’s look at error 2.
Error 2: ksort expects an array
The second error is that the function wanted an array and got a string:
NOTICE: PHP message: PHP Warning: ksort() expects parameter 1 to be array, string given in /wp-content/plugins/foobar/search-form.php on line 151
We use ksort() to sort the order of an array and here it’s clearly telling us “Buddy, $name_array isn’t an array!” Now, one fix here would be to edit line 149 to be this:
That makes sure it doesn’t try to do array tricks on a non-array, but the question is … why is that not an array to begin with? Hold on to that again, we want to look at the next problem…
Error 3: Invalid argument
Now that we’ve seen the other two, you probably know what’s coming here:
NOTICE: PHP message: PHP Warning: Invalid argument supplied for foreach() in /wp-content/plugins/foobar/search-form.php on line 153
This is foreach() telling us that the argument you passed isn’t an array. Again.
What Isn’t An Array?
We’re forcing the variable $name_array to be an array on line 146. Or at least we thought we were.
From experience, using $name_array[KEY] = ITEM; was just fine from PHP 5.4 up through 7.3, but as soon as I updated a site to 7.4, I got that same error all over.
The issue was resolved by changing line 141 to this: $name_array = array();
Instead of defaulting $name_array as empty with'', I used the empty array() which makes it an array.
An alternative is this: $name_array = (array) '';
This casts the variable as an array. Since the array is meant to be empty here, it’s not really an issue either way.
Trying to use values of type null, bool, int, float or resource as an array (such as $null["key"]) will now generate a notice.
The lesson here is that PHP 7.4 is finally behaving in a strict fashion, which regards to data types. Whenever a non-array variable is being used like an array, you get an error because you didn’t say “Mother May I…” it was an array.
Whew.
Now to be fair, this was a warning previously, but a lot of us (hi) missed it.
So. Since WordPress is pushing PHP 7.4, go check all your plugins and themes for that and clean it up. Declare an array or break.
Oh and that last error? Headers already sent? Went away as soon as we fixed the variable.
Cookie Consent
We use cookies to improve your experience on our site. By using our site, you consent to cookies.
Contains information related to marketing campaigns of the user. These are shared with Google AdWords / Google Ads when the Google Ads and Google Analytics accounts are linked together.
90 days
__utmx
Used to determine whether a user is included in an A / B or Multivariate test.
18 months
_gali
Used by Google Analytics to determine which links on a page are being clicked
30 seconds
_ga_
ID used to identify users
2 years
_ga
Used to distinguish users.
2 years
_gat
Used to throttle request rate.
1 minute
_gid
Used to distinguish users.
24 hours
__utma
Used to distinguish users.
Persistent
__utmb
Used to determine new sessions/visits.
30 minutes
__utmc
Used to determine if the user is in a new session/visit.
Session
__utmt
Used to throttle request rate.
10 minutes
__utmv
Used to store visitor-level custom variable data.
2 years
__utmz
Stores the traffic source or campaign that explains how the user reached your site.
We use cookies to personalize content and ads, to provide social media features, and to analyze our traffic. We also share information about your use of our site with our social media, advertising, and analytics partners.