WordPress’ Post name Permalinks, Block Editor, and JSON Error

Following the right path makes finding clues easier.

Stated in many posts on this site is the lack of a traffic generating goal. When I first set up the site permalinks defaulted to “Plain” or I set them to “Plain.” I don’t remember. Every URL for a page, post, comment, or view ended with something like /?p=59 or /?p=128. It kept the URLs short but provided no other real benefit.

Recently I began to think about changing the permalinks to title style. I now appreciate having some description of the page in the URL and wanted to enable it for this site. Plus it is supposed to help with search engine ranking if the URL is descriptive. Simple enough, change the Permalink Structure from Plain to Post name.

Not so simple.

The change saves without error and the site functions as expected when browsed afterward. The URLs all include the title in the link rather than ending /?p=xx. All seems well. Then edit an existing draft or published post or create one. When saved as draft or attempting to publish, an invalid JSON message is displayed. The edits, or new post, will not be saved and cannot be published!

Start tracking down “not a valid JSON response” with “WordPress” and got plenty of hits. As is usual when there’s lots of hits, there is agreement across articles about many of the settings to review for a solution. And none of the common solutions applied! The conditions were already as specified or making the change didn’t make saving edits in the block editor possible. Same for the solutions that weren’t common to all articles. Either conditions already as specified or the change didn’t resolve the problem.

I turned off the block editor and used the classic editor. Posts could be edited and saved and new posts could be created and saved as draft or published. The site was working, and with Post name permalinks, but nothing could be edited or created with the block editor, the classic editor needed to be used.

Trying to get a handle on the problem I reviewed the server logs, the web server logs, enabled and reviewed the WordPress logs, monitored console output in my browser. I also tried editing from different operating systems and with different browsers. The problem was consistent regardless the OS or browser used while editing. All I was able to find was that files in a wp-json folder weren’t being found.

With this information I searched the server from / for a path that includes wp-json. There is no path with that string. Now a new search focus for troubleshooting. Rather than WordPress and the JSON error, WordPress and missing wp-json module. Eventually I found some articles that recommended AllowOverride all be enabled for the web server’s directory statement.

I tested and it worked! Post name Permalinks could be enabled and the block editor worked as expected. But I couldn’t reconcile enabling that directive for all sites on the server. Fortunately the <Directory> statement can be in a <VirtualHost> statement. This site’s <VirtualHost> statement now contains a <Directory> statement with AllowOverride all. Restart the server and *boom* the site works and editing in the block editor works.

It is surprising to me that not one of the WordPress related troubleshooting articles I found, that had in agreement at least a handful of steps among them, ever mentioned the Apache <Directory> statement.

The troubleshooting tree that finally worked, tracking down reasons for missing wp-json path, wasn’t hinted at in the initial error message, searches with “WordPress” and “the response is not a valid JSON response” kept turning up the same potential causes, none of which included discovering the missing wp-json, how to troubleshoot it, or that a web server configuration requirement, not a WordPress setting, resolved the issue.

When troubleshooting keep digging, and try different searches if what first turns up doesn’t help. Need to keep digging until the corrective action is found. Even if the cause or messages from it can’t be found. Apache log level was set to info and there was noting with the term “wp-json” found in any log.

Cloud Storage, overGrive, and my own cloud?

Host my own cloud? Part of the journey is here. Not a full blown private cloud yet.

Syncing with external (cloud) directories is such a common thing. Providers have big incentives to lock you into their platform and don’t always provide a straightforward or full featured way to connect if you’re not using their connection tool. And there are security considerations that affect the method(s) available to connect to the account.

I’ve had a GMail account for ages because I’ve had Android phones. I got in the habit of using DropBox on the phone as a convenient storage for documents on the phone and my computers, various Linux flavors, and Windows. DropBox changed its policy and limited to two the number of devices a free account can connect with. Now I needed a way for my second computer (primary computer + phone hit the device limit) to sync files.

overGrive to the rescue! A perpetual license, with plenty of personal use seats, for something like $5 back in 2020. Buy once, install on each pc, and have full GoogleDrive sync on my local drives. Make a change using any device, save the file, open it on another device and edit the sync’d copy with the latest changes.

I change my computer’s OS from time to time or do other things that require applications like overGrive to be reinstalled which involves reauthenticating overGrive with Google. Reinstall has always gone without a hitch and GoogleDrive was syncing on the pc. I’ve done this several times over the years with no issue. And all on the same original perpetual license.

When I needed to reinstall back in January because of one of those system changes, overGrive couldn’t authenticate. Google made some changes so the overGrive authentication (and other apps using the same mechanism) didn’t work any longer. Fortunately I was at a point where I didn’t regularly switch pcs and so wasn’t relying on GoogleDrive sync so much.

For a while the folks at The Fan Club had a page up explaining they didn’t know when the issue would be resolved. Google had changed the procedure and cost of licensing and they weren’t forecasting when/if the issues would be resolved.

A recent trip to The Fan Club revealed the problem description page was gone, replaced by instructions for setting up the Google Authentication on your own. I tried them and got authentication set up. Like many guides made for new services the illustrations, label names, and functional paths of the actual website were not were not the same, or in the same order. But overGrive was working again

It still makes GoogleDrive a manual sync for files I want on all devices. So there’s still a risk I cause a sync conflict between Dropbox which is “primary” and GoogleDrive which is meant as one way copy from Dropbox.

Solutions that come to mind are a paid Dropbox account so more devices can connect, switch over to GoogleDrive for all devices, or host my own cloud. There’s plenty of options for hosting my own cloud; FileRun, NextCloud, OwnCloud, Seafile, TrueNAS Scale, and others. And some appeal to knowing no one is monitoring my cloud use.

From Ubuntu/Zorin to Debian/OpenSUSE

Driven away from Ubuntu… by snaps

Ubuntu has been on my primary computer (initially desktop then laptop) for years. Yes, so many years that at one time my primary computer was a desktop. And on my backup laptop I’ve used a few different distributions but primarily Zorin.

The one thing in common with the distributions I’ve tired was being Ubuntu based. That meant lots of features driven by what Canonical was doing with Ubuntu. Then Canonical introduced snaps. For my use snaps have been frustrating. I believe it was Ubuntu 20.04 where snap packages became default for some apps and it has progressed to more and more default snap packages.

Things that frustrated me, and continued to frustrate me until switching to Debian in 2025, was that the snap daemon would often indicate updates needed but would refuse to update. Then also, snaps broke any modification to program launch shortcuts or made the modifications difficult or impossible (or at least beyond my willingness to invest the time) to implement where, when the app packaging was still .deb, updates didn’t break customizations. And, oh geeze, the loop back devices! Go from a third or a half screen of output when mount is issued to more than a screen full. That just makes it unnecessarily difficult to track down what you’re looking for in the mount output. All of this and more caused me to start seriously looking for distributions that don’t include snap, or at least don’t include or enable it by default.

What I’ve ended up doing is migrating my primary laptop to Debian and my backup to OpenSUSE.

There have been a few bumps in the migration, mostly because of my unfamiliarity with both Debian and OpenSUSE. But hey, anytime the OS is changed there’s some bumps. Even when upgrading to a newer version of the same OS.

At this point I won’t be back to Ubuntu for a while. I’m getting comfortable with Debian on my primary and getting comfortable with OpenSUSE on the backup. The initial draft for this post was created on my backup laptop, OpenSUSE, from a coffee shop connecting to my home server. The home server is still Ubuntu but, with the exception of Let’s Encrypt, there are no snaps in use on it.

Blocking, blocking, blocking

No visitors, so why not?

This site has been up for a few years now. Very few (hardly any) visitors. That’s fine. This is really just a place for me to make notes about tech that’s on my mind. Without a job there’s fewer situations that I find myself having to resolve so less to write about.

wp.boba.org is on the Internet though, so of course it gets hit by bots. And since commenting without creating a login is permitted the bots attempt to post spam. Comments need to be approved before they’re displayed so I see, and reject, all of the spam. Source IP is usually Russia but spam comments also come from Kazakhstan, Belarus, Iran, Amsterdam, Saudi Arabia, Kuwait, Dubai, China, and VPNs that originate in Stockholm and London, among other places.

For a while I didn’t bother about it and simply marked those comments as spam so they never show up on the site. Lately though I’ve changed my approach a bit. Since I’m not trying to make a popular site, and I realize the likelihood of getting real comments from any of the networks spam comes from is infinitesimal, I decided to start blocking networks that spam comments are coming from.

The interesting thing is that once I began blocking networks, spam comments became a bit more frequent. Each time from a new network, of course, because the firewall was updated for each new spam source.

The spam being more frequent is a subjective measure but when the first block rule went in it was a while before another spam comment showed up. After that new network was blocked the interval to the next spam comment was less than the interval from the first to the second. It seems as if once a site is detected where spam can be posted that IP or URL is shared among spammers so they can all take a crack at it.

I’ve also found how to add Internet block lists to the firewall. There’s hundreds of thousands if IPs that are blocked and the lists are updated daily. Even so, and much to my surprise, after adding the block lists, the only blocks I see in the log are from the spammer networks. That is honestly a surprise to me. With hundreds of thousands of IPs in the block lists I would have thought some would show up in the log. None have so far. That’s a good thing, but still a surprise.

Today’s blocked networks follow below. It will probably be a day or two before there will be others to add. Don’t expect updates. Hmmm…….

37.99.32.0/20
37.99.48.0/20
37.99.80.0/21
37.221.0.0/24
45.88.76.0/22
46.8.10.0/23
46.151.28.0/24
46.161.11.0/24
62.113.118.0/24
77.238.237.0/24
80.239.140.192/27
84.17.48.0/23
84.38.188.0/24
87.249.136.0/22
91.84.100.96/27
91.201.113.0/24
93.183.92.0/24
178.172.152.0/24
178.217.99.0/24
179.43.128.0/18
183.0.0.0/10
185.173.37.0/24
188.126.89.64/27
192.42.116.192/27
194.32.122.0/24
195.2.70.0/24
195.181.174.0/23
212.34.128.0/24
212.34.141.0/24
212.34.148.0/24

Windows 11 Pro setup

First time for me doing the out-of-box experience with Windows 11 Pro preloaded on new hardware.

ThinkPad X13 2-in-1 Gen 5 is a very nice laptop. Completing initial power-on and setup, since it was Windows Pro, I opted for a local admin account and associated it with a Microsoft account.

Once on the desktop I installed KeePassXC, Calibre, AeroAdmin, Firefox, iTunes, and Kdenlive. I used Kdenlive to create a few videos describing settings or usage for some of the apps. YouTube links sprinkled through this post.

The person getting the laptop has decided to step up their online credential game. Rather than using variations of a base password they want to learn to use and manage more complex ones. Bravo, I say. And check out the videos I made about using KeePassXC to do that, Login with KeePassXC and KeePassXC, Updating a Password.

Calibre is an e-book library manager. I included it because it has become very useful to me and I encourage everyone to try it. All those household documents, appliance manuals, car owner manuals, serial numbers and VINs can be cataloged and available at your fingertips. It’s also a great way to organize training guides, magazine and web articles, etc. Maybe my calibre library info system review will pique your interest.

Sometimes the hardest part of giving remote support is getting the recipient to recognize the steps that need to be taken to complete the connection. This AeroAdmin guide is my attempt to clarify that.

Then Firefox was added because after some updates Edge would no longer login to some Microsoft websites. That prevented access to some account info, among other things. So, install Firefox and with it successfully login to every Microsoft site that Edge would not login to. Firefox is there because a backup to Edge is necessary.

And what resolved Edge’s problems with Microsoft’s own sites? Disabling all Edge security features for any Microsoft.com, Office.com, Live.com domain. And after all that “Device encryption” couldn’t be enabled because it didn’t recognize the Microsoft account was logged in. It clearly was as demonstrated by access to OneDrive, Microsoft365, and other integrated features after logon to the desktop with no more credential prompts for any of those services.

It seems Microsoft tries to soften the blow when enabling device encryption fails with their messages, “Oops something when wrong” and “it was probably us”. It sets a light mood and is a relief at first. But after having the problem for more than a week it is disturbing that nothing has changed.

That didn’t get resolved before the laptop was delivered to its owner.

Confirmed, Movies Updates Work

House of cards, but with the stack setup it is easier.

Like many things that appear on a computer screen there is a long chain of events that need to happen successfully for what is on screen to be what is desired there. The various “Movies” tables on this website are one example.

I got some DVDs for Christmas. A very nice Seven Samurai DVD with extras. My movies database had it marked as movie I want. It appeared on this website in the Movies I Want tables that are on two pages of this website.

Change the setting in the database so it’s a movie I have and the title should now only be found in the two Movies I Have tables on the website and no longer found in the two Movies I Want tables on this website. One change to the source to trigger four changes on the website.

In the case of the Movies tables, for changes to the movies database to appear in the tables, the updated data must be exported to two files, one listing “Movies I Have” and the other “Movies I Want”. Those exported files update the source lists the Movies tables refer to. And finally a sync tool from the wpTables publisher must run against the source lists to update the Movies tables on the website.

Making changes to the movies database is infrequent, a few times a year at most. Remembering the process each time is a challenge but now the data extract step and the link refresh steps are automated which makes most of the process happen without need to remember anything (or look at the code if I wish to remember).

The link update code as a cron…

# m h  dom mon dow   command
*/15 * * * * wget -q -O – "https://wp.boba.org/wp-admin/admin-ajax.php?action=wdtable_update_cache&wdtable_cache_verify=<hex-number>"

Export updates for source lists …

<?php
 $server = "<host>"; 
 $username = "<user-name>";
 $password = "<pwd>"; 
 $database_name = "<movies>"; 
 $link_myMovies = mysqli_connect($server, $username, $password, $database_name);
 $Views = array("movies_i_want","movies_i_have");
 $out_path = "/var/tmp/";
 
 foreach ($Views as $view)
 {
        $query_result = null;
        $columns_total = null;
        $str_query = "SELECT * FROM $view";
        $query_result = mysqli_query($link_myMovies, $str_query);
        $columns_total = mysqli_num_fields($query_result);
 
        $col_names = array();
 
        for ($i = 0; $i < $columns_total; $i++) 
         {
                $Heading = mysqli_fetch_field_direct($query_result, $i);
                array_push($col_names,$Heading->name);
         }
 
         $fileOut = fopen("$out_path$view.csv", 'w') or die("Unable open ./$out_path$view.csv");
         fputcsv($fileOut, $col_names);

         while ($row = mysqli_fetch_array($query_result, MYSQLI_NUM)) 
         {
                fputcsv($fileOut, array_values($row)); 
         }
         
         fclose($fileOut) or die("Unable to close ./$view.csv");
}
?>

Up again, but not public yet

Well, except, you’re reading this so it is public.

Lost interest in maintaining this server and website when I lost my job and couldn’t get another. The server’s Ubuntu, web server is Apache, and CMS is WordPress. It’s been running for a number of years without issue. I have never thought of it as “production” because I don’t rely on it for anything. It’s just a test bed to familiarize myself with the software stack and gain some understanding of its setup and administration. I’m self hosting. It’s an old computer repurposed as a server.

One other thing I experimented with is DNS. I wanted to be able to get to my server on my home network using wp.boba.org, whether on the public Internet or my home network. That worked fine for years with BIND9 and isc-dhcp.

I developed the habit of running upgrades periodically without testing. If there was a problem then no big deal, not production, figure out the issue, repair and proceed. Problems happened a few times with that approach and were always easily rectified.

DNS on the server stopped working after an upgrade. I tried many things and couldn’t figure out why. Rather than rollback the upgrade or restore the system from a backup I kept mucking with it to try and get it to work. No success. Eventually I just lost interest and let the server go dark. I wasn’t working so didn’t have anyone to talk with about the server. With no one to talk tech with about my server project there seemed no point to fixing it.

I did want to dip my toe in the water again after a while. I decided to rebuild the server and bring all components up to the latest release. I still couldn’t get BIND9 DNS to work. Searching BIND9 issues I found other Ubuntu users were also having problems with it. After searching for alternate DNS servers I decided to try dnsmasq. That got me to a working DNS on my home network. And that got me to the point of having the server up and publicly available again.

All development of the server configuration and settings was done on a virtual machine, vm, in a virtual network with virtual clients. VirtualBox is the hypervisor being used. Once everything worked as expected I migrated the server vm to a physical host. That took surprisingly little tweaking. Network addresses had to be changed from the virtual network settings to the home network settings and a different Ethernet device name entered where needed. That was about it to migrate from a virtual to physical server.

For all the world to see, in all its underwhelming glory, wp.boba.org is back. Enjoy.

Mount an external LFS drive

It’s easy. Just took a while to recall.

Original server was hardware installed from thumb drive iso. Set up LFS on server install.

New server from VirtualBox vm. Used ext4 there. Have it running on different drive on original server. LFS drive is set aside.

Want to get at some info from LFS drive. Trying to mount external LFS drive is running into many dead ends so far.

And of course it was simply a question of installing the correct file system drivers. In this case # apt update & apt install lvm2, and the volume can be mounted read/write.

I will keep the old drive around for a while in the external housing. I’m sure there will be times I want to find stuff to pluck off. But I need to put a label on it for a hard date to be DBANed.

Rehoming all my Domains, Oh My !

Domains and registrars, and services, and what?

Google is selling their domain registration business to Squarespace. If your webserver is at a dynamic Internet address, the address needs to be monitored so it can be updated on the name server when it changes. Squarespace name servers won’t accept dynamic updates.

Monitoring the network to see the router’s public Internet address change and updating Google Domains‘ name server was done with Google provided DDNS instructions and settings. Squarespace, the provider they’re selling their Domain Name business to, does not support DDNS. Once Squarespace is actually managing the domain name it will keep the old information about the Internet address in its name server but doesn’t provide a way to automate updates. Once the domain name is on Squarespace and my Internet provider updates my modem’s Internet address, access to this website by name goes down unless I’ve set up another way to keep the website address updated.

Two ways I found to avoid this are move to a registrar that supports DDNS, like Namecheap, or find a DNS provider that supports DDNS and doesn’t require registering a domain with them, like FreeDNS (at afraid.org, yes, but don’t be), and use of their name servers as custom name servers with the domain registrar. That approach requires two service providers for each domain, a registrar and a DNS service.

There’s a fee with registrars for migrating a domain to them. Not much but if you can just change a setting and then there’s no need to pay to move to a different registrar then why not do that?

“THAT”, in this case means leaving the domains with Google and updating the name servers on Google’s domain registration record to the FreeDNS name servers and then keeping the Internet address updated on the FreeDNS name servers.

I’ve moved one domain to Namecheap to see how I like that, an $11 move. It will give me a hand at a third domain control panel, Google Domains, Squarespace, and Namecheap.

The others I’ve created records for them on FreeDNS, updated the name server records on Google Domains and will start using the Squarespace control panel to manage them when they transfer from Google. Squarespace doesn’t support DDNS but if custom nameservers are supported the move from Google Domains will go without a hitch.

Haven’t moved boba.org yet. Want to interact with the other sites a bit before deciding to use FreeDNS and their name servers with Squarespace domain registration or move to a registrar that supports DDNS with their name servers.

I do have to spend time out of the house to interact with the sites through the new DNS / name server setups. Sure, could do it through the phone if I turn off the WiFi but LTE isn’t very good here and I don’t like phone screen for web browsing. If LTE was good could tether the computer to the phone and browse the sites on the pc as I’d like. Kind’a lucky the weak signal, more fun to go out. Maybe find a coffee shop in a mall, buy a cup, sit in one of the seats and figure out how to choose the better option, then compare the details and make the choice.

Goodbye Google Domains!! ?? !!

…hello Namecheap ddns or, hmm, domain hosting too?

This domain, boba.org, is on a server I control, behind a dynamic IP address. Google Domains provides the domain hosting and supports DDNS which made it easy to have Google nameservers be authoritative, keep the A record updated, and manage the physical server.

Now Google’s giving up the domain name business, along with all the convenient features they bundle like DDNS, redirects, privacy, etc.

It’s being transferred to Squarespace. And Squarespace doesn’t include DDNS or offer it as a bundle.

Still need a way to update domain record with new address when it changes BUT can’t do that with Squarespace nameservers.

Checking if domain record can have nameserver but no A record with IP. IF SO, domain record points to nameserver that can be updated, e.g. Namecheap free DNS, and domain continues to function when IP changes even though new domain host doesn’t offer dynamic IP updating.

Will see what happens and update…