fogbound.net




Sat, 19 Sep 2020

Goodbye Google

— SjG @ 11:08 am

I’ve finally removed the Google Ads and analytics from this site.

Many years ago, I thought it mattered where people went on the site and which posts were most useful. I also had the delusion that there would be enough visits that the ads might help pay for the hosting.

Ah, to be young, naïve, and full of hope. That ship ship has certainly sailed.

Anyway, there’s no point in cluttering up the blog with surveillance crap. I’m just sorry I left it there for so long.


Sat, 12 Sep 2020

Those pesky /usr/local/include headers under MacOS Catalina

— SjG @ 11:04 am

Here’s another of those “system upgrade moves stuff around” problems. My work iMac seems to be suffering a slow disk failure. It gets slower and slower as it tries to run.

Queue restoring TimeMachine backups onto a new iMac. A lot of stuff just works. But things like building xsendfile for the Apache development server under Mac ports threw lots of errors. The compiler couldn’t find the headers:

mod_xsendfile.c fatal error: stdio.h: no such file or directory

There are lots of suggestions out there to run xcode-select --install, but I’d already done that.

Turns out that XCode has stopped storing the SDKs in /Library/Developer/CommandLineTools/SDKs and moved them to /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs. I spent a bit of time trying to figure out how to pass that path to the Mac Ports version of apxs2 to use the new include path, but eventualy gave up and just did the hacky thing:
ln -s /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs /Library/Developer/CommandLineTools/SDKs

Another issue, just so I rememebr next time, was that the web server and the command line were mysteriously running different versions of PHP. The key, of course, was that I had forgotten to run port select, e.g., for this project, I needed port select php php70.


Tue, 25 Aug 2020

JetBrains/PHPStorm losing SVN

— SjG @ 10:47 am

OK, maybe the cool kids aren’t using Subversion (SVN) any more, but we are for a set of established projects. Rather than rebuild our test infrastructure and our live deployment system, we’re sticking with stuff that works. I’m old enough to adopt the “if it ain’t broke, don’t fix it” mentality.

Ah, but then when I upgraded PHPStorm to a recent version, it done got broke. The JetBrains products have long had an uneasy realtionship with SVN. Originally, a client was built-in. Then, they switched to shelling out to an external executable. The authentication was sometimes dicey. That being said, when it works, it’s really, really nice. Having all of the revision control tools like histories, version comparisons, branching, the shelf, etc, available within the source editor is really convenient and time-saving.

Anyway, I upgraded, and suddenly, none of the SVN stuff worked. The right-click menu didn’t even show the Subversion submenu, and the options from the VCS menu didn’t work as expected.

Here’s how I eventually solved the problem. It turns out that (big surprise), I haven’t been doing things quite like everyone else does. I have a Project in PHPStorm, which is essentially three separate svn working sets under a single directory, e.g.,

/Users/samuelg/projects/this_particular_project
/Users/samuelg/projects/this_particular_project/main_site
/Users/samuelg/projects/this_particular_project/dependent_site
/Users/samuelg/projects/this_particular_project/mobile_app

In the above example, main_site is a working directory checked out from https://svn.myrepo.com/project/whatever/trunk, dependent_site is checked out from https://svn.anotherrepo.com/path/whatever/trunk, and mobile_app is checked out from https://svn.yetanother.com/project/whatever/trunk. This actually works really well: you can edit stuff in main_site and dependent_site, and when it comes time to check in, PHPStorm will update both repositories. Now, of course, best practices would be to have dependent sites in the same repository, but for various historical reasons and external dependencies, this is not possible.

SO, it used to be that PHPStorm would understand if you registered the <Project> to use Subversion as its revision control system:

Setting VCS at the <Project> level

However, after a period of pulling out more hair and trying different things, I learned that now, I have to identify each separate working space:

Setting the VCS for each working space

So there it is.

Filed in:

Tue, 12 May 2020

Remembering things for my future forgetful self: developer’s edition

— SjG @ 3:57 pm

I need to record a bunch of training tutorials for a web site. I want to use consistent, fake data, so I set up a local copy of the site. I dumped out the database, and created my test data. As the site evolves, I can continue to check out revisions and run migrations, but still have my customized test data.

So far, so good. But a problem arises. When recording, the URL for my demo site shows up. This will confuse the customer. There’s an easy fix for this, of course, which is to change my demo site to respond to the true site’s hostname, and map it in my /etc/hosts file.

It works great! Except that I’m forgetful. I’ll be adding a new feature on my dev machine, get it approved on the staging server, and eventually check it out of revision control to launch it to the live server. I’ll check, and get confused. Where are my changes? I may go so far as to start looking at the filesystem of the live server and issuing informational commands in the revision control system, which will only confuse me more, as the changes are live. Eventually, I’ll notice that some of the data is demo-like data, and then I’ll kick myself for having wasted half an hour on something so stupid. I’ll comment out the line in my /etc/hosts, and everything will be back to normal… until it comes time to record new tutorials.

It’s time to remind future me. First step is in the terminal. I added the following to my .zshrc file:

export inhost=`egrep '127\.0\.0\.1\s+domain.com' /etc/hosts | wc -l`
export commented=`egrep '#127\.0\.0\.1\s+domain.com' /etc/hosts | wc -l`
if [[ "$inhost" -eq 1 && "$commented" -eq 0 ]]
then
    echo " ____  _______        ___    ____  _____ _"
    echo "| __ )| ____\ \      / / \  |  _ \| ____| |"
    echo "|  _ \|  _|  \ \ /\ / / _ \ | |_) |  _| | | "
    echo "| |_) | |___  \ V  V / ___ \|  _ <| |___|_|"
    echo "|____/|_____|  \_/\_/_/   \_\_| \_\_____(_)"
    echo ""
    echo "-----------------------------------------"
    echo "domain.com is remapped in /etc/hosts!!! "
    echo "-----------------------------------------"
fi

This works well. When I’ve edited the /etc/hosts, my terminal comes up like:

Terminal with warning

But, future me may not go directly to the terminal to diagnose the issue. Future me may foolishly try to compare the current state of the site to the dev or staging version. Never fear! I can save future me from this idiocy.

First, I set my standard test browser to open up new windows and tabs with a “home page” instead of a blank page. That “home page” is http://localhost. I set up the default virtual host, and create an index page with the following barebones code:

<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="UTF-8">
  <title>Home</title>
</head>
<body>
<?php
$server = dns_get_record('domain.com');
if ($server[0]['ip'] == '127.0.0.1')
{ ?>
  <h1 style="color:red">WARNING!</h1>
  <pre style="color:red">
-----------------------------------------
domain.com is remapped in /etc/hosts!!!
-----------------------------------------
  </pre>
  <?php } else { ?> we good.<?php } ?>
  <h2>Search:</h2>
  <form class="search-form-home__form" action="https://startpage.com/sp/search" id="search" method="post">
    <div>
       <input id="q" maxlength="2048" name="query" type="text" value=""/>
       <input type="hidden" name="cat" value="web"/>
       <input type="submit"/>
    </div>
  </form>
</body>
</html>

Now when future me opens up a new browser tab or window, he’ll be greeted with a nearly blank page with a search box. Or possibly a warning!

Warning in browser!

So, by spending a bunch of time now, I can save future me a bunch of time (and frustration). It’s not clear that it will be worth it, but it was definitely worth trying.


Tue, 7 Apr 2020

One-liner to get a directory’s worth of video times

— SjG @ 10:58 am

The ffmpeg family of programs is incredibly arcane and powerful for handling video and video information. I needed to get the run times of a collection of videos. Here’s a handy one-liner that creates output suitable for import into a spreadsheet:

for i in *.mp4; do q=ffprobe -i "$i" -show_entries format=duration -v quiet -of csv="p=0";echo "$i, $q"; done

Sample run:
$ cd ~/work/training_videos
$ for i in *.mp4; do q=ffprobe -i "$i" -show_entries format=duration -v quiet -of csv="p=0";echo "$i, $q"; done
First_steps.mp4, 70.868000
Create_a_project.mp4, 134.320000
Loading_libraries.mp4, 45.442000
...