fogbound.net




Tue, 17 May 2022

Linux Command Line Magic

— SjG @ 12:24 pm

In day-to-day operations, cirumstances often arise where you need simple answers to fairly complicated situations. In the best scenario, the information is available to you in some structured way, like in a database, and you can come up with a query (e.g., “what percentage of our customers in January spent more than $7.50 on two consecutive Wednesdays” is something you could probably query). In other scenarios, the information is not as readily available or not in a structured format.

One nice thing about Linux and Unix-like operating systems, is that the filesystem can be interrogated by chaining various tools to make it cough up information you need.

For example, I needed to copy the assets from a digital asset management (DAM) system to a staging server to test a major code change. The wrinkle is that the DAM is located on a server with limited monthly bandwidth. So my challenge: what was the right number of files to copy down without exceeding the bandwidth cap?

So, to start out with, I use some simple commands to determine what I’m dealing with:

$ ls -1 asset_storage | wc -l
10384

$ du -hs asset_storage
409G	asset_storage

So that first command lists all the files in the “asset_storage” directory, with the -1 flag saying to list one file per line, which is then piped into the word-count command with the -l flag which say to count lines. The second command tells me the storage requirement, with the -h flag asking for human-readble units.

I’ve got a problem. Over 10,000 files totalling over 400G of storage, and say my data cap is 5G. The first instinct is to say, “well, the average file size is 40M, so I may only be able to copy 125 files.” However, we know that’s wrong. There are some big video files and many small image thumbnails in there. So what if I only copy the smaller files?

$ find asset_storage -size -10M -print0 | xargs -0 du -hc | tail -n1
630M	total

Look at that beautiful sequence. Just look at it! The find command looks in the asset_storage directory for files smaller than 10M. The list it creates gets passed into the disk usage command via the super-useful xargs command. xargs takes a list that’s output from some command and uses that list as input parameters to another command. To be safe with weird characters (i.e., things that could cause trouble by being interpreted by the shell, like single quotes or parens or dollar signs) we use the -print0 flag from find (which forces it to use null terminators after each result output) and the -0 flag on xargs, which tells it to expect the null terminators. This takes the list of small files, passes them to the disk usage command with the -h (human-readable) and -c (cumulative) flags. The du command gives output for each file and for the sum total, but we only want the sum, so we pipe it into the tail command to just give us that last value.

So if we only include files under 10M, we can transfer them without getting close to our data cap. But what percentage of the files will be included?

$ find asset_storage -size -10M -print | wc -l
7708

Again, the find command looks in the asset_storage directory for files smaller than 10M and each line is passed into the word count as before. So if we include only files smaller than 10M, we get 7,708 of the 10,384 files, or just under 75% of them! Hooray!

But when I started to create the tar file to transfer the files, something was wrong! The tar file was 2G and growing! Control C! Control C! What’s going on here?

What was wrong? Well, this is where it gets into the weeds a bit. It took me longer than I’d like to admit to track down. The shell command buffer has limitations on its length, and xargs has its own limitations. If the list it receives exceeds those limits, xargs splits the input and invokes the destination command multiple times, each with a chunk of the list. So in my example above, the find command was overwhelming the xargs buffer and the du command was called multiple times:

$ find asset_storage -size -10M -print0 | xargs -0 du -hc | grep -i total
6.1G	total
630M	total

My tail command was seeing that second total, and missing the first one! To make the computation work the way I’d wanted, I had to allocate more command line length to xargs (the size you can set is system dependent, and can be found with xargs --show-limits):

$ find asset_storage -size -10M -print0 | xargs -0 -s2000000 du -hc | grep -i total
6.6G	total

Playing with the file size threshold, I was finally able to determine that my ideal target was files under 5M, which still gave me 68% of the files and kept the final transfer down to about 3G.

In summary, do it this way:

$ find asset_storage -size -5M -print0 | xargs -0 -s2000000 du -hc | tail -n1
2.9G	total

$ find asset_storage -size -5M -print | wc -l
7094

$ find asset_storage -size -5M -print0 | xargs -0 -s2000000 tar cf dam_image_backup.tgz


Sun, 3 Apr 2022

Hummingbird Season

— SjG @ 2:10 pm

It’s Springtime in the Quagg Garden, and several hummingbirds have declared it their own exclusive territory. When I take my lunch, I am greeted by spectacular aerial displays: both mating rituals and territorial defense. Less dramatic and yet equally (sonically) expressive, the mocking birds seem to be nesting nearby. Plenty of finches and bushtits and crows come by too.


Sun, 13 Mar 2022

The Programming Curse

— SjG @ 10:28 am

Programming is fun. You can be off doing some chore and get this idea … “hey, wouldn’t it be cool if I could just have the computer help me with this …”

So, you come up with an idea, and you think through the first few steps, and throw together a script. Then you play with it, and you get excited. It works, sort of, but you can see ways to make it better. You make changes and discover a better approach to the problem — so you implement that, and before you know it, you’ve spent an evening or an afternoon. It’s exciting to watch your ideas turn into something.

But computers get ever more complex, and the power and complexity of interfacing also gets ever more complex. Keeping pace with that increased complexity are more and more powerful development tools. This is a double-edged sword: you can easily do amazing things that would have once been very difficult, but getting set up can be more challenging and when things go wrong, it’s harder to figure out why.

For some ideas, getting to the point of coding is still easy as entering php -a or python and starting to type. For other ideas, though, there is the dreaded setup problem. I call this phenomenon “The Programming Curse.”

For example, I had an idea for a phone app that I wanted to prototype. In the old days, I’d have had to break out XCode and learn Swift and all of the iOS libraries. Today, however, I can use more familiar (to me) web technologies, and build an app using the Ionic Framework. Now I have a toolchain that includes at least nodejs, the Ionic framework, Ruby Gems, and XCode. I know very little about any of these things’ internals, and I really don’t want to know a lot about them. I just want to explore my code idea!

Sadly, I have to learn something about the internals. My first attempt to install the toolchain failed deep inside a nodejs package setup. After extensive googling, I find that it’s because one of the components is not the latest version (but there’s a reason for that1).

Maybe I’ve just gotten old, or maybe I’m just lazy. I’m certainly not the first to gripe about this phenomenon2. It just dampens the fun when during that excited “wouldn’t it be cool” phase I have to spend hours getting a functional development environment together instead of actually getting to write code.

1 The problem is that I support a phone app that was written in an earlier version of the Ionic framework, and it depends on a Cordova plug-in that’s no longer supported. The plug-in still works, but I can’t update my development environment for my new project because the dependencies would clobber my ability to update new builds of my old project. Could that be resolved by selectively holding back some packages to previous versions? Maybe. Three or four hour’s worth of effort in that direction didn’t get me anywhere, other than dependency hell.
For my web-only projects, I use products like Docker to keep a fully isolated development environment per project. Since Ionic depends on nodejs which installs globally (and since I need XCode to perform the final build), I haven’t found a way to do that. I guess if I made some Mac OS virtual machines, I could, but it seems like a lot of overhead.

2Fifteen years ago, David Brin wrote an article on Why Johnny Can’t Code extolling the virtues of BASIC. I find myself grudgingly agreeing — not about his specific language objections (I don’t know why he felt Perl or Python are any further from the metal than BASIC), but about how and why it should be easier to write small programs.


Fri, 11 Mar 2022

Of House Mountains and AR

— SjG @ 12:01 pm

Many, many years ago, a Swiss exchange student introduced us to the concept of a “house mountain.” It’s sort of the landscape view equivalent of home base: the mountain that you see from wherever “home” is.

Separately, I just came across a discussion of augmented reality applications, which reminded me of the outstanding PeakFinder web site and mobile app. I first encountered PeakFinder in 2013 when I was loading up my first iPhone. It was one of the two applications that showed me the enormous promise of augmented reality (the other being the original Star Walk). I was able to install PeakFinder on my phone, and identify peaks when hiking in the Sierra Nevada, on a trip to the Atacama desert in Chile, from a ferry crossing Horseshoe Bay in British Colombia, and many other places.

In general, I find I use PeakFinder without the AR mode. I just point my phone around the horizon and see the peaks labeled, recognizing them by the basic shape. But if you want to know what’s that peak in a picture you took last year, PeakFinder has a neat feature where you can import your photo and then overlay the data. It requires that GPS coordinates were embedded in the picture or that you can find the spot where you took the picture on a map. Tilting the camera off level and/or lens distortion make the overlay approximate, but it’s almost always good enough!

Shot from the train in Alberta, Canada as we approached Jasper
Volcanos and Laguna Miscanti, Chile

So, marrying the concepts of AR and house mountains, PeakFinder lets you generate the view from any arbitrary point and even keep it as a “favorite.” So even if you aren’t in a place or don’t have a picture from that location, you can see your house mountain, like this view of my childhood house mountain.

AR House Mountain


Sat, 19 Feb 2022

Welcome to the Neighborhood

— SjG @ 2:30 pm

We have a new resident, though he’ll probably move on along soon enough.

Newly eclosed monarch boy
Filed in: