fogbound.net




Mon, 28 Mar 2005

Photographing Teapots

— SjG @ 9:36 pm

After trying a lot of different approaches, I am now getting close to the results I want when photographing teapots. I figured it would be worthwhile to share my system in the hopes that it’s helpful to someone else, and with the idea I might get some suggestions from others.

First, the goals:

  • A clean image that shows detail of the piece.
  • An image that’s reasonably accurate in color reproduction.
  • An image that gives a little drama to the piece, rather than a cold, clinical look.

The Lighting Equipment:
I’ve tried several approaches. The one that I find best (so far) is not the cheapest approach. It involves about $300 in lighting equipment, not including the cost of any photographic gear.

Teapot Photo Stage
(click for bigger view)

I started with a kit from Table Top Studios. It included a 30″ light tent and a two-light set. I bought a graduated backdrop from another photographic supply house, which I needed to trim to size — Table Top Studios now sells a custom sized backdrop which seems to be ideal. One other word of praise for Table Top — I was facing a deadline, when one of the bulbs burned out. They were extremely helpful and overnighted me a free replacement, so I was able to make the deadline. That made me a loyal customer from here on out.

I set up the tent on a card table in the overstuffed Meier Quagg Library. I installed the nylon sweep, and used clothespins to fasten the graduated backdrop to it. Because the light tent has a lip, I raised the front portion of the sweep using the two-volume Oxford English Dictionary. The 1982 version is perfectly sized; you might want to use something else of the same general shape and size.

Before placing a teapot upon this stage, I metered off of an 18% gray card, oriented vertically. Then, I tried it with a teapot. I did a lot of experimentation, bracketing, spot-metering, etc, and, to my simultaneous delight and dismay, found that for both digital and film, Nikon’s Matrix Metering was spot on for the best exposure. To get the best images, I stopped down to f/11, which necessitated a fairly long exposure time (on the order of a quarter second), which, of course, makes a tripod all the more necessary.

To see how these results compare with my previous efforts, compare:
before to after (obviously, different teapots!)


Fri, 18 Mar 2005

The Destruction of Da Derga’s Hostel

— SjG @ 5:09 pm

anonymous, circa. 1100, translated by Whitley Stokes, read as an e-book from BlackMask.com.

This is a curious old Irish tale, which seems to fall somewhere between standard historical epics and fairy tales. You can still hear quite a little of the oral tradition in its structure, but it also has some surprises. The beginning is very much fairy tale, about how Conaire becomes king, and how he learns of his personal taboos. This portion is mystical and fantastical. It is followed immediately by the tale of how the good king brings peace and prosperity and then, in one grand binge, violates all his taboos. The tale then takes a short detour, setting up the Reavers (the agents of destruction), and giving us their history and descriptions, with each being more terrifying and strange than the previous. After this short detour, we take a very long detour, where these agents of destruction have resolved to destroy Da Derga’s hostel (where the king is spending the night). They review and catalog each individual within the hostel, sparing no details, and their seer predicts how many of the reavers will be slain by each. This is by far the longest section of the tale, and seemed to have been a great opportunity for retellers to toss in their own creative additions. The actual destruction is something of an anticlimax.

Filed in:

Thu, 3 Mar 2005

Feedback Form Module, Newbie Software Engineering

— SjG @ 12:12 am

So I’ve just finished version 0.4 of a module for handling user feedback for CMS Made Simple. It allows users with administrator rights to create reasonably complex forms, with all the user interface objects that we’ve come to expect, and handle the submission of those forms in a variety of ways.

It’s been an interesting experience. Version 0.1 was your standard naive PHP implementation of an application within a framework. It was all one big script in one big file. It would do fairly simple database queries, stuff all the results into a big array, and then process array elements with big switch statements when it needed to customize output based upon UI object type.

Version 0.2 ported this basic functionality into an Object Oriented model. I found that there were a couple of complex decisions to make — should I handle the database storage and retrieval in an OO manner? And if I query a bunch of Input Objects from the database, how do I know what specific kind of object to instantiate since that data is contained in the database record? I guess the truly OO approach would be to use an OO database, or, next best approach, to have a separate table in the database for each kind of input object. But that’s not how I did it, probably to the detriment of my code (I instantiate a superclass object, then use the type details from the general object to create a new object which is the correct specific class.)

Still, this showed how a procedural approach could save a lot of database activity over an OO approach. The data model comprises forms, which contain one or more fields, which have one or more options. In the procedural approach, I denormalize the database so that fields contain the form id, and field options contain both the field id and the form id. I could then grab everything I needed for a form in three queries. With the OO model, the number of queries is proportional to the number of fields. What’s more, there’s been a massive proliferation in the number of files required. While I worry about the web server having to load and parse all that stuff each time, I should probably have more faith in the PHP engine and the OS caching. As a number of people have said to me, I’m not playing on a TRS-80 with 4k RAM anymore. But I still feel like I should be programming as if resources were seriously limited.

Similarly, when it came time (for version 0.4) to add localization of the code, it required some somewhat unpleasant contortions: each Input object needs to have access to a global collection of text strings (stored in a big hash) so they can present localized versions of messages. And I still need to go in and make sure that I’m actually handing around references rather than the copies that PHP likes to pass.

Maybe this is just the yammerings of someone who should understand software design better. Clearly, there were some bad decisions made in the code, although I could argue about how bad they really are.

Another aspect that took me by surprise was how I could test code and have it perform perfectly, while other users reported errors and bugs immediately. In this case, the main culprit was not inattentiveness (in testing, anyway), but my PHP configuration. If you allow output buffering, PHP gracefully handles output before the headers have been sent. Not so, if output buffering is disabled. So when my code would generate errors, my test configuration would blithely allow error codes to be output but then clobber that output with the expected page output. So while I thought output buffering was only involved in performance, it seems that for development, it should be disabled. That way, those bugs cannot be so easily overlooked.

Filed in: