fogbound.net




Wed, 9 May 2007

Extracting Scripts from Javascript pages using Javascript

— SjG @ 1:56 pm

Here’s a weird one. There was the need to extract the contents of all Javascript <script> … </script> tags from an html page, using Javascript in an Ajax-y environment*. I tried using a similar regular expression to the one published by Matt Mecham, but found that IE threw an error. IE didn’t like the [^] construct.

So, since I knew that the pages that this would need to process would be standard strings with nothing odd in them, I substituted [^\0]. Works in Firefox and IE. I don’t know if it breaks under different encodings, though.

The other problems was conceptual — I didn’t remember that regex.exec() only gives you the first match in the resultant array (but gives you your submatches); I confused it with the behavior of string.match() which doesn’t give you your submatches. *sigh*

So the code looks like this:

var reg = new RegExp("<script[^>]*>([^\\0]*?)<\\/script>","ig");
while( (m2 = reg.exec(http.responseText))  != null )
    {
        for( i = 1; i < m2.length; i++ )
        {
           alert(i + '('+m2[i].length+')' + m2[i]);
           // do other stuff
        }
    }

(Please note that WordPress seems insistent on munging that code. Spacing, in particular, might be corrupted.)

(* note that use of the passive voice. To protect the innocent, we won’t say who/why it was needed.)


Mon, 2 Apr 2007

Buffalo Terastation Problems

— SjG @ 4:20 pm

I’ve written here numerous dull tirades on the subject of backups. Well, here’s more.

We had my brand new shiny backup script working on the LAN to backup all the servers to a Linux box with a 300GB hard drive. For extra security, we copied it out to a Buffalo Terastation, which also serves as our office fileserver. For that extra bit of security, the Terastation is formatted as two shares, each of 250GB (using RAID-1). One of those shares is the office fileshare, the other is for server backups.

Well, there was a slight *cough* stupid *cough* problem with one of my backup scripts over a weekend, which resulted in a recursive backup of a directory (doh!). This filled up the disk on the Linux box, but it didn’t prevent it from happily trying to copy it all to the Terastation (using lftp).

When I came in on Monday, the Terastation was not happy. It simultaneously said the drives were ~30% full, and said that it couldn’t find any disks at all. FTP connections were dropped immediately. We were able to copy a few files off of it from machines that had kept the drive mounted via SMB, but then it would disconnect and vanish from that machine’s network visibility. This was not good. At some point, we thought it might be a good idea to try enabling another protocol to access the data, which had the unfortunate side effect of switching the Teraserver admin into Japanese.

Tech support took about 20 minutes to answer the call, but they were courteous and helpful. Eventually, they concluded that the controller board was bad. To get a replacement, they charged our credit card the price of a new unit, and shipped it out, with the understanding that we’d swap the drives into the new unit, send back the old one, and get credited back the money. While this is not ideal, I can understand why they do it that way.

In any case, the new unit arrived today. I went through the effort of swapping the drives from one unit to the other (which is a lot more complicated than it should be, requiring a lot of screws). And voila! Still a Japanese admin, and no ability to access the data.

My working theory now is that the Teraserver stores configuration data on the drives, and when the one share filled up, it corrupted the config data somehow. I’ll call tech support tomorrow and see what I can learn. *sigh*


Sun, 25 Mar 2007

Backups, cont.

— SjG @ 9:50 pm

OK. I’m a bonehead. The link I provided to my backup script tarball was broken. The link is fixed.

But wait! A new version of the scripts will be posted in a few days. It’s got some bug fixes and some new features. With it, the little birds really do sing more cheerfully, and the colors really will be brighter.

(As an aside … I don’t know why none of the people who clicked on the broken link bothered to send me an email or leave a comment to tell me there was a problem. Could that all have been robot traffic?)


Thu, 8 Mar 2007

Automated Backups – Updated!

— SjG @ 3:50 pm

[Update — fixed the link!]

Automated Backups are a good thing. Automated Backups make the little birds sing, the rainbows shine, and little fauns gambol about in beautiful green forests. When computers are backed up, the butterflies flutter, the flowers bloom, and the fruit from the trees taste just a little sweeter. But when computers are not backed up, the universe becomes angry.

An angry universe is not a good thing. An angry universe makes little birds cry. An angry universe makes Cthulhu come and visit.

So. Automated backups. I’m partial to rdiff-backup because it allows me to not only back up data, but keep previous versions available. Backing up nightly doesn’t help if you accidentally overwrite the contents of a file with something, and don’t notice for a day or two. But with rdiff-backup, you can restore the version before the error.

Unfortunately, rdiff-backup really is designed for server-to-server backups, where each end of the transaction has shell access. Enter duplicity, a related project. It’s more designed for storing backups on servers that you don’t control and/or don’t trust. It allows encryption of your backup sets, as well as supporting a wider variety of protocols (ftp, scp, s3, etc.)

So with a combination of these two scripts, you can backup pretty much any POSIX-ish server to pretty much anything that you can ftp or ssh into. Still, it’d be nice if you could:

  • Check that the backups completed successfully, and get email confirming that success or warning on a failure.
  • Configure up all of your various backups by a simple text file, rather than remembering the different command-line formats.
  • Create groups of options that can be applied to backup tasks.
  • Issue commands on the backup source and destinations before and/or after the backup (good for dumping databases into a flat file, for example, and then deleting it after it’s backed up).
  • Get email confirmation on completion of backups.
  • Have some tools to simplify the securing of the backup process.

For these reasons, I put together this backup script, which is basically a Ruby wrapper for rdiff-backup and duplicity. It’s almost entirely configured via two human-readable yaml files.

It’s flexible, reasonably simple to use, and comes without any guarantees whatsoever. Feel free to use it yourself!

DISCLAIMER: it’s as-is. Not to be used in place of a certified Cthulhu-deterrent. Use at your own risk. To quote the duplicity page: “[it] is not stable yet. It is thought to have a few bugs, but will work for normal usage, and should continue to work fine until you depend on it for your business or to protect important personal data.” — that goes for me too, only double.


Tue, 16 Jan 2007

Javascript in Photoshop

— SjG @ 9:21 pm

Ah, it was such a happy day that a Photoshop junkie and programmer geek discovered that Photoshop could be scripted with a Real Object-Oriented Language, like Javascript!

And yet, as time goes on, this happiness is mitigated. Egregious bugs (such as the failure of Selection.bounds) pop up, requiring internet searches to get the workaround. Then, there are unexpected things, where the document units are not honored for selection translations.

Even worse, it dawns on the programmer that many useful functions are missing — functions that seem like natural features for scripting Photoshop — like, say, getting the RGB value of an arbitrary pixel from an RGB document, or getting the transparency of said pixel, or, even, say, changing that value. Of course, there are workarounds for all of these functions (e.g.,this ). But why should I need fifteen extra lines of code?

Then, there are errors that are in my code. I find that aliasing is a big problem when I’m creating virtual triangles and then getting the pixels from within them. For example, my script for simplifying the creation of images like this:

tessel-test5-big.jpg

has enough slop in the aliasing combines with the rotations and translations to yield problems like this:

tessel-test-detail.jpg

So maybe it’s time for me to start implementing this kind of thing in a vector-based program. I’ve played with the demo of Intaglio, and it looks good. And it’s scriptable too, albeit only using AppleScript, which I’ll have to learn.