Joe Maller.com

RFC 822 Dates with AppleScript

Here’s a little AppleScript subroutine which converts date objects into correctly formatted RFC date strings: RFCdate() (click to open in Script Editor)

This is a timesaver for anything related to RSS, which requires dates be in the RFC 822 format, ie. Wed, 24 May 2006 01:30:22 -0400


Avon Walk for Breast Cancer [research]

My cousin Sarah, a breast cancer survivor, will be participating in her first Avon Walk for Breast Cancer on July 8-9 in San Francisco, she has been using the http://thebustboosters.com/breast-actives-full-product-review products and she has recovered pretty well. She already met her fundraising goals, but it’d be great if she raised a little more.


Hey Adobe

Quark’s got a Universal Binary Beta, what have you got??

Working in CS2 on a MacBook Pro is slow and painful. We will probably need new hardware before CS3 ships, that hardware will be Intel Macs. Hurry up or someone’s gonna take your market.


Original Star Wars and HiDef DVDs

John Gruber linked to the news that Lucas is giving in to demand (eg. money) and will finally release the Star Wars trilogy on DVD in their original, unaltered theatrical format. Han shoots first. John writes:

Bastards. I broke down and finally bought the current DVD trilogy collection just a few months ago now I’ve got to pay for it yet again just to get the versions of the films that I really want.

Hold your wallet. The release will be in SD, despite the news that HD DVDs will start appearing in stores this summer. Lucas, who is as brilliant a money-maker as he is a horrible dialog-writer, will rake in tons of money selling an obsolete DVD. All the Star Wars films will be available again in HiDef soon enough. And too many people will buy them all for the third, fourth or fifth time.

Lucas isn’t alone. All the studios are flooding the DVD market right now to sell as much as possible before the switch to Blu-Ray or HD-DVD.

Supposedly the porn industry likes likes Blu-Ray best, but you’d have to be delusional to think physical pornography sales are anywhere near as strong as they were before the Internet. Still, porn is generally credited with choosing VHS over BetaMax, at least this time they seem to be going with the better format.

Personally, I think HD-DVD may win this time. Not because it’s better, but because “Blu-Ray” is a dumb name. Ask any non-technical consumer what they want for their new HD TV, HD-DVDs or Blu-Ray disks. Sure a certain percentage of people will have Blu-Ray explained to them, “see, they’re both HD,” but how many will buy on name alone?

Never mind that the name itself looks (and sounds) like blurry. Not what you want to hear when dropping a lot of money to replace a bunch of movies you bought in SD a few years ago.

And so the studios might be slitting their own throats. Movie attendance is down over the past several years. Partly movies are kind of boring now, also suspense and action movies keep dancing around what’s really scary. But the studios now want consumers to re-buy movies they just bought. The great DVD migration wasn’t that long ago, most DVDs were probably purchased within the past 5 years. But now Hollywood wants everyone to buy those movies yet again, and they’re going to spend a fortune on conversion and manufacturing in a bet that consumers will buy the new disks. And if we don’t? Lots of bankruptcies.


NAB 2006 Wrapup

I did my annual whirlwind trip to Las Vegas for 40 hours of NAB.

Great to see everyone and meet some new faces. This year though it seemed I missed more people than I saw. However I did get to spend some good time talking with Christoph Vonrhein of CHV. Christoph is scary smart and pushing FXScript way beyond what has been done before.

Buzzless

Not much to get really excited about this year. The whole show seemed to lack a certain energy it’s had in the past. If anything it seems like HD is fully here and now people are figuring out how to work with it.

Lots of people around the show seemed to be complaining about various aspects of Panasonic’s P2 cards. Everyone loves how the video looks, but was down on the workflow. Complaints I overheard included the constant swapping of cards, their still astronomical cost, the lack of third-party cards, 8gb maximum available size, the lack of workable hacks (FireStore excepted) contributed to a general malaise about the format.

Far more hostile were the descriptions of working with HDV. It’s great that the format has a relatively low buy-in cost, but people were spitting blood about working with the files. HDV users seemed angry, frustrated and annoyed. And a lot of them are jealous of the Panasonic image quality, HDV looks mushy by comparison. I heard several people who are using HDV cameras steering others away from HDV cameras.

What that basically tells me is that the prosumer HD space is still very much up for grabs. Maybe that’s why the only perceptable buzz at the show was from RED Digital Cinema.

Except for RED

A lot of people seem to think RED is a joke. I’d be feeling the same way except that I know two people working at the company, Ted Schilowitz and Graeme Nattress. They’re both brilliant and dead serious about this camera. I have total confidence in them, this camera is no joke. Despite the fact that it looks more like a weapon from Unreal Tournament than a camera, and with a $17,500 price tag, the RED camera will completely transform the digital filmmaking marketplace.

In the near-term, HDV may yet become the interim standard. It’s just so much cheaper than anything else. Dealing with HDV files is really a software problem, so maybe Apple will shake a better workflow out of Quicktime and FCP making HDV more fun to use.

Overall it was a great trip. Sometimes, working alone in my little dark office, I forget there are people using the stuff I make. I know how absurd that sounds, but it’s real. Meeting so many Joe’s Filters users and hearing stories about how my filters helped people was revitalizing. Thanks to everyone, there’s more stuff coming soon.


UTF-8 and high ASCII don’t mix

Part of my FXScript compiler works by joining two code chunks with a shell script. Each chunk lives in its own file and contains one “high-ASCII” character, a © symbol in one, and a ’ (typographically correct apostrophe) in the other. Those are processed with sed and joined with a few additional strings via echo and cat.

For several hours I was stumped because one of the two characters would be garbled after passing through the script.

Finally I noticed that one source file was encoded as ASCII and the other was UTF-8. When both were set to UTF-8, everything worked.

The iconv command converts files between encodings. I used the following script to covert a directory of ISO-8859-1 Latin1 text files to UTF-8:

for f in *
    do 
    cp "$f" "$f.TMP"
    iconv -f LATIN1 -t UTF-8 "$f.TMP" > "$f"
done
rm *.TMP

Here’s a one-line version:

for f in *; do cp "$f" "$f.TMP"; iconv -f LATIN1 \
-t UTF-8 "$f.TMP" > "$f";  done; rm *.TMP

Just don’t run that more than once or it will re-convert already converted characters which isn’t pretty. Iconv doesn’t buffer data, so attempting to convert in place results in zero-length files. I moved the files first to keep Subversion from freaking out that the files were all new.

As much as it seems like something that should be detectable on the surface, 8-bit text encoding can’t be sniffed out.

It’s completely impossible to detect which of the 8-bit encodings is used without any further knowledge (for instance, of the language in use). …

If you need a formal proof of “undetectability”, here’s one: – valid ISO-8859-1 string is always completely valid ISO-8859-2 (or -4, -5) string (they occupy exactly the same spots 0xa1-0xff), e.g. you can never determine if some character not present in another set is actually used.

That’s the reason I couldn’t find a counterpart to iconv which would detect and return the encoding of a text file. An alternate solution would be to detect UTF-8 and not reconvert a file that’s already unicode, but I think I’m done with this for now.

For a beginning understanding of Unicode and text encoding, start with Joel Spolsky’s canonical article, The Absolute Minimum Every Software Developer Absolutely, Positively Must Know About Unicode and Character Sets (No Excuses!).


Shell scripts in AppleScript are illegible

I got my FXScript Compiler working on the new machine and pulling sources from Subversion without too much trouble. But I decided that my practice of embedding shell scripts in AppleScript kind of sucks. It’s just desperately ugly. Tools like sed are ugly enough on their own, slashing and escaping every other character just makes them completely impossible to dissect after a few months.

For example, I print the following in the header of each file’s source code before compiling:

[tab] // [tab] Version: r145
[tab] // [tab] build200604181617
[tab] // [tab] April 18, 2006

One echo statement looks like this ($d and $b are already set and $b is formatted):

echo -e "\t//\tVersion: $d\n$b\n\t//\t`date '+%B %d, %Y'`\n\n

Not exactly pretty, except in comparison to this:

echo -e "\\t//\\tVersion: $d\\n$b\\n\\t//\
\\t\`date '+%B %d, %Y'`\\n\\n"

Echo is using the -e argument because these are being piped through other commands.

The real killer is anything involving regular expressions. Say a matching pattern needs to match a string containng double-quotes inside a double-quoted sed pattern. Then this already slash-infested command:

sed -e "s/\([Ff]ilter[\t ]*"[^"]*\)"/.../"

becomes this:

do shell script "sed -e "s/\\([Ff]ilter[\\t ]*\
\\"[^\\"]*\\)\\"/.../""

No part of me wants anything to do with keeping track of that many backslashes. It’s slightly better when using sed with the -E extended regex flag, but still.

In the ongoing pursuit of long term legibility, I’m putting my shell scripts functions into individual files inside the XCode project. More on how that works later.

[I know the main page template doesn’t work right when long strings break the column width, it’s on my long-term to do list]



« Previous PageNext Page »

random

14th St webcam