Joe Maller.com

Web Syntax Coloring

February 2011 Update: This post was originally published in 2007 and hasn’t aged well. For code snippet syntax coloring, I currently use Google’s Prettify script (prettify.js). The script is dynamically inserted by jQuery if there are code elements to style.

Recently I’ve been experimenting with two very different methods of syntax coloring source code in web pages. The first method uses a Dan Webb’s CodeHighlighter JavaScript library to convert appropriately tagged content into syntax colored code. It’s necessarily simple, but easily extensible. As an example, here are the CSS rules I’m using to style CodeHighlighter’s conversions:


code.html span.comment         { color: #999;}
code.html span.tag             { color: #07f;}
code.html span.string         { color: #080;}
code.html span.attribute     { color: #07f;}
code.html span.doctype         { color: #07f;}

code.css span.comment          {color: #999;}
code.css span.keywords         {color: #fd2;}
code.css span.selectors        {color: #0b0;}
code.css span.properties    {color: #66f;}
code.css span.units            {color: #33c;}
code.css span.urls            {color: #4a0;}

code.javascript span.comment     { color: #999; }
code.javascript span.brackets     { color: #07f; }
code.javascript span.string     { color: #4a0; }
code.javascript span.keywords     { color: #07f; }
code.javascript span.exp         { color: #808; }
code.javascript span.global     { color: #06e; }

The second method uses two more fantastic TextMate features, Create HTML From Selection and Create CSS from Current Theme. What these two commands do is translate exactly what I’m seeing in TextMate into very precise and valid XHTML with accompanying CSS rules. The main disadvantage of this is the weight of the code, the above 721 bytes of CSS converts to nearly 36k of HTML and CSS rules. It’s a seriously heavy pile of span tags, but the cost is immediately outweighed by 148 very specific reasons. And that’s just bundles, there are dozens of great themes too.

Aaron Quint also deservingly gushes over these two commands.

What these do is convert exactly what I’m seeing in TextMate into very precise and valid XHTML. Here’s the same CSS as above translated by TextMate:

code.html span.comment      { color: #999;}
code.html span.tag          { color: #07f;}
code.html span.string       { color: #080;}
code.html span.attribute    { color: #07f;}
code.html span.doctype      { color: #07f;}

code.css span.comment       { color: #999;}
code.css span.keywords      { color: #fd2;} 
code.css span.selectors     { color: #0b0;} 
code.css span.properties    { color: #66f;} 
code.css span.units         { color: #33c;} 
code.css span.urls          { color: #4a0;} 

code.javascript span.comment    { color: #999;}
code.javascript span.brackets   { color: #07f;}
code.javascript span.string     { color: #4a0;}
code.javascript span.keywords   { color: #07f;}
code.javascript span.exp        { color: #808;}
code.javascript span.global     { color: #06e;}

Just for the sake of comparison, below is a screenshot of how my code looks in TextMate. It’s not a perfect translation, but it’s a very good start:

Syntax Coloring CSS in TextMate

One of the purported advantages of the JavaScript method is that the source code remains unchanged. That’s sort of true, but not really. The JavaScript functions work by inserting a bunch of spans, so by the time the user sees it the main difference between JavaScript converted code and pre-processed code from TextMate is the detail (and weight) of the TextMate result. Also, any HTML would need to have entities escaped which is another step and a further degradation of the original code.

The main advantage then becomes convenience. A simple block of code doesn’t need to be run through TextMate (on the off-chance I’m writing somewhere else), it can be entered directly and tagged for styling without breaking focus.


NAB 2006 Wrapup

I did my annual whirlwind trip to Las Vegas for 40 hours of NAB.

Great to see everyone and meet some new faces. This year though it seemed I missed more people than I saw. However I did get to spend some good time talking with Christoph Vonrhein of CHV. Christoph is scary smart and pushing FXScript way beyond what has been done before.

Buzzless

Not much to get really excited about this year. The whole show seemed to lack a certain energy it’s had in the past. If anything it seems like HD is fully here and now people are figuring out how to work with it.

Lots of people around the show seemed to be complaining about various aspects of Panasonic’s P2 cards. Everyone loves how the video looks, but was down on the workflow. Complaints I overheard included the constant swapping of cards, their still astronomical cost, the lack of third-party cards, 8gb maximum available size, the lack of workable hacks (FireStore excepted) contributed to a general malaise about the format.

Far more hostile were the descriptions of working with HDV. It’s great that the format has a relatively low buy-in cost, but people were spitting blood about working with the files. HDV users seemed angry, frustrated and annoyed. And a lot of them are jealous of the Panasonic image quality, HDV looks mushy by comparison. I heard several people who are using HDV cameras steering others away from HDV cameras.

What that basically tells me is that the prosumer HD space is still very much up for grabs. Maybe that’s why the only perceptable buzz at the show was from RED Digital Cinema.

Except for RED

A lot of people seem to think RED is a joke. I’d be feeling the same way except that I know two people working at the company, Ted Schilowitz and Graeme Nattress. They’re both brilliant and dead serious about this camera. I have total confidence in them, this camera is no joke. Despite the fact that it looks more like a weapon from Unreal Tournament than a camera, and with a $17,500 price tag, the RED camera will completely transform the digital filmmaking marketplace.

In the near-term, HDV may yet become the interim standard. It’s just so much cheaper than anything else. Dealing with HDV files is really a software problem, so maybe Apple will shake a better workflow out of Quicktime and FCP making HDV more fun to use.

Overall it was a great trip. Sometimes, working alone in my little dark office, I forget there are people using the stuff I make. I know how absurd that sounds, but it’s real. Meeting so many Joe’s Filters users and hearing stories about how my filters helped people was revitalizing. Thanks to everyone, there’s more stuff coming soon.


Shell scripts in AppleScript are illegible

I got my FXScript Compiler working on the new machine and pulling sources from Subversion without too much trouble. But I decided that my practice of embedding shell scripts in AppleScript kind of sucks. It’s just desperately ugly. Tools like sed are ugly enough on their own, slashing and escaping every other character just makes them completely impossible to dissect after a few months.

For example, I print the following in the header of each file’s source code before compiling:

[tab] // [tab] Version: r145
[tab] // [tab] build200604181617
[tab] // [tab] April 18, 2006

One echo statement looks like this ($d and $b are already set and $b is formatted):

echo -e "\t//\tVersion: $d\n$b\n\t//\t`date '+%B %d, %Y'`\n\n

Not exactly pretty, except in comparison to this:

echo -e \"\\t//\\tVersion: $d\\n$b\\n\\t//\
\\t\`date '+%B %d, %Y'`\\n\\n\"

Echo is using the -e argument because these are being piped through other commands.

The real killer is anything involving regular expressions. Say a matching pattern needs to match a string containng double-quotes inside a double-quoted sed pattern. Then this already slash-infested command:

sed -e "s/\([Ff]ilter[\t ]*\"[^\"]*\)\"/.../"

becomes this:

do shell script "sed -e \"s/\\([Ff]ilter[\\t ]*\
\\\"[^\\\"]*\\)\\\"/.../\""

No part of me wants anything to do with keeping track of that many backslashes. It’s slightly better when using sed with the -E extended regex flag, but still.

In the ongoing pursuit of long term legibility, I’m putting my shell scripts functions into individual files inside the XCode project. More on how that works later.

[I know the main page template doesn’t work right when long strings break the column width, it’s on my long-term to do list]


Syndication overkill

Just tried and abandoned FeedWordPress. It’s an impressive plugin, but seemed like too much work only to cross-post news from the Joe’s Filters site here too. Mostly though, I didn’t like the way my language would have had to float inbetween sites. I may add a JavaScript feed display at some point, but for now I’ll just post a note here when something updates over there.

Share |

link: Apr 11, 2006 2:45 pm
posted in: Joe's Filters

Joe’s Filters Documentation

I’ve finally posted the revised Joe’s Filters Documentation. Much of the content is the same, but the backend system has been completely reconstructed. It’s now running on WordPress, includes feedback, RSS and will soon offer a printed version as well (via a print stylesheet). This is finally the write-once publish everywhere solution I’ve been thinking about since I first posted the docs in 2003.

There are a few things left to do, mostly just integrating the news RSS feed with this site and moving the feeds to Feedburner. Now I can get back to the filters and document them as I work. (And start benchmarking in FCP 5.1 on my MBP, more on that later.)

Take a look and let me know what you think, here or there.

Share |

link: Apr 07, 2006 9:54 am
posted in: Joe's Filters
Tags:

FXScript Reference comments disabled due to spam

The FXScript Reference site is 100% scratchbuilt. I created it partly to teach myself more practical PHP and MySQL skills and shunned any existing code snippets or libraries. Yet the site and my home-built commenting system is getting pounded with spam.

How is this happening? It’s the usual crap; laser this, cheap pills that, a bunch of links that were scuttled by my post-cleaning routines, fake comments with links to casinos and porn. The spam is coming from different IP addresses each time, usually from India, Russia or Columbia. All spam comments were posted using Firefox.

The k30fps entry has been taking the brunt of the spam (72,000+ hits vs a normal average of 2500-3000 hits for other items?).

I set up a quick log to collect all $_SERVER and $_POST data for any comments posted, to see what was happening. I was hoping something would stick out like curl or some unknown referrer page. No such luck, everything looked normal, the most troubling thing was the user agent:
[HTTP_USER_AGENT] => Mozilla/5.0 (Windows; U; Windows NT 5.1; rv:1.7.3) Gecko/20040913 Firefox/0.10

I suspect there is a hacked Greasemonkey script out there which exploits HTML form auto-entry to insert spam comments. There’s not much I can do about blocking that.

Traffic has been low recently, except for the spammers, so I’ve unfortunately decided to turn off comments for the time being. Maybe the lack of exploitable forms will get me off the spam list. When I have more time I will try to hook into Akismet. Since turning that on for my WordPress site I have not had to manually delete a single spam comment.

If anyone has something they’d like to contribute, send me an email and I’ll either post it for you or open the site a window for posting.

**Update** Comments are on again.


My own personal Laffer Curve

Every year for the past decade I’ve felt that I managed to accomplish more than I did previously. Throughout most of those years I couldn’t see how I could possibly do more than I was doing. But every year I did more. Until now.

The Laffer Curve has been in the news recently after the recent surge in US tax revenues. The Laffer Curve is a simple economic theory which shows that government revenues decrease when taxes are too high or too low. It points to a sweet spot where revenue is maximized by a tax rate that isn’t too low or too high.

The Laffer Curve of my life looks something like this:

Joe's Laffer Curve

Currently I’m on the right side the hump and have Too Much Going On. Time’s not being wasted, but my productivity, as measured by accomplishments, has fallen off a cliff.

Getting back to the place of maximum accomplishment will take some time, because I’ve got a lot of stuff I want to finish. So, the slogging will continue for now.

Share |

link: Jul 18, 2005 1:57 am
posted in: misc. Projects


« Previous PageNext Page »