Zappatic
Large video player in the YouTube Feather beta

The YouTube Feather beta allows you watch videos without a lot of useless features (no comments, Google+ crap, ...) The only downside is that they only offer the small video player. A simple bookmarklet solves this easily.

The bookmarklet has a customizable width for the player. For this example, I chose a width of 900 pixels. We'll set a new size for the container and video player, along with the correct height (16:9) for the video player. An extra height of 30 pixels is added for the player controls at the bottom. Finally the left margin is reset to 10 pixels.

If you want to adjust the width, change the w variable at the beginning.

var w=900;
document.getElementById("lc").style.width=w+"px";
document.getElementById('p').style.width=w+"px";
document.getElementById('p').style.height=(w*9/16 + 30)+"px";
document.getElementById('ct').style.marginLeft="10px";

Simply drag this bookmarklet: YT900 containing the above code in your shortcut bar

Here's the bookmarklet for a 1280x720 resolution: YT720p

Converting an Illustrator drawing to an HTML image map

A while ago, I had to incorporate a map of Flanders into a site, so I had my designer draw the map in Illustrator. The problem was that different regions needed to be clickable, so I'd have to make an HTML image map. Image maps support polygon regions, so it would work fine, I only had to get the polygon data out of the Illustrator file into the HTML markup.

The map in Illustrator

The first attempt was using a site where you can upload an image, and draw your polygon manually. Obviously the polygons would be coarse and not match the outlines of the provinces very well, as you have to manually click each point of the polygon on a small image. After an accidental refresh of the page, losing all my polygons for practically the entire map, this approach of creating the image map was switfly abandoned.

Luckily, Illustrator can provide us with the polygon data. Simply export the drawing as an svg file, which is just an xml file. This xml file contains <polygon> and <polyline> tags which we'll need to process. Let's take a look at a snippet of the polygon data.


<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<svg version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
	 width="690.615px" height="302.64px" viewBox="0 0 690.615 302.64" enable-background="new 0 0 690.615 302.64"
	 xml:space="preserve">
<g id="Layer_1">
	<polygon fill="#F3E969" points="342.887,230.002 351.096,231.173 355.201,224.138 351.683,220.035 355.201,212.412 361.65,214.17 
		368.687,207.134 366.928,204.789 370.445,203.031 371.619,199.677 369.977,200.029 369.391,196.511 365.286,200.615 
		361.182,198.857 360.009,196.511 360.596,191.82 360.009,185.371 357.077,179.507 364.113,173.057 373.494,174.816 
		378.186,171.884 386.395,168.953 385.863,160.987 385.222,151.362 383.376,152.237 374.081,156.64 368.217,156.053 
		362.941,159.571 360.596,164.848 351.8,169.54 350.627,177.162 336.555,177.748 323.655,186.543 309.583,186.543 297.856,194.166 
		293.751,192.408 294.338,183.612 279.093,192.408 273.229,197.097 267.953,196.511 262.675,199.443 261.502,197.097 
		256.812,196.511 256.812,201.789 252.707,200.029 246.844,200.615 240.394,205.306 239.808,202.96 231.013,201.203 
		219.164,201.696 220.458,207.652 214.595,215.275 217.527,217.033 212.249,220.551 214.009,224.07 219.872,224.07 219.286,229.933 
		215.181,229.933 213.422,237.555 216.354,239.901 221.868,236.684 232.068,226.484 236.172,224.725 239.104,227.656 
		245.554,230.002 247.312,227.071 254.349,233.521 255.522,238.21 261.972,237.039 275.458,237.625 281.321,234.107 287.771,239.97 
		290.703,235.866 294.221,237.625 297.152,237.625 301.257,235.866 305.361,234.107 307.707,247.592 310.052,248.179 
		321.193,235.279 325.883,235.279 332.333,237.039 338.197,239.97 343.474,236.452 	"/>
	<polyline fill="#FFFFFF" points="413.248,262.25 412.076,255.215 405.627,250.524 407.385,242.902 403.281,239.384 
		400.936,237.039 395.072,239.97 388.037,238.797 381.586,242.902 378.654,247.005 377.615,246.709 378.068,251.697 378.654,257.56 
		372.205,259.32 372.205,262.838 368.687,264.597 376.895,268.115 382.76,266.356 383.345,273.392 389.209,275.737 398.004,280.428 
		402.107,278.082 410.904,274.565 417.354,271.046 408.559,267.529 413.248,262.25 	"/>
	...

There is a slight manipulation we have to perform. The points attributes above do indeed contain the polygon data, but the formatting differs a little from what is necessary for an HTML <area> element. In the SVG the x,y coordinates are separated by a comma, and there is a space between coordinate-groups, whereas in an HTML <area> element it's just a string of comma separated coordinates (x,y,x,y,x,y,…)

A small Python script can take care of that quite easily, as it would be too much work to manually convert all the formatting for the points arrays. Even if you'd entered the comma's manually and extracted the points out of the SVG file, another problem would pop up: the coordinates seemed to be offset on both the x and y axis ! This is particularly strange, seeing as in the SVG file, at the top, you can see that the viewbox and x and y offsets are zero. You would assume the zero origin would be the same in the HTML <area> element, but apparently there is a difference. Instead of spending hours figuring out where the difference comes from, we can easily add an offset in the Python script when parsing the points.

We'll start the Python script with importing the xml minidom module, and defining our offsets.


#!/usr/bin/env python
from xml.dom.minidom import parse
offsetX = -48
offsetY = -54

Now we'll parse the SVG file. In this example I only take a look at the first <g> element, which is actually the first layer in Illustrator. Make sure you keep the Illustrator file as simple as possible, in a single layer. This will keep the SVG file much cleaner. Next, all the child nodes of the <g> elements are processed, and we store the points attributes of the <polygon> and <polyline> elements away in the inputs array.


inputs = []
dom = parse("chart.svg")
group = dom.documentElement.getElementsByTagName('g')[0]
for child in group.childNodes:
	if child.nodeName == 'polygon' or child.nodeName == 'polyline':
		if child.hasAttribute('points'):
			inputs.append(child.getAttribute('points'))

Finally, we'll loop over the inputs array containing all the points data. To get all individual coordinate pairs we'll split the string on every space, and to get to the x and y value of a coordinate, we split on the comma. Now that we have the x and y values we can add the offsets. All that is left is outputting the correct HTML <area> string, joining all the coordinates with a comma.


for input in inputs:
	outputarr = []
	components = input.split(" ")
	for component in components:
		xy = component.split(",")
		if len(xy) == 2:
			x = int(float(xy[0])) + offsetX
			y = int(float(xy[1])) + offsetY
			outputarr.append(str(x) + "," + str(y))
	print '<area shape="poly" coords="' + ",".join(outputarr) + '" href="#" alt="" title="" />'

What about the offset numbers? They seem quite arbitrary, and in fact they are. Determining the exact values can be somewhat difficult, because you can't see the polygon areas over your image when you're testing it out. There is however a nice jQuery plugin that allows you to outline the polygons: jQuery MapHighlight. With the aid of this plugin you can let Python process the SVG file, look at the HTML and see how much the polygons are shifted, so you can adjust appropriately.

A part of the processed map, with the maphighlight plugin
Safari Validator 0.3.1 released

The new Safari Validator release is out, bringing HTML5 validation!

The service to validate the HTML5 pages is validator.nu. This means that, unlike W3C validation, the contents of the HTML5 pages are transferred to the remote validation.nu server, and processed there. Be careful that you don't submit pages containing sensitive information for validation! It's best to turn off HTML5 validation in the preferences when you're not developing an HTML5 website to be sure.

I've also added, upon request, a toolbar button. This quickly toggles the Safari Validator bar so you can reclaim the precious screen real estate when you're not interested in the validation results.

As you will probably notice (the most of you being designers and all) that the button image is just about the ugliest ever created, I can only agree. I'd have put the circle with the v-checkmark in there in grayscale, but Apple forces to use a (alpha) bitmap, and I didn't have the time to create a nice looking icon. If you're interested in making a decent logo for the toolbar button, let me know. Just make sure to only use black (and transparency).

Three other fixes are also included : the W3C validation results page is now shown again, and the border added by the extension at the bottom of each website is gone. Also, the embed shouldn't be injected into iframes anymore, so it won't end up in CMS rich text editors.

Missing months in AWStats

A strange thing happened when importing a large number of logfiles into AWStats. Apparently only three out of twelve months were being imported correctly. Even over several years, only the same three months would be imported, the rest discarded.

I pipe the Apache logfile into Cronolog, to make sure the logs get put into the following folder structure: YYYY/MM/DD/access.log This makes it easy to find something in the log files. Each night, the giant access.log file gets split up into several files, one for each vhost. This can easily be accomplished by appending the vhost to each log line. A script then regexes for the vhost, puts it in the relevant log folder for the specific vhost and gzips the logfile. This way I only need to have two running cronolog processes during the day, instead one for each of the vhosts.

So, when migrating a lot of sites I had to reprocess the AWStats as well, seeing as I didn't copy the AWStats databases. This is quite easy, as you can specify in the .conf file which files it needs to parse. I put in the following :

LogFile="find /path/to/site.com/logs/ | grep access.log.gz | xargs zcat |"

This will find all the files in the log folder, filter out the access logs, and zcat them for input into AWStats. This is where it went wrong, and most of the months went missing. The cause was actually really simple. AWStats expects logfiles to be fed in chronological order. Running find on itself clearly showed that the result was not in chronological order (indicated by the folder names).

This is apparently a difference in the find implementation. The old server, running CentOS, sorted the find output perfectly, so I never encountered this problem before. The new server, running a different Linux flavor mixes up the find result.

The solution was obvious :

LogFile="find /path/to/site.com/logs/ | grep access.log.gz | sort | xargs zcat |"
Migrating Qmail: smtproutes not working?

While migrating a Qmail installation to a new server, I ran into a peculiar problem. In order to avoid spreading out mail over two different servers while the DNS-change is propagating to all the dns-servers, you can use the smtproutes file to instruct Qmail to deliver mail to a different server. Basically you tell it to accept mail for the domain you are moving, but instantly deliver it remotely to the ip-address or domain of the new server.

The procedure is actually quite simple. Mirror all the accounts on the new server, so that all the mail gets accepted there. Then, on the old server, remove the domain from the /var/qmail/control/virtualdomains file (and locals file if it contains the domain as well). Finally, if it doesn't already exists, create /var/qmail/control/smtproutes and add example.com:1.2.3.4 in there.

1.2.3.4 is the ip-address of the new mail server (you could also enter the domain name there), and example.com is the domain you are migrating.

By using this procedure, you ensure that no new mail gets delivered on the old server. Mail servers that haven't seen the updated DNS entry yet will deliver to the old Qmail server, which will forward the message immediately to the new one.

This is where I ran into a problem. Apparently when I was testing out this configuration, the mail wouldn't end up on the new server. I would receive a bounce, containing the "554 too many hops, this message is looping" error. I could see in the logs that indeed, the message was looping. It was tried around twenty times in fast succession, never leaving the old mail server. I looked with tcpdump on the old and new server, to see which one was doing something wrong, but the new server didn't even get contacted at all.

After researching a bit several people encountered the same 'too many hops' issue in combination with smtproutes. The offered solutions were always the same:

  1. Make sure the domain is still in /var/qmail/control/rcpthosts, otherwise Qmail won't accept the mail at all (it won't relay for unknown domains)
  2. Remove the domain from /var/qmail/control/virtualdomains and locals
  3. Add the domain to the /var/qmail/control/smtproutes file in the format domain:newmailserver

Everything on that list was as it was supposed to be, and yet mail wasn't being forwarded to the new server. The old server didn't even open a connection to the new one. Even worse, all incoming mail was being bounced.

It took me a while, but I finally understood what the problem was. On the old server, I have a /29 block of ip-addresses available to put servers on, and all of them were assigned to the old server. The new mailserver, on a physically different machine, was configured for one of those ip-addresses! It didn't really matter that the old server still had that ip-address active as one of its own, as the router knew about the change. The mail server however thought that the ip-address I put in smtproutes was actually on the same old machine, because the network interface for that ip-address was still up.

As soon as I ifconfig down'ed the network interface for the obsolete ip-address on the old server, the mail forwarding worked just fine. So even if the three points you need to check above when smtproutes isn't working don't do the trick, make sure you're not forwarding to an ip that's assigned to a network interface on the old machine!

SafariTidy becomes Safari Validator !

I've rewritten the plugin to make use of the new extension mechanism in Safari 5. This means that the reliance on SIMBL is gone, and no private API's are used. Now it is just a webplugin-safari extension combination. The extension handles all the user interface, and the webplugin handles the actual validation.

Installation is pretty simple: copy the webplugin to ~/Library/Internet Plug-Ins, and double click the safariextz file. The latter requires you having enabled extensions in Safari. To do this, open the preferences, and go to the 'Advanced'-pane. Check 'Show Develop menu in menu bar'. In the newly visible 'Develop'-menu, select 'Enable Extensions'.

Please note that the W3C validation takes time! It will slow your browsing down, especially if the site contains a lot of (i)frames. Just go to the preferences, and disable the W3C validation.

HTML5 support isn't there yet, as I'm still investigating the best solution. The validator.nu engine (which W3C uses) assumes you will be running it as a webservice, which is not ideal in a browser situation. Opening a port and running a server in the browser is definitely a worst case solution.