After weeks of teasing, this post is finally here. To coincide, I'm conducting a Google+ Hangout tonight (more info here
In the first few years I was on the tech team at Dreggs playing videos was complicated, and required a ninja.
We had a dual head computer (kindly supplied by +Richard Vicary
) running nVidia's excellent nView driver. This provided many of the features present in Linux since the '90s that Windows was (and still is, to this day) missing, such as multiple desktops. It also enabled hotkeys to be assigned to nearly everything it did. Media Player Classic (supplied with the klite codec pack) did a similar thing for video.
During setup we'd cue each of the afternoon's many videos on a separate desktop, ready to roll on command.
As such, when the host said "go" we'd hit space (play) on the pre-started-and-paused video, shift-F2 (move window to second monitor) and alt-enter (fullscreen) in quick succession.
If we got it right, at sufficient speed, no-one noticed anything. If we screwed up, a window appeared on the projector, rightfully shaming us for the vidiots we were.
Four years in I discovered mplayer, a richly featured media player build on FFMPEG (which may be familiar to VLC users). mpLayer plays pretty much anything, and has a list of audio/video filters too long to read in one sitting, including "bmovl", which reads in overlay data (like a watermark) from a FIFO (named pipe).
It was this feature that started me thinking - being able to play videos with a watermark would be nice. But then I realised that, if the overlay was resent regularly with subtle differences, an animation could be made. Inspired by Channel 4's "Now, Next, Later" screens I set to work constructing a prototype.
A simple overlay was fairly easy to make, both from the shell and in C++. A fading and moving animation was a little more tricky, although the part that took longest was the alpha blending code, which is harder to write than it seems, as it contained a lot of int-float-int conversions whilst traversing a large array of pixel values. This ended up with a utility that accepted a range of values from the command line:
Using GNU getopt and libconfig simplified parsing of the command line options and config files, and I'd recommend them to anyone tinkering with Linux programming. the --nownextlater option actually takes a config-style file, which specifies font, timings, images and text used etc.
There was a slight problem we found during the first week, in that occasionally, when the nnl screen was due to appear, mplayer would crash which, by its nature, would be halfway through a film, leading to a rushed restart-and-skip. The problem disappeared the second week, for no discernable reason other than we put a backup in place! I'm more than happy to explain the finer points of this bug in the hangout, so come along to hear the joys of capture cards and gerry-rigged delay lines!
Mplayer can also take a playlist from a FIFO, so I put together a quick little utility to control such a playlist. It had a fairly useful set of options, and was used during the event to keep track of two separate mplayer instances: one played background music, whilst the other ran the main videos.
Dreggs' Got Talent
For one of the evenings we ran "Dreggs' Got Talent" (where that apostrophe is/could/should be was the subject of long debates, but I think that's where we went in the end), where we had actual working (see below) buzzers for the judges.
In previous years we'd had a PS/2 keyboard with buttons soldered on and its cable extended via mic cable. This was accomplished by running GND, +CLK and +DATA into an XLR, with a local 5V adaptor providing power. At the far end was an XLR-PS/2 cable. My initial plan had been to use the multicore to extend the cable, but the signal degraded too much, so we had to use daisychaned mic cables, which worked a treat.
However, my laptop doesn't have a PS/2 port, due to being built in the last decade, so I had to convert the buzzers to USB. This presented a problem, as USB doesn't like travelling 15m down mic cables. Despite my best efforts, I ended up having to connect it to my netbook, hiding under the judges table, and then use netcat to send the keystrokes over a network to my laptop. They then went through a convoluted system of named pipes and shell scripts, before being picked up by the master dreggs utility, which displayed the Xs on screen and played the sound accordingly.
The biggest issue with this setup was reliability. Sometimes it worked wonderfully, sometimes acts managed an extra thirty seconds whilst the judges repeatedly pressed their buttons, waiting for the tech desk to work. It did allow us to overlay the buzzers on a video feed, however, so was utterly worth it!
dreggs was the master utility, a neat little GUI written with ncurses, an excellent little library for writing graphical user interfaces in a console, with remarkable speed.
It consisted of a list of cues (loaded in from a config file using a command line option), each one able to run a list of shell scripts sequentially. These shell scripts ran the other utilities, with an option to run the asynchronously, as some included 20 minute waits.
That was quite clearly hacked together! I only had a few days before we left once I got to this part, so shell scripts was a nice and flexible way of doing things, that didn't require reimplementing the code I'd written. During my first year at uni I started work updating it into a monolithic program that had a slightly slicker UI, along with many more capabilities, but then work and a social life kicked in! Somewhere I probably still have a subversion repository with all the code it, but the drive to develop a media server is severly limited when you get to work with +dave green
's Ai on a regular basis!
It worked fairly well, but did require a whole bunch of little shell scripts to be written, which I've been trying to reverse engineer (comments tend to be lacking when rushing) whilst getting ready for the hangout!
The final part of the system is a couple of scripts to run mplayer with the correct arguments, including telling it to use an external playlist, bmovl overlay, be fullscreen on the scend screen (suprisingly tricky) and not do anything unless my other programs tell it to.
There were also some variants I wrote after we encountered problems to get things going again in a rush, such as allowing skipping and clearing anything that might be causing a problem. Although, looking back now, they may have been a little optimistic!
That sums up what made my little media server what it was. Some of me regrets not being able to develop it further, although as that year ended up being the last year we hosted Dreggs Cafe, it may've saved me a lot of unused development time!
Make sure to join me in the hangout tonight (more info here
), or watch it on YouTube at your convenience afterwards.