Alex Gyoshev

Downloading CSS Resources via Wget

Imagine that you need to borrow a hosted CSS file, along with its resources. You paste the CSS contents in a text editor, and start searching for url() patterns within it. After seeing 100+ matches, you bless the name of the CSS sprite-oblivious person who built it.

That could be one’s nightmare of a working day, hopefully not a reality. Don’t worry! wget is here to save the day!

Downloading for offline use, then pruning the result

Turns out, wget has a very handy -p switch that lets you download an offline copy of a page. So if a page references the said CSS file,

wget -p -k http://example.com/

… will get it, and you can then get the CSS and files from the result.

Download only what’s needed

Now, getting the HTML page and its cousins is not viable if it has a lot of content, or if you don’t know of such a page.

Instead, you can automate the necessary steps to achieve the same result:

  1. Download the CSS file
  2. Get all url() references from it
  3. Download each of the above, relative to the CSS

Here’s the command, with inline comments

# get the CSS file
wget http://example.com/css/styles.css

# below script assumes quotes in url() declarations
# find all URLs within it
grep -ri "url('[^']*')" -o styles.css | \

    # remove the start and end of the URL css
    sed "s/url('//" | sed "s/\(?\|'\).*//" | \

    # no need to download anything twice
    uniq | \

    # pipe all URLs to wget, using the currect base URL
    wget -i - -B http://example.com/css/

Now you’ll have time for something else. For example, learn how to solve the Rubik’s cube. This one is really good.

Learning Vim Interactively

Quite a bit of people find the default vimtutor too static for their learning style, so here’s a list of sites that let you learn Vim in a more interactive way.

OpenVim Interactive Tutorial

I believe that this is indeed an interactive tutorial as it should be. The site runs a quite complete vim simulation in your browser, and guides you while you edit and navigate text. The good thing of the web app is that you can start learning shortcuts even before you install Vim! Plus, it gets bonus points for being open source and showing their test suite online.

VimGolf

VimGolf is a list of challenges for modifying text with the shortest keystroke sequence. While not directly an interactive tutorial that guides you hand in hand, it lets you think and research ways to edit a block of text. You get to see some of the solutions that are better than yours, which lets you improve your style. You also learn to appreciate life-changing techniques, such as that typing ZZ is two keystrokes shorter than :wq. While not all tasks are directly useful (looking at you, cow reversal), there are some real-life situations that have been submitted by other users (downgrading jQuery code).

Vim Adventures

I admit that I’m a big fan of learning new skills through games. I strongly believe that mixing fun with education leave you with a more memorable experience, and this helps you retain the material longer. Vim adventures is a well-crafted game that lets you move throughout the world by using the vim keystrokes, defying obstacles with various shortcuts. After playing the first level (which covers basic navigation), you get access to the rest of the levels (2-12 plus one planned final level) for the next 6 months for $25. I am curious whether mastering the game controls improves your Vim navigation skills – if you are observing such a correlation, please share it in the comments.

Shortcut foo

Another freemium app; the basic mechanic is that it asks you to type in shortcuts that do a described action. A quite useful feature is that it gauges your accuracy, so that you can determine where you need most training. The free tier includes beginner text editing and navigation, a total of 42 shortcuts. To unlock all Vim training, you need to pay a one-time fee of $5. Like in Vim Adventures, you are learning the shortcuts outside of the editor, so your performance within the app might be different than the one in the editor.

In conclusion

I hope that these tools improve your learning experience. Keep in mind that the best way to learn is to go through some of the pain and start using Vim on a daily basis.

Happy vimming!

Js13kgames 2013 Post-mortem

Have you ever wondered how it feels to write a HTML5 game? Мy wife and I did, and we created the game Life of Blobb for the js13kgames competition. The participation has proved very educational, especially because of the constraints:

  • Limited size. 13 kilobytes means that you get to develop with limited raster graphics, relying more on code to create your game. Imagine how much minified canvas code can be extracted in one kilobyte, and the corresponding image.

  • Limited time. The time range was August 13th to September 13th, giving participants a month to go through writing, testing, and polishing the games. Since we started late and had to finish early, our development time was roughly cut in half, but still sufficient.

What went well

  • The object hierarchy and level structure turned out very well. Levels are scriptable (providing setup and tick events), which allowed flexibility like time limits and different setups.

  • The levels have been balanced to be challenging enough so that the game is not easily beaten the first time. Full credit of this goes to my wife; I had quite a bit of trouble of beating some levels after she was done balancing and I pulled her changes.

  • I particularly enjoy the ending. This year’s theme was bad luck, and I think it is appropriate.

  • Since this is our first game, it was a good sign when @mozhacks published a game development tutorial and it matched the design that evolved on our side.

What went bad

My only regret is that the instructions were not written well. In particular, one of the main mechanics – to keep the player blob in a given size range – was described poorly, and frustrated many players. A lot of interactions could have been polished more, but given the time frame, they turned out well.

During development, the one of the biggest engineering hurdles was how one blob consumed another. At first, this was done by calculating the overlapping area of the two blobs, but that yielded the effect of very rapid blob growth and rendered the game unplayable. In the end, conserving the center distance proved more usable and allowed some mechanics like brushing off bigger blobs. Still, this problem consumed a lot of time that could have been used for polishing.

In conclusion

I am happy with the end result. Who knows, it may get a bit more polish sometime soon…

Normalizing Line Endings in Git Repositories

Motivation

As noted on GitHub, if you develop your project (happily) in a cave where all other tribe members use the same configuration, you may be oblivious to the raging line endings war that has cursed computing for decades. If it so happens that another tribe member arrives with another configuration, the line ending genes may mash up and create stronger files, if you consider mixed line endings genetically strong. Unfortunately, this causes severe insanity in the tribe, so you would like to set on a path to eradicate the genetic variation and establish supreme rule of the LF. Or the CRLF. I won’t judge your preferences.

Two methods, two end results

Choosing either method depends on what you value the most: preserving what is pushed upstream, or preserving the commit history.

If you care about upstream the most, you can follow the suggestion by GitHub: its TL;DR version is to create a commit that fixes all the line endings. This approach is perfect for open-source projects, since you cannot change your project’s history without mangling all forks. The downside is that each time you do a git blame from now on, it will point to the same commit that fixed the line endings. For your convenience, it is reposted here:

git rm --cached -r .
git reset --hard
git add .
git commit -m "Normalize line endings"

If you want to preserve your commit history and have more control over your repo, you can do this through git filter-branch, replacing the line endings in each offending commit.  This is usually useful for private repos, and you would need to contact all project members to let them know.  I urge you to have a repository backup (hosted or local) prior to executing the steps below, since they rewrite history.

<code># filter all branches and run the ~/fix-eol.sh shell script
git filter-branch --tree-filter '~/fix-eol.sh' -- --all
</code>




<code>#!/usr/bin/bash
# ~/fix-eol.sh
# convert all js,css,html, and txt files from DOS to UNIX line endings
find . -type f -regex ".*\.\(js\|css\|html\|txt\)" | xargs fromdos
</code>

Depending on the size of the repository, this command may take quite some time. The factors that determine the run time are:

  • the amount of commits - since each commit may contain files with bad line endings, each one is checked out and processed. The more commits, the slower the process

  • the amount of files that need to be converted - you can run the command over the complete repository, or only on special folders

To hasten the process considerably, you can use a in-memory file system (the filter-branch docs even recommend this).

<code># create a temporary folder on the /dev/shm memory fs
mkdir /dev/shm/repo-temp
# run filter-branch with an in-memory temporary directory
git filter-branch --tree-filter '~/fix-eol.sh' \
    -d /dev/shm/repo-temp -- --all</code>

That will process the repository significantly faster, depending on the size of the in-memory directory. The above factors will still be relevant - in my experience, filtering 30,000 commits and about 50 files takes about 1,5 hours on a 4Gb directory. Your results may vary.

Reflection

One nice thing about the second approach is that it can be run after the first, since the commit that normalizes the line endings will be obsolete after running the filter. You can normalize the line endings through a commit that preserves the upstream, and if the loss of history hinders your development, you can decide to filter history.

Techniques for Better QUnit Tests

Since I use QUnit on a daily basis, here are some tips that might prove helpful in your daily test cycle.

Limit execution time of asyncTests

One common pitfall when writing asyncTests is that they tend to hang when the operation does not call its callback. This can be resolved through timeouts, as this pattern shows:

// specify the count of asserts that need to run
asyncTest("foo", 1, function() {
    // limit the test execution time to 10s
    var timeout = setTimeout(start, 10000);

    performAsyncOperation(function onDone(status) {
         // suppress the test timeout
         clearTimeout(timeout);
         start();
         ok(status);
    });
});

Temporarily suppress test teardown

When debugging tests related to the DOM, it is useful to see how a test ended, which is hindered by a proper teardown that brings the test suite to a clean slate. A simple and efficient way of checking the end state of a test is to call the stop method:

test("bar", function() {
    $("<div class='bar' />").appendTo("#qunit-fixture")
        .css("color", "red");

    // temporarily turn the test into an asynchronous one
    // this will hang the suite, but will allow inspection
    stop();
});

Maintaining a big suite of tests: Composite suites

One of the rarely advertised features is the official composite plug-in. It solves the problem that big suites have – it is not practical to keep a few thousand tests in a single web page. What the composite plug-in does is that it enables you to split your tests into different files, each of which tests different parts of the code. For example, one page that tests the API of a component, and one separate page for its rendering:

composite test suite

This way you know exactly where the tests for a given API method or for some specific functionality are, which saves time – searching within 500 tests will still be tedious, not to mention navigation.

Know thy framework

I am often surprised how many people do unit tests but are not aware of the features of the underlying framework. The official QUnit cookbook is a 15 minute read, and can save you hours of reinventing the wheel.

Now go red-green that shiny new feature, or that pesky old bug!

Maintaining an Up-to-date Vim Configuration With Git Submodules

[Update: The Vundle plug-in conveniently abstracts away the approach described here. I have been using it for quite a while and highly recommend it, just as Erik did]

Nowadays, people share even their DNA on GitHub. With such a healthy open-source ecosystem and new updates every day, here is a way to keep up with the latest versions of your vim plug-ins.

Git submodules

Let’s start with an idea - consider your vim configuration as project, with plug-ins as dependencies. Since you want to have the latest and greatest, you may want to directly link to their sources on github. But how? This is where git submodules come in:

<code title="git submodule basic syntax">git submodule add <repository> [<path>] </code>

In the case of your vim configuration, the above command will take the following form:

<code title="cloning the NerdTree plug-in to your vim bundle">git submodule add https://github.com/scrooloose/nerdtree.git .vim/bundle/nerdtree</code>

The above command will add a reference to the project located at to your vim configuration (that is itself in git). However, there is a problem: due to the fact that many projects have their own directory structure, you cannot just add them easily to the specific directories that vim expects. Luckily, vim superstar Tim Pope has developed a remedy called “pathogen”.

Pathogen

This great plug-in will load other plug-ins from the ~/.vim/bundle directory. Since pathogen is yet another dependency of the project, it should be placed in the .vim/bundle folder:

<code title="cloning the NerdTree plug-in to your git bundle">git submodule add https://github.com/tpope/vim-pathogen.git .vim/bundle/pathogen</code>




<code title="pathogen initialization">source ~/.vim/bundle/pathogen/autoload/pathogen.vim
call pathogen#infect()</code>

Now that vim is infected with the pathogen, your modules are fully loaded. Hack away!

Updating plug-ins

Here comes the really sweet part: if you want to update all your plug-ins (bound by submodules), it is as sexy as running one line in bash:

<code title="updating all git submodules (i.e. vim plug-ins)">git submodule foreach git pull origin master </code>

Updating a single module is done by a git pull origin master, if you ever need to.

In conclusion

I hope that this will let you manage your vim configuration with ease - it works great for me! We are one step closer to our personal configuration heaven. If you found this article to be useful, leave a comment or hit me up on twitter!

qHint - Enforcing Coding Conventions With jsHint and qUnit

[Update: the project is now called qHint, as Leo Balter suggested]

Open-source and coding conventions

Adhering to coding conventions (other than your own) is hard. This is particularly true in the open-source world, where every project may have different style requirements. This makes code contributions harder, since you either:

  • have to remember the coding style for each project that you contribute to, or

  • submit patches that do not follow the style guide of the project (thus either scrambling the code or making the reviewers’ job harder)

Tools like jsLint solve these problems by enforcing strict rules about the coding style. However, requiring all developers to run jsLint is prone to errors, as it is yet another step in the process of contribution. This post provides a solution to this problem, for projects that have unit tests written in qUnit – but the same process can be applied for any other unit test framework.

JsHint

jsHint is a code quality tool forked from jsLint. While the latter had strict rules and pretty much tried to enforce a general coding convention, the former allows flexible customization of the rule set, so that it matches your coding standards. Its configuration is dead simple, and you get a nice array of all style violations when you call the JSHINT function.

Integrating jsHint into qUnit

Imagine that you could validate a JavaScript file against jsHint with the following line:

<code>jsHintTest('Core', '../src/core.js');</code>

It would be neat, right? Well, the following code introduces this new type of qUnit test. It fetches the specified JavaScript file and outputs any validation errors like qUnit test errors.

<code>function jsHintTest(name, sourceFile, options) {
    function validateFile(source) {
        var i, len, err,
            result = JSHINT(source, options),
            errors = JSHINT.errors;

        ok(result);

        if (result) {
            return;
        }

        for (i = 0, len = errors.length; i < len; i++) {
            err = errors[i];

            if (!err) {
                continue;
            }

            ok(false, err.reason + " on line " + err.line +
                                   ", character " + err.character);
        }
    }

    return asyncTest(name, function() {
        $.ajax({
            url: sourceFile,
            success: function(source) {
                start();
                validateFile(source);
            }
        });
    });
}

`

And the result is…

Screenshot of qUnit tests that validate coding style using jsHint

The latest qHint code is on github.

Beginning Vim

Vim? What? Why?

Let me say that: there is a raging war out there, driven by the eternal question “What text editor do you use?”. Personally, I was ignorant of the two camps (namely, the vi-like editors and Emacs). However, I got curious when I found out that Notepad-like programs are not even considered as text editors. When people have such strong feelings about a program, it definitely deserves a few hours to play with.

So what is Vim? Simply put: an extensible text editor, focused on manipulation of text, rather than pure editing. There are currently about 3500 extensions for it, ranging from source control integration to a personal wiki.It is said that mastering Vim will yield great productivity gains in term of coding. Well, that’s pretty convincing!

Learning Vim

If you feel hooked, you should download and install it. There is only one practical way of learning Vim, and it’s to use it. The first advice I got was to do all my text editing in it for a week before judging it. But let me give you a word of warning – the learning curve is steep, to say the least.

Personally, I started off with the integrated tutorial in Vim, called vimtutor. It’s a shell script that can be found in the installation directory. Under 64bit Windows, that’s C:\Program Files (x86)\Vim\vim73\vimtutor.bat. You’ll need to run it several times (when practicing “how to close vim”), so keeping a command prompt or explorer window in the directory is quite handy. Here are also some interactive tutorials that I reviewed.

If you prefer video, and want to spare a few bucks to treat yourself, PeepCode has the Smash Into Vim series (2 x $12). The alternative are Derek Wyatt’s craaaazy vim novice screencasts* - these are funny, too! Vimcasts have videos that cover interesting information to supplement the other tutorials, but aren’t targeted at complete novices.

If you are an active Twitter user, you may like to follow @vimtips for daily vim tips. Past tips are archived on vimtweets.com, and there are real gems among them.

And if you want to setup your workspace for learning, you can use the Vim movement shortcuts wallpaper.

Making it beautiful

One definition of beautiful = vim + railscasts color scheme + consolas

This is absolutely necessary, since the default color scheme / font combination is really ugly. I think that this repels many people and could be fixed for the sake of better first-time user experience. Of course, this may be an intentional design decision, made either to please the die-hard fans or to create a tribal vim community. Or a bit of both.

My personal preference is the railscasts color scheme, with the Consolas font. In order to apply these, you need to create a _vimrc file (that stores the Vim configuration) in your user directory (for windows 7, that’s C:\users\yourusername) and place the following two lines in it:

colorscheme railscasts
set guifont=Consolas:h12

In order to use the railscasts theme, you have to download it from the link above and place it in the vimfiles/colors/ subdirectory of your user directory.

Sharing your Vim configuration across multiple computers

If you want to learn Vim both at work and at home (like me!), it is convenient to share your configuration between computers, so that you get the same environment without much effort. What better way to share it than to put it on GitHub! However, there happened to be two problems with this:

  1. I turned out to be too lazy to pull my configuration every time I make a change. Dropbox to the rescue! I simply cloned the configuration repository to my Dropbox folder and it got synced across my work and home PC.

  2. I didn’t want the local repository to be the same as my Windows 7 user folder. This can be easily fixed by creating a symbolic link for the _vimrc and vimfiles, like this:

    cd c:\users\gyoshev mklink _vimrc “F:\github\config_vimrc” mklink /D vimfiles “F:\github\config\vimfiles”

In Windows Explorer, the newly created links should look like this:

symbolic links to _vimrc and vimfiles

Voila! Everything gets synced everywhere.

Conclusion

Well, I hope that these resources will prove useful to you! I certainly feel more confident after having read and watched most of them. And if you need help, don’t forget to :h command it ;-)

P.S.: Many thanks to @korchev, for the initial help and the aggressive Vim marketing at work :)

* Don’t worry, after the first 2-3 of them, he calms down. At first, I expected that he’ll ritually consume his mouse, with candles and everything. You’ll see what I mean.

Cutting the Google Analytics Script in Half

[Update: after a humbling pull request on GitHub, I was pointed to a blog post that shows a better process.]

I am a bit extreme when it comes to optimization (as I will prove in a bit). Both YSlow and Google PageSpeed are great pals in this, and I get concerned when they report inefficiencies. You see, I use Google Analytics in order to track the page views of this site. If you ever used it, you have certainly copied the tracking script from the GA site – the script that is generated for you automatically and you just need to paste in the pages that you want to track. I like this feature pretty much, as it saves time; it also saves Google the need to explain what code you need to write to get it done. However, the code that gets generated isn’t optimized, nor minified, so the above tools find it offensive. So I took the effort of optimizing it, and here is how.

Starting script

<script type="text/javascript">

  var _gaq = _gaq || [];
  _gaq.push(['_setAccount', 'UA-XXXXXX']);
  _gaq.push(['_trackPageview']);

  (function() {
    var ga = document.createElement('script');
    ga.type = 'text/javascript';
    ga.async = true;
    ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www')
                 + '.google-analytics.com/ga.js';
    var s = document.getElementsByTagName('script')[0];
    s.parentNode.insertBefore(ga, s);
  })();

</script>

This is the default snippet that gets generated from the GA site, without additional options.

Optimization process

  1. Array.push can accept multiple arguments. This is even mentioned in the online help, but the generated code does not contain it.

    gaq.push([‘setAccount’, ‘UA-20578581-1’],[‘_trackPageview’]);

  2. Both the HTTP and HTTPS GA scripts are identical. Furthermore, both URLs work fine without specifying a subdomain. Thus, using protocol-less URLs (part of the URI RFC) in order to link the JavaScript shaves off a few more bytes.

    ga.src = ‘//google-analytics.com/ga.js’;

  3. (Theoretically optional) The script persists its location in the document, which is nice for more advanced scenarios that require more tracking. However, placing the script anywhere else than before the closing body tag might degrade performance. If performance is more important than tracking,

    document.body.appendChild(ga);

And after a few manual minification tricks, here comes the…

Optimized script

  var _gaq = _gaq || [];
  _gaq.push(['_setAccount', 'UA-XXXXXX'],['_trackPageview']);

  (function(d) {
    var s = d.createElement('script');
    s.type = 'text/javascript';
    s.async = true;
    s.src = '//google-analytics.com/ga.js';
    d.body.appendChild(s);
  })(document);

Optimization statistics

After minification, the reduction is by 45% (187 bytes), since the current GA code generator does not minify its output. Voila!

The optimized script is active on this site, and was tested in IE6 (‘nuff said).

I hope that you will find this useful.

Professional Staffing, WTF?

Having just read REWORK, the idea behind HR agencies really annoys me. I can’t see a worse way to completely miss the point of hiring, both from the candidates’ and employers’ point of view. Template messages with no personality will never attract people who are not desperate. And a boatload of Skype messages tend to be too intrusive for first encounters, no?

Employers!

If you want to hire mediocre people, that do not care about what you work on, then great for you! But if you actually care, you want to hire somebody who does, too. Filtering the candidates is something you should do, instead of trusting foreigners blindly. In order to do remarkable work, it is not required to be famous, or have a great deal of experience. And if a person values her work, she will want to get in touch with other people that will honor this - like you.

Candidates!

Really? Would you let someone else determine where you spend the better part of your day? Your work should be more important for you than that. Instead, you could go for something you value and believe in. In fact, it does not speak really well of an employer if a staffing agency is involved (as noted above). Going to an interview without knowing what the company stands for, and without looking at what it does, undermines your image. A honest cover letter can outline your appreciation for the work being done, and your research will certainly pay off once you meet the future employers. As the saying goes, failing to prepare is preparing to fail.

And…

Introducing more people to your hiring process just increases its complexity (and it is already complex enough). The moment both sides start to care, the middle man perishes. So… start caring.