This is more of a historical footnote, rather than a new development.
A few years ago, in 2006, I worked with an intern from Virginia Tech who built a small cluster of tiled displays for me. There were four Linux machines networked together. One was the master machine, and the remaining three machines each drove two displays. I recall that it used Chromium, the networked OpenGL implementation.
This was pretty scalable, so additional machines and monitors could be tiled together. We could have built a huge visualization wall with some more investment. I recall encouraging some folks to use it for large circuit designs as well as visualizations of combustion simulations.
The other cool thing I remember was that you could remote desktop into another Windows box from the cluster, which gave you a truly huge Windows desktop. Applications like Google Maps gave a significantly different experience. You could not take in the whole display. Your eyes and your head moved around as you scanned and focused on different details. It is hard to describe how qualitatively and quantitatively a different experience this was to me.
Today, displays cost even less. There are high resolution displays, like the retina displays from Apple, and graphics horsepower is significantly better. Studies have shown that more computer display space helps make users more effective. I think that many power users opt for two or more large displays for their machines.
This tiled display was still super cool and useful. It is easier than ever to attach a lot of pixels to your computer. I don't think you even need Chromium so much anymore. So just do it!