The Guru College

Finally Moved

I’ve finally moved The Guru College blog over to A Small Orange, and away from 1and1. This post only exists on the new site, so if you are reading it, you are reading it from the ASO servers. Only took me 10 months longer than I expected it to take. There’s a couple of other sites still living on 1and1, but I have to coordinate with other people for that.

I Wish iPhoto for iOS was iPhoto Match

I was really excited when I saw the iPhoto for iOS announcement, but quickly discovered that it’s not really what I was looking for. I was really hoping that it was going to be like iTunes Match is for music – an extension of the iPhoto Library on your Mac – allowing you to share and edit albums, photos and projects seamlessly on any device. This works seamlessly with iTunes Match and iTunes, in that playlists and music between computers and iOS devices is magically kept in sync via iCloud.

Of course, if this were to be that magical photography nirvana, that would have been the announcement all by itself. For years I’ve been looking for a proper solution to the problem of having two more more devices and people trying to edit the same library of photos. The ways I can think of this working:

  1. Metadata, edits, albums and other groupings either have to be kept in sync manually between seperate libraries
  2. The library itself is shared between computers on shared storage
  3. The library lives in the cloud and everyone buffers images locally

The first option sucks, and it’s what most people wind up doing. Very quickly, edits and changes aren’t communicated, or photos are imported in one location and not others. So the libraries are somewhat in sync, but not really enough to be useable. The second option is a lot better than the first option, with the caveat that you MUST NEVER open the library in two places at once. The library files really are lots of little databases, and you will corrupt them so fast your head will spin. So this work, but extreme caution must be taken or you’ll be going to backups very frequently.

The last option doesn’t really exist anywhere at the moment. Parts of it are implemented in Photo Stream: the images themselves are uploaded to iCloud and shared between machines on the same account. This gets you the very basic “we have imported the same photos” bits solved, but it doesn’t handle metadata or non-destructive edits at all.

What I would like to see is the database side of iPhoto (or Aperture) to be stored in iCloud, just like Apple does for the iTunes Library in iTunes Match. Also like iTunes Match, there’s also a local copy of the data, so if you’re in a tunnel or your computer is off the ‘net, you can still make changes and see them show up on all the other devices when you rejoin the networked world. This applies to the images as well – they are uploaded to iCloud, though a local copy is kept on the device that uploaded the file for faster access. When another device needs access to the image, it’s pulled down on-demand. As we already have the library data in the cloud, all the edits come with the image. They could even be all fancy and save a history, to allow a different computer to undo/redo edits to an image.

The current iCloud storage pricing would have to be updated somewhat, as images from modern cameras are huge, and I would be paying $3,000/year for data storage with my current library. (1.5 TB of images * $10050 GB of iCloud space) If the cost of iCloud space isn’t flexible, have the images live on the device that initially imported it, and have it set to share over the LAN if possible when another device requests it, much like the way AirDrop works.

Personally, I could justify paying $20/month or so for iPhoto Match – which is probably not enough to really get Apple’s attention – but professional photo houses could pay a lot more, and if it iPhoto Match works as well as iTunes Match has, it would quickly become a must-have for every serious photographer on the planet.

Macro Reversing Rings

What if you could get a Macro lens for your DSLR for $8?

If you already have a 50mm prime lens, you may be a lot closer than you thought. With a prime of any manufacture, all it takes is grabbing a reversing ring from Amazon, learning where the aperture control is, and learning a little bit more than you ever really wanted to know about light. As an aside, older lenses work better than newer lenses for this, as a lot of newer lenses don’t have dedicated aperture rings, so you have to mess about with the aperture lug that the camera body would grab ahold of. Trust me, it’s easier to use an older lens. They are often cheaper, to boot.

The way this works is to mount the lens backwards on the camera body, so instead of the lens taking a large swath of light and focusing it down on a small sensor, the lens takes a small patch of light and expands it back into the camera body. A 50mm lens will give you just about a 1:1 reproduction ratio on a Nikon DX body, which is a handy size, and a working distance of about 4 inches. The downside to all of this is that you have to give up a lot of the automatic controls. The camera body will still do auto-ISO adjustments, sure, but you have to tell it what shutter speed to use, you have to adjust the aperture by hand, and focusing involves moving the camera back and forth rather than spinning the focus ring on the lens. The effects, however, are pretty good.

This picture was taken from about 4 inches away from the flower, with an SB-600 and an SB-800 speedlight working on Nikon’s Creative Lighting System. Both lights were turned up pretty high – 14 power on the SB-600 and 12 power on the SB-800, as I had set the lens to f/16, and my ISO was still pretty low. Another thing to keep in mind is that it’s really hard to focus when using reversed lenses, as minor adjustments to the camera’s position change focus significantly. It’s best to use a tripod and put your lights on stands, and then move the object you are taking pictures of in carefully controlled way. But more on that later, if I ever get my next project finished.

This shot of my eye was taken by a friend, who was also shining a small LED flashlight into my eye to help her see when I was in focus. The SB-800 was mounted in a 24″x24″ softbox right next to my face. You can see the flashlight, the camera and the soft box in the reflection in my eye, which is kind of weird. After 60 shots, I still wasn’t quite where I wanted to be in terms of picking up fine details of the iris. Perhaps I can get someone to sit still for me over the weekend and let me burn their optic nerve out while I take countless pictures… of course, when I phrase it that way…

Finally, this is a detail shot of the mount on the back of my Sigma 70-200. Again I had the SB-800 in a soft box, I think on 14 power or better, taking shots waiting for my eyes to stop burning. I’d never thought of this lens as being all that dirty, but when looked at in the light of f/16 and a 1:1 reproduction ratio, it’s pretty damn nasty. I guess this weekend I’ll be cleaning all my lenses and looking for this kind of filth.

Focus, Damnit

OS X has a focus problem. It’s easier to explain with the following screen capture:

I’ve left out the menu bar and the Dock, just to emphasize the following point: which window is in focus? The terminal window or the finder dialogue? If you hit the “enter” key, do you get a newline in the Terminal, or do you acknowledge the “unsafe device removal” message?

Of course, you enter a newline in the terminal. This is but one example of the madness, and it seems to be worse in Lion than it was in previous releases of OS X.

AppleTV and Mountain Lion

Apple just announced something that has gotten my full attention, and my change my computer buying plans in the short and long term: the next release of OS X, dubbed Mountain Lion, allows a Mac to share it’s screen via an AppleTV at 720p, over the wireless network.

I know there’s lots of easy ways to hook a Mac up to your television, or cross-convert your media library to work on an AppleTV, but this setup allows for seamless access to any video content on your Mac in your living room. No more worries about using Handbrake to make a copy for your iPhone and then a second copy for your AppleTV – you just load the media into iTunes and it plays back over the AppleTV via screen sharing.

At the moment screen sharing appears limited to 720p, but I imagine that as 802.11ac is developed, and the next generation of the AppleTV is released, 1080p will be supported. It also means that I now want to keep my Mac Pro when I get the iMac, as I can easily hook the machine up to the TV in the living room without having to find space in the TV cabinet for the MacPro and all it’s bulk, heat and noise.

Keeps a man thinking, right?

Low Cost Wireless Mesh Networking

I’ve noticed that each of the three iPhones I’ve owned over the last three years have a problem passing off to the 3G network when the WiFi signal is too weak to use. In other words, when the signal strength on the WiFi network is very low, the iPhone will still prefer WiFi to 3G, even if the WiFi connection is unusable. This happens every time I get to the far side of the house, but it also gets me when I’m on the bus and signal is coming and going, or when I’m walking around outside at work, and while I can still get a little of work’s network, I can’t use my phone without disabling WiFi.

I can’t fix the problem at work – I’m not in control of that – but I can fix things at home. Or so I had thought, with Apple’s mesh networking implementation in their Airport line of products. In previous iterations of my home networking setup, I’ve tried using an Airport Express to extend the network of an Airport Extreme. While it works, it’s not the most reliable thing, and as I have older Airport hardware, I can’t run simultaneous 802.11g and 802.11n networks while keeping things running at d decent speed. Finally, the cost to upgrade to the newest line of Apple networking products is substantial – the current generation Airport Express is $99, and the current generation Extreme is $179. Poking about online, I think I have the answer I need: Open Mesh

Specfically, the MR500 Mesh Router from Open Mesh. It’s $99 per unit, and cleverly runs 2 CPU’s – one for user traffic, and one for mesh traffic. It also uses both the 2.4 ghz band and the 5 ghz band – but the 5 ghz band is reserved for mesh traffic. This is all done in the name of enterprise grade reliability – the packets must keep flowing. The system also supports multiple gateway devices from the wired network to the wireless network, even ones on different ISPs, so failover between networks should be seamless. There is also the provision for per-user rate limiting, public and private networks, and self-restarting hardware if the software stack crashes. All for the $99 per-unit cost. It also has a built in 5 port 10100 switch, so you can use it to bridge physical network segments.

The only downside to the system, as best I can tell, is that all the management and configuration of the devices is done on the Open Mesh website. They run a web app that you create an account on, and when the device powers it, it looks for it’s MAC address on the service, and configures itself accordingly. My concern comes in for times that my link to the internet is down, but I still want to stream movies and music inside the house. That is one example, and I’m sure others would have better ones, but it’s a concern. I need to do a little more digging and see if the device will just come back up under it’s old settings, assuming nothing should change if it can’t talk to it’s master. I’d feel a little more comfortable if I could run a local version of the cloud management application, but nonetheless, this is still worth looking at.

Configuring a new OpenAFS cell on Ubuntu Server

After doing a lot of reading online, I’ve finally built a new OpenAFS cell in my home network. This involved reading piles of terribly out-of-date documentation and HOWTO’s, and a lot of frustration on my part. I help administer OpenAFS at work, but I’d never set up a new cell before, and the process of working out the right Kerberos keys and setting up the servers with bos were throwing me for a loop.

Eventually, I found the guides hosted on spinlocksolutions.com, which have a very clear walkthrough, with up-to-date explanations on how to setup a Debian-based OpenAFS cell, as well as a Kerberos 5 domain and an OpenLDAP server. They can be used with Ubuntu Server almost without alteration. This is good, as I have a passing knowledge of Ubuntu from previous projects.

However, there were two gotchas I ran into when doing an install with Ubuntu Server:

  1. The Ubuntu installer had added my FQDN to /etc/hosts with an IP address of 127.0.2.1, which makes the afs-newcell script lose it’s mind when trying to create the CellServeDB entry it needs.
  2. The Ubuntu apt-get delivered copy of the afs-newcell perl script has an error in it, that it doesn’t add the -noauth flag when creating the dafs server. You will see errors about not having permissions to create the dafs server. Simply edit the script and add the ‘-noauth’ flag.

Once afs-newcell runs properly, everything else goes exactly according to the documentation – afs-rootcell creates things as it’s supposed to, and the various Debian/Ubuntu packages to configure PAM work as expected.

Newer Posts | Home | Older Posts