A fun, well-made app for recording and rating what you’re drinking. I’ve been testing it out for a few weeks and I’d love to see something like this take off because the data could be extremely useful at scale (not just for individual drinks, but for bars overall).
We were so busy launching and tuning Elixr that I almost forgot to tell you guys about it! The Mobelux team has been designing and developing Elixr for the past few months and we couldn’t be happier with our 1.0.
I’m thrilled to see that people are not only using Elixr, but really understanding the reason behind it. There were a few decent ways to catalog the drinks that you were having, but nothing comprehensive that was letting you share those experiences. And while sharing is core to Elixr, there’s much more than that under the surface. Every time you post and rate a drink you’re helping to build a world-wide database of the best places to enjoy a drink, whether it’s a hand-crafted Old Fashioned or the newest Ale from Dogfish Head.
Well, that was fast.
Roughly one year ago Mobelux launched Carousel. It was a fairly simple idea. Make a straight-forward, simple way to view Instagram on the desktop throughout the day. This post is a retrospective of the process involved in shipping that app.
Here’s a very early (and rough) look at the original concept I threw together on a lazy Sunday in March, 2011.
While hideously simple, this first comp contained all the major elements of what Carousel would become. A single scroll view that held photos. Photos would present basic information and controls to take action on them. A toolbar for secondary actions and scope switching. It was very important that the design didn’t look and feel like an email client with Instagram photos in it. Packing the app with mediocre features and extraneous UI was unacceptable. I showed the first comp to Eddie a few days later and we had a more serious conversation about taking the design to the next phase. At first we were a little hesitant. Deciding to make a new app is not a simple decision for a young company. Risk and cost are serious factors to consider. Not only that, Mobelux had never shipped a Mac app before. It took a little time to come to terms with the complexities of a menu, keyboard commands and dealing with a dynamically resizable window. Ultimately we decided that it was worth a shot.
As with all software ideas it quickly ballooned in scope. What about saving photos? Comment moderation? The biggest drawback we had to deal with was that you couldn’t post a photo outside of the official Instagram app. What can we do well on the desktop that can’t be done elegantly in the official iPhone app? We thought hard about the things that would make a desktop app successful and came up with a feature list. Fast scrolling. Commenting and moderation. Liking. Viewing larger. Searching. Saving. It didn’t take long until we had to scale back all those features to a solid 1.0 and tuck the other ideas away for future updates.
Once we’d nailed down what it would do, we started talking about how the app would look and feel. Something about Instagram filters and the photos that resulted from them had always conjured up images of a Wes Anderson movie. So we went with it. It was a bit risky settling on such whimsy as the inspiration for the visual design on a Mac app. We knew that purists may take issue with the direction. But this wasn’t Mail or iTunes. Why not have a little fun? Eddie worked up some comps of what it might look like if we brought the textures and palette of Royal Tenenbaums’ home to the main window. We settled on the name Instaview for the app and got to work.
Not quite there, but it seemed to be moving in the right direction. Looking back I can’t believe we even entertained that giant label at the top. After a few design sessions we honed it to something more focused and somewhat traditional.
One of our favorite touches was that the frames matched the filter that was used. This was very important early on when photo frames weren’t optional on Instagram. It integrated each photo into the scrollview and made them feel coherent. Each photo frame includes a fair amount of distressing, lending to the aged aesthetic that the filters suggest.
With a design we were quickly becoming happy with it was time to scope the project and get to work building it. Initial discussions revolved around building a framework mimicking iOS navigation controller and UITableView. Shortly after we had that conversation, Iconfactory announced Chameleon, a clean-room implementation of UIKit. The timing was, needless to say, impeccable. Jeremy immediately got to work seeing how viable it was to use Chameleon as the core of Instaview. It wasn’t long before we had a working model with push-pop navigation, asynchronous fetching and popover support. It was time to wrap the Instagram API and start using real data.
In just under two months Instaview was almost ready to ship. As I started setting up all the accounts involved in branding it became apparent that using “insta” or “gram” in the name was going to be an issue. While it instantly tied our app to the Instagram bandwagon it didn’t feel unique. What if Instagram took issue with all the insta-infringers? What icon were we going to come up with that wasn’t a camera or a polaroid? Weren’t we getting sick of seeing all the tan and brown, rainbow-striped ripoffs out there? Furthermore, what if Instagram was acquired and shuttered? Wouldn’t we want to brand so that we could potentially switch services if there was a catastrophe? There had to be a better identity we could come up with.
I was driving around town with Emily and started explaining the issues with the name. I asked if she could think of anything that might work with the visual theme and the idea of artificially aged photos. She told me that in the pathology lab they routinely looked at slides on a dusty old Kodak projector from the late 70’s. It was called a Carousel. The name clicked. It was perfect. I immediately got to work on an icon and came up with a toyed-up version of a Carousel projector (we’ll save that process for a different post.)
Carousel shipped on May 11th, 2011. It’s been a ton of fun to work on over the past year. A few months ago we shipped an update that brought us up to date on all the features we set out to ship last March, including support for five languages. We’re very happy with where the app is. But don’t worry, we’re not resting on our laurels.
Now it’s time to get to work on Carousel 2.
Well, that was fast.
The update should proliferate through DNS and appear in extension updates soon, but if you’re impatient:
On a side note, it’s remarkable how convenient it is to deploy software without having to go through official app review first. It’s almost enough to make a fellow think about writing some non-mobile software.
Ok, maybe I should expound beyond sweet christbabyjesus no. I have a feeling that post could turn into one of those posts that needs more explanation.
Bottom line: you can’t compile Actionscript 3 into an iPhone app. Adobe has written some type of selective AS3 to ObjC translator. Reasons that you don’t want to use that:
All those things add up to a non-reliable entity becoming your single point of failure. And lest we forget, Adobe can barely write Objective-C apps themselves. We’re still waiting for an update to CS4 that makes it not crash when you move the mouse too fast. You really want to trust them to manage your memory, translate your code and keep up with Apple’s SDK?
Let me know how that works out for you.
There she goes, off to the magical land of App Store review.
Calling this release just a 1.1 was hard because there’s so many ne–
Well, you’ll see.
Don’t worry if someone else is already working on your idea. I’m certain they are, but they are decidedly not you and it’s the you that makes your idea unique.
Whether you’re successful or not, it’s a terrific way to get in a lot of trouble. There’s a long list of established rules and regulations that you violate with your creative impertinence, but it feels great, right?
Trusting your gut and charging forward. It can be addictive.
Do yourself a favor and follow these simple instructions at least once in your life. It’s both incredibly liberating and intensely terrifying.
You’ll be sorry if you don’t.
There’s more annoying than when Apple does something with their software that you can’t do. They legitimize it by telling you that they want to get the functionality near-perfect before releasing it for general use. I get that. But when they do something that I really need to be able to do and can’t it’s very frustrating.
Case in point: video. The way you get video from the 3GS is really quite odd. You use a class called UIImagePickerController. You have to set a non-obvious property to get it to run in video mode. Then the OS brings up a modal view that allows you to take/get the video.
So what? They’re forcing video to fit into a class it wasn’t meant for. Big deal, right? If there’s anyone guilty of abusing classes it’s me.
But there’s a real problem here. Photos and movies are inherently different. In image mode, the UIImagePickerController delegate gives you back UIImage objects (as the name of the class leads you to believe that it would). UIImages contain all kinds of data about themselves that are critical to have when manipulating images. They contain orientation data. You can do things like save the image to the camera roll. Find out color space info. Etc. When UIImagePickerController is run in video mode all you get back is an NSURL. A location on disk.
There’s no pertinent information I can glean from that. Generally, if you know what kind of file you’re going to be dealing with you can pull that data into a Cocoa Touch object that has all these extra attributes accessors on it (like, say, UIImage). But these video files are different. There’s no UIMovie object in Cocoa Touch. No UIVideo, either. Apple doesn’t want you mucking about with video files on the iPhone so all they give you is the location. And in most cases the location is fine. You can kick off an MPMediaPlayer instance and pass in that NSURL. Boom! Full screen video. It even knows which way is up.
But what if the video was taken upside down or sideways? The iPhone knows which way is up and should tags movies correctly, right? Look at the video uploading functionality in the Camera app. You can go straight to MobileMe or Youtube. You can even run compression on the videos before they’re uploaded (!).
It’s because there’s a private class that Apple wraps around video objects, and me and you aren’t cool enough to use it. And it’s really starting to piss me off.