Meet The Developers: Emmanuel Crouvisier from Tagg

It’s funny how small this world is. Take, for example, Emmanuel Crouvisier, the owner of Emcro and developer of Tagg, a new application that allows you to easily and quickly tag the pictures taken with your iPhone using facial recognition. I worked with him a few years back when we were both at a magazine in Arizona, and then I went my way and he went his, but we kept in touch. Flash to today, and not only has he built quite a few apps for other companies, but now he’s branched out on his own and developed Tagg for the iPhone, and pushing iOS 5 to the max. With the launch of his new app, we figured it might be a good time to sit down and talk about development, what he’s doing with Tagg and the future of app development. Enjoy.

What is Tagg?

Plain and simple, Tagg is an application that allows you to take a picture, then associate that person with a contact in your address book. It works just like iPhoto does as well, so it automatically picks out faces in the image, and then you can also post those pictures to Facebook or Twitter. This gets your images organized quickly and easily, with no fuss.

Tagg

Tagg

The Interview

How did you get into software development?

Originally I started playing with computers way back when I was 7 or 8. At the time, it was right around the time that the Mac was coming out and playing with some old IBM that taught me some really simple command line, and started doing some basic programming back then. I was very heavily involved in the BBS scene as well, developing back in the early ’90s or something like that, and I developed my own BBS software called Fusion which was based on another software called Shockwave. Things kind of just went from there. I’ve always dabbled in various computer things, development of sorts, and then I ended up moving into iOS development once Apple released the SDK.

What was the first app you developed for iOS?

For iOS, it was the 944 app. I was working for 944 Media, and they asked me to bring the magazine to the iOS platform. Originally, it was going to be developed for the iPhone — this is before the iPad came out — and it was just a quick way to show how to get about the city, getting access to 944 photo galleries. Not necessarily to bring the magazine to the iPhone. We did a little bit of work on that, fell behind a little bit (it fell off the priority list), and then the iPad was announced and that really changed a lot of things. All of a sudden the iPad was a great new platform that made a lot of sense for magazines. As soon as that was announced we picked back up on development work on the iPhone and iPad and launched shortly after the iPad came out with the initial version of 944 Magazine on the iPad and iPhone.

How did that work out?

Generally, the app was very well received. We went out to all 10 of 944′s markets, it was a totally free app, all free content, and we had something like 7,000 users regularly downloading the app every month when the app would update.

Automatically detect faces and assign them to contacts.

Automatically detect faces and assign them to contacts.

The platform that we built for 944 required app updates rather than developing the server architecture to do software updates and downloads over the air. When Newsstand was released, and 944 ceased to exist [944 Media was bought out and then shut down by Sandow Media in 2011 -Ed], Newsstand changed the game for the publishing industry enormously.

Once 944 shut down, you moved on with the acquisition company to produce their iPad and iPhone content. How has Newsstand helped you with developing further applications for the iOS platform?

It’s been tremendous. Looking at the graphs, the difference in downloads right now is literally 100 times the downloads yesterday versus two weeks ago on the same day. Basically, that’s all due to Newsstand. We had a huge, huge spike on the day that iOS 5 was released, and that was across all of the applications. Right now we’ve got three apps in there that were all ready for Newsstand on launch day. Those were New Beauty magazine, Worth magazine and Vegas/Rated magazine. One of those apps made it as a Top 5 Newsstand app for the iPhone, that just about doubled the huge spike of traffic after the initial Newsstand launch, so that has helped enormously as well.

When you were at 944, you were doing some work with Facebook integration as well, right?

Absolutely. So that’s where I got into a lot of the social networking aspect of it. I got the idea after Apple had announced their face tagging in iPhoto, but when they announced it they weren’t ready to ship the product. So when I heard the announcement, it was kind of a “light bulb” moment and I realized that I could probably do the same thing using Facebook’s SDK. I dug through the documentation on there overnight, literally, and the next day started working on that. About 10 days later I had a really good working prototype, and had a lot of back and forth with the Facebook Connect guys. They were really excited about what we were building, because we were the first company using the API to do that. In essence, in about two weeks or so, we went ahead and launched that, and to date we’ve had something like 300,000 photos on the 944 website.

Is that what led you down the path for developing Tagg?

That actually came about to solve a problem I had encountered myself. At some point, I went to restore my iPhone and the backup was taking forever. I looked to see what was going on and it was because I had amassed 1,800 photos in my photo library. A lot of those were photos I had taken out and about with friends that I meant to one day connect to my computer to sync to iPhoto, find the faces, share them with Facebook and go through that whole process. Then I realized that there had to be a better way. If I could find a good way to tag the photos, it would solve the problem on the device.

Uploading and sharing images is simple.

Uploading and sharing images is simple.

I had that idea brewing for a while before iOS 5 was announced, and one day I was driving across country and I was watching some of the videos from WWDC. One of the videos talked about the ability to extrapolate a face from an image. I had an idea, suddenly. I could just take a picture, the app could automatically detect where the faces are, and then make the tagging process really fast, really efficient, making it a no-brainer. You basically take a picture, the app goes through the process of finding the faces, and then you can tap on the face and tag away from there.

What’s the response been like so far?

We made it into the App Store on iOS 5′s launch date which was really important, still going through some marketing efforts right now but we’ve had a few hundred sales. I’m planning a big marketing push in the next few weeks, and we have a few celebrity tie-ins to work in, too. A writer friend of mine who works for Comedy Central wrote a lot of funny bits for the app, so whenever you tag a celebrity in a picture, a funny saying will come up. That should be a good draw. We’re just working on getting the word out for the app.

What kind of advice would you give to someone looking to get into iOS development?

The most important thing is to just get started. A big struggle at first is to get your head around some of the structure meanings and some of the classes that sometimes make a lot of sense when you read them but can be 40 characters long. Xcode makes it much less painful than it could be; Apple has done a great job of supporting developers in that way, and their documentation is really good. Definitely watch the WWDC sessions, they’re fantastic. They’ve got really good example code up there. Reading a couple of books on Objective-C development would help. Just start building apps — that’s the most important thing.


theatre-aglow
theatre-aglow
theatre-aglow
theatre-aglow