Category Archives: Apple Inc.

Airplay adapter in the news

If you’re not yet aware, the guys at Panic found something recently. Check that link and read up. It won’t take too long. What’s pretty cool is the comment someone left. That’s pretty awesome if true.

Airplay is not involved in the operation of this adapter.

It is true that the kernel the adapter SoC boots is based off of XNU, but that’s where the similarities between iOS and the adapter firmware end. The firmware environment doesn’t even run launchd. There’s no shell in the image, there’s no utilities (analogous to what we used to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon designed to accept incoming data from the host device, decode that data stream, and output it through the A/V connectors. There’s a set of kernel modules that handle the low level data transfer and HDMI output, but that’s about it. I wish I could offer more details then this but I’m posting as AC for a damned good reason.

The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. Contrary to the opinions presented in this thread, we didn’t do this to screw the customer. We did this to specifically shift the complexity of the “adapter” bit into the adapter itself, leaving the host hardware free of any concerns in regards to what was hanging off the other end of the Lightning cable. If you wanted to produce a Lightning adapter that offered something like a GPIB port (don’t laugh, I know some guys doing exactly this) on the other end, then the only support you need to implement on the iDevice is in software- not hardware. The GPIB adapter contains all the relevant Lightning -> GPIB circuitry.

It’s vastly the same thing with the HDMI adapter. Lightning doesn’t have anything to do with HDMI at all. Again, it’s just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.

This system essentially allows us to output to any device on the planet, irregardless of the endpoint bus (HDMI, DisplayPort, and any future inventions) by simply producing the relevant adapter that plugs into the Lightning port. Since the iOS device doesn’t care about the hardware hanging off the other end, you don’t need a new iPad or iPhone when a new A/V connector hits the market.

Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable. Given the dynamic nature of the system (and the fact that the firmware is stored in RAM rather then ROM), updates **will** be made available as a part of future iOS updates. When this will happen I can’t say for anonymous reasons, but these concerns haven’t gone unnoticed.

Eureka! Mac & IOS Communication

fooGetting a Mac OS X application and an iOS application to communicate with one another has been the bane of my developer existence for several years now.

I have made several attempts at getting it – and I may have come extremely close in the past only to give up and toss the projects.

I’ve been able to get multiple different iOS applications to communicate over Bluetooth and it’s served me well in many instances. But that elusive iOS to OS X thing hung over me like a cloud filled with human-processed refried beans. However I am happy to report that the cloud now contains daisies (not filthy feces) and has released it’s pleasant bounty on my developer brow. Problem solved.

I finally have an OS X and iOS application that connect using Bonjour and communicate with one another over Wi-Fi. This is a big box that I can now check off my list of developer things to do outside normal projects.

Many of you might be thinking to yourself, “What’s the big deal? That should have been pretty easy!” Well, I am and continue to be a bit of a networking apprentice. It’s never been my strong suit, and I’m sure it never will be. I love being able to get things to talk with one another by various means, but when it involves networking I panic a bit. I stumble. It’s the least fun part of what I might be doing.

I do however want to understand how it works, and I dig into StackOverflow, Google, developer documentation, etc. looking for answers to certain things. And then learn from what I read and attempt to implement them – trying to understand what’s going on under the hood. I don’t like to just copy and past a bunch of code that works and not want to learn about why it works.

Anyway – I am really happy today.

beer

iPad mini BT keyboards? Consider the one that might be on your desk already.

We’ve all been to meetings where people with iPad minis are laid out on the conference table, the cover rolled up to allow a typing angle. Lots of screen typing, but when an edit is required everything comes to a grinding halt as the process is quite slow.

When the use is casual, this isn’t a problem. But when in a meeting one wants to be able to enter and edit text in a timely manner so as not to become removed from the discussion while one struggles with previous points. Continue reading

Nice applications.

Kaleidoscope for OS X.
A difference application for text, images, and folders. Integrates with many tools. $34.99 for OS X. Would be nice if it integrated with Xcode but I don’t really use version control within Xcode itself – I use Versions. And Kaleidoscope integrates perfectly with it. I might need to pick this up for our team.

Finish for iPhone.
Overcome the clutches of procrastination with Finish, a busy iPhone user’s best friend. Unlike other to-do apps that are “clever” for their own sake, only Finish takes advantage of how you naturally think. Finish gets in your face when you need it, stays out of the way when you don’t, and effortlessly keeps you focused the only thing that matters.

Scanner Pro for iOS.
Turn your phone or new iPad into portable scanners. Scan documents, receipts, and whiteboards. Who hasn’t needed to do this from time to time. Saved as PDF. What makes this interesting is really only the ability to combine “scans” into a single PDF.

It’s a little pricy… wondering if I should roll this out myself. Using OCR technology or something. Probably impossible for converting really terrible scribbles and hand-writing to anything useful.

UIGestureRecognizers – sometimes it’s okay to simply say no.

I’ve been working on a project that was using UIGestureRecognizers – mainly Pan but some others as well. As it turns out sometimes it’s perfectly fine to go old school and use touchesBegan, touchesMoved, and touchesEnded.

You won’t go to developer jail and you won’t be looked at unfairly by other developers during a code review. Use the tools you need without introducing any un-needed seeming simplicity that ends up complicating matters. They are nice when needed but don’t jump to the conclusion that you should always prefer them to the previous solutions.

A little bit of a rant and also a revelation.

Development vs. Design

We have all been there. We envision a widget or a piece of UI and how it might work. It takes all of a few moments in time. So you jump into header and implementation files, coding and stubbing. Refactoring and turning methods into Classes. You leave comments to remind you to do some less than fun things. Then the music in your headphones starts to confuse you.

You drop the cans, push back from your monitors and mutter to yourself, “what the fuck am I doing?” You feel confused, wondering if what you wrote makes sense in context of the overall problem-solution game. Have you painted yourself into a corner? Are you on the right path and really close to the end game?
Continue reading

iOS landscape mode & positioning elements

This is a documented note to myself and to others who might be pulling their hair out a little when working with a landscape orientation in iOS.

I was working in landscape mode (only) and I was positioning things around based on frame origins, and I was finding my placements were off. To get into landscape, auto resizing masking is applied to the view(s). However in viewDidLoad is too early to count on anything to base calculations upon. Use viewDidAppear instead. viewWillAppear is also too soon. I forgot about this method since I almost never seem to need it and it’s not part of the default Xcode template for a view controller. I spent nearly an hour wondering what the hell was going on – simple math right? How can it be off like that?

My doh moment for this Friday. I can’t stand when I miss things like this and just piddle coding time away chasing something so rudimentary.

UIPanGestureRecognizer starting touch point

The UIPanGestureRecognizer is a very handy gesture recognizer. It gives us velocity, it had various states, and it can form the basis for some interesting interactions in your iOS applications. One thing it’s not very good at is telling the developer the initial touch location for a recognized pan operation.

By the time UIGestureRecognizerStateBegan is fired, the user’s finger has already moved a bit to actually trigger your delegate method. Depending on how fast the user moved the pan initially, that CGPoint could be fairly close or pretty far away from the actual starting position.

I have been working on a system that connects iOS devices and panning on a control application will affect the others in certain ways. And I needed to know the exact starting position even before the panning was recognized. Since I entered the convenient world of gesture recognizers in iOS I had nearly forgotten about touchesBegan, touchesMoved, and touchesEnded.

Pair the UIPanGestureRecognizer with touchesBegan and you’ll have exactly what you need. In my case I had an area to accept the touchesBegan from, so instead of using hitTest I did something like this which works very well:

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
    CGPoint locationPoint = [[touches anyObject] locationInView:self.view];
    CGPoint viewPoint = [trackPad convertPoint:locationPoint fromView:self.view];
    if([trackPad pointInside:viewPoint withEvent:event]){
        NSLog(@"--> %.2f", viewPoint.y);
    } else {
        NSLog(@"touched outside trackpad");
    }
}

Wonderful.

Wireless charging for iPhone 5 comes when exactly?

I would enjoy a wireless charging system for my iPhone 5 (and potentially a few other devices – a mat large enough for an iPad mini would be extra nice) to make night-time charging at home more elegant. USB cables sprouting from an outlet definitely works but it’s messy looking.

Duracell Powermat

The Duracell Powermat is something what I am after – I’ve checked on their solutions and they have something coming soon™ whenever that is. Anyone have any insight into this? While Binging (I don’t do the Google thing as much anymore – an evil topic for another time), I discovered that Apple has applied for a patent in this arena.

Apple states that their patent covers methods, systems, and apparatus for interacting between a plurality of peripheral devices receiving power wirelessly from a wireless power supply. In one embodiment, a virtual charging area could be created. The virtual charging area will be able to extend to about one meter (3 feet) from a central station that incorporates a NFMR power supply.

Continue reading

iPad mini review

I didn’t receive my 16gb iPad mini until the Thursday after Christmas because it had to be shipped and there was a serious run on them, etc. I had given my old iPad away and I simply restored from an iCloud backup. I have been using it since.

Siri on the iPad is pretty nice. I use it on my iPhone quite a lot, mostly for reminders, appointments, iMessage, etc. when you need it, it’s pretty great. Same on the iPad.

The size. I do have to admit to sometimes missing touch targets every now and then because of the decreased size of UI elements. Nothing major, but even when it happens once its a pain in the ass. The keyboard has never given me problems, it’s areas within applications. So that bit isn’t the most fun. I can put this into a back pocket, with the fear of sitting down forgetting its there. It’s quite portable and every bit an iPad.

The screen. It’s not quite retina, but its something I haven’t noticed very much to be honest. Every now and then I might see some white text on a black background that is small enough to make it blurry. Quite rare. The screen is decent and utilitarian. I have no qualms.

I use the iPad even more now, and I plan on using it at work for taking notes, etc. even more than I do with an iPad 3 that I have kicking around there. Lugging it was always the drawback.

More as I continue to use it, but I love it. Writing this now in landscape on it right now