A Jackrabbit Labs Experiment
Recent PressBuilt In Austin
Hacking a drone for the hell of it
We’re avid Apple Watch users, and more than a few of us are obsessive about Drones — so we took the logical step and recreated The Force.
Ok — not really, but we wanted to control a Parrot drone using the accelerometer built into the Apple Watch — so we built it.
The concept is really simple. You stream the x, y and z accelerometer readings to an app running on the phone connected to your Apple Watch and whenever those values correspond to one of the predefined control gestures you send a command to the drone.
Below is some insight into the process which is hopefully enjoyable and educational — but if you’re looking for a straight tutorial — We put together a Github step by step tutorial for anyone interested in being one with the force. Link to that is below, you can also check out our Gesture Guide video below for reference.
Exploring Parrot’s ARSDK3
Our initial purchase was a Parrot ARDrone 2.0. Selected because it’s one of the most affordable drones on the market and comes with an iOS SDK. However, we quickly realized that the ARDroneSDK2 which is designed to work with the drone we purchased doesn’t actually compile against the latest version of iOS. We may have been able to go back to Xcode 6.2 to compile ARDroneSDK2 but we chose to get a drone that works with the newer ARSDK3. ARSDK3 requires a Parrot Rolling Spider, Cargo Minidrone, Jumping Sumo, Jumping Sumo Evos, Bebop Drone or Bebop 2. We opted for the Cargo Minidrone, $100 on Amazon!
We were able to get the drone control working within an hour once we received the new drone which works with the iOS 9 compilable ARSDK3.
Next we needed to find a way to recognize arm gestures from the Apple Watch accelerometer readings. Apple hasn’t included any type of gesture recognizers in WatchKit nor watchOS 2 so we had an excuse to develop some ourselves!
The Apple Watch accelerometer outputs three values. There is an x value which measures the angle of the screen moving from left to right with a value of +1 when the digital crown is on the bottom side and a value of -1 when it’s on the top side. There is a y value which measures the angle of the screen moving up and down with a value of +1 when the face is rotate completely up and -1 when the face is completely down and there is a z value which measures the direction of the screen with a value of +1 when the screen if face up and -1 when the screen if face down.
We started by writing down approximate values for x, y and z for a bunch of different gestures we knew we would need. These included holding an arm straight in front, rolling the arm left or right and raising and lowering the arm from the straight position. We examined all the ways to perform each of the gestures we needed to recognize and found that each gesture has characteristic regimes of values for x, y and z. The value for each axis can go from -1 to +1 but within this range there are three characteristic regimes. These three regimes are within ε of zero, greater than +ε and less than -ε where ε is a small number. Empirical testing revealed that we needed an ε about twice as large for y as for x and z in order to correctly identify the gestures we’re interested in.
The above drawing depicts the eight gestures (land + launch are the same and rotate) that we can identify on the left. On the right are the characteristic x, y and z value regimes for back, forward, left, right and land. At this point we had a functioning iOS SDK for sending control commands to the drone and a method for recognizing each of the eight control gestures.
The Parrot SDK for iOS provides a whole suite of control commands. These include autopilot commands such as take off, land and hover in place as well as manual commands. There’s a manual command for controlling gaz, which is the rate of ascent or descent to control the altitude of the drone. There’s commands for rolling left and right and there’s commands for pitching forward and back. There’s also a command for yaw, which is the direction the drone is facing, but we don’t use it yet. We had no problems quickly connecting our watch app gestures to these autopilot and manual commands.
Wake Screen Solution
We did run into an issue with the Apple Watch turning off its display and pausing the app when the user moved away from the wrist raised position. This posed a real problem because most of our gestures are outside of the wrist raised regime. Our first attempt to keep the screen (and app) active outside the wrist raised position was built on the fact that the screen will remain active as long as a finger is placed on the screen. We tried performing gestures while keeping a finger from the opposite hand on the screen but this second hand made it hard to precisely perform each gesture.
Next we tried taping the negative side of battery and the tip of a touch stylus to the screen but in both cases this failed to keep the screen active due to a lack of pressure on the screen. Finally we we’re able to find a software solution by reading into the documentation for a HealthKit workout session. By maintaining an active WCSession we are able to keep sending live accelerometer data from the watch to the phone after the screen has turned off. This allows us to continue recognizing gestures even when the watch isn’t in the wrist raised position.
We’ve continued improving our implementation of the arm gesture recognizers and we’re now able to reliably control the drone using simple wrist movements. Implementing the gesture recognizers was an interesting puzzle and it turns out controlling a drone hands free is even more fun than you would expect. We were able to achieve the use case we set out to build and we gained some knowledge about the ARSDK3 for controlling drones and how watchOS uses WCSessions to processes measurements in real time — even while the watch isn’t in the wrist raised position. All around success!
For more innovation, check out our Smart Toy Labs project, link below:
Building Smart Toys w/Bluetooth Beacons