A 25 Year Quest To Build The Perfect Drawing App

Russell Okamoto
23 min readAug 10, 2018

From inspiration in 1992 to AppStore release in 2018

Spriteville’s logo is a red jellyfish based on the shape and color of upper atmospheric sprite lightning; it’s done with bitmap graphics and resembles an alien from Space Invaders as homage to sprite-based arcade games I played in the 80s.

Over 25 years ago, I began my quest to build The Perfect Drawing App.

In March this year, this long journey concluded with the launch of the iOS app Spriteville.

I call Spriteville a “mobile canvas for dynamic art”. The app was quietly released into the AppStore sans marketing. Few people know about it. No techblogs reviewed it. User traction is unremarkable.

Nevertheless, Spriteville is the culmination of decades of research and development, trial and error, struggle and passion.

Spriteville evolved from work on four previous projects spanning two decades.

The first project was developed in 1992, a Pen Computing app combining digital ink and word processing. The second project was from 1994, a hypertext notetaking app inspired by Xerox PARC’s Notecards project. The third project was a brush stroke engine developed in 2013 for a hyperlocal network app. The fourth project was a drawing-based iOS Custom Keyboard prototyped in 2014.

I’m not proud of a 25 year software delivery lifecycle. I don’t think anybody would be. On the contrary, it’s humbling and embarrassing to admit how long this project took to finish. And at what price and for what payoff?

My only solace is that throughout this Spriteville journey, I encountered numerous roadblocks. Each one felt immovable and insurmountable. Sisyphean. But persistent grinding eventually wore a hole through each obstruction allowing a tiny ray of hope to light the way forward.

If you are struggling with your own project, no matter what it is, I hope my odyssey gives you encouragement to see your project through.

There’s a fine line between passion and lunacy. If you cross it, know that you are not alone.

Each atom of that stone, each mineral flake of that night filled mountain, in itself forms a world. The struggle itself toward the heights is enough to fill a man’s heart. One must imagine Sisyphus happy.

— Albert Camus, The Myth of Sisyphus

The saga of Spriteville traverses over 25 years. It’s a long (perhaps soporific?) tale beginning in an ancient time in the Bay before Google, Facebook, Twitter, and Social Media. I chopped it into parts so skip to whatever interests you.

PART I – WHAT IS THE PERFECT APP? describes what I think makes The Perfect App. (1 minute)

PART II – HISTORY OF SPRITEVILLE chronicles my previous projects that inspired Spriteville. (5 minutes)

PART III – BUILDING SPRITEVILLE describes the design challenges (solutions) encountered (surmounted) when building the Spriteville iOS app. Dry, technical, and maybe boring unless you care about UX. (11 minutes)

PART IV — REFLECTIONS concludes with thoughts about why building Spriteville took so long and what I’ve leaned (3 minutes)

PART I — WHAT IS THE PERFECT APP?

I think The Perfect App is any app that empowers you to engage in a medium as both a writer and a reader.

This “symmetric authoring and consuming” is a capability called tailorability.

For Language, the Word Processor is an app that exhibits tailorability. The Word Processor lets people readily write documents that other people can consume as readers. For Math, the Spreadsheet embodies tailorability since anybody can create calculations that other people can easily apply and reformulate. From a Web perspective, Blogs create tailorability by enabling ordinary people to create interesting Web pages that others can read/browse. In Photography/Video, the Camera enables tailorability by letting anybody quickly snap photos/videos that others can browse and enjoy.

Beginning with Pen Computing in 1992 to today’s immersive AR and VR environments, I’ve been passionate about enhancing tailorability for Art and Self-Expression.

I hope Spriteville helps nurture the art within all of us — enabling anybody to create dynamic art that moves the world.

PART II — HISTORY OF SPRITEVILLE

Spriteville lets you paint with physics to create animated scenes

CONCEPTION (1992)

Spriteville was conceived on February 19, 1992.

Pen Computing was the rage in 1992. The year started out hot with new Pen-based hardware and software from major vendors like Microsoft and aspiring startups like PenPoint, Go, Momenta, Slate, and GEOS. (Apple was busy secretly creating the Newton Personal Digital Assistant — like a very early, heavy, bigger iPhone with pen input — that was released a year later).

On the first day of the Pen Computer 1992 conference in San Jose, I tried Pen Computing and digital ink for the first time and immediately fell in love. The pen seemed like a magic wand with unlimited potential for tailorability.

IMHO, digital ink and pen-based apps — for drawing, notetaking, messaging, etc. — still remain untapped as a creative medium ripe for tailorability.

In particular, I was hooked on the idea of building an app that let you draw free-form, handwritten notes and drawings that could be manipulated like text in a Word Processor. I called this an Ink Editor.

I set out to build my own Ink Editor calling my product vision Tapestry, evocative of weaving together many handcrafted threads of information.

Notes from 4/28/1992 for an “Ink Editor” with spline-based curve smoothing and built-in “typesetting” functions like auto-scaling, auto-kerning, etc.
Pen user interface notes exploring an offset cursor “without obstruction by hand or pen” that enables “precise pointing in applications where pen tip is too large”
More notes about expanding highlight regions for pen input and object snapping

After the conference, that summer, I launched my first software startup aptly named Mitochondria (Mitochondria means “thread-like”) in Burlingame, California.

Still one of my favorite logos that I have designed

Unfortunately I discovered my app development ability did not match my ambition. Progress on Tapestry was slow. I was inexperienced with Microsoft Windows programming and was also working on contract projects.

By the end of 1993, I only had a proof-of-concept for Tapestry running on Microsoft Windows for Pen Computing with a Wacom tablet tethered to my old IBM PC Compatible. Brush strokes in Tapestry were primitive. No dynamic width based on pressure, no color-mixing or anti-aliasing, no ink diffusion, no style.

I never completed Tapestry.

Competitive products such as InkWriter from Aha Software and Microsoft OneNote (2002) eventually filled the digital ink void.

Tapestry, however, fueled my passion to build a drawing app where every brush stroke could be manipulated as movable ink. Decades later, all of Tapestry’s features — moving, editing, merging, and rearranging brush strokes (after they are drawn) — were incorporated into the Spriteville iOS app.

A NEW STATE (1994)

Thread let you create hyperlinked sticky notes

In 1994, I married my wonderful wife and moved from the California Bay Area back to my hometown in Oregon.

I joined GemStone Systems, a pioneering database company (acquired by VMware in 2010).

In the office during the day, I worked on enterprise database management systems and scalable distributed systems. At home, I hacked away on my next tailorability project I called Thread.

Unlike pen-input and digital ink in Tapestry, Thread was based on keyboard-input and text. Thread was inspired by Notecards, a pioneering Xerox PARC hypertext system I had read about years before at the Stanford Computer Science library.

Thread was my first development foray into building a Spatial Hypertext application. Thread let you create sticky notes that you could instantly hyperlink together to form a personal web of connected notes. You could embed text and images into a sticky, search for text, and hyperlink stickies together using an elegant, direct manipulation gesture I called “Drag-and-Link”

My “Drag-and-Link” direct manipulation gesture to create hyperlinks

Furthermore, each sticky note in Thread expanded automagically as text impinged upon the screen border. So a sticky could grow infinitely as you typed — in either the horizontal or vertical direction. Each sticky acted like an infinitely expandable, zoomable virtual canvas where you could navigate related stickies by simply following hyperlinks from one to another.

I worked on Thread intermittently over the next few years and had a production-ready version for Microsoft Windows. My master plan was to first market Thread and then ultimately merge Thread’s notion of hypertext with Tapestry’s digital ink. My end goal was to build a tailorability app based on hyperlinked virtual whiteboards with freely placed text, handwritten notes, embedded images, documents, and drawings. I named this product vision Manifesto.

Way back in the 90s, software apps were delivered on floppy disks in shrink-wrapped boxes sold at stores like Fry’s Electronics. No “internet download” or App Store or Play Store. So software sales costs were significant. I didn’t acquire the resources to market, package, or distribute Thread. Thread never took off, and I never merged Thread and Tapestry to build Manifesto.

Thread, however, proved to me the magic of a “zoomable, infinite canvas with hyperlinking”. I now refer to this capability as Serengeti (Masai for endless plains). Years later, I reimplemented these Serengeti features into Spriteville with the intent of providing an unbounded digital canvas for unbounded digital expression.

A NEW WAVE (2013)

The third project that inspired Spriteville was a brush stroke engine I developed for a virtual sticker app called Wave.

Wave was an iOS app created by my social network startup Celly. Wave let you cutout stickers by drawing a brush stroke on a photo. You then dropped these virtual stickers at locations for others to discover. Wave stickers were like graffiti.

I chose a vector-based approach for this brush stroke engine. Bezier curves formed the outlines of the brush strokes. My takeaway lesson from this project was how hard it is to create variable width “fat curves” or “offset curves” as the outline of brush strokes. Self-intersecting lines and areas of high curvature — e.g. when a stroke folds back on itself — are tricky. The brush stroke engine also utilized the SpriteKit game engine to emit particle feedback — like flying sparks — when you drew a sticker outline.

When the 1.0 version of Wave was finished, Celly regrettably did not have the resources to execute a viable go-to-market strategy. So Wave never made a splash as a product, however, it shaped Spriteville in two profound ways.

First, when it came time for me to reimplement the brush stroke engine for Spriteville, I chose a stamp-based approach instead of the vector approach I used in Wave. Stamps instead of vectors for brush strokes enhanced creativity and expressiveness. In Spriteville, you can create brush stroke stamps from any texture, including built-in text and emoji, images and cutouts, or even another brush stroke. Brush strokes can also be augmented with particle emitters or shaders.

Second, developing the brush stroke engine for Wave gave me an epiphany:

I saw that physics could be used as a powerful animation tool, allowing anybody to quickly create rich, fun scenes with movement based on simulated forces and effects.

Unlike animation software that requires you to tediously draw in-between frames, scenes in Spriteville can be quickly and easily animated with simulated physics. Any object, including each brush stroke, is a first-class sprite that can impart and respond to physics forces emitted by other sprites. Thus, you can rapidly tailor scenes where sprites move, bounce, collide, follow paths, orbit, fall with gravity, or break apart upon impact with other sprites. Spriteville lets you use your phone to quickly create scenes fun scenes with animation, what I call “dynamic art”.

CUSTOM KEYBOARD (2014)

GIF made with custom drawing keyboard prototype

In Fall of 2014, iOS 8 Custom Keyboard Extensions were new. No drawing-based keyboards were out yet. So packaging the Wave brush stroke engine as an iOS Custom Keyboard seemed like a no-brainer. The goal was to let you pop up a Custom Keyboard, draw fun animations, and then share them as a GIF through any host app like iMessage, Twitter, Instagram, or Facebook.

Unfortunately, after a few months of prototyping, I discovered several (Apple) showstopper bugs with combining SpriteKit and Custom Keyboards. After a couple Apple TSIs (Technical Support Incidents), I learned system memory limitations also prevented SpriteKit integration with Custom Keyboard Extensions.

There was no choice: Spriteville couldn’t be implemented as a Custom Keyboard. It had to be a stand-alone app.

PART III – BUILDING SPRITEVILLE

Spriteville’s feature map and user interface components

Development of the Spriteville app commenced in Winter 2014. By first quarter of 2015, the beta looked promising, but in June of 2015 Apple pulled the rug out from all SpriteKit developers with the release of iOS 9 at WWDC.

iOS 9 instantly broke every SpriteKit app. Despite developer outcry, Apple did not fix SpriteKit for almost a year and a half until the official iOS 10 release in Fall 2016.

Core APIs like SKView.textureFromNode() simply did not work. Frames Per Second dropped to half compared to iOS 8. Sprites that animated fine in iOS 8 disappeared completely in iOS 9.

Aside from Apple’s fiat decision to sabotage all SpriteKit apps, building Spriteville was hard because of many design constraints.

To build the perfect app, I think it helps to first know the playing field — who’s already out there and where obstacles are hidden.

I keep my antennae up for mobile creativity apps — including painting, drawing, animation, handwriting, notetaking, and augmented reality apps from players like Adobe, Autodesk, Apple, Microsoft, Facebook, Instagram, Snap, and Twitter, as well as apps made by startups.

Throughout Spriteville’s development lifecycle, I tracked app updates and new apps that came along, noting slick features (like color-mixing, smudging, object segmentation) as well as design constraints and shortcomings.

What I discovered is artistic expression — in all these mobile creativity apps — is constrained by a combination of hardware limitations, performance issues, and missing user interface idioms.

So Spriteville is designed to overcome each of these constraints.

DESIGN CONSTRAINTS

1. Small Screens

Touch input area and travel distance for common tasks like drawing brush strokes, handwriting sentences with digital ink, or moving and sizing digital objects on a composition space (canvas) are limited by the physical size of device screens. For example, as your finger or stylus draws a brush stroke across a device screen, this touch input inevitably encounters the edge of your screen which limits stroke length and forces you to break up continuous strokes into multiple, smaller stroke segments. This is slow, cumbersome, and artistically undesirable. As screen size decreases — e.g., from desktop to laptop to tablet to phone — such constraints on expressiveness grow.

2. Fixed Composition Space

Free composition space — the undrawn area on a digital canvas — naturally diminishes as the canvas becomes increasingly populated with brush strokes and other canvas objects like images. To make additional room for composition, the canvas needs to be frequently panned or expanded. In existing competitive apps, such operations typically require you to tap a user interface element like a scrollbar to move the canvas in a discrete X or Y direction. You must estimate the amount of movement required to create requisite space for a next drawing object by tapping the scrollbar potentially multiple times. Such canvas operations are cumbersome and time consuming. Moreover, in existing apps, canvas size is commonly preset to a specific size so cannot grow dynamically should you wish to draw beyond the preset canvas borders. A more expressive drawing canvas would pan and grow automatically and effortlessly as needed to make space for you.

3. Lack Of Depth

Existing creativity apps lack a direct manipulation gesture for depth control so artists have no way to convey both downward/inward and upward/outward Z movements. Absence of such depth gesturing limits expressiveness in digital creativity apps across many fields including Art, social networking, web communication, games, drafting, VR, and AR.

4. Ink Is Immovable

With existing creativity apps, brush strokes are fixed in place once they are drawn on a virtual canvas, as if digital ink dries up, hardens, and thus cannot be lifted from the canvas and repositioned. To “move” an existing digital stroke, you must resort to a multi-step operation — first removing the stroke by erasing or undoing it and then redrawing the stroke in a new location. This process is time consuming and tedious as it may not be simple or quick or even possible to recreate the stroke’s original size and appearance. It would be convenient to be able to simply reposition a stroke after its initial placement.

5. Brush Strokes Are Piecemeal

Complex shapes are naturally composed of multiple strokes and objects. A drawing of a bird for example may be drawn as multiple connected strokes — strokes for the wings, head, feet, etc. Existing apps, however, treat each drawn stroke or digital object as physically isolated from other strokes and objects. To transform — e.g. translate, rotate, or scale — a multi-stroke shape, you must first select an area around all related strokes to form a “cut out” area that can then be transformed in a subsequent step. Cut out techniques may be aided by predefined starting shapes like a selection rectangle or oval cut out where edges of the cut out may then be manipulated with handles to more closely match stroke topology. Cutout tools like “Magic Wand” or “Magnetic Lasso” utilize image segmentation or ML-based object selection (e.g. Adobe “Select Subject”) to identify shapes that closely match the borders of objects. Nevertheless, all cut out techniques require multiple, iterative workflow steps (like repeatedly fine-tuning cut out selection handles or adding/subtracting cut areas) and require the artist to switch tools (e.g., swapping out the current brush tip for a cutout tool) commonly located on different user interface views (cut out tools are frequently placed on palette views separate from the main canvas). Such context switching takes unwanted time, interrupts visual focus, and most problematic of all, slows creative flow.

Beyond lack of support for combining multiple strokes into a connected, multi-stroke object, existing digital creativity apps do not readily combine arbitrary digital objects together into composite digital objects. It would be useful to be able to simply drag one object onto another object to form a composite digital object that in turn could itself be the source or target of another compositing operation. As part of such a compositing operation, akin to the need for depth gesturing mentioned previously, a method of specifying which object appears on top of (or occludes) another object would also need to exist. Such digital object composition modalities are absent in existing creativity apps.

6. Inanimate Drawing Objects

Real world objects have mass, charge, energy, and respond to forces like gravity. They can be launched into motion and follow Newton’s Laws of Motion. Existing creativity apps, however, treat digital strokes and objects as static and inanimate, devoid of real world physics properties and behaviors. Though you can manually scale, translate, and rotate digital objects, they are left motionless when the transformation completes. To impart the appearance of motion, some apps help you create zoetrope-like animations by drawing layered scenes frame by frame with onion-skinning, keyframe, and timeline sequencing tools. Using these tools, however, takes considerable time, introduces workflow latency, interrupts focus, and requires undesirable context switching. You must also grok complex user interfaces and scale steep learning curves. It would be useful, in contrast, if you could create animations quickly and effortlessly with digital strokes that were inherently “alive” — imbued as objects with simulated physics properties that responded to field forces and collisions with other objects. Furthermore, if digital strokes and objects could be augmented by shaders (image processing programs) that simulated movement effects, the need for tedious, intricate crafting of many, in-between animation frames could be reduced or obviated altogether.

7. Style Is Limited

Creativity apps typically provide a set of built-in brush styles. Apps like Adobe Photoshop and Sketch let you import and purchase brush templates and even design custom brushes, however, there is no instant way to convert an existing canvas object into a brush tip or to augment brushes with shaders, particle-based emitters, or hand drawn textures, text, and emoji.

8. User Interface Is Intimidating

Creativity apps typically support deep and wide feature sets, so they rely on many, diverse user interface components — including various operational modes, menu interfaces, and specialized palette views. Unfortunately, as the number and variety of user interface elements grows, an app’s user interface becomes weighty, confusing, and cluttered, ultimately intimidating first time users. It would be desirable, in contrast, if a digital creativity app provided expressive functionality but with minimal modes, menus, and palettes, making the app intuitive and welcoming to new users.

DESIGN SOLUTIONS

Spriteville overcomes the aforementioned constraints with novel gestures and methods:

  1. Edge Pan Gesture

Spriteville automatically expands and pans a digital canvas as brush strokes approach the edge of the digital canvas; this “Edge Pan” gesture allows brush strokes to continue indefinitely without impinging upon the edge of the canvas or the device screen edge; the size and direction the canvas expands and pans varies according to the velocity (speed and direction) of the touch-based stroke as well as the location of the touch input device points (e.g. from a stylus or finger) in proximity to the canvas edge; stroking faster or nearer to the canvas edge triggers greater expansion and panning of the canvas compared to stroking slower or further away from the canvas edge; a minimum threshold region outlining the canvas can be preset to define a region where stroking begins to trigger canvas expansion and panning; the amount and speed of expansion and panning can be regulated by common easing functions (e.g, ease-in, ease-out, ease-in-out) resulting in smooth movement.

2. Tap Pan Gesture

Spriteville lets you expand and pan a digital canvas by pressing near the edge of the canvas; as the canvas expands, it makes space in a region near the pressed edge and pans the canvas to create room for drawing new brush strokes; the touchpoint size, force, pressure, location, and duration of the press controls when this “Tap Pan” gesture occurs, how far, and how fast the canvas expands and pans; e.g., pressing harder or nearer to the canvas edge triggers greater expansion and panning of the canvas compared to tapping softer or further away from the canvas edge; a minimum threshold region outlining the canvas can be preset to define a region where pressing begins to trigger canvas expansion and panning; the amount and speed of expansion and panning can be regulated by common easing functions (e.g, ease-in, ease-out, ease-in-out) resulting in smooth movement.

3. Tilt Pan Gesture

Spriteville lets you expand and pan a digital canvas by tilting the app’s device; as the canvas expands, it makes space in a region opposite to the tilt direction and pans the canvas to create room for drawing new brush strokes; degree, direction, and duration of tilt controls when this “Tilt Pan” gesture occurs, how far, and how fast the canvas expands and pans. For example, large tilt angles expand and pan the canvas more than smaller tilt angles; the Tilt Pan gesture can be configured to trigger repeatedly (if the device is constantly tilted beyond a threshold angle) or triggered once whenever the threshold angle is crossed (and then reset to wait for a next tilt event); the amount and speed of expansion and panning can be regulated by common easing functions (e.g, ease-in, ease-out, ease-in-out) resulting in smooth movement.

4. Corkscrew Gesture

Spriteville implements a novel Corkscrew gesture, a direct manipulation user interface method invented to solve Z-Axis depth management issues for screen-based apps.

5. Draw And Composite Gesture

Spriteville lets you combine two or more brush strokes by drawing a brush stroke so that it intersects previously drawn brush strokes or by dragging and dropping brush strokes onto each other.

6. Drag And Composite Gesture

Spriteville lets you combine two or more digital objects by dragging and dropping digital objects onto each other forming a composite object; digital objects include digital brush strokes, shapes, images, videos, and 3D models.

7. Every Object Is A Sprite

Spriteville automatically transforms digital brush strokes and objects into simulated physics objects known as “sprites”; sprites can have physics-based properties including mass, charge, restitution, friction; you can thus animate a sprite by simply selecting the sprite and then flinging it to impart a force to the sprite; the direction and momentum of the sprite is based on the sprite’s physics properties and the field forces from surrounding sprites; sprites can collide with (bounce off) other sprites and respond to field forces emanated by other sprites; sprites can be composited together to form composite sprites with combined physics properties, e.g., a sprite that responds to a gravity field can be combined with a sprite that emits light to create a composite sprite with both properties.

8. Draw With Physics

In Spriteville, sprites may be endowed with specific simulated physics behaviors and field forces that affect other sprites. For example:

  • A non-colliding particle (“Neutrino) passes through other sprites but responds to force fields;
  • A colliding particle (“Fermion”) collides and composites with other sprites;
  • A gravity particle (“Graviton”) responds to gravity, device tilting and shaking, and collides and composites with other sprites;
  • A crumbling particle (“Asteroid”) breaks up on collision;
  • A path field (“Gluon”) defines a path that other sprites follow repeatedly;
  • An orbital field (“Planet”) creates a gravity field based on its size causing other sprites to orbit around it;
  • A wind field (“Acceleron”) accelerates sprites in a direction;
  • A light field (“Photon”) emits light coloring nearby sprites;
  • A repulsive force field (“Positron”) pushes other sprites away;
  • An attractive force field (“Electron”) pulls other sprites closer;
  • A speed field (“Tachyon”) speeds up other sprites;
  • A drag field (“Dragon”) slows down other sprites;
  • A boundary field (“Edge”) creates an open or closed region that can contain other sprites;
  • A disintegration field (“Black Hole”) destroys sprites that touch it;
  • A creation field (“White Hole”) replicates sprites that touch it;
  • A grow field (“Higgs”) grows the size and mass of other sprites;
  • A shrink field (“Pym”) shrinks the size and mass of other sprites;
  • An opacity field (“Pulsar”) fades/unfades sprites that touch it;
  • A hyperlink particle (“Worm Hole”) teleports a sprite across layers or scenes;
  • A rope field (“Constellation’) links sprites into a connected chain;

9. Layering Physics

Sprite physics behavior can be applied (layered) on to other sprites by simply brushing one sprite onto one or multiple other sprites; for example an Acceleron (wind field sprite) can be brushed over another sprite and cause the sprite to accelerate.

10. Instabrush

Spriteville lets you drag and drop a selected brush stroke or digital object onto a canvas region (e.g. represented by an icon) to create a brush tip that instantly replaces the current brush tip; this “Instabrush” gesture enables artists to quickly create new brushes that can in turn be recursively used to build other brushes; this Instabrush gesture improves expressiveness by enabling experimental brushes to be fashioned quickly.

11. Import Textures

A brush tip can be created from any digital asset including any combination of text, emoji, brush strokes, images, video, or 3D models.

12. Shaders

A brush tip can be augmented by shaders (or other image processing program), lighting effects, and normal mapping of textures.

13. Particle Emitters

A brush tip can be augmented by particle emitters.

14. Haptics

Brush stron emits haptic feedback and sound effects based on physics properties, touchpoint size and pressure, stroke length, and brush tip, creating tactile and aural feedback that helps you control attributes of a stroke while it is being drawn or transformed.

15. Brush = Texture + Filters + Physics

A sprite can be created by a brush stroke configured with a brush tip texture, one or more image processing filters, and one or more physics properties.

16. Holistic User Interface

Spriteville defines an expressive yet intuitive user interface for digital creativity where you can:

  • Drag one finger to draw digital brush strokes;
  • Tap on an empty canvas area to stamp a digital object or tap an existing object to lock/unlock it;
  • Long press with one finger to select and drag an object (or layer); this operation brings the selected object’s layer to the top/front of the canvas;
  • Pan with two fingers to select and drag an object (or layer); this operation brings the selected object’s layer to the top/front of the canvas;
  • Tap with two fingers to pause all sprite physics behavior; subsequent two-finger tap resumes sprite physics;
  • Pinch an object (or layer) to zoom (scale) it;
  • Pinch and turn an object to rotate it;
  • Move one or two fingers in a circular motion (Corkscrew gesture) over a selected object (or layer) to move the Z-position of the object (or layer);
  • Utilize previously defined gestures — Edge Pan, Tap Pan, Tilt Pan, Instabrush, Draw And Link, Drag And Composite;

In summary, Spriteville lets you draw brush strokes of any length. Canvas size grows automatically and can be virtually unlimited. Depth of digital objects and layers can be manipulated directly using a simple, seamless gesture. Individual brush strokes can be transformed at will and linked together into composite objects. All drawing objects are treated as digital sprites that can be animated using simulated physics. Custom brush tips can be created instantly and augmented with shaders and particle effects. First-time artists can use the app intuitively.

“Wheatfield with Crows” animation task made in 45 seconds with Spriteville. Composing with a palette of physics-based sprites lets you quickly create dynamic art that imparts whimsy, character, and personalization beyond static photos, videos, or GIFs.

REFLECTIONS

My quest to build The Perfect Drawing App began over 25 years ago.

Looking back, the main question I ask myself is “Why did it take so long?”

I’ve been programming since I was ten in many languages. I’ve done desktop GUI programming, concurrent programming, mobile iOS and Android programming, built web sites, databases, and fault-tolerant distributed systems. I’ve studied algorithms, machine learning, SICP, Paxos, implemented compilers and realtime query processing engines. But I’ve never considered myself an extraordinary programmer. I eventually get the job done, but I refer to myself as a Debugger rather than a Developer since I spend way more time testing, inspecting breakpoints, and profiling performance issues compared to typing lines of error-free code. Interaction design (which I enjoy) simply takes me a long time with many iterations to “get right”. Depending on your experience, skills, and resources maybe building this Spriteville app would have taken you (or your team) just a few months, or certainly less than a year?

This tale has been a cautionary one. I’ve learned I am susceptible to feature creep. I’m probably no better at estimating tasks or delivery dates than the many engineers I have managed over my career. I’m easily lured by the pursuit of Perfection.

An early 2015 iOS 8 version of Spriteville could have been released to the App Store before Apple ruined all SpriteKit apps with iOS 9. This Spriteville version might have been MVP, however, it did not have video recording, layer management, sprite composition, advanced physics like Gluons (pathways), or Serengeti — a zoomable, infinitely expandable, hyperlinkable canvas. When Spriteville was delayed by the iOS 9 fiasco, I took the opportunity to implement these next level features thinking it would be simple.

I was wrong.

Making a canvas zoom and expand to any size enhances creativity since it lets you grow your composition space and draw brush strokes of any length. However, this capability introduces a host of performance challenges, functional requirements, and testing. Long brush strokes can easily exceed Metal GPU texture limits, so extra checks and downsizing are required. Zooming creates physics bodies and textures that need to be dynamically rescaled e.g. when trying to composite sprites drawn at different zoom scales. Zooming also makes sprite wrapping behavior (when a sprite reaches the screen edge, it wraps to the opposite side of the screen) non-trivial since scaling impacts the logical size of each canvas layer. Allowing virtually unlimited numbers of user-drawn sprites on a scene means physics bodies must be efficient. So convex polygonal shape smoothing is needed to create simpler outlines that retain just enough detail for realistic collisions. Physics body shapes also need to be dynamically updated based on Level-Of-Detail. To calculate overlapping physics bodies for sprite composition, I struggled with three separate approaches (all failed) including a GPU-based Moore Neighborhood Trace algorithm using Metal and convolution filters. CGPathCreateCopyByStrokingPath was too slow and CPU expensive. Shape intersection with the neat VectorBoolean library (implementation of Bezier Clipping) was also too slow and CPU intensive. Sprite composition highlighted the need for Z-depth management of sprites. When compositing multiple sprites, you want to freely adjust the Z-depth order of individual sprites. This need inspired the Corkscrew gesture giving you a way to directly move one sprite on top of another sprite, but that gesture took years to invent.

My Celly cofounder told my about Disney’s MultiPlane Camera. When I researched this story I figured it would be simple to add multiple planes/layering. Because I had already implemented a single canvas layer that could be infinitely stretched and zoomed, I assumed adding multiple such layers like the Disney camera would be cake. Just an array right? I was wrong. Again. Multiple layers introduces the need for layer management chrome — adding, deleting, or editing different layers — and the need to seamlessly move sprites on top of sibling sprites within the same layer but also across layers. That’s just the beginning.

After this quest, when I hear the word simple, I tremble.

I’ve found it hard to make one feature simple. And way harder of course to distill a palette of features into one simple solution. “Just make it simple” is perhaps the hardest challenge. It’s a management directive for “Figure out the essence of a feature, crack its DNA code, do the most elegant solution, and don’t spend too much time or money working on it.”

In my experience,

Simple is a destination
Arrived at through sweat, frustration, and innovation
Lying beyond the last step of product delivery
Where design delights users

Before this quest, I used to say,

“Simple tools that empower creativity are magical.”

Now I would say,

“Magical tools create simplicity”

Spriteville’s goal is to nurture the Art within all of us.
To make animation with digital ink as simple as typing.

I hope Spriteville creates simplicity for you!

With Gassho!

Russell

--

--

Russell Okamoto

Co-creator of Spriteville, Dynamic Art, http://spriteville.com / Co-founder of Celly, Emergent Social Networks, http://cel.ly