Sketching Interfaces

Generating code from low fidelity wireframes

Categories: Behind the Scenes — Benjamin Wilkins

The time required to test an idea should be zero. This was the very first sentence I wrote when considering the Airbnb design tools team vision. We believe that, within the next few years, emerging technology will allow teams to design new products in an expressive and intuitive way, while simultaneously eliminating hurdles from the product development process.

As it stands now, every step in the design process and every artifact produced is a dead end. Work stops whenever one discipline finishes a portion of the project and passes responsibility to another discipline. Projects progress from stakeholder meetings to design to engineering; requirements become explorations, explorations become mockups and prototypes, and these are handed off to developers to become final products. But each of these cumbersome steps is, at its core, a translation of shared meaning to a different medium in progression toward a common goal, with skilled experts in each domain acting as translators.

So how do we streamline this process to make our vision statement true? Our team has begun exploring methods to bring testing time to zero. As we learn and build, we will be sharing what we’re working on here.

Sketch to product

Sketching seemed like the natural place to start. As interface designers, sketching is an intuitive method of expressing a concept. We wanted to see how it might look to skip a few steps in the product development lifecycle and instantly translate our sketches into a finished product.

Article author Ben Wilkins and a colleague giving a presentation.

Live-demoing at a team meeting

Airbnb’s design system is well documented, and each component within the system has been named. We developed a working theory that if machine learning algorithms can classify a complex set of thousands of handwritten symbols — such as handwritten Chinese characters — with a high degree of accuracy, then we should be able to classify the 150 components within our system and teach a machine to recognize them.

We built an initial prototype using about a dozen hand-drawn components as training data, open source machine learning algorithms, and a small amount of intermediary code to render components from our design system into the browser. We were pleasantly surprised with the result:

This system has already demonstrated massive potential. We’ve experimented using the same technology to live-code prototypes from whiteboard drawings, to translate high fidelity mocks into component specifications for our engineers, and to translate production code into design files for iteration by our designers.

An ongoing exploration

As the design systems movement gains steam and interfaces become more standardized, we believe that artificial intelligence assisted design and development will be baked into the next generation of tooling. We’re excited to share our work with the broader community of designers and developers that are exploring this emerging field and to see where this leads. Stay tuned for future updates as we continue to experiment and build. In next post of the series, Design Tools Manager Lucas Smith will dive into some of the research and literature that informs our approach.

For more about our thinking around our vision, view the video below for a talk I gave on symbolic thinking and design systems.

Many thanks to the other members of the Design Technology team for helping drive this project forward: Jon Gold, Gavin Owens, David Chen, and Lucas Smith

Benjamin Wilkins is a Design Technologist at Airbnb. You can reach out on twitter, @thatbenlifetho

Up Next