I am a big fan of human-centered design, with its focus on customer empathy, ethnographic research, and rapid prototyping, which help companies get close to the target. But I am also excited about “stagecraft,” a related approach that fakes a new product or service to get real-life reactions from users early in the development process. It uses all the tricks of theater and film to fool an audience into believing that what they are seeing, hearing, or even touching is the real thing.
Employed early in the development process, stagecraft can yield critical insights that you might not discover until much later on. In the traditional process, testing with live customers usually does not occur until a product has been fully specced and a prototype has been built. With human-centered design, live testing begins early and uses a minimum viable product that has just enough features to engage users and elicit feedback. In stagecraft, the early test model is much more elaborate and has all the bells and whistles of the real thing — or appears to. This is important for getting the design right when you are rapidly developing a complex product and service.
I have seen stagecraft applied successfully in recent projects at Continuum Innovation, a Boston-based firm with which I am affiliated. One was a recent project for the Science and Technology Directorate of the U.S. Department of Homeland Security (DHS). The agency used stagecraft to gather very early learning about how new technologies could be used in equipment for first responders. This was a case where real-life testing was impossible; a poor prototype could endanger human lives. So, after riding around with Boston-area emergency medical technicians (EMTs) to get the basics, Continuum staged emergencies to test new ideas and gain critical insights into what would and would not work in the field.
Picture this: Through the windshield, two EMTs see cars pull over to make way for their self-driving ambulance. A portion of the windshield is obscured by a data display, showing a map of the crash site and indicating the locations of the most seriously injured passengers and their vital signs, which are transmitted from skin patches that have been applied by police officers or firefighters on the scene. At the site, the EMTs roll out. One crouches above a victim and glances at a display on a cuff wrapped around her wrist, seeing that the patient’s blood pressure has dropped — a sign of hypovolemic shock. The EMT straps an oxygen mask on the victim and gives her wrist a shake to activate the built-in cellphone in the cuff to call her colleague and tell him that this victim needs to be transported immediately.
It’s all staged. The driverless van is a mockup, with a foam-core dashboard. The view through the windshield is a projection. The crash victim is a dummy. Behind a two-way mirror, a software engineer pounds furiously on a keyboard, entering blood pressure, pulse rate, and the other vital signs that are displayed on the EMT’s wrist cuff. The ambulance technicians are real EMTs who are doing their best to replicate how they would handle this situation on the streets of Boston.
By staging emergencies repeatedly, designers quickly discovered important things about proposed ideas — including wrong assumptions. This was pay dirt for DHS. In the past, the agency’s attempts to design gear around emerging technology had missed the mark. In those cases, it had hosted workshops with vendors, surveyed police, EMTs, and fire departments, and run focus groups to get requirements, which were then turned over to engineers and designers to work up prototypes. This took a lot of time and money and left DHS open to costly errors.
In Continuum’s work for DHS, errors were discovered early. Here’s one example. In interviews, leaders of first-responder organizations as well as police officers and firefighters were excited about wearable heads-up displays. But in the stagecraft exercises officers found that the displays limited their ability to focus on the scene — a deal breaker. After the staged exercises, DHS was confident that it had the insights needed to begin working on actual design ideas.
In an assignment for Audi, the stagecraft was even more elaborate. The idea was a car-sharing service called Audi on Demand, part of a broader strategy for the car maker to become a provider of mobility services. Typically, the idea would have been handed off to an R&D department, which would have spent months or years researching rental, car-sharing, and ride-sharing services, gathering customer data, and testing insights with pilot programs. Instead, Audi skipped ahead, creating a Potemkin village — an elaborate charade to test how the service might feel to actual customers.
Audi on Demand promised to deliver the model of the customer’s choice wherever the customer wanted, within the San Francisco Bay Area. Customers could also drop the car off wherever they wanted. Continuum employees set up shop in Silicon Valley and built a list of wealthy consumers who liked driving a high-end car but hated the hassles of ownership. Customers got a smartphone app to select a car and unlock and start the car when it arrived. Knowledgeable “Audi representatives” delivered the cars. An 800 number connected customers to a 24-hour help desk — in reality a Continuum employee with a burner phone.
It was critically important to create a “high-fidelity” front end to convince demanding consumers that this was a live service. The charade worked: Customers never knew that, behind the apps, the service was all duct tape and baling wire.
And Audi learned valuable lessons. First, it discovered that price was not a significant barrier for these customers; as they used the service, their perceptions of value became much more positive. But the company also discovered that time was a more critical factor in satisfaction — which it learned by deliberately making some late deliveries and tracking customer reactions. The message: It was all right if Audi was 15 minutes late, but if customers allowed Audi 15 minutes, they wanted a 30-minute grace period for themselves. The team also learned early on that it could not deliver a good parking experience because it could not staff garages everywhere. So that idea died. Today, Audi on Demand is running as a real service (on a limited basis) in the Bay Area.
Stagecraft is not necessary for every product or service. It works best in situations where products are complex or will be used in uncontrolled environments. Online services and apps are very good candidates because they are relatively easy to fake and because even the best designers can’t anticipate what actual customers will do. A major financial services company that used stagecraft to test a new online service found out it was not going to fly. The stagecraft cost thousands of dollars. Building a full beta version would have cost millions.
Stagecraft allows managers and executives to think more ambitiously about major innovations. On a stage, sometimes covering substantial physical space, it’s possible to test the entire customer experience. By putting the “real thing” into the hands of real customers early on, you get to the right design faster. Equally important, stagecraft keeps you from going too far down a wrong path. At a time when innovation is more critical than ever — and when missing the mark could open the door to disruptors — stagecraft is a handy tool to have in your tool kit.