Human-centered interfaces from an interface-centered human:
Food through the Interface
Improving public health and promoting food justice by providing the right information at the right time.
CHALLENGE
Promote food justice and food security for the most vulnerable, across the food system.
Promote food justice and food security for the most vulnerable, across the food system.
PROJECT
Research into how the interface of food affects production and consumption led to Fare+Square, a mobile app and digital assistant that leverages AI to provide grocery pricing, recommendations and nutritional assistance program information through simple notifications and gamification interfaces.
Research into how the interface of food affects production and consumption led to Fare+Square, a mobile app and digital assistant that leverages AI to provide grocery pricing, recommendations and nutritional assistance program information through simple notifications and gamification interfaces.
DETAILS
Harvard Graduate School of Design (GSD) + School of Engineering and Applied Sciences (SEAS)
Collaborative Design Engineering Studio
Collaborative Design Engineering Studio
Chuck Hoberman, Peter Stark and Jock Herron, instructors
Zeerak Ahmed, Terra Moran and Karen Su, team members
AFTER
Published in Food Systems.
︎

Food through the Interface is part of a year-long research and design exploration of food systems, across scales ranging from the molecular to macroeconomic. Based on an understanding of food as oriented around the consumer interface, the project aims to challenge expectations about distribution and consumption — reshaping both eating habits and production models.
︎

From system to shelf, and back again
As a team, we first set out to explore the food system. Our research explored industrial agriculture, food processing and production as well as local farms, markets and restaurants. We spoke to farmers, business owners, government officials and everyday consumers.
Beginning with a basic assumption about food as a closed system of production and consumption, we rapidly found that it get a lot more complicated — the food system contains many elements and diverging paths.
And while we uncovered a very complicated system with many problems, we kept returning to the most basic food experience for most people in the US: the time right before food purchase, typically in front of a grocery store shelf.
Beginning with a basic assumption about food as a closed system of production and consumption, we rapidly found that it get a lot more complicated — the food system contains many elements and diverging paths.
And while we uncovered a very complicated system with many problems, we kept returning to the most basic food experience for most people in the US: the time right before food purchase, typically in front of a grocery store shelf.


Faced with staggering amounts of choice, how do consumers decide what to eat? Our research revealed that many people struggle to eat cheaply and healthily, particularly those with limited means or facing decision fatigue after long work days. Visits to stores like Dorchester’s Daily Table revealed the even in the best contexts, shopping well is hard.
Mapping the system in and around the moment of decision helped us to understand the many interactions involved. Even places like Boston public schools became key points for promoting food availability, through their school lunch and weekend take-home programs.
Mapping the system in and around the moment of decision helped us to understand the many interactions involved. Even places like Boston public schools became key points for promoting food availability, through their school lunch and weekend take-home programs.


Finding the right place and way to intervene
Deciding that we wanted to work at the interface of food, we sketched a broad range of ideas. These ranged from healthy food vending machines, smarter refrigerators, augmented grocery store experiences, or even new value-added tax programs that would return benefits to consumer.We felt in the end, however, that what was needed most was a digital assistant for grocery shopping: a low-touch information service that would be specifically designed for low-income consumers on nutritional assistance programs, but open to use by all consumers. We wanted this service, Fare+Square, to provide the right information at the right time.




Finding the right place and way to intervene
Deciding that we wanted to work at the interface of food, we sketched a broad range of ideas. These ranged from smarter refrigerators, augmented grocery store experiences, even new value-added tax programs that would return benefits to consumer.We felt in the end, however, that what was needed most was a digital assistant for grocery shopping: a low-touch information service that would be specifically designed for low-income consumers on nutritional assistance programs, but open to use by all consumers. We wanted this service, Fare+Square, to provide the right information at the right time.

Designing for an augmented intelligence
Looking to a near-term future, we designed Fare+Square as an intelligent digital assistant, determining the types of data it might leverage to better understand a consumer’s preferences and needs.





Seeking to better understand Fare+Square’s impact on food systems as a whole, we also built a simulation model. Exploring the effect of notifications and bonus programs, we believed that the app applied at scale could help to address problems within food systems in cities.
Still curious? Check out some of my other work:
︎
Busy Bee
Measuring digital productivity and human movement in a creative studio with a new tool for spatial analysis and data visualization.
CHALLENGE
Use new technologies and tools to measure and visualize the relationship between movement and productvity in a studio environment.
PROJECT
Busy Bee, a system that uses physical computation, environmental sensors, computer vision and scripts to track circulation and digital file creation over time — and provides an interface for interactive exploration of the data.
Use new technologies and tools to measure and visualize the relationship between movement and productvity in a studio environment.
PROJECT
Busy Bee, a system that uses physical computation, environmental sensors, computer vision and scripts to track circulation and digital file creation over time — and provides an interface for interactive exploration of the data.
DETAILS
Columbia Graduate School of Architecture, Planning and Preservation (GSAPP)
Measure
AFTER
Columbia Graduate School of Architecture, Planning and Preservation (GSAPP)
Measure
Lorenzo Villaggi and Carlo Bailey, instructors
Alex Rosenthal, Tao Yu and Zhiwen Zhang, team members
Spring 2016
Spring 2016
AFTER
Published in Columbia GSAPP’s Abstract 2016.
︎

The system monitors circulation, digital productivity and ambient environmental factors in one studio of Avery Hall, and visualizes this data to enable new design interventions. A step toward the Internet of Things, Busy Bee represents one part of a broader feedback loop, where architecture and the space it defines share a dynamic and reciprocal relationship.
The physical computing process in Busy Bee involves real-world data collection and digital data processing. Within the studio, onboard cameras in Raspberry Pi units record human movement (OpenCV-enabled computer vision), while other sensors monitor sound, humidity and temperature (Python interpretation of GPIO input); studio workstations track file generation and associated metadata (Python scripting). Collected data is transmitted via digital cloud services (Google Sheets, Dropbox and Flux) to Grasshopper. The resulting visualization — presented in both three spatial dimensions as well as over time — offers a dynamic and interactive interface for exploring the data.
Busy Bee was implemented over six weeks as part of a half-semester course. Within the team, my focus was on sensor data collection as well as data processing and visualization creation.
The physical computing process in Busy Bee involves real-world data collection and digital data processing. Within the studio, onboard cameras in Raspberry Pi units record human movement (OpenCV-enabled computer vision), while other sensors monitor sound, humidity and temperature (Python interpretation of GPIO input); studio workstations track file generation and associated metadata (Python scripting). Collected data is transmitted via digital cloud services (Google Sheets, Dropbox and Flux) to Grasshopper. The resulting visualization — presented in both three spatial dimensions as well as over time — offers a dynamic and interactive interface for exploring the data.
Busy Bee was implemented over six weeks as part of a half-semester course. Within the team, my focus was on sensor data collection as well as data processing and visualization creation.



Still curious? Check out some of my other work:
︎
Event Detection Interface
Enabling smarter search of big data with better UI and UX.
CHALLENGE
Design an interface for internal data scientists and external clients to collect, manage and analyze building and infrastructure performance data.
PROJECT
Prototype search UI and UX that flexibly accommodate a range of users and tasks — from simple visual comparisons to advanced boolean or machine learning searches — while defining consistent graphic elements and design patterns for the rest of the software.
Design an interface for internal data scientists and external clients to collect, manage and analyze building and infrastructure performance data.
PROJECT
Prototype search UI and UX that flexibly accommodate a range of users and tasks — from simple visual comparisons to advanced boolean or machine learning searches — while defining consistent graphic elements and design patterns for the rest of the software.
DETAILS
MKThink and RoundhouseOne
Summer Fellowship
Summer 2016
MKThink and RoundhouseOne
Summer Fellowship
Summer 2016
︎


The Event Detection Interface was developed as one component of a broader suite of online web tools developed to collect, manage, analyze and visualize built environment data, for educational and institutional clients. The tool needed to accommodate interested but novice stakeholders as wells expert researchers, allowing all levels of users to analyze time-series data from environmental sensors and building systems — producing concrete insight in the form of possible efficiency and optimization measures.
Through an iterative process, and in collaboration with potential end users, I took an existing proof-of-concept and designed a complete workflow and interface. I also established a graphic langauge and design patterns that could be extended to the rest of the software suite.
The design of the Event Detection Interface prototype included first-time onboarding, data loading and multi-faceted search options — including an advanced search feature that allowed for boolean logic and graphical descriptions. The GUI adopted the concept of a central “canvas,” on top of which users could place rows of time-series data. Floating panels provided access to search parameters, results and a history of saved searches.

