PlateClick: Bootstrapping food preferences through an adaptive visual interface

Publikation: Bidrag til tidsskriftKonferenceartikelForskningfagfællebedømt

Standard

PlateClick : Bootstrapping food preferences through an adaptive visual interface. / Yang, Longqi; Cui, Yin; Zhang, Fan; Pollak, John P.; Belongie, Serge; Estrin, Deborah.

I: International Conference on Information and Knowledge Management, Proceedings, 17.10.2015, s. 183-192.

Publikation: Bidrag til tidsskriftKonferenceartikelForskningfagfællebedømt

Harvard

Yang, L, Cui, Y, Zhang, F, Pollak, JP, Belongie, S & Estrin, D 2015, 'PlateClick: Bootstrapping food preferences through an adaptive visual interface', International Conference on Information and Knowledge Management, Proceedings, s. 183-192. https://doi.org/10.1145/2806416.2806544

APA

Yang, L., Cui, Y., Zhang, F., Pollak, J. P., Belongie, S., & Estrin, D. (2015). PlateClick: Bootstrapping food preferences through an adaptive visual interface. International Conference on Information and Knowledge Management, Proceedings, 183-192. https://doi.org/10.1145/2806416.2806544

Vancouver

Yang L, Cui Y, Zhang F, Pollak JP, Belongie S, Estrin D. PlateClick: Bootstrapping food preferences through an adaptive visual interface. International Conference on Information and Knowledge Management, Proceedings. 2015 okt. 17;183-192. https://doi.org/10.1145/2806416.2806544

Author

Yang, Longqi ; Cui, Yin ; Zhang, Fan ; Pollak, John P. ; Belongie, Serge ; Estrin, Deborah. / PlateClick : Bootstrapping food preferences through an adaptive visual interface. I: International Conference on Information and Knowledge Management, Proceedings. 2015 ; s. 183-192.

Bibtex

@inproceedings{d2da86fc8b6b4e979f61225bf9d761a5,
title = "PlateClick: Bootstrapping food preferences through an adaptive visual interface",
abstract = "Food preference learning is an important component of wellness applications and restaurant recommender systems as it provides personalized information for effective food targeting and suggestions. However, existing systems require some form of food journaling to create a historical record of an individual's meal selections. In addition, current interfaces for food or restaurant preference elicitation rely extensively on text-based descriptions and rating methods, which can impose high cognitive load, thereby hampering wide adoption. In this paper, we propose PlateClick, a novel system that bootstraps food preference using a simple, visual quiz-based user interface. We leverage a pairwise comparison approach with only visual content. Using over 10,028 recipes collected from Yummly, we design a deep convolutional neural network (CNN) to learn the similarity distance metric between food images. Our model is shown to outperform state-of-the-art CNN by 4 times in terms of mean Average Precision. We explore a novel online learning framework that is suitable for learning users' preferences across a large scale dataset based on a small number of interactions (≤ 15). Our online learning approach balances exploitation-exploration and takes advantage of food similarities using preference-propagation in locally connected graphs. We evaluated our system in a field study of 227 anonymous users. The results demonstrate that our method outperforms other baselines by a significant margin, and the learning process can be completed in less than one minute. In summary, PlateClick provides a light-weight, immersive user experience for efficient food preference elicitation.",
keywords = "Food preference elicitation, Online learning, Visual interface",
author = "Longqi Yang and Yin Cui and Fan Zhang and Pollak, {John P.} and Serge Belongie and Deborah Estrin",
note = "Publisher Copyright: {\textcopyright} 2015 ACM.; 24th ACM International Conference on Information and Knowledge Management, CIKM 2015 ; Conference date: 19-10-2015 Through 23-10-2015",
year = "2015",
month = oct,
day = "17",
doi = "10.1145/2806416.2806544",
language = "English",
pages = "183--192",
journal = "International Conference on Information and Knowledge Management, Proceedings",

}

RIS

TY - GEN

T1 - PlateClick

T2 - 24th ACM International Conference on Information and Knowledge Management, CIKM 2015

AU - Yang, Longqi

AU - Cui, Yin

AU - Zhang, Fan

AU - Pollak, John P.

AU - Belongie, Serge

AU - Estrin, Deborah

N1 - Publisher Copyright: © 2015 ACM.

PY - 2015/10/17

Y1 - 2015/10/17

N2 - Food preference learning is an important component of wellness applications and restaurant recommender systems as it provides personalized information for effective food targeting and suggestions. However, existing systems require some form of food journaling to create a historical record of an individual's meal selections. In addition, current interfaces for food or restaurant preference elicitation rely extensively on text-based descriptions and rating methods, which can impose high cognitive load, thereby hampering wide adoption. In this paper, we propose PlateClick, a novel system that bootstraps food preference using a simple, visual quiz-based user interface. We leverage a pairwise comparison approach with only visual content. Using over 10,028 recipes collected from Yummly, we design a deep convolutional neural network (CNN) to learn the similarity distance metric between food images. Our model is shown to outperform state-of-the-art CNN by 4 times in terms of mean Average Precision. We explore a novel online learning framework that is suitable for learning users' preferences across a large scale dataset based on a small number of interactions (≤ 15). Our online learning approach balances exploitation-exploration and takes advantage of food similarities using preference-propagation in locally connected graphs. We evaluated our system in a field study of 227 anonymous users. The results demonstrate that our method outperforms other baselines by a significant margin, and the learning process can be completed in less than one minute. In summary, PlateClick provides a light-weight, immersive user experience for efficient food preference elicitation.

AB - Food preference learning is an important component of wellness applications and restaurant recommender systems as it provides personalized information for effective food targeting and suggestions. However, existing systems require some form of food journaling to create a historical record of an individual's meal selections. In addition, current interfaces for food or restaurant preference elicitation rely extensively on text-based descriptions and rating methods, which can impose high cognitive load, thereby hampering wide adoption. In this paper, we propose PlateClick, a novel system that bootstraps food preference using a simple, visual quiz-based user interface. We leverage a pairwise comparison approach with only visual content. Using over 10,028 recipes collected from Yummly, we design a deep convolutional neural network (CNN) to learn the similarity distance metric between food images. Our model is shown to outperform state-of-the-art CNN by 4 times in terms of mean Average Precision. We explore a novel online learning framework that is suitable for learning users' preferences across a large scale dataset based on a small number of interactions (≤ 15). Our online learning approach balances exploitation-exploration and takes advantage of food similarities using preference-propagation in locally connected graphs. We evaluated our system in a field study of 227 anonymous users. The results demonstrate that our method outperforms other baselines by a significant margin, and the learning process can be completed in less than one minute. In summary, PlateClick provides a light-weight, immersive user experience for efficient food preference elicitation.

KW - Food preference elicitation

KW - Online learning

KW - Visual interface

UR - http://www.scopus.com/inward/record.url?scp=84958234089&partnerID=8YFLogxK

U2 - 10.1145/2806416.2806544

DO - 10.1145/2806416.2806544

M3 - Conference article

AN - SCOPUS:84958234089

SP - 183

EP - 192

JO - International Conference on Information and Knowledge Management, Proceedings

JF - International Conference on Information and Knowledge Management, Proceedings

Y2 - 19 October 2015 through 23 October 2015

ER -

ID: 301829251