Show simple item record

dc.contributor.advisorBonatti, Alessandro
dc.contributor.advisorHadfield-Menell, Dylan
dc.contributor.advisorMaskin, Eric
dc.contributor.advisorParkes, David
dc.contributor.authorHaupt, Andreas A.
dc.date.accessioned2025-04-14T14:05:54Z
dc.date.available2025-04-14T14:05:54Z
dc.date.issued2025-02
dc.date.submitted2025-03-24T19:29:14.215Z
dc.identifier.urihttps://hdl.handle.net/1721.1/159105
dc.description.abstractConsumer applications employ algorithms to deliver personalized experiences to users, among others, in search, e-commerce, online streaming, and social media, impacting how users spend their time and money. This dissertation studies the design of such personalization algorithms and the economic consequences of their deployment. The first chapter focuses on the impacts of reward signal precision on online learning algorithms frequently used for personalization. Reward signals are precise when individual measurement is accurate and heterogeneity is low. While some algorithms, which we call "risk-averse", favor experiences that yield more precise reward signals and hence favor measurability and homogeneity, others, in the limit, choose experiences independently of the precision of their associated reward signals. The third chapter analyzes how preference measurement error differentially affects user groups in optimal personalization. If such measurement error is symmetric, welfare maximization requires delivering majority-preferred experiences at a rate beyond their proportion in the user population and hence increasing concentration. However, asymmetric preference measurement errors may arise due to users' actions to reduce measurement error. Participants in a survey of TikTok state that they engage in such costly actions. The fifth chapter studies, through the introduction of a new desideratum for market design, how to achieve personalization without infringing on user privacy. Contextual privacy demands that all (preference) information elicited by an algorithm is necessary for computing an outcome of interest in all possible configurations of users’ information. This property is demanding, as it requires that no two pieces of information can jointly but not unilaterally influence the outcome. Algorithms can protect the privacy of users who are queried late and whose information is not used to compute public statistics of the user population, hence achieving the relaxed notion of maximal contextual privacy. Two brief chapters introduce new models of human-machine interaction. The first examines the design of generative models, while the second proposes stated regret of past consumption as a new data modality and presents a corresponding data collection tool.
dc.publisherMassachusetts Institute of Technology
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
dc.rightsCopyright retained by author(s)
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.titleThe Economic Engineering of Personalized Experiences
dc.typeThesis
dc.description.degreePh.D.
dc.contributor.departmentMassachusetts Institute of Technology. Institute for Data, Systems, and Society
dc.identifier.orcidhttps://orcid.org/0000-0002-2952-4188
mit.thesis.degreeDoctoral
thesis.degree.nameDoctor of Philosophy


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record