Reconciling Personalization with Privacy

Personalization has become a household term in the online consumer market. E-tailers, search engines, and social networks all want to tailor their services to their users. Many surveys show that Internet users appreciate a personalized experience. At the same time, businesses that personalize their services see increased profits. Around 20-30% of Amazon purchases and 60% of Netflix views are a result of personalized recommendations.

The word “personalization” however points to its Achilles’ heel: to benefit from a personalized experience, users need to disclose personal information. This poses a problem since consumers today are very concerned about their privacy and hesitant to disclose information about themselves. The Q1 2012 TRUSTe Privacy Index shows that 90% of U.S. adults worry about their privacy online. ISR professor Alfred Kobsa works on reconciling the twofold benefits of personalization with Internet users’ privacy concerns. He thereby collaborates with industry, including Samsung US R&D Center, Ericsson Research, and Microsoft Research.

One line of his research is to study how users can be assisted in their disclosure decisions by means of informative justifications for revealing personal information. For instance, one can explain one or more of the following to users:

■ How is the information used to personalize the users’ experience? This strategy is in line with the “Consumer Privacy Bill of Rights” that was recently proposed by the White House. This guideline calls on industry to define a set of practices that will give consumers more information and control of their privacy. Specifically, the framework suggests that “companies should provide clear descriptions of [...] why they need the data, how they will use it.”

■ What are the benefits of disclosure? People engage in a “privacy calculus,” trading off the perceived benefits of disclosure with its potential risks. Justifications describing the benefits may tip the scale in favor disclosure or non- disclosure.

■ What did others do? Indicating how many other users have disclosed the requested information allows people to eschew their individual privacy calculus, by conforming to the behavior of others.

Do these justifications indeed decrease perceived privacy threats and increase disclosure? Recent work by Ph.D. student Bart Knijnenburg (A. Kobsa, advisor) shows that the solution is not this simple. In an online user study with a mock-up of a mobile recommender system, Knijnenburg and Kobsa showed that users appreciate the disclosure help provided by the justifications. At the same time, though, these justifications decrease users’ trust, their satisfaction, and their level of disclosure. This research has been described in ISR technical report UCI-ISR-12-1 which is available on the ISR website.

The likely explanation for this result is that the justifications tested by Knijnenburg and Kobsa did not only indicate high disclosure benefits. Rather, promised benefits were sometimes less than stellar. In these cases, users not only withheld the information, but also became dissatisfied with the recommender for not providing better benefits. In fact, in-depth interviews revealed that users were much more attuned to the warning signal of low benefits than to the positive signal of high benefits. Hence, before requesting any personal information, system designers need to ascertain that users’ benefits from providing this information are high, since otherwise users are likely to rebel.

Seven Tips for Personalization with Fewer Privacy ProblemsHowever, even when they justified the disclosure with very high benefits or very high social compliance, Knijnenburg and Kobsa found that users would still be less willing to disclose their personal information than when no justification was given. It might very well be that any justification carries an implicit warning, regardless of how positive it is. A justification inadvertently signals that the act of disclosure is not trivial and may involve risks. Users cannot be easily persuaded; collecting their personal information is a long-term trust-building exercise.

To solve these problems, Kobsa and his team are now looking into more sophisticated ways of justifying disclosure. One direction is the interactive visualization of personalization benefits. Such visualization can explain the personalization process and give users immediate feedback about the usefulness of their disclosure, e.g., “since you told that you are a student we were able to recommend additional flights, which come with a student discount.” This research is carried out in collaboration with UC Santa Barbara.

Another direction is to tailor justifications to each user. For instance, preliminary results show that justifying disclosure by portraying the expected benefits actually works rather well for most users, except for males with low disclosure tendency. By paying close attention to people’s individual privacy preferences and decision strategies, system designers can give them a personalized experience without raising privacy concerns.

For more information, contact Prof. Alfred Kobsa.

This article appeared in ISR Connector issue: