破解Facebook隱私亂局——對最有錢的用戶收費(fèi)
首先必須承認(rèn),我其實(shí)并不恨Facebook。原則上說,,幫助人們聯(lián)系并讓溝通更容易的社交網(wǎng)絡(luò)是個(gè)好主意,而且Facebook確實(shí)為人們做過一些非常好的事,。比如說,,看看藥物測試公司就知道了。他們通過Facebook來尋找應(yīng)征者,,顯著降低了這項(xiàng)重要工作的成本。 此外,,“刪掉Facebook”(#DeleteFacebook)行動(dòng)搞錯(cuò)了重點(diǎn),。沒有Facebook也會(huì)有別的產(chǎn)品。Facebook只是最新一代的社交網(wǎng)絡(luò),。比群起攻之更有意思也更具挑戰(zhàn)的是解決根本問題,,再延伸點(diǎn),,還能借機(jī)將橫跨各大網(wǎng)絡(luò)平臺(tái)的注意力經(jīng)濟(jì)拉入正軌。 從廣義上講,,注意力經(jīng)濟(jì)是針對人們?nèi)旌蜃⒁饬Φ氖袌觥,;ヂ?lián)網(wǎng)、電視,、社交媒體和廣播都在爭奪注意力,。傳統(tǒng)媒體很久以前就開始把注意力打包賣給廣告主,。社交媒體和搜索引擎的出現(xiàn)則帶來了遠(yuǎn)比以前豐富的雙向信息交流,,這種交流既導(dǎo)致隱私面臨泄露風(fēng)險(xiǎn),也為用戶網(wǎng)絡(luò)生活的點(diǎn)滴細(xì)節(jié)創(chuàng)造出二級和三級市場,。 我有一個(gè)主意可以解決Facebook的問題,,就是給所有Facebook用戶標(biāo)價(jià),并允許他們照價(jià)付款,,從而不被追蹤,、不用參與某些活動(dòng)也不受算法的干擾。以前就有過這樣的建議,F(xiàn)acebook首席運(yùn)營官謝麗爾·桑德伯格最近甚至提到了,。關(guān)鍵在于用戶的標(biāo)價(jià)一定要有差別,,而且要體現(xiàn)出用戶對Facebook的價(jià)值。貧窮國家用戶的價(jià)值將遠(yuǎn)低于富裕國家用戶,。 富裕用戶對Facebook的價(jià)值可能超過貧困用戶,,因此可能要在此項(xiàng)服務(wù)上花更多的錢(我們知道Facebook很關(guān)心用戶收入,,因?yàn)閺V告主和推廣人員已經(jīng)可以根據(jù)估算用戶收入選擇目標(biāo))。這樣做有可能形成一種浮動(dòng)計(jì)費(fèi)機(jī)制,,從而或多或少地減少重要程度較低的用戶成本,。個(gè)人注意力經(jīng)濟(jì)指標(biāo)將只用于本人,。我們知道,,在某種程度上Facebook內(nèi)部已經(jīng)在做這件事,,把該指標(biāo)稱為“用戶平均收入”,。舉個(gè)例子,在美國和加拿大,,F(xiàn)acebook用戶每季度產(chǎn)生的收入約為26美元,,而在全球范圍內(nèi),,用戶單季價(jià)值為6.18美元。 如此一來,,整個(gè)注意力經(jīng)濟(jì)就變成了注意力和使用服務(wù)的明確交換,,而且還可方便地選擇退出,。 如果不花錢又想繼續(xù)用Facebook,,基本上等于同意公開所有信息并放棄隱私權(quán),。甚至可以這么寫出來,,而且應(yīng)該用大號字體標(biāo)注在服務(wù)協(xié)議的條款中。這事應(yīng)該做得明明白白,,一步到位,,不要拖泥帶水。而且我們終于可以告別這個(gè)“歌舞伎劇場”,,舞臺(tái)上的Facebook還在假裝關(guān)心用戶隱私,,因?yàn)闀r(shí)至今日,其業(yè)務(wù)模式的基礎(chǔ)就是盡可能出售用戶信息而不受懲罰,。 此外,,透明度將增強(qiáng)信任感,用戶注意力也能變成真正的市場,。該領(lǐng)域還可以結(jié)合其他立法措施加強(qiáng)隱私保護(hù),,比如弗吉尼亞州民主黨參議員馬克·華納和馬薩諸塞州民主黨參議員伊麗莎白·沃倫建議強(qiáng)制企業(yè)向每位數(shù)據(jù)被盜用戶賠償50或100美元,。而且,就像記者兼隱私專家朱莉婭·安格溫所說,,此舉也可以協(xié)助改善美國隱私監(jiān)管和定義,。 不過,要點(diǎn)在于改變整個(gè)注意力經(jīng)濟(jì),,將其變成我們能看到,、能理解的顯性市場,而不是把用戶當(dāng)作商品的黑市,。這樣一來,,我的建議對Equifax之類公司可能真的管用,因?yàn)橐酝娜皇占畔⒌姆?wù)商就得在用戶知情的情況下獲得許可,,而且出售信息時(shí)都要披露價(jià)值,。如果再加上Facebook正在采取的其他措施,隱私控制會(huì)更容易操作,。 給Facebook上的隱私信息明碼標(biāo)價(jià)還有另一大好處,,就是弄清楚人們是不是真的想用Facebook(以及注意力經(jīng)濟(jì)的其他服務(wù))。如果不想付費(fèi)但還想繼續(xù)用Facebook,,那么意圖就很明顯——他們是在錢包投票,,表示并不非常看重自己的隱私,。如果人們不想花錢,,然后停止使用Facebook,結(jié)論同樣清晰——意思是享受的服務(wù)沒有隱私值錢,。如果人們愿意付費(fèi)并在沒有追蹤和廣告的情況下繼續(xù)使用Facebook,,那就釋放出了另一種信息——他們重視Facebook的服務(wù),但也重視自己的隱私,。一言以蔽之,,我們可以大大提升用戶需求和企業(yè)目標(biāo)的一致性。為了讓交易更負(fù)責(zé)任,,F(xiàn)acebook(或其他任何服務(wù)商)每年都應(yīng)向用戶提供報(bào)告,詳細(xì)說明信息如何出售,,買家是誰,,價(jià)格多少,然后用明確的警示信息和大號字體讓用戶確認(rèn)下一年是否繼續(xù),。 我們還可以再進(jìn)一步,,要求Facebook在所有廣告和得到贊助的帖子上增設(shè)一個(gè)按鈕,點(diǎn)擊之后可以看到誰出了錢,,買家在哪里以及為了讓用戶看到相關(guān)內(nèi)容買家支付了多少費(fèi)用,。這或許會(huì)讓某些人感到震驚,,因?yàn)槭撬麄儠?huì)發(fā)現(xiàn)廣告主推送時(shí)花費(fèi)如此少。 在Facebook驗(yàn)證后,,還可以輕松推廣到谷歌,、推特、Snapchat,、Instagram和其他免費(fèi)社交和搜索服務(wù)商等處,。每家公司的用戶都有權(quán)了解自己的注意力價(jià)值幾何。 要明確的是,,我的建議并不適合某些灰色領(lǐng)域,。 以Cambridge Analytica為例,該機(jī)構(gòu)其實(shí)是哄騙用戶自愿向第三方應(yīng)用提交數(shù)據(jù),。再比如,,F(xiàn)acebook內(nèi)部也許想要求付費(fèi)用戶開啟面部識(shí)別功能,以便自動(dòng)添加標(biāo)簽,,而且有些用戶可能確實(shí)愿意,。這種情況下,政府也應(yīng)該實(shí)施監(jiān)管并懲罰Facebook(就像對未能執(zhí)行隱私保護(hù)法規(guī)和服務(wù)條款的其他侵犯隱私行為進(jìn)行罰款一樣),。 作為通用社會(huì)原則,,要求透明交易是一種健康的政策,也是正確的做法,。暗箱操作掩蓋了不公平和不道德的商業(yè)模式,。通過一個(gè)又一個(gè)行業(yè)淪落的案例已經(jīng)顯示,情況確實(shí)如此,。不道德的企業(yè)總想對終端用戶隱瞞顧客,、商品或服務(wù)的價(jià)值。社交網(wǎng)絡(luò)和搜索引擎讓情況稍稍改變,,因?yàn)橛脩艨梢韵硎芤恍┟赓M(fèi)福利,,但暗含的道德問題都一樣。 我的結(jié)論是,,如果經(jīng)營模式依賴于出售用戶的隱私數(shù)據(jù),,又不明確告知做法,企業(yè)從根本上就不可能真正重視用戶的隱私,。把所有操作亮出來,,明確公開又簡單。這將為新一代經(jīng)營模式更健康的注意力經(jīng)濟(jì)企業(yè)打下基礎(chǔ),,在結(jié)合業(yè)務(wù)模式和用戶利益方面也能做得更好,。(財(cái)富中文網(wǎng)) 亞歷克斯·索克埃爾是一位作家、演說家,,曾在Mozilla擔(dān)任營銷副總裁,。他撰寫了《無人駕駛汽車中的司機(jī):技術(shù)選擇怎樣塑造未來》(The Driver in the Driverless Car: How Our Technology Choices Will Create the Future)一書,。 譯者:Charlie 審校:夏林 |
I have a confession to make: I don’t actually hate Facebook. A social network that connects people and makes it easier to communicate is a nice idea in principle. And Facebook has already done some very good things for humanity. Just look at drug trial companies, for instance. They have used Facebook to find potential participants, slicing significant costs of this important endeavor. Further, the #DeleteFacebook movement misses the point. Something will take its place. Facebook is only the latest generation of online social networks. More interesting and challenging than killing Facebook is fixing Facebook—and, by extension, fixing the entire attention economy across invasive online platforms. The “attention economy,” broadly speaking, is the market for our attention span throughout the day. The Internet, television, social media, and radio all fight for our attention. Legacy mediums have long packaged and sold our attention to advertisers. But the advent of social media and search has ushered in much richer two-way transfers of information that have put our privacy at risk and also created secondary and tertiary markets for minute details of our online lives. Here’s one idea on how to fix Facebook: Put a price on every single Facebook user and allow them to pay it to opt out of any tracking or any other activities or algorithmic interventions. That’s been suggested before, and Facebook COO Sheryl Sandberg even recently addressed it. But here’s the twist: The prices must vary by individual user and will reflect the user’s actual value to Facebook. Users in poor countries will be worth a lot less than users in rich countries. And rich users will likely be worth more to Facebook than poor users, so they would have to pay more to use the service (we know that Facebook cares about income distribution because it has offered the capability to advertisers and marketers to target by inferred income). This would create a sliding scale that would somewhat mitigate the cost to users of lesser means. Your personal attention economy number would be unique to you. We know Facebook is already doing this internally, to some degree. The company calls it “average revenue per user.” In the U.S. and Canada, for example, a Facebook user is worth about $26 per quarter of revenue. Globally, users are worth $6.18 per quarter. This would make the entire attention economy an explicit exchange of our attention for services that we are using with an easy opt-out. Those who won’t pay but want to keep using the service will basically agree to give up their information and cede rights to privacy. It should even be worded that way—and in large type on a terms of service agreement. This will happen explicitly and in one fell swoop, rather than drip, drip, drip. And we can get rid of this Kabuki theater of Facebook really caring about user privacy when its entire business model to date has been premised on selling as much information about users as it can get away with. This transparency will start to build trust and create a real marketplace for our attention. It would also work in tandem with other legislative efforts to enforce privacy, like Sen. Mark Warner (D-Va.) and Sen. Elizabeth Warren’s (D-Mass.) proposal to force companies to pay $50 or $100 for each person whose data is stolen. Likewise, it would complement proposals for better regulations and definitions of privacy in the U.S., as laid out by journalist and privacy expert Julia Angwin. The important point, however, is changing the entire attention economy by making it an explicit market that we can see and understand instead of a murky exchange where the users are the product. In this manner, what I propose could actually work for companies like Equifax—where services that surreptitiously collect information about us would have to receive our informed content and disclose our value whenever it sells our information. And it would build upon other ongoing Facebook efforts to make privacy controls easier to use. Putting a transparent and personal price tag on Facebook privacy would also have the beautiful effect of making it clear whether people actually want Facebook (and other attention economy services). If no one wants to pay but they still want to keep using Facebook, then their intent becomes very obvious. People are voting with their wallets and saying that they don’t value their privacy very much. If people don’t want to pay and then stop using Facebook, it’s also obvious. They have said that the service is not worth the price of their privacy. If people want to pay and keep using Facebook sans tracking and ads, then that sends another message: that they value Facebook’s service but also value their privacy. In one stroke, we can create far better alignment between user needs and company goals. To make this transaction more accountable, once per year, Facebook (or any other service) will need to send to us a report of how our information was sold, who bought it, and for how much—and request an annual opt-in with the same caveats and large-print type. We can take it one step further, as well, and mandate that Facebook have a button on every ad or every sponsored post that tells us who bought the post, where they are located, and how much they paid to put that post in front of our eyes. This might be shocking to some, mainly to see how little advertisers are paying to put things in front of us. What applies to Facebook could easily apply to Google ), Twitter , Snapchat , Instagram, and other free social and search services. In every case, users should have the ability to know what their attention is worth. To be clear, there are some gray areas that won’t fit perfectly into what I’m proposing. In the case of Cambridge Analytica, for example, users were essentially tricked into willingly submitting their data to a third-party application. Internally, Facebook may want to ask paying users, for example, to turn on their facial recognition features in order to enable auto-tagging, and some users may like that idea. In this case, as well, government regulation should kick in and fine Facebook (just as it would fine other privacy violations that involve the failure to enforce privacy rules and terms of service). As a general social principle, mandating transparency in transactions is a healthy policy—and it’s the right thing to do. Opacity hides unfairness and business models that are unethical. In sector after sector, this has been shown to be true. Unethical businesses seek to hide the value of a customer, a good, or a service from the end user. The situation is slightly flipped with social and search, where users are getting something for free but the ethics remain the same. The bottom line is this: It is fundamentally impossible to build a business that puts users’ privacy first when the business model depends on selling private data about those users without making this practice obvious or explicit. Make everything explicit, obvious, and open—and keep it simple. This will pave the way for a healthier generation of attention economy businesses that will have far better alignment between their business models and their users’ interests. Alex Salkever is an author, public speaker, and former vice president of marketing at Mozilla. He is also author of The Driver in the Driverless Car: How Our Technology Choices Will Create the Future. |