Advertising Through Spyware — After Promising To Stop

On January 29, the New York Attorney General announced an important step in the fight against spyware: Holding advertisers accountable for their payments to spyware vendors. This is a principle I’ve long endorsed — beginning with my 2003 listing of Gator advertisers (then including Apple, Chrysler, and Orbitz), and continuing in my more recent articles about advertising intermediaries funding spyware and specific companies advertising through spyware.

I’m not the only one to applaud this approach. FTC Commissioner Leibowitz recently commended the NYAG’s settlement, explaining that “advertising dollars fuel the demand side of the nuisance adware problem by giving [adware vendors] the incentive to expand their installed base, with or without consumers’ consent.” In a pair of 2006 reports, the Center for Democracy and Technology also investigated spyware advertisers, attempting to expose the web of relationships that fund spyware vendors.

The NYAG’s settlement offers a major step forward in stopping spyware because it marks the first legally binding obligation that certain advertisers keep their ads (and their ad budgets) out of spyware. In Assurances of Discontinuance, Cingular (now part of AT&T), Priceline, and Travelocity each agreed to cease use of spyware. In particular, each company agreed either to stop using spyware advertising, or to use only “adware” that provides appropriate disclosures to users, prominently labels ads, and offers an easy procedure to uninstall. These requirements apply to ads purchased directly by Cingular, Priceline, and Travelocity, as well as to all marketing partners acting on their behalf.

These important promises are the first legally-binding obligations, from any Internet advertisers, to restrict use of spyware. (Compare, e.g., advertisers voluntarily announcing an intention to cease spyware advertising — admirable but not legally binding.) If followed, these promises would keep the Cingular, Priceline, and Travelocity ad budgets away from spyware vendors — reducing the economic incentive to make and distribute spyware.

But despite their duties to the NYAG, both Cingular and Travelocity have failed to sever their ties with spyware vendors. As shown in the six examples below, Cingular and Travelocity continue to receive spyware-originating traffic, including traffic from some of the web’s most notorious and most widespread spyware, in direct violation of their respective Assurances of Discontinuance. That said, Priceline seems to have succeeded in substantially reducing these relationships — suggesting that Cingular and Travelocity could do better if they put forth appropriate effort.

Example 1: Fullcontext, Yieldx (Admedian), Icon Media (Vizi) Injecting Travelocity Ad Into Google

A Travelocity Ad Injected into Google by Fullcontext A Travelocity Ad Injected into Google by Fullcontext

Travelocity
money viewers
   Icon (Vizi Media)    
money viewers
   Yieldx (Ad|Median)    
money viewers
Fullcontext

The Money Trail – How Travelocity Pays Fullcontext

On a PC with Fullcontext spyware installed (controlling server 64.40.99.166), I requested www.google.com. In testing of February 13, I received the image shown in the thumbnail at right — with a large 728×90 pixel banner ad appearing above the Google site. Google does not sell this advertising placement to any advertiser for any price. But Fullcontext spyware placed Travelocity’s ad there nonetheless — without permission from Google, and without payment to Google.

As shown in the video I preserved, clicking the ad takes users through to the Travelocity site. The full list of URLs associated with this ad placement:

http://64.40.99.166/adrotate.php
http://ad.yieldx.com/imp?z=6&Z=728×90&s=41637&u=http%3A%2F%2Fwww.google.com…
http://ad.yieldmanager.com/imp?z=6&Z=728×90&s=41637&u=http%3A%2F%2Fwww.goog…
http://ad.yieldx.com/iframe3?jwIAAKWiAABdAwIA5soAAAAAxAEAAAAACwADBAAABgMKxQ…
http://ad.yieldmanager.com/iframe3?jwIAAKWiAABdAwIA5soAAAAAxAEAAAAACwADBAAA…
http://network.realmedia.com/RealMedia/ads/adstream_sx.ads/iconmedianetwork…
http://network.realmedia.com/RealMedia/ads/click_lx.ads/iconmedianetworks/e…
http://clk.atdmt.com/AST/go/247mancr0020000002ast/direct;at.astncr00000121;…
http://leisure.travelocity.com/RealDeals/Details/0,2941,TRAVELOCITY_CRU_354…

As shown in the URL log and packet log, Fullcontext initiated the ad placement by sending traffic to the Yieldx ad network. (Yieldx’s Whois reports an address in Hong Kong. But Yieldx is hosted at an IP block registered to Ad|Median, an ad network with headquarters near Minneapolis.) Using the Right Media Exchange marketplace (yieldmanager.com), Yieldx/Ad|Median then sold the traffic to Icon Media Networks (now Vizi Media of LA and New York), which placed the Travelocity ad. The diagram at right depicts the chain of relationships.

This placement is typical of the Fullcontext injector. I have tracked numerous Fullcontext placements, through multiple controlling servers. I retain many dozens of examples on file. See also prior examples posted to my public site: 1, 2, 3.

The Fullcontext injector falls far short of the requirements of Travelocity’s Assurance of Discontinuance. For one, users often receive Fullcontext without agreeing to install it — through exploits and in undisclosed bundles (violating Travelocity Assurance page 4, provision 11.a; PDF page 11). Furthermore, Fullcontext’s ads lack any branding indicating what adware program delivered them — violating Assurance provision 11.b, which requires such branding to appear prominently on each adware advertisement. Fullcontext’s uninstall and legacy user functions also fail to meet the requirements set out in the Assurance.

Example 2: Fullcontext and Motive Interactive Injecting Cingular Ad Into Google

A CingularAd Injected into Google by Fullcontext A Cingular Ad Injected into Google by Fullcontext

Cingular
money viewers
   Motive Interactive   
money viewers
Fullcontext

The Money Trail – How Cingular Pays Fullcontext

Through the MovieInteractive ad network, Fullcontext also injects the Cingular ad into Google. See screenshot at right, taken on February 17. On a PC with Fullcontext spyware installed (controlling server 64.40.99.166), I requested www.google.com. I received the image shown in the thumbnail at right — with a prominent Cingular banner ad appearing above Google. As in the case of Travelocity, this ad appeared without permission from Google and without payment to Google. Rather, the ad was placed into Google’s site by Fullcontext spyware.

The full list of URLs associated with this ad placement:

http://64.40.99.166/adrotate.php
http://ad.motiveinteractive.com/imp?z=6&Z=728×90&s=161838&u=http%3A%2F%2Fwww.goo…
http://ad.yieldmanager.com/imp?z=6&Z=728×90&s=161838&u=http%3A%2F%2Fwww.google.c…
http://ad.motiveinteractive.com/iframe3?jwIAAC54AgD5QwMAtVQBAAIAZAAAAP8AAAAHEQAA…
http://ad.yieldmanager.com/iframe3?jwIAAC54AgD5QwMAtVQBAAIAZAAAAP8AAAAHEQAABgTud…
http://clk.atdmt.com/goiframe/21400598/rghtccin0470000088cnt/direct;wi.728;hi.90…
http://www.cingular.com/cell-phone-service/cell-phone-details/?q_list=true&q_pho…

As shown in the URL log and packet log, Fullcontext sent traffic to Motive Interactive, a Nevada ad network. Using the Right Media Exchange marketplace (yieldmanager.com), Motive Interactive sold the traffic to Cingular. The diagram at right depicts the chain of relationships. Notice that Cingular’s relationship with Fullcontext is one level shorter than the Travelocity relationship in Example 1.

Cingular should have known that this traffic was coming from spyware, because detailed information about the ad placement was sent to Cingular’s web servers whenever a user clicked a FullContext-placed ad. The packet log shows the information sent to the Atlas servers operating on Cingular’s behalf:

http://view.atdmt.com/CNT/iview/rghtccin0470000088cnt/direct;wi.728;hi.90/01?click=http:// ad.motiveinteractive.com/click,jwIAAC54AgD5QwMAtVQBAAIAZAAAAP8AAAAHEQAABgTudAIAmUcCAPqaAAC
iJAIAAAAAAAAAAAAAAAAAAAAAAKdz10UAAAAA,,http%3A%2F%2Fwww%2Egoogle%2Ecom%2F,

The first portion of the URL specifies what ad is to be shown, while the portion following the question mark reports how traffic purportedly reached this ad. (This information structure is standard for Right Media placements.) Notice the green highlighted text — telling Atlas (and in turn Cingular) that this ad was purportedly shown at www.google.com. But Atlas and Cingular should know that the www.google.com page does not sell banner ads to any advertiser at any price. The purported placement is therefore impossible — unless the ad was actually injected into Google’s site using spyware. The presence of this Google URL in Cingular’s referer log should have raised alarms at Cingular and should have prompted further investigation.

Example 3: Deskwizz/Searchingbooth and Ad-Flow (Rydium) Injecting Travelocity Ad Into True.com

A Travelocity Ad Injected into True.com by Searchingbooth A Travelocity Ad Injected into True.com by Searchingbooth

Travelocity
money viewers
   Ad-Flow (Rydium)  
money viewers
Deskwizz/Searchingbooth

The Money Trail – How Travelocity Pays Searchingbooth

Fullcontext is just one of several active ad injectors that place ads into other companies’ sites. The screenshot at right shows a injection performed by Deskwizz/Searchingbooth. In March 9 testing, I requested True.com. Deskwizz placed a large (720×300) pixel banner into the top of the page (not shown), and another into the bottom. This latter banner, shown in the thumbnail at right, promoted Travelocity. Just as the preceding examples occurred without payment to or permission from Google, this placement occurred without payment to or permission from True.com. Rather, the ad was placed into Google’s site by Deskwizz/Searchingbooth spyware.

The full list of URLs associated with this ad placement:

http://servedby.headlinesandnews.com/media/servlet/view/banner/unique/url/strip?…
http://www.uzoogle.com/indexP.php?PID=811
http://www.uzoogle.com   [posted parameter: PID=811]
http://ad.ad-flow.com/imp?z=2&Z=300×250&s=118935&u=http%3A%2F%2Fwww.uzoogle.com%…
http://ad.yieldmanager.com/imp?z=2&Z=300×250&s=118935&u=http%3A%2F%2Fwww.uzoogle…
http://ad.doubleclick.net/adj/N447.rightmedia.com/B2130591.2;sz=300×250;click0=h…

As shown in the URL log and packet log, Deskwizz/Searchingbooth sent traffic to its Uzoogle ad loader, which forwarded the traffic onwards to Ad-Flow. (Ad-flow is the ad server of Rydium, a Toronto ad network.) The traffic then flowed through to the Right Media Exchange marketplace (yieldmanager.com), where it was sold to Travelocity. The diagram at right depicts the chain of relationships.

This placement is typical of Deskwizz/Searchingbooth. I have tracked a web of domain names operated by this group — including Calendaralerts, Droppedurl, Headlinesandnews, Z-Quest, and various others — that all receive traffic from and through similar banner injections. Z-quest.com describes itself as a “meta-search” site, while Uzoogle presents itself as offering Google-styled logos and branded search results. But in fact these sites all serve to route, frame, and redirect spyware-originating traffic, as shown above. I retain many dozens of examples on file. See also the multiple examples I have posted to my public site: 1, 2, 3, 4, 5.

Example 4: Deskwizz/Searchingbooth and Right Media Injecting Cingular Ad Into True.com

A Cingular Ad Injected into True.com by Searchingbooth A Cingular Ad Injected into True.com by Searchingbooth

Cingular
money viewers
   Yield Manager / Right Media Exchange  
money viewers
Deskwizz/Searchingbooth

The Money Trail – How Cingular Pays Searchingbooth

Deskwizz/Searchingbooth also injects Cingular ads into third parties’ sites, including into True.com. The screenshot at right shows the resulting on-screen display (as observed on March 9). The screenshot depicts a Cingular ad placed into True.com without True’s permission and without payment to True.

The full list of URLs associated with this ad placement:

http://servedby.headlinesandnews.com/media/servlet/view/banner/unique/url/strip?…
http://optimizedby.rmxads.com/st?ad_type=ad&ad_size=728×90&section=160636
http://ad.yieldmanager.com/imp?Z=728×90&s=160636&_salt=3434563176&u=http%3A%2F%2…
http://optimizedby.rmxads.com/iframe3?6B4AAHxzAgD5QwMAtVQBAAIAAAAAAP8AAAAGFAAABg…
http://ad.yieldmanager.com/iframe3?6B4AAHxzAgD5QwMAtVQBAAIAAAAAAP8AAAAGFAAABgJQF…
http://clk.atdmt.com/goiframe/22411278/rghtccin0470000088cnt/direct;wi.728;hi.90…

As shown in the URL log and packet log, Deskwizz/Searchingbooth sent traffic to the Right Media‘s Rmxads. The traffic then flowed through to the Right Media Exchange marketplace (yieldmanager.com), where it was sold to Cingular. The diagram at right depicts the chain of relationships.

Cingular should have known that this ad was appearing through spyware injections for the same reason presented in Example 2. In particular, the packet log reveals that specific information about ad context was reported to Cingular’s server whenever a user clicked an injected ad. This context information put Cingular on notice as to where its ads were appearing — including sites on which Cingular had never sought to advertise, and even including sites that do not accept advertising.

Example 5: Web Nexus, Traffic Marketplace Promoting Travelocity in Full-Screen Pop-Up Ads

Web Nexus Promotes Travelocity - Full-Screen Pop-Up Web Nexus Promotes Travelocity Using a Full-Screen Pop-Up

Travelocity
money viewers
   Traffic Marketplace   
money viewers
Web Nexus

The Money Trail – How Travelocity Pays Web Nexus

Although the four preceding examples all show banner ad injections, pop-up ads remain the most common form of spyware advertising. Spyware-delivered pop-ups continue to promote both Cingular and Travelocity. For example, Web Nexus is widely installed without consent (example) and in big bundles without the disclosures required by the Travelocity’s Assurance of Discontinuance. Yet Web Nexus continues to promote Travelocity through intrusive full-screen pop-ups, like that shown at right (taken on February 22). Indeed, this pop-up is so large and so intrusive that it even covers the Start button — preventing users from easily switching to another program or window.

The Travelocity ad at issue is also striking for its lack of branding or other attribution. A user who manages to move the pop-up upwards will find a small “Web Nexus” footer at the ad’s bottom edge. But this label initially appears substantially off-screen and hence unreadable. In contrast, Travelocity’s Assurance of Discontinuance (Travelocity section, page 4, provision 11.b; PDF page 11) requires that each adware-delivered advertisement be branded with a “prominent” name or icon. Because it appears off-screen, Web Nexus’s ad label cannot satisfy the NYAG’s prominence requirement. Furthermore, packet log analysis reveals that this placement is the foreseeable result of Web Nexus’s design decisions. Further discussion and analysis.

The full list of URLs associated with this ad placement:

http://stech.web-nexus.net/cp.php?loc=295&cid=9951709&u=ZWJheS5jb20v&en=&pt=3…
http://stech.web-nexus.net/sp.php/9157/715/295/9951709/527/
http://t.trafficmp.com/b.t/e48U/1172127347
http://cache.trafficmp.com/tmpad/content/clickhere/travelocity/0107/contextu…

As shown in the URL log and packet log, Web Nexus sent traffic to Traffic Marketplace (a New York ad network owned by California’s Vendare Media). The traffic then flowed through to Travelocity. The diagram at right depicts the relationships.

Example 6: Targetsaver, EasilyFound, LinkShare Promoting Cingular in Full-Screen Pop-Up Ads

TargetSaver Promotes Cingular Using a Full-Screen Pop-Up TargetSaver Promotes Cingular Using a Full-Screen Pop-Up

Cingular
money viewers
   LinkShare  
money viewers
   EasilyFound  
money viewers
TargetSaver

The Money Trail – How Cingular Pays TargetSaver

In testing of March 8, I searched for “get ringtones” at Google. I received the full-screen pop-up shown at right. This pop-up was served to me by TargetSaver spyware, widely installed consent (example) and with misleading and/or hidden disclosures (1, 2). These installation practices cannot meet Cingular’s duties under its Assurance of Discontinuance (Cingular section, page 4, provision 14.a; PDF page 18).

The full list of URLs associated with this ad placement:

http://a.targetsaver.com/adshow
http://www.targetsaver.com/redirect.php?…www.easilyfound.com%2Fa%2F2.php…
http://www.easilyfound.com/a/2.php?cid=1032
http://www.easilyfound.com/a/3.php?cid=1032
http://click.linksynergy.com/fs-bin/click?id=MCVDOmK0318&offerid=91613.100…
http://www.cingular.com/cell-phone-service/cell-phone-sales/free-phones.js…

As shown in the URL log and packet log, TargetSaver sent traffic to EasilyFound. EasilyFound then forwarded the traffic on to LinkShare, a New York affiliate network, which sent the traffic to Cingular.

Cingular should have known that a partnership with EasilyFound would entail Cingular ads being shown through spyware. EasilyFound describes itself as “a metacrawler search engine.” But in my extended testing, EasilyFound widely buys spyware-originating traffic and sends that traffic onwards to affiliate merchants (Cingular among others). I have previously described this general practice in multiple articles on my public web site. I have also publicly documented this very behavior by EasilyFound specifically. In May 2006 slides, I showed EasilyFound buying traffic from Targetsaver and sending that traffic onwards to LinkShare and Walmart. I even posted an annotated packet log and traffic flow diagram. My slides have been available on the web for approximately ten months. Yet, by all indications, this affiliate remains in good standing at LinkShare and continues the same practices I documented last year.

According to Whois data, EasilyFound is based in Santa Monica, California, although EasilyFound’s Contact page gives no street address.

Additional Examples on File

The preceding six examples are only a portion of my recent records of spyware-originating ads from Cingular and Travelocity. I retain additional examples on file. My additional examples include additional banner injections, additional pop-ups, additional traffic flowing through Cingular’s affiliate program (LinkShare), and traffic flowing through Travelocity’s affiliate program (Commission Junction).

In my extended testing during the past two months, I have recorded only a single example of Priceline ads shown by spyware. That placement occurred through Priceline’s affiliate program, operated by Commission Junction.

The Scope of the Problem

The Assurances of Discontinuance reflect the remarkable size of the advertising expenditures that triggered the New York Attorney General’s intervention.

  Cingular Wireless (AT&T) Priceline Travelocity
Amount spent with Direct Revenue At least $592,172 At least $481,765.05 At least $767,955.93
Duration of Direct Revenue relationship April 1, 2004 through October 11, 2005 May 1, 2004 through February 24, 2006 July 1, 2004 through April 15, 2006
Number of ads shown At least 27,623,257 At least 6,142,395 At least 2,103,341
Knowledge of Direct Revenue’s practices “Even though Cingular was aware of controversy surrounding the use of adware and was aware, or should have been aware, of Direct Revenue’s deceptive practices, including surreptitious downloads, Cingular continued to use Direct Revenue.” “Priceline knew that consumers had downloaded Direct Revenue adware without full notice and consent and continued to receive ads through that software.” “Travelocity was aware that Direct Revenue had … been the subject of consumer complaints that Direct Revenue had surreptitiously installed its software on consumers’ computers without adequate notice.”
Additional factors listed by NYAG   “Some of Priceline’s advertisements were delivered directly to consumers from web servers owned or controlled by Priceline.”  
Payment to New York $35,000 of investigatory costs and penalties $35,000 of investigatory costs and penalties $30,000 of investigatory costs and penalties

These three advertisers alone paid more than $1.8 million to Direct Revenue — approximately 2% of Direct Revenue’s 2004-2005 revenues. See detailed Direct Revenue financial records.

Internet Advertising and the Generalized Second Price Auction: Selling Billions of Dollars Worth of Keywords

Edelman, Benjamin, Michael Ostrovsky, and Michael Schwarz. “Internet Advertising and the Generalized Second Price Auction: Selling Billions of Dollars Worth of Keywords.” American Economic Review 97, no. 1 (March 2007): 242-259.

Winner of the 2013 Prize in Game Theory and Computer Science from the Game Theory Society (for “the best paper at the interface of game theory and computer science in the last decade”).

Winner of the 2018 SIGecom Test of Time Award from the ACM Special Interest Group on E-Commerce (for “an influential paper or series of papers published between ten and twenty-five years ago that has significantly impacted research or applications exemplifying the interplay of economics and computation”).

We investigate the “generalized second-price” auction (GSP), a new mechanism used by search engines to sell online advertising. Although GSP looks similar to the Vickrey-Clarke-Groves (VCG) mechanism, its properties are very different. Unlike the VCG mechanism, GSP generally does not have an equilibrium in dominant strategies, and truth-telling is not an equilibrium of GSP. To analyze the properties of GSP, we describe the generalized English auction that corresponds to the GSP and show that it has a unique equilibrium. This is an ex post equilibrium, with the same payoffs to all players as the dominant strategy equilibrium of VCG.

Strategic Bidder Behavior in Sponsored Search Auctions

Edelman, Benjamin, and Michael Ostrovsky. “Strategic Bidder Behavior in Sponsored Search Auctions.” Decision Support Systems 43, no. 1 (February 2007): 192-198. (Winner of Emerald Citations of Excellence.)

We examine sponsored search auctions run by Overture (now part of Yahoo!) and Google and present evidence of strategic bidder behavior in these auctions. Between June 15, 2002, and June 14, 2003, we estimate that Overture’s revenue from sponsored search might have been higher if it had been able to prevent this strategic behavior. We present a specific alternative mechanism that could reduce the amount of strategizing by bidders, raise search engines’ revenue, and also increase the overall efficiency of the market. We conclude by showing that advertisers’ strategic behavior has not disappeared over time; rather, such behavior remains present on both search engines.

Greedy Bidding Strategies for Keyword Auctions

Cary, Matthew, Aparna Das, Benjamin Edelman, Ioannis Giotis, Kurtis Heimerl, Anna Karlin, Claire Mathieu, and Michael Schwarz. “Greedy Bidding Strategies for Keyword Auctions.” Proceedings of the International Conference on Electronic Commerce (2007): 262-271.

How should players bid in keyword auctions such as those used by Google, Yahoo! and MSN? We consider greedy bidding strategies for a repeated auction on a single keyword, where in each round, each player chooses some optimal bid for the next round, assuming that the other players merely repeat their previous bid. We study the revenue, convergence and robustness properties of such strategies. Most interesting among these is a strategy we call the balanced bidding strategy (bb): it is known that bb has a unique fixed point with payments identical to those of the VCG mechanism. We show that if all players use the bb strategy and update each round, bb converges when the number of slots is at most 2, but does not always converge for 3 or more slots. On the other hand, we present a simple variant which is guaranteed to converge to the same fixed point for any number of slots. In a model in which only one randomly chosen player updates each round according to the bb strategy, we prove that convergence occurs with probability 1. We complement our theoretical results with empirical studies.

Optimal Auction Design in a Multi-unit Environment: The Case of Sponsored Search Auctions

Edelman, Benjamin, and Michael Schwarz. “Optimal Auction Design in a Multi-unit Environment: The Case of Sponsored Search Auctions.” December 2006. Mimeo. (Revised and published as Optimal Auction Design and Equilibrium Selection in Sponsored Search Auctions, American Economic Review 100, no. 2 (May 2010): 597-602.)

We characterize the optimal (revenue maximizing) auction for sponsored search advertising. We show that a search engine’s optimal reserve price is independent of the number of bidders. Using simulations, we consider the changes that result from a search engine’s choice of reserve price and from changes in the number of participating advertisers.

Bad Practices Continue at Zango, Notwithstanding Proposed FTC Settlement and Zango’s Claims with Eric Howes; updated December 8, 2006

Earlier this month, the FTC announced the proposed settlement of its investigation into Zango, makers of advertising software widely installed onto users’ computers without their consent or without their informed consent (among other bad practices).

We commend the proposed settlement’s core terms. But despite these strong provisions, bad practices continue at Zango — practices that, in our judgment, put Zango in violation of the key terms and requirements of the FTC settlement. We begin by explaining the proposed settlement’s requirements. We then present eight types of violations of the proposed settlement, with specific examples of each. We conclude with recommendations and additional analysis.

Except where otherwise indicated, this document describes only downloads we tested during November 2006 — current, recent installations and behaviors.

Zango’s Burdens Under the Proposed FTC Settlement

The FTC’s proposed settlement with Zango imposes a number of important requirements and burdens on Zango, including Zango’s installation and advertising practices. Specifically, the settlement:

  • Prohibits Zango from using “any legacy program to display any advertisement to, or otherwise communicate with, a consumer’s computer.” (settlement I)
  • Prohibits Zango from (directly or via third parties) “exploit[ing] a security vulnerability … to download or install onto any computer any software code, program, or content.” (II)
  • Prohibits from Zango installing software onto users’ computers without “express consent.” Obtaining “express consent” requires “clearly and prominently disclos[ing] the material terms of such software program or application prior to the display of, and separate from, any final End User License Agreement.” (III) Defines “prominent” disclosure to be, among other requirements, “unavoidable.” (definition 5)
  • Requires Zango to “provide a reasonable and effective means for consumers to uninstall the software or application,” e.g. through a computers’ Add/Remove utility. (VII)
  • Requires Zango to “clearly and prominently” label each advertisement it displays. (VI)

These are serious burdens and requirements that, were they zealously satisfied by Zango, would do much to protect consumers from the numerous nonconsensual and misleading Zango installations we have observed.

Zango Is Not In Compliance with the Proposed Settlement

Zango has claimed that it “has met or exceeded the key notice and consent standards detailed in the FTC consent order since at least January 1, 2006.”

Despite Zango’s claim, we continue to find ongoing installations of Zango’s software that fall far short of the proposed settlement’s burdens, requirements, and standards. The example installations that we present below establish that Zango’s current installation and advertising practices remain in violation of the terms and requirements of the proposed settlement.

  • “Material Terms” Disclosed Only in EULA
    Zango often announces “material terms” only in its End User License Agreement, not in the more prominent locations required by the proposed settlement. (Examples A, B)
  • “Material Terms” Omitted from Disclosure
    Zango often omits “material terms” from its prominent installation disclosures — failing to prominently disclose facts likely to affect consumers’ decisions to install Zango’s software. (Examples A, B, C)
  • Disclosures Not Clear & Prominent 
    Zango presents disclosures in a manner and format such that these disclosures fail to gain the required “express consent” of users because the disclosures are not “clearly and prominently” displayed. (Examples B, E, F)
  • Disclosures Presented Only After Software Download & Execution
    Zango presents disclosures only after the installation and execution of Zango’s software on the users’ computers has already occurred, contrary to the terms of the proposed settlement. (Examples C, F)
  • No Disclosure Provided Whatsoever
    Some Zango software continues to become installed with no disclosure whatsoever. (Example D)
  • Installation & Servicing of Legacy Programs
    Older versions of Zango’s software — versions with installation, uninstallation, and/or disclosure inconsistent with the proposed settlement — continue to become installed and to communicate with Zango servers. (Examples C, D, E, F)
  • Installations Promoted & Performed through Miscellaneous Other Deceptive Means & Circumstances
    Zango installs are still known to be promoted and performed in or through a variety of miscellaneous practices that can only be characterized as deceptive. (Multiple examples in section G)
  • Unlabeled Advertising
    Some Zango advertisements lack the labeling required by the proposed settlement. (Multiple examples in section H)

These improper practices remain remarkably easy to find, and we have numerous additional recent examples on file. Moreover, these problems are sufficiently serious that they cast doubt on the efficacy and viability of the FTC’s proposed settlement as well as Zango’s ability to meet the requirements of the settlement.

Example A: Zango’s Ongoing Misleading Installations On and From Its Own Servers

The proposed settlement requires “express consent” before software may be “install[ed] or “download[ed]” onto users’ PCs (III). The term “prominent” is defined to mean “clear[] and prominent[]” disclosure of “the material terms” of the program to be installed, and most of Zango’s recent installation disclosures seem to meet this standard. But we are concerned by what those disclosures say. In our view, the disclosures omit the material facts Zango is obliged to disclose.

Although the proposed settlement does not explain what constitute “material” terms, other FTC authority provides a definition. The FTC’s Policy Statement on Deception, holds that a material fact is one “likely to affect the consumer’s conduct or decision with regard to a product or service.”

From our analysis of Zango’s software, we think Zango has two material features — two features particularly likely to affect a reasonable user’s decision to install (or not install) Zango software. First, users must know that Zango will give them extra pop-up ads — not just “advertisements,” but pop-ups that appear in separate, freestanding windows. Second, users must know that Zango will transmit detailed information to its servers, including information about what web pages they view, and what they search for.

A Misleading Zango Installer Appearing Within Windows Media Player A Misleading Zango Installer Appearing Within Windows Media Player

Unfortunately, many of Zango’s installations fail to include these disclosures with the required prominence. Consider the screen shown at right. Here, Zango admits that it shows “advertisements,” but Zango fails to disclose that its ads appear in pop-ups. Zango’s use of the word “advertisements,” with nothing more, suggests that Zango’s ads appear in standard advertising formats — formats users are more inclined to tolerate, like ordinary banner ads within web pages (e.g. the ads at nytimes.com) or within other software programs (e.g. the ads in MSN Messenger). In fact Zango’s pop-up ads are quite different, in that they appear in pop-ups known to be particularly annoying and intrusive. But the word “advertisements” does nothing to alert users to this crucial fact.

Zango also fails to disclose that its servers receive detailed information about users’ online behavior. Zango tell users that ads are “based on” users’ browsing. But this disclosure is not enough, because it omits a material fact. In particular, the disclosure fails to explain that users’ behavior will be transmitted to Zango, a fact that would influence reasonable users’ decision to install Zango.

In addition, Zango’s description of its toolbar omits important, material effects of the toolbar — namely, that the toolbar will show distracting animated ads. Zango says only that the toolbar “lets [users] search the Internet from any webpage” — entirely failing to mention the toolbar’s advertising,

We’re also concerned about the format and circumstances of these installation screens. Zango’s installation request appears in a Windows Media “license acquisition” screen — a system Microsoft provides for bona fide license acquisition, not for the installation of spyware or adware. Zango’s installer appears within Windows Media Player — a context where few users will expect to be on the lookout for unwanted advertising software, particularly when users had merely sought to watch a video, not to install any software whatsoever. Furthermore, the button to proceed with installation is misleadingly labeled “Play Now” — not “I Accept,” Install,” or any other caption that might alert users to the consequences of pressing the button. The screen’s small size further adds to user confusion: At just 485 by 295 pixels, the window doesn’t have room to explain the material effects of Zango’s software, even with Zango’s extra-small font. (In Zango’s main disclosure, capital letters are just seven pixels tall.) Furthermore, a user seeking to read Zango’s EULA (as embedded in these installation screens) faces a remarkable challenge: The 3,033 word document is shown in a box just five lines tall, therefore requiring fully 53 on-screen pages to view in full. Finally, if a user ultimately presses the “Play Now ” button, then the “Open” button on the standard Open/Save box that follows, Zango installs immediately, without any further opportunity for users to learn more or to change their mind. Such a rapid installation is contrary to standard Windows convention of further disclosures within an EXE installer, providing further opportunities for users to learn more and to change their minds. Video capture of this installation sequence.

All in all, we think typical users would be confused by this screen — unable to figure out who it comes from, what it seeks to do, or what exactly will occur if they press the Play Now button. A more appropriate installation sequence would use a standard format users better understand (e.g. a web page requesting permission to install), would tell users far more about the software they’re receiving, and would label its buttons far more clearly.

These installations are under Zango’s direct control: They are loaded directly from Zango’s servers. Were Zango so inclined, it could immediately terminate this installation sequence, or it could rework these installations, without any cooperation with (or even requests to) its distributors.

Example B: Zango’s Ongoing Misleading Hotbar Installations On and From Its Own Servers

Hotbar's Initial Installation Solicitation - Silent as to Hotbar's Effects Hotbar’s Initial Installation Solicitation – Silent as to Hotbar’s Effects

Hotbar's ActiveX Installer - Without Disclosure of Material Effects Hotbar’s ActiveX Installer – Without Disclosure of Material Effects

Final Step in Hotbar Installation - No Cancel Button, No Disclosure of Material Effects Final Step in Hotbar Installation – No Cancel Button, No Disclosure of Material Effects

The “express consent” required under the proposed settlement applies not just to software branded as “Zango,” but also to all other software installed or downloaded by Zango. (See “any software” in section III.) The “express consent” requirement therefore applies to Hotbar-branded software owned by Zango as a result of Zango’s recent merger with Hotbar. But Hotbar installations fail to include unavoidable disclosures of material effects, despite the requirements in the proposed settlement.

Consider the Hotbar installation shown in this video and in the screenshots at right. The installation sequence begins with an ad offering “free new emotion icons” (first screenshot at right) — certainly no disclosure of the resulting advertising software, the kinds of ads to be shown, or the significant privacy effects. If a user clicks that ad, the user receives the second screenshot at right — a bare ActiveX screen, again lacking a substantive statement of material effects of installing. If the user presses Yes in the ActiveX screen, the user receives the third screen at right — disclosing some features of Hotbar (e.g. weather, wallpapers, screensavers), and vaguely admitting that Hotbar is “ad supported,” but saying nothing whatsoever about the specific types of ads (e.g. intrusive in-browser toolbar animations) nor the privacy consequences. Furthermore, this third screen lacks any button by which users can decline or cancel installation. (Note the absence of any “cancel” button, or even an “x” in the upper-right corner.)

This installation sequence is substantially unchanged from what Edelman reported in May 2005.

This installation lacks the unavoidable material disclosures required under the proposed settlement. We see no way to reconcile this installation sequence with the requirements of the proposed settlement.

Example C: Incomplete, Nonsensical, and Inconsistent Disclosures Shown by Aaascreensavers Installing Zango Software

Aaascreensavers' Initial Zango Prompt - Omitting Key Material Information Aaascreensavers’ Initial Zango Prompt – Omitting Key Material Information

Zango's Subsequent Screen -- with deficiencies set out in the text at left Zango’s Subsequent Screen — with deficiencies set out in the text at left

We also remain concerned about third parties installing Zango’s software without the required user consent. Zango’s past features a remarkable serious of bad-actor distributors, from exploit-based installers to botnets to faked consent. Even today, some distributors continue to install Zango without providing the required “clear and prominent” notice of “material” effects.

Consider an installation of Zango from Aaascreensavers.com. Aaascreensavers provides a generic “n-Case” installation disclosure that says nothing about the specifics of Zango’s practices — omitting even the word “advertisements,” not to mention “pop-ups” or privacy consequences. (See first screenshot at right.) Furthermore, Aaascreensavers fails to show or even reference a EULA for Zango’s software. Nonetheless, Aaascreensavers continues to place Zango software onto users’ PCs through these installers.

Particularly striking is the nonsensical screen that appears shortly after Aaascreensavers installs Zango. (See second screenshot at right.) Beneath a caption labeled “Setup,” the screen states “the content on this site is free, thanks to 180search Assistant” — although the user has just installed a program (and is not browsing a site), and the program the user (arguably) just agreed to install was called “n-Case” not “180search Assistant.” At least as paradoxically, the “Setup” screen asks users to choose between “Uninstall[ing] 180search Assistant” and “Keep[ing]” the software. Since “180search Assistant” is software reasonable users will not even know they have, this choice is particularly likely to puzzle typical users. After all, it is nonsense to speak of a user making an informed decision to “keep” software he didn’t know he had.

Crucially, both installation prompts omit the material information Zango must disclose under its settlement obligations: Neither prompt mentions that ads will be shown in pop-ups, nor do they mention the important privacy effects of installing Zango software.

Video capture of this installation sequence.

Example D: Msnemotions Installing Zango with No Disclosure At All

Msnemotions continues to install Zango software with no disclosure whatsoever. In particular, Msnemotions never shows any license agreement, nor does it mention or reference Zango in any other on-screen text, even if users fully scroll through all listings presented to them. Video proof.

This installation is a clear violation of section III of the proposed FTC settlement. That section prohibits Zango “directly, or through any person [from] install[ing] or download[ing] … any software program or application without express consent.” Here, no such consent was obtained, yet Zango software downloaded and installed anyway.

In our tests, this Zango installation did not show any ads (although it did contact a Zango server and download a 20MB file). Nonetheless, the violation of section III occurs as soon as the Zango software is downloaded onto the user’s computer, for lack of the requisite disclosure and consent.

Example E: Emomagic Installing Zango with an Off-Screen Disclosure

Emomagic First Mentions Zango Five Pages Down In Its EULA
Emomagic First Mentions Zango 5 Pages Down In Its EULA

Emomagic continues to install Zango software with a disclosure buried five pages within its lengthy (23 on-screen-page) license agreement. That is, unless a user happened to scroll to at least the fifth page of the Emomagic license, the user would not learn that installing Emomagic installs Zango too. Video proof.

This installation is a clear violation of the proposed FTC settlement, because the hidden disclosure of Zango software is not “unavoidable.” In contrast, the proposed Settlement’s provision III and definition 5 define “prominent” disclosures to be those that are unavoidable, among other requirements.

We have additional examples on file where the first mention of Zango comes as far as 64 pages into a EULA presented in a scroll box. See also example F, below, where Zango appears 44 pages into a EULA, after the GPL.

Example F: Warez P2P Speedup Pro Installing Zango with an Off-Screen Disclosure

Warez P2P First Mentions Zango at Page 44 of its EULA, Below the GPL Warez P2P First Mentions Zango at Page 44 of its EULA, Below the GPL

Warez P2P Speedup Pro continues to install Zango software with a disclosure buried 44 pages within its lengthy license agreement. Video proof. Users are unlikely to see mention of Zango in part because Zango’s first mention comes so far down within the EULA.

Users are particularly unlikely to find Zango’s EULA because the first 43 pages of the EULA scroll box show the General Public License (GPL). (Screenshot of the first page, giving no suggestion that anything but the GPL appears within the scroll box.) Sophisticated users may already be familiar with this license, which is known for the many rights it grants to users and independent developers. Recognizing this pro-consumer license, even sophisticated users are discouraged from reviewing the scroll box’s contents in full — making it all the less likely that they will find the Zango license further down.

After installation, Warez P2P Speedup Pro proceeds to the second screen shown in Example C, above. The video confirms the special deceptiveness of this screen: If a user chooses the “uninstall” button — exercising his option (however deceptively mislabeled) to refuse Zango’s software — the user then receives a further screen attempting to get the user to change his mind and accept installation after all. The substance of this screen is especially deceptive — asking the user whether he wants to “cancel,” when in fact he had never elected even to start the Zango installation sequence in the first place. Finally, if the user presses the “Exit Setup” button on that final screen, the user is told he must restart his computer — a particularly galling and unnecessary interruption.

Section G: Zango Installations Predicated on Consumer Deception or on Use of Other Vendors’ Spyware

A Zango Ad Injected into Google by FullContext A Zango Ad Injected into Google by FullContext

We have also observed Zango installs occurring subsequent to consumer deception or other vendors sending spyware-delivered traffic to Zango.

Fullcontext spyware promoting Zango. We have observed Fullcontext spyware (itself widely installed without consent) injecting Zango ads into third parties’ web sites. Through this process, Zango ads appear without the permission of the sites in which they are shown, and without payment to those sites. These ads even appear in places in which no banner ads are not available for purchase at any price. See e.g. the screenshot at right, showing a Zango banner ad injected to appear above Google’s search results.

Typosquatters promoting Zango. Separately, Websense and Chris Boyd recently documented Zango installs commencing at “Yootube”. “Yootube” is a clear typosquat on the well-known “Youtube” site — hoping to reach users who mistype the address of the more popular site. If users reach the misspelled site, they will be encouraged to install Zango. Such Zango installations are predicated on a typosquat, e.g. on users reaching a site other than what they intended — a particularly clear example of deception serving a key role in the Zango installation process.

Spyware bundlers promoting Zango. In our testing of summer and fall 2006, we repeatedly observed Zango “S3” installer programs downloaded onto users’ computers by spyware-bundlers themselves operating without user consent (e.g. DollarRevenue and TopInstalls). Users received these Zango installation prompts among an assault of literally dozens of other programs. Any consent obtained through this method is predicated on an improper, nonconsensual arrival onto users’ PCs — a circumstance in which we think users cannot grant informed consent. Furthermore. the proposed settlement requires “express consent” before “installing or downloading” (emphasis added) “any software” onto users’ PCs (section III). Zango’s S3 installer is a “software program” within the meaning of the proposed settlement, yet DollarRevenue and TopInstalls downloaded this program onto users’ computers without consent. So these downloads violate the plain language of the proposed settlement, even where users ultimately refuse to install Zango software.

Update (December 8): We have uncovered still other Zango installations predicated on deception, including on phishing at MySpace. We discuss these improper practices in our follow-up comment to the FTC. Our bottom line: These Zango installs are disturbing not because they put zango in violation of hte terms of hte proposed settlement, but precisely because they do not — because tehse isntallations, disturbing though they may be, do not clearly violate any of the settlement’s requirements. These installations raise the alarming prospect that this settlement could allow Zango to continue to pay distributors to create malicious and/or deceptive software and web pages.

Section H: Unlabeled Ads

Today CDT filed a further comment about the FTC’s proposed settlement, focusing in part on Zango’s recent display of unlabeled ads, again specifically contrary to Zango’s obligations under the proposed settlement (section VI). CDT has proof of 39 unlabeled ads — 10% of their recent partially-automated tests — in which Zango’s pop-up ads lacked the labeling required under the proposed settlement. CDT explains that the ads “provide[d] absolutely no information that would allow consumers to correlate the advertisements’ origins to Zango’s software.”

We share CDT’s concern, because we too have repeatedly seen these problems. For example, this video shows a Zango ad served on November 19, 2006 — with labeling that disappears after less than four seconds on screen (from 0:02 to 0:06 in the video). Furthermore, Edelman first reported this same problem in July 2004: That when ads include redirects (as many do), Zango’s labeling often disappears. Compliance with the proposed settlement requires that Zango’s labeling appear on each and every ad, not just on some of the ads or even on most of the ads. So, here too, Zango is in breach of the proposed settlement.

Furthermore, the proposed settlement’s labeling requirement applies to “any advertisement” Zango serves — not just to Zango’s pop-ups, but to other ads too. Zango’s toolbars show many ads, as depicted in the screenshots below. Yet these toolbars lack the labeling and hyperlinks required by the proposed settlement. These unlabeled toolbars therefore constitute an additional violation of Zango’s duties under the proposed settlement.


Zango and Zango/Hotbar Toolbars Without the Labeling Required under the Proposed Settlement

The Size of Zango’s Payment to the FTC

We are puzzled by the size of the cash payment to be made by Zango. We understand that the FTC’s authority is limited to reclaiming ill-gotten profits, not to extracting penalties. But we think Zango’s profits to date far exceed the $3 million payment specified in the proposed settlement.

Available evidence suggests Zango’s company-to-date profits are substantial, probably beyond $3 million. As a threshold matter, Zango’s business is large: Zango claims to have 20 million active users at present (albeit with some “churn” as users manage to uninstall Zango’s software). Furthermore, Zango’s revenues are large: Zango recently told a reporter of daily revenues of $100,000 (i.e. $36 million per year), a slight increase from a 2003 report of $75,000 per day. With annual revenues on the order of $20 to $40 million, and with three years of operation to date, we find it inconceivable that Zango has made only $3 million of profit.

Zango’s prior statements and other companies’ records also both indicate that Zango’s profits exceed $3 million. A 2005 Forbes article confirms high profits at Zango, reporting “double-digit percentage growth in profits” — though without stating the baseline level of profits. But financial records from competing “adware” vendor Direct Revenue indicate a remarkable 75%+ profit margin: In 2004, DR earned $30 million of pre-tax profit on $38 million of revenue. Because Zango’s business is in many respects similar to DR, Zango’s profit margin is also likely to be substantial, albeit reduced from the 2004-era “adware” peak. Even if Zango’s profit margin were an order of magnitude lower, i.e. 7%, Zango would still have earned far more than $3 million profits over the past several years.

If Zango’s profits substantially exceed $3 million, as we think they do, the settlement’s payment is only a slap on the wrist. A tougher fine — such as full disgorgement of all company-to-date profits worldwide — would better send the message that Zango’s practices are and have been unacceptable.

Zango’s Statements and the Need for Enforcement

In its November 3 press release, Zango claims its reforms are already in place. “Every consumer downloading Zango’s desktop advertising software sees a fully and conspicuously disclosed, plain-language notice and consent process,” Zango’s press release proclaims. This claim is exactly contrary to the numerous examples we present above. Zango further claims that it “has met or exceeded the key notice and consent standards detailed in the FTC consent order since at least January 1, 2006” — again contrary to our findings that nonconsensual and deceptive installations remain ongoing.

From the FTC’s press release and from recent statements of FTC commissioners and staff, it appears the FTC intends to send a tough message to makers of advertising software. We commend the FTC’s goal. The proposed settlement, if appropriately enforced, might send such a message. But we worry the FTC will send exactly the opposite message if it allows Zango to claim compliance without actually doing what the proposed settlement requires.

As a first step, we endorse CDT’s suggestion that the FTC require Zango to retract its claim of compliance with the proposed settlement. Zango’s statement is false, and the FTC should not stand by while Zango mischaracterizes its behavior vis-a-vis the proposed settlement.

More broadly, we believe intensive ongoing monitoring will be required to assure that Zango actually complies with the settlement. We have spent 3+ years following Zango’s repeated promises of “reform,” and we have first-hand experience with the wide variety of techniques Zango and its partners have used to place software onto users’ PCs. Testing these methods requires more than black-letter contracts and agreements; it requires hands-on testing of actual infected PCs and the scores of diverse infection mechanisms Zango’s partners devise. To assure that Zango actually complies with the agreement, we think the FTC will need to allocate its investigatory resources accordingly. We’ve spent approximately 10 hours on the investigations leading to the results above, and we’ve uncovered these examples as well as various others. With dozens or hundreds of hours, we think we could find many more surviving Zango installations in violation of the proposed settlement’s requirements. We think the FTC ought to find these installations, or require that Zango do so, and then ought to see that the associated files are entirely removed from the web.

Update (December 8): Our follow-up comment to the FTC discusses additional concerns, further ongoing bad practices at Zango, and the special difficulty of enforcement in light of practices seemingly not prohibited by the proposed settlement.

Intermix Revisited

I recently had the honor of serving as an expert witness in The People of the State of California ex. rel. Rockard J. Delgadillo, Los Angeles City Attorney v. Intermix Media, Inc., Case No. BC343196 (L.A. Superior Court), litigation brought by the City Attorney of Los Angeles (on behalf of the people of California)against Intermix. Though Intermix is better known for creating MySpace, Intermix also made spyware that, among other effects, can become installed on users’ computers without their consent.

On Monday the parties announced a settlement under which Intermix will pay total monetary relief of $300,000 (including $125,000 of penalties, $50,000 in costs of investigation, and $125,000 in a contribution of computers to local non-profits). Intermix will also assure that third parties cease continued distribution of its software, among other injunctive relief. These penalties are in addition to Intermix’s 2005 $7.5 million settlement with the New York Attorney General.

In the course of this matter, I had occasion to examine my records of past Intermix installations. For example, within my records of installations I personally observed nearly two years ago, I found video evidence of Intermix becoming installed by SecondThought. By all indications, SecondThought’s exploit-based installers placed Intermix onto users’ computers without notice or consent.

Using web pages and installer files found on Archive.org, I also demonstrated that installations on Intermix’s own web sites were remarkably deficient. For example, some Intermix installations disclosed only a portion of the Intermix programs that would become installed, systematically failing to tell users about other programs they would receive if they went forward with installation. Most Intermix installations failed to affirmatively show users their license agreements, instead requiring users to affirmatively click to access the licenses; and in some instances, even when a user did click, the license was presented without scroll bars, such that even a determined user couldn’t read the full license. Furthermore, some Intermix installations claimed a home page change would occur only if a user chose that option (“you can choose to have your default start page reset”), when in fact that change occurred no matter what, without giving users any choice.

Remarkably, I also found evidence of ongoing Intermix installations, despite Intermix’s 2005 promise to “permanently discontinue distribution of its adware, redirect and toolbar programs.” For example, in my testing of October 2006 and again just yesterday, the Battling Bones screensaver (among various others) was still available on Screensavershot.com (a third-party site). Installing Battling Bones gives users Intermix’s Incredifind too. Even worse, this installation proceeds without any disclosure to the user of the Intermix software that would be installed. (Video proof. The installer’s EULA mentions various other programs to be installed, but it never mentions Intermix or the specific Intermix programs that in fact were installed.) Furthermore, I found dozens of “.CAB” installation files still on Intermix’s own web servers — particularly hard to reconcile with Intermix’s claim of having abandoned this business nearly two years months ago. Truly shutting down the business would have entailed deleting all such files from all servers controlled by Intermix.

I continue to think there’s substantial room for litigation against US-based spyware vendors. I continue to see nonconsensual and materially deceptive installations by numerous identifiable US spyware vendors. (For example, I posted a fresh Ask.com nonconsensual toolbar installation just last month. And I see more nonconsensual installations of other US-based vendors’ programs, day in and day out.) These vendors continue to cause substantial harm to the users who receive their unwanted software.


Technology news sites and forums have been abuzz over the FTC’s proposed settlement with Zango, whose advertising software has widely been installed without consent or without informed consent. I commend the FTC’s investigation, and the injunctive terms of the settlement (i.e. what Zango has to do) are appropriately tough. Oddly, Zango claims to have “met or exceeded the key notice and consent standards … since at least January 1, 2006.” I disagree. From what I’ve seen, Zango remains out of compliance to this day. I’m putting together appropriate screenshot and video proof.

Current Ask Toolbar Practices

Last year I documented Ask toolbars installing without consent as well as installing by targeting kids. Ask staff admitted both practices are unacceptable, and Ask promised to make them stop. Unfortunately, Ask has not succeeded.

In today’s post, I report notable current Ask practices. I show Ask ads running on kids sites and in various noxious spyware, specifically contrary to Ask’s prior promises. I document yet another installation of Ask’s toolbar that occurs without user notice or consent. I point out why Ask’s toolbar is inherently objectionable — especially its rearrangement of users’ browsers and its excessive pay-per-click ads to the effective exclusion of ordinary organic links. I compare Ask’s practices with its staff’s promises and with governing law — especially “deceptive door opener” FTC precedent, prohibiting misleading initial statements even where clarified by subsequent statements.

Details:

Current Practices of IAC/Ask Toolbars

False and Deceptive Pay-Per-Click Ads

I present and critique pay-per-click ads that don’t deliver what they promise. I consider implications for search engine revenues, and I analyze legal and ethical duties of advertisers and search engines. I offer a system for others to report similar ads that they find.

Read Google’s voluminous Adwords Content Policy, and you’d think Google is awfully tough on bad ads. If your company sells illegal drugs, makes fake documents, or helps customers cheat drug tests, you can’t advertise at Google. Google also prohibits ads for fireworks, gambling, miracle cures, prostitution, radar detectors, and weapons. What kind of scam could get through rules like these?

As it turns out, lots of pay-per-click advertisers push and exceed the limits of ethical and legal advertising — like selling products that are actually free, or promising their services are “completely free” when they actually carry substantial recurring charges.

In the sections that follow, I flag more than 30 different advertisers’ ads, all bearing claims that seem to violate applicable FTC rules (e.g. on use of the word “free”), or that make claims that are simply false. (All ads were observed on September 15 or later.) I then explain why this problem is substantially Google’s responsibility, and I present evidence suggesting Google’s substantial profits from these scams. Finally, I offer a mechanism for interested users to submit other false or deceptive ads, and I remark on Google’s failure to take action.

Charging for software that’s actually free

One scam Google doesn’t prohibit — and as best I can tell, does nothing to stop — is charging for software that’s actually free. Search for “Skype” and you’ll find half a dozen advertisers offering to sell eBay’s free telephone software. Search for “Kazaa” or “Grokster” and those products are sold too. Even Firefox has been targeted.

Each and every one of these ads includes the claim that the specified product is “free.” (These claims are expressed in ad titles, bodies, and/or display URLs). However, to the best of my knowledge, that claim is false, as applied to each and every ad shown above: The specified products are available from the specified sites only if the user pays a subscription fee.

These ads are particularly galling because, in each example, the specified program is available for free elsewhere on the web, e.g. directly from its developer’s web site. Since these products are free elsewhere, yet cost money at these sites (despite promises to the contrary), these sites offer users a particularly poor value.

Often these sites claim to offer tech support, but that’s also a ruse: Tests confirm there’s no real support.

Although sophisticated users will realize that these sites are bad deals, novice or hurried users may not. These sites bid for top search engine placement — often appearing above search engines’ organic (main) results. Some proportion of users see these prominent ads, click through, and get tricked into paying for these otherwise-free programs. Claiming a refund takes longer than it’s worth to most users. So as a practical matter, a site need only trick each user for an instant in order to receive its fee.

The “completely free” ringtones that aren’t

Ringtone ads often claim to be “free,” “totally free,” “all free,” “100% complimentary,” and available with “no credit card” and “no obligation” required. These claims typically appear in pay-per-click ad bodies, but they also often appear in ad titles and even in ad domain names, of course along with landing pages.

Often, these claims are simply false: An ad does not offer a “totally free” product if it touts a limited free trial followed by an auto-renewing paid service (a negative option plan).

Other claims are materially misleading. For example, claiming “no credit card required ” suggests that no charges will accrue. But that too is false, since ringtone sites generally charge users through cell phone billing systems, unbeknown to many users who believe a service has no way to impose a charge if a user provides no credit card number.

Each and every one of these ads includes the claim that the specified product is “free” (or some other claim substantially similar, e.g. “complimentary”). In most cases, subsequent language attempts to disavow these “free” claims. But in each case, to the best of my knowledge, service is available only if a user enters into a paid relationship (e.g. a paid subscription) — the very opposite of “free.” (Indeed, the subscription requirement applies even to unlimitedringtones.com, despite that ad’s claim that “no subscription [is] required.” The site’s fine print later asserts that by requesting a ringtone registration, a user “acknowledge[s] that [he is] subscribing to our service billed at $9.99 per month” — specifically contrary to site’s earlier “no subscription” promise.)

Vendors would likely defend their sites by claiming that (in general) their introductory offers are free, and by arguing that their fine print adequately discloses users’ subsequent obligations. This is interesting reasoning, but it’s ultimately unconvincing, thanks to clear regulatory duties to the contrary.

The FTC’s Guide Concerning the Use of the Word ‘Free’ is exactly on point. The guide instructs advertisers to use the word “free” (and all words similar in meaning) with “extreme care” “to avoid any possibility that consumers will be misled or deceived.” The guide sets out specific rules as to how and when the word “free” may be used, and it culminates with an incredible provision prohibiting fine print to disclaim what “free” promises. In particular, the rule’s section (c) instructs (emphasis added):

All the terms, conditions and obligations upon which receipt and retention of the ‘Free’ item are contingent should be set forth clearly and conspicuously at the outset of the offer … in close conjunction with the offer of ‘Free’ merchandise or service.

In case that instruction left any doubt, the FTC’s rule continues:

For example, disclosure of the terms of the offer set forth in a footnote of an advertisement to which reference is made by an asterisk or other symbol placed next to the offer, is not regarded as making disclosure at the outset.

Advertisers may not like this rule, but it’s remarkably clear. Under the FTC’s policy, ads simply cannot use a footnote or disclaimer to escape a “free” promise made earlier. Nor can an advertiser promise a “free” offer at an early stage (e.g. a search engine ad), only to impose additional conditions later (such as in a landing page, confirmation page, or other addendum). The initial confusion or deception is too strong to be cured by the subsequent revision.

Advertisers might claim that the prohibited “free” ads at issue come from their affiliates or other partners — that they’re not the advertisers’ fault. But the FTC’s Guide specifically speaks to the special duty of supervising business partners’ promotion of “free” offers. In particular, section (d) requires:

[I]f the supplier knows, or should know, that a ‘Free” offer he is promoting is not being passed on by a reseller, or otherwise is being used by a reseller as an instrumentality for deception, it is improper for the supplier to continue to offer the product as promoted to such reseller. He should take appropriate steps to bring an end to the deception, including the withdrawal of the ‘Free’ offer.

It therefore appears that the ads shown above systematically violate the FTC’s “free” rules. Such ads fail to disclose the applicable conditions at the outset of the offer, as FTC rules require. And even where intermediaries have placed such ads, their involvement offers advertisers no valid defense.

Ads impersonating famous and well-known sites

Some pay-per-click ads affirmatively mislead users about who is advertising and what products are available. Consider the ads below, for site claiming to be (or to offer) Spybot. (Note text in their respective display URLs, shown in green type.) Despite the “Spybot” promise, these sites actually primarily offer other software, not Spybot. (Spybot-home.com includes one small link to Spybot, at the far bottom of its landing page. I could not find any link to the true Spybot site from within www-spybot.net.)

In addition, search engine ads often include listings for sites with names confusingly similar to the sites and products users request. For example, a user searching for “Spybot” often receives ads for SpyWareBot and SpyBoot — entirely different companies with entirely different products. US courts tend to hold that competitive trademark targeting — one company bidding on another company’s marks — is legal, in general. (French courts tend to disagree.) But to date, these cases have never considered the heightened confusion likely when a site goes beyond trademark-targeting and also copies or imitates another company’s name. Representative examples follow. Notice that each ad purports to offer (and is triggered by searches for the name of) a well-known product — but in fact these ads take users to competing vendors.

Google’s responsibility – law, ethics, and incentives

Google would likely blame its advertisers for these dubious ads. But Google’s other advertising policies demonstrate that Google has both the right and the ability to limit the ads shown on its site. Google certainly profits from the ads it is paid to show. Profits plus the right and ability to control yield exactly the requirements for vicarious liability in other areas of the law (e.g. copyright infringement). The FTC’s special “free” rules indicate little tolerance for finger-pointing — even specifically adding liability when “resellers” advertise a product improperly. These general rules provide an initial basis to seek greater efforts from Google.

Crucially, the Lanham Act specifically contemplates injunctive relief against a publisher for distributing false advertising. 15 USC § 1125(a)(1) prohibits false or misleading descriptions of material product characteristics. § 1114 (2) offers injunctive relief (albeit without money damages) where a publisher establishes it is an “innocent infringer.” If facing claims on such a theory, Google would surely attempt to invoke the “innocent infringer” doctrine — but that attempt might well fail, given the scope of the problem, given Google’s failure to stop even flagrant and longstanding violations, and given Google’s failure even to block improper ads specifically brought to its attention. (See e.g. World Wrestling Federation v. Posters, Inc., 2000 WL 1409831, holding that a publisher is not an innocent infringer if it “recklessly disregard[s] a high probability” of infringing others’ marks.)

Nonetheless, the Communications Decency Act’s 47 USC § 230(c)(1) potentially offers Google a remarkable protection: CDA § 230 instructs that Google, as a provider of an interactive computer service, may not be treated as the publisher of content others provide through that service. Even if a printed publication would face liability for printing the same ads Google shows, CDA § 230 may let Google distribute such ads online with impunity. From my perspective, that would be an improper result — bad policy in CDA § 230’s overbroad grant of immunity. A 2000 DOJ study seems to share my view, specifically concluding that “substantive regulation … should, as a rule, apply in the same way to conduct in the cyberworld as it does to conduct in the physical world.” But in CDA § 230, Congress seems to have chosen a different approach.

That said, CDA § 230’s reach is limited by its exception for intellectual property laws. § 230(e)(2) provides that intellectual property laws are not affected by § 230(c)(1)’s protection. False advertising prohibitions are codified within the Lanham Act (an intellectual property statute), offering a potential argument that CDA § 230 does not block false advertising claims. This argument is worth pursuing, and it might well prevail. But § 230 cases indicate repeated successes for defendants attempting to escape liability on a variety of fact patterns and legal theories. On balance, I cannot confidently predict the result of litigation attempting to hold Google responsible for the ads it shows. As a practical matter, it’s unclear whether or when this question will be answered in court. Certainly no one has attempted such a suit to date.

Notwithstanding Google’s possible legal defenses, I think Google ought to do more to make ads safe as a matter of ethics. Google created this mess — by making it so easy for all companies, even scammers, to buy Internet advertising. So Google faces a special duty to help clean up the resulting problems. Google already takes steps to avoid sending users to web sites with security exploits, and Google already refuses ads in various substantive categories deemed off-limits. These scams are equally noxious — directly taking users’ money under false pretenses. And Google’s relationship with these sites is particularly unsavory since Google directly and substantially profits from their practices, as detailed in the next section.

Even self-interest ought to push Google to do more here. Google may make an easy profit now by selling ads to scammers. But in the long run, rip-off ads discourage users from clicking on Google’s sponsored links — potentially undermining Google’s primary revenue source.

Who really profits from rip-off ads?

When users suffer from scams like those described above, users’ money goes to scammers, in the first instance. But each scammer must pay Google whenever a user clicks its ad. So Google profits from scammers’ activities. If the scammers ceased operations — voluntarily, or because Google cut off their traffic — Google’s short-run revenues would decrease.

Users
service fees
   Scammers   
advertising fees
Google
How Google Profits from Scammers

Consider the business model of rogue web sites “selling” software like Skype. They have one source of revenue — users buying these programs. Their expenses tend to be low: they provide no substantial customer service, and often they link to downloads hosted elsewhere to avoid even incurring bandwidth costs. It seems the main expense of such sites is advertising — with pay-per-click ads from Google by all indications a primary component. The diagram at right shows the basic money trail: From users to scam advertisers to Google. When users are ripped off by scammers, at least some of the payment flows through to Google.

How much of users’ payments goes to Google, rather than being retained by scammers? My academic economics research offers some insight. Recall that search engine ads are sold through a complicated multi-unit second-price auction: Each advertiser’s payment is determined by the bid of the price of the advertiser below him. Many equilibria are possible, but my recent paper with Michael Ostrovsky and Michael Schwarz offers one outcome we think is reasonable — an explicit formula for each advertiser’s equilibrium bid as a function of its value (per click) and of others’ bids. In subsequent simulations (article forthcoming), Schwarz and I will demonstrate the useful properties of this bidding rule — that it dominates most other strategies under very general conditions. So there’s good reason to think markets might actually end up in this equilibrium, or one close to it. If so, we need only know advertisers’ valuations (which we can simulate from an appropriate distribution) to compute market outcomes (like advertiser profits and search engine revenues).

One clear result of my recent bidding simulations: When advertisers have similar valuations (as these advertisers do), they tend to “bid away” their surpluses. That is, they bid almost as much as a click is worth to them — so they earn low profits, while search engines reap high revenues. When a user pays such an advertiser, it wouldn’t be surprising if the majority of that advertiser’s gross profit flowed through to Google.

A specific example helps clarify my result. Consider a user who pays $38 to Freedownloadhq.com for a “free” copy of Skype. But Freedownloadhq also received, say, 37 other clicks from 37 other users who left the site without making a purchase. Freedownloadhq therefore computes its valuation per click (its expected gross profit per incoming visitor) to be $1. The other 10 advertisers for “Skype” use a similar business model, yielding similar valuations. They bid against each other, rationally comparing the benefits off high traffic volume (if they bid high to get top placement at Google) against the resulting higher costs (hence lower profits). In equilibrium, simulations report, with 10 bidders and 20% standard deviation in valuations (relative to valuation levels), Google will get 71% of advertisers’ expected gross profit. So of the user’s $38, fully $27 flows to Google. Even if Freedownloadhq’s business includes some marginal costs (e.g. credit card processing fees), Google will still get the same proportion of gross profit.

One need not believe my simulation results, and all the economic reasoning behind them, in order to credit the underlying result: That when an auctioneer sells to bidders with similar valuations, the bidders tend to bid close together — giving the auctioneer high revenues, but leaving bidders with low profits. And the implications are striking: For every user who pays Freedownloadhq, much of the user’s money actually goes to Google.

In January I estimated that Google and Yahoo make $2 million per year on ads for “screensavers” that ultimately give users spyware. Add in all the other terms with dubious ads — all the ringtone ads, the for-free software downloads, ads making false statements of product origin, and various other scams — and I wouldn’t be surprised if the payments at issue total one to two orders of magnitude higher.

Towards a solution

Some of these practices have been improving. For example, six months ago almost all “ringtones” ads claimed to be “free,” but today some ringtones ads omit such claims (even while other ads still include these false statements).

Recent changes in Google pricing rules seem to discourage some of the advertisers who place ads of the sort set out above. Google has increased its pricing to certain advertisers, based on Google’s assessment of their “low quality user experience.” But the specific details of Google’s rules remain unknown. And plenty of scam ads — including all those set out above — have remained on Google’s site well after the most recent round of rule changes. (All ads shown above were received on September 15, 2006, or later.)

Google already has systems in place to enforce its Adwords Content Policy. My core suggestion for Google: Expand that policy to prevent these scams — for example, explicitly prohibiting ads that claim a product is “free” when it isn’t, and explicitly prohibiting charging users for software that’s actually free. Then monitor ads for words like “free” and “complimentary” that are particularly likely to be associated with violations. When a bad ad is found, disable it, and investigate other ads from that advertiser.

To track and present more dubious ads, I have developed a system whereby interested users can submit ads they consider misleading for the general reasons set out above. Submit an ad or view others’ submissions.

These problems generally affect other search engines too — Yahoo, MSN, and Ask.com, among others. But as the largest search engine, and as a self-proclaimed leader on ethics issues, I look to Google first and foremost for leadership and improvement.

Google’s (Non-)Response

When Information Week requested a comment from Google as to the ads I reported, Google responded as follows:

When we become aware of deceptive ads, we take them down. … We will review the ads referenced in this report, and remove them if they do not adhere to our guidelines.

A week later, these ads remain available. So Google must have concluded that these ads are not deceptive (or else Google would have “take[n] them down” as its first sentence promised). And Google must have concluded that these ads do adhere to applicable Google policies, or else Google would have “remove[d] them” (per its second sentence).

Google’s inaction exactly confirms my allegation: That Google’s ad policies are inadequate to protect users from outright scams, even when these scams are specifically brought to Google’s attention.

All identifications and characterizations have been made to the best of my ability. Any errors or alleged errors may be brought to my attention by email.

I thank Rebecca Tushnet for helpful discussions on the legal duties of advertisers and search engines.

StatCounter - Free Web Tracker and Counter

Originally posted October 9, 2006. Last Updated: October 16, 2006.