Autodesk and the First Sale Doctrine

Autodesk, Inc. and Timothy Vernor have gotten into a dispute over Mr. Vernor’s resale of Autodesk’s AutoCAD software on eBay.  Autodesk kept filing DMCA take down notices for each of Mr. Vernor’s auctions of AutoCAD software that Mr. Vernor had started on eBay.  After this happened a few times, Mr. Vernor hired a lawyer and sued Autodesk under the Declaratory Judgment Act, seeking a declaration of rights from a federal court within the 9th Circuit that Mr. Vernor had the right to resell Autodesk’s software.

Mr. Vernor won at the trial level.  A copy of the opinion is found at Vernor v. Autodesk, Inc., 555 F.Supp. 2d 1164 (2008).  At the heart of Mr. Vernor’s argument is the protections afforded by the Copyright Act under section 109, known as the “first sale doctrine.”  That section states: “Notwithstanding the provisions of section 106(3), the owner of a particular copy or phonorecord lawfully made under this title, or any person authorized by such owner, is entitled, without the authority of the copyright owner, to sell or otherwise dispose of the possession of that copy or phonorecord.”  17 U.S.C. § 109(a).  Mr. Vernor argued that his purchase at yard sales of copies of the AutoCAD software could have only occurred if AutoCAD had already sold copies of its software to another party prior to Mr. Vernor’s purchases.  Therefore, the first sale doctrine would immunize Mr. Vernor from further liability under the Copyright Act.

Autodesk, on the other hand, argued that effectively it had never sold a copy of its software to anyone, because any sale of its software is subject to a licensing agreement that specifically forbids transfer of the software, and the software in Mr. Vernor’s possession was not sold but was transferred to the prior holder via a settlement agreement between the prior entity and Autodesk.  Furthermore, the software itself is only offered for sale via a restrictive license, making the subsequent holder of the copy of the software a licensee.  As a result, section 109 provides such a person, such as Mr. Vernor, any defense.

After the trial court entered judgment for Mr. Vernor, Autodesk appealed.  The Ninth Circuit reversed the trial court.  A copy of their opinion is here: 09-35969, and is found at Vernor v. Autodesk, Inc., No. 09-35969 (9th Cir. Sep. 10, 2010).  The Ninth Circuit established a three part test for determining if the subsequent holder of a copy of software owns the software or is merely a licensee: “We hold today that a software user is a licensee rather than an owner of a copy where the copyright owner (1) specifies that the user is granted a license; (2) significantly restricts the user’s ability to transfer the software; and (3) imposes notable use restrictions.”

For fun, I downloaded a copy of the End User License Agreement that Microsoft licenses its Office Suite, which you can read here: clientallup_eula_english.  I know that you will be surprised to discover that Microsoft licenses but does not sell its software to end users.  Section 7 of this agreement provides a whole host of restrictions on use and resale of its software.  So I checked on ebay to see if anyone would sell me a copy of Microsoft Office, and this morning I found 9,623 offers.  Searching for Autocad turned up over 2,400 copies for sale. Apparently many people who possess copies of software don’t pay much attention to the license agreement that makes them licensees rather than owners, and that now makes them into copyright infringers when they started offering these software packages for sale on sites like eBay.

The licensing terms for the Microsoft EULA do suggest that “use of the software” constitutes acceptance of the agreement.  Mr. Vernor indicated that he never used the copies of AutoCAD, and therefore he wasn’t bound by the agreement with AutoCAD, but this was not dispositive for the Ninth Circuit, as he bought the software from a prior holder that could not be called an “owner” based on the agreement between that entity, CTA, and Autodesk.  I’d expect this ruling from the Ninth Circuit to cause some trouble for licensees, many of whom have probably never thought when they bought that shrink-wrapped CD that they could not re-sell it later, given how common limited licensing agreements are in the world of proprietary software today.  Open Source, here we come!

New Maryland Power of Attorney Statute Goes into Effect Oct. 1

Maryland’s General Assembly passed a new statute governing powers of attorney that would be valid under Maryland law, including providing two new forms in the statute to help standardize the forms used by attorneys to help their clients plan their estate.  Powers of attorney executed after October 1 will need to comply with the requirements of the statute, including the requirement that signatures of principals be notarized and accompanied by the signatures of two witnesses.  In addition, the new statutory forms provide a space for the Agent to sign, accepting their appointment, and for the signature of the Agent to be notarized.

Hopefully the changes to Maryland law will help to reduce the fraud and confusion with regards to what constitutes a valid power of attorney.  The Maryland Law Blog provides further comments and links to the legislation.

Bad Comcast – Bad!

According to the Tennis Channel, Comcast discriminates against its programming in favor of sports channels that Comcast owns directly, causing the Tennis Channel to be distributed to about 1/5th as many homes.  (See article here)

If true, Comcast, as a carrier of cable data signals, may be engaging in the kind of content discrimination that harms the market place and should be regulated.  This article caught my eye this morning in light of the previous posts on net neutrality on this blog.  The obvious question is whether Comcast’s proposed acquisition of NBC would result in the same kind of preferential treatment for NBC shows carried on Comcast’s cable network, and whether such discrimination will begin to occur for internet users looking for particular internet content.  Perhaps we will see the same rise in network discrimination that we saw when banks started charging non-bank customers a fee to use the bank’s ATM machines.

Crackberry, DNS Disasters & Other Horrors

Blackberry users across the world were cross at RIM for two email outages that affected a fair number of blackberry users twice in as many weeks.  (See article here)  In addition, some grinches were busy trying to steal Christmas from a number of last minute Amazon and Walmart shoppers.  (See article here)  And google had an outage earlier this year when Michael Jackson died, and the search engine got so many queries for the singer that it thought it was under attack and stopped responding.  (See article here)  These outages reflect one of the great technology design challenges: single points of failure.  In the blackberry’s case, the basic method for getting email from a desktop to the blackberry requires that email messages be copied from the local computer and transferred through a RIM-controlled relay server to the user’s blackberry.  The relay server becomes a single point of failure for the RIM network.

With the Amazon outage this holiday season, the cause was a distributed denial of service (DDOS) attack aimed at the domain name server (DNS) hosting company who is responsible for telling users looking for http://www.amazon.com that that domain is located at the IP address 72.21.207.65.  By design, there can be only one “authoritative” group of DNS servers for a domain that can answer, for the entire internet, queries that request the number for the name.

These single points of failures are targeted by Murphy’s Law and malicious hackers alike, and network engineers and security experts have made careers designing better mousetraps to mitigate these fundamental weaknesses of their computer systems.  When you consider the amount of money and talent that some of these very large companies have, it underscores for me how fragile our existing information system infrastructure really is.  Tremendous resources have been focused on making the amazon.com web site highly available and highly accurate, but in spite of that extraordinary effort, there are still outages around amazon’s busiest time of year.

A challenge for the new decade will be fundamental changes in reliability in our computer networks, to make “High Availability As A Service” one of the new ‘net offerings for computer systems of all sizes.  Maybe you all should put that on your list for Santa for next Christmas!

Net Neutrality: An Analysis

Earlier in December, I wrote a blog entry entitled “Net Neutrality and other Myths,” in which I discussed some of the issues with net neutrality.  I have since changed the title of that article to “Net Neutrality and other Misnomers” because net neutrality is not so much fiction as it is not a clear name for the issue (though alliterative and catchy).

Net neutrality is not a “myth” as much as a misnomer because from a technical perspective, all networks must discriminate (e.g., set priority levels) to some extent in order to function when there is insufficient bandwidth for all network requests to be handled at the same time. Therefore, no network can be completely “neutral” about the traffic it carries and still function for its users.  This is the same reason that the postal service offers different rates for different speeds of delivery of a package.  You pay more for priority shipping than standard shipping, otherwise the postal service would take longer for everyone who shipped through it (and might not even work as well as it does today for the billions of packages it handles each year).  Just like the USPS, computer networks have to set priority for some traffic in order to work properly.

The net neutrality debate is not so much about this technical problem as it is about the concern that service and content providers may become aligned to prioritize the delivery of specific content at the expense of other content providers, driving some off a network such as Verizon.  This, I agree, is a bad thing.  I don’t think Comcast should have the right to contract with a content provider like NBC, and then de-prioritize content from TNT or CBS.  This is like USPS determining priority of your package based on whether it is a legal opinion letter to your client or junk mail to switch back to Comcast.  I don’t like the idea of the postal service opening my letter to you to figure out how fast it should deliver the package (or deliver it at all, for that matter).  Instead, I prefer to pay them more to make the letter, irrespective of its content, get to its intended recipient faster.

This is, in fact, the state of internet access today.  Service providers charge more for faster (or symmetrical) internet access as compared to their “standard” service offering.  If you want symmetrical gigabit internet access, you can get it, but it will cost far more than the $55 per month I pay for my Fios internet service at home.  Net neutrality would effectively require that I should not have to subscribe to one service provider’s service in order to get faster access to a certain kind of content.

A concern expressed by one reader of my blog was that without net neutrality, the major service providers will have no incentive to increase the bandwidth available to subscribers.  This concern, however, flies in the face of the actual increase in bandwidth since the 1990s for residential subscribers of internet service, absent any clear “net neutrality” principles enforced by the FCC.

Back in 1993, my 486 came equipped with a 2400 baud modem, which would allow me to send and receive data at the whopping speed of 2.4 kilobits per second.  When I first signed on with America Online for dial up service in 2000, the best modem speeds were 56 kilobits per second.  Therefore, in seven years, internet access speeds had increased by a factor of 23.  In 2003, I was among the first subscribers in my neighborhood for aDSL, with a top speed of 768 kilobits per second (though, upload speeds were slower – 128 kilobits).  In that period of three years, my internet access had increased by a factor of 13.  In 2009, I became a Fios customer, which provides me with about 15000 kilobits per second of download speeds (and 5000 kilobits upload).  In that period of six years, my available bandwidth increased by a factor of 19.  From 1993 to 2009, overall internet speed available to me in Maryland has increased at a steady rate.  However, the cost per kilobit of bandwidth is where the real story is with regards to improvement:

Bandwidth (kbs) Annual Cost (in 2008 dollars) Unit Cost per Kbs
1993 2.4 $336 $140.00
2000 56 $326 $5.82
2003 768 $420 $0.55
2009 15000 $660 $0.04

Back in 1993, bandwidth was expensive per kilobit.  Today, the amount of bandwidth available to me costs almost nothing per kilobit.  (I grant that my upload speeds are slower with DSL and Fios – Fios’ upload is a mere 5000 kilobits per second, which costs about $0.12 per kilobit – still a dramatic drop in cost from 1993).  And all of this occurred without a firm “net neutrality” policy in place.

Now, the writer pointed out that the phone companies that control most internet access for consumers have had no incentive to help out VOIP, which is essentially a competing phone service that utilizes one’s internet connection to send and receive phone calls.  Ironically, a purist net neutrality policy would actually hinder improvements to VOIP.  This is because, for VOIP service to actually improve, VOIP packets themselves would need to be given priority across the internet so that a greater percentage of these packets would reach their destination.  Strict net neutrality insists on no differentiation in content for the purposes of priority.

Of course, Verizon and the other major phone carriers have not wanted to encourage VOIP, at least when it first came out, because VOIP was direct competition to phone service, especially long distance service (where the majority of savings on cost were for VOIP users).  Today, Vonage and Skype both offer competing local and long distance services that ride on a user’s internet access.  In fact, an article (click here) indicates that both service providers have expanded their international calling plans at flat rates to between forty and sixty countries.  In addition, cell phone plans with flat rates for local and long distance plans are becoming increasingly popular.

Would net neutrality have improved VOIP service sooner?  Perhaps.  But if you wanted a phone call on VOIP to be as reliable as your POTS line at home in 2003, the FCC instead would have needed to require internet service providers to prioritize VOIP packets over other kinds of traffic being carried on their networks.  This would not really be “neutrality.”  But, I think the point is that the large telecommunications companies did not appreciate the competition for one of their core products, and were not going to do anything to support the technology or the potential vendors that might sell them.

I don’t think it is clear, however, that a government mandate to support a technology like VOIP would have led to more reliable VOIP services for consumers.  Back in the 1990s, the government did intervene to try and increase competition in the local phone service market, by requiring that the major carriers like Verizon allow smaller competitive local exchange carriers (CLEC) to have access to Verizon equipment in order to deliver competitive local phone service. Initially, a lot of new vendors came into the market and started providing service, such as WinStar, WorldCom, and others.  At the end of the dot-com bubble in 2001, many of these CLECs closed their doors because, even with government intervention, the market could not support that many local exchange carriers.  And CLECs got a bad rap in the market because many turned out to be vapor-vendors that sold more air than actual services to customers.  In some cases, these vendors had a competing product that delivered poorer or less reliable service than Verizon for local phone service.  And at least one of these CLECs, WorldCom, found its CEO indicted for cooking the books and fraudulently overbilling customers.

Today, Verizon and Comcast have expanded their reach to producing content available to internet users.  Comcast recently purchased NBC.  Verizon’s FIOS service includes on demand services that provide, in some cases, exclusive access to content.  The principles of net neutrality would require that Comcast not make access to NBC content (like Law & Order) faster on its network than content from competing content providers.  And I fully agree with this principle.

But absent such a policy, I’m not convinced that network speeds to the internet will drastically slow down because the major players of internet access also control some of the major content producers for the internet.  To the contrary, I agree with Ray Kurzweil that we are really in the knee of the curve with regards to the amount of bandwidth available to consumers.  Even more dramatic speed increases in CPU, memory, computing power and bandwidth are coming and will likely arrive in my lifetime, if not sooner.  See Ray Kurzweil, The Singularity is Near (Viking, 2005).  And I do not believe that any one entity has sufficient control over this growth to have a halting influence over it.  The increasing complexity and sophistication of technology is driven by more than any single telecommunications carrier, no matter its size or assets.  To stop this progression at this point would require that a dinosaur-killer scale comet hit the earth, or more likely, every teenager on the planet stopped using their cell phones, twitter and facebook.

Furthermore, the internet itself, without net neutrality, has fostered an explosion of user-generated content.  The growth here is to such an extreme that there has been a significant decentralization of the entertainment industry.  See The Long Tail by Chris Anderson.  The internet itself, by allowing more people to self-publish, has moved us further away from the centralized publishing of content by the few.  And the fact is that all of us that want to self-publish are willing to pay for internet access sufficient to do so (either by sitting at Starbucks each day on their WiFi connection, or by paying $55 a month for FIOS at home).

In sum, net neutrality is a misnomer, telecommunications companies should not be able to discriminate against access to content they do not own, and even if they are doing this, the world and the technology in it is moving on without them.

iPhone Security

Here is an article on the ABA site regarding iPhone security problems.  (click here)  The authors, Sharon Nelson and John Simek, point out three basic problems with the iPhone: (a) the security features to encrypt the iPhone can be hacked via a series of steps involving SSH, (b) remote wipe can be circumvented by leaving the iPhone off the 3G network, and (c) the PIN to unlock the iPhone can be circumvented when the phone is placed in recovery mode.

For most iPhone users, the phone is used to synchronize email, calendars, and contacts.  All of these may have some confidential information in them.  As for email, most users don’t encrypt their email in the first place, so all the messages sent and received between lawyer and client are susceptible to being intercepted when they leave the walls of the law practice.  This is a problem that pre-dates the iPhone.  As to stealing an iPhone to access email – it frankly might be easier to just attack the user’s Outlook Web Access account that is published through their firm’s web site, or attack the Exchange server directly if it is available via SMTP.

Calendars are another problem.  Most attorneys do put some information into calendar events to tell them they are meeting with a client, the client’s name, and the purpose of the meeting.  Lawyers generally do not append 10 page client summary documents to calendar items, so the calendar itself, while having some information that is arguably confidential, would not be fatal if lost.  Contacts on the iPhone would also likely give a hacker some idea of who the lawyer’s clients are and how to reach them, but other than private cell phone numbers, there usually is not much more info about clients in the address book.  (Firms should probably stop and think about whether the above generalizations are true for them.  If there is a lot more confidential information in these items, you might need to consider more substantial mitigations for the risk of loss of the phone).

To me, the bigger risk would be if lawyers are using their iPhones to store confidential documents received from the client to read them on the way home, for example.  There are also other security items like usernames and passwords to access client systems that might be stored on the iPhone for the convenience of the attorney.  I would argue that these kinds of files should not be on any smartphone.  Instead, such items should not leave the firm, but should be accessed via some kind of secure web site controlled by the firm as a matter of policy.

The authors also point out that the iPhone’s remote wipe feature, which represents a mitigation for the scenario of the lost iPhone, is insufficient if the iPhone is taken off the wireless 3G network, because the device cannot be located to be wiped.  Of course, this is the way that the Blackberry remote wipe policy works for Blackberry Enterprise Server users, so, while a problem, the iPhone is in good company (the ABA conducted a survey in 2009 of smartphone usage in law firms, and the majority used Blackberrys).   The authors cite Windows Mobile as having figured this out, but, I’m pretty sure Windows Mobile is on the way out of the market.

In conclusion, the authors are right that there are some security problems with the iPhone, and attorneys should think about those issues to protect the confidentiality of their work product.  But all technology presents risks for users which must be balanced by mitigations that are reasonable for the circumstances.    To this humble author, some of those mitigations should be implemented, regardless of the smartphone being used.

I was just kidding about “beach law”

But apparently, there is a growing law practice for some that litigate in the panhandle of Florida (see article here).  The case is Stop the Beach Renourishment, Inc. v. Florida Department of Environment Protection and went to the U.S. Supreme Court in early December for oral argument.  At the heart of the case is the question of whether the Florida beach restoration program amounts to a taking of private property without due process of law: a violation of the 5th and 14th amendments according to the plaintiffs in the matter.  For the plaintiffs, the issue is that the restoration program causes the restored shore to become public property.  That makes sense in that the public paid to repair the beach from erosion.  It’s bad for the homeowners that want their privacy on their own little section of the beach.

You can find some of the documents filed in this matter on JD Supra.  Additional analysis and documents filed in the matter are available here.

Software Licensing for Businesses

Here was a very good article on software license auditing for businesses: (click here for story).  The issues for businesses are twofold: (a) keeping track of the licensing that the business has purchased, and (b) keeping track and understanding the licensing agreements that control the software.  The former can be handled by software.  For example, Microsoft publishes Systems Management Server (SMS), which includes a software audit and metering tool.  I understand that Altiris also offers a solution, and undoubtedly there are other packages out there that can tell you what’s running on your network.

The latter, however, requires a human being to review the software licensing agreement terms, and then analyze purchase history against the usage from the audit/metering tool.  And the software license agreements themselves are often as clear as mud, especially if you have multiple, overlapping agreements for a variety of software packages.  Even small businesses may have a substantial number of software packages and licenses they have acquired over time, so keeping up with this to avoid an audit can take real effort and concentration.

Net Neutrality & Other Misnomers

I happened across an article on net neutrality today where AT&T was lobbying the Federal Communication Commission (FCC) to moderate their expectations on nondiscrimination rules for internet service providers.  (See article here)

I suspect a fair amount of internet users could care less about this debate.  In some parts of our nation, internet access is nearly ubiquitous, with the advent of iPhones, free Wi-Fi hot spots, and the expansion of broadband networks by big companies like Verizon and Comcast.  With all of this technology available to us, most users probably take internet access for granted.  And I doubt too many users think much about bandwidth limits or network priority when accessing a web site.  For most, typing in search terms in google or a URL in the browser address bar almost always takes them to somewhere they were trying to get to.  And so, net neutrality sounds like some far away monster that some fictional knights are trying to slay.

At its heart, net neutrality is a regulatory command to U.S. internet service providers to not discriminate against certain content that may appear on internet sites.  So, for example, net neutrality rules might prohibit a service provider like Verizon from blocking or reducing the priority of traffic from a web site like youtube or hulu.com.  Priority in networking is typically the way network managers ensure that critical traffic always gets to its destination, while less important traffic waits.  In terms of the overall internet, internet service providers might decide that email traffic is more important than downloading videos, and therefore, your email would always reach its destination but your youtube videos might stop and start or not load at all.

Corporate network managers deal with this problem regularly because businesses generally cannot afford to buy all the bandwidth that their users could use.  This is based on a principle similar to the growth in hard drive storage: in spite of the near exponential growth of hard drive storage space over the last twenty years, computer users continue to find more stuff to fill up their hard drives with.  Bandwidth is the same way.  The more bandwidth available, the more users will use.

The larger internet service providers ultimately will face these same kinds of challenges for the consumer-end of the internet user market.  Most consumers today pay some kind of flat rate for internet access.  Fios, for example, gives you a 10-30 megabit circuit for a flat rate of less than $100 a month.  Residential DSL and residential cable internet access are similarly priced.  For that matter, most cell phone data plans are also flat rate.  I suppose one answer to this is to go back to a pay-for-usage model, where consumers are billed at some kind of per megabyte or gigabyte of internet usage each month.  Those that use more, pay more.

I can’t say I’m a huge fan of this approach, but the truth is that, perhaps unfairly, a minority of consumers likely use the bulk of the available bandwidth to access the internet.  So, the rest of us probably are paying in part for the usage of others, and there is a kind of unfairness about that.  On the other hand, no consumer pays an incredible amount, because the overall cost of operating the entire internet infrastructure is distributed across many many millions of users in the U.S. and globally.  So, you might view internet access as another entitlement, like basic health care or shelter.

The rumblings about net neutrality take another form, however, and that is the fear that internet service providers may make deals with certain content providers (like hulu), where the service provider allocates more bandwidth (or a higher priority) to hulu over competing video sites.  So, if you want to view a video and it is available on several content providers, but your service provider has a deal with hulu, the video will work properly when you go to hulu and maybe not if you go to hulu’s competitors.  Thus, the service provider will tend to cause its users to favor the site that works, and the truth is that you wouldn’t likely even know why that is.  And then, the service providers would control the world, just like the underpants gnomes from Southpark.

Well, maybe not, but exclusivity arrangements between service providers and content providers would likely run smaller operations out of business.  This would probably be bad for competition.  It might even be antitrust worthy if the providers on both sides are large enough to corner a market.  Do the service providers want to be further regulated?  Of course not.  But should our internet not discriminate against certain content?  Probably.  In fact, if government actors can’t discriminate based on viewpoint under the first amendment, the FCC can only choose to implement a non-discrimination regulation, or take no action at all.  If the FCC were to sanction service provider discrimination against certain kinds of content that is otherwise lawful solely because of the viewpoint of that content, those injured content authors might very well be able to state a constitutional claim against the FCC’s regulation.

Practically today, however, much of this is a theoretical problem.  But the principles that the FCC is trying to foster – that we should not leave the censorship to internet service providers – is a sound principle.  In practice, some discrimination is required to ensure that some large networks will work for all that connect to them.  And discrimination against illegal content that is dangerous to your credit card account or results in malicious destruction of property is likely necessary.  But I’m not convinced that these issues should just be left to the free market to figure out, anymore than I want Microsoft to tell me what I can or cannot say on a public street corner.