- Election 2011
- Chronology (including bills)
- Electoral District (list)
- Participate in mailing lists
Earlier this month, the U.S. government surprised the Internet community by announcing that it plans to back away from its longstanding oversight of the Internet domain name system. The move comes more than 15 years after it first announced plans to transfer management of the so-called IANA function, which includes the power to add new domain name extensions (such as dot-xxx) and to alter administrative control over an existing domain name extension (for example, approving the transfer of the dot-ca domain in 2000 from the University of British Columbia to the Canadian Internet Registration Authority).
My weekly technology law column (Toronto Star version, homepage version) notes the change is rightly viewed as a major development in the ongoing battle over Internet governance. Yet a closer look at the why the U.S. is embarking on the change and what the system might look like once the transition is complete, suggests that it is not relinquishing much power anytime soon. Rather, the U.S. has ensured that it will dictate the terms of any transfer and retain a "super-jurisdiction" for the foreseeable future.â¨
Day-to-day administration of the domain name system is currently managed by the Internet Corporation for Assigned Names and Numbers (ICANN), a U.S.-based non-profit company that operates under a contract with the U.S. government. Critics argue that this means that the U.S. retains final authority over key Internet governance decisions. â¨â¨
The United Nations and supporting governments have attempted to loosen U.S. control on several prior occasions without success. Despite those failures, the U.S. now voluntarily says it will walk away from its oversight power, tasking ICANN with developing a transition plan that must "support and enhance the multistakeholder model." The U.S. adds that it will not accept a proposal based on a government-led or an inter-governmental organization solution, short-circuiting any hopes the U.N. might have had for assuming control.â¨â¨
Why is the U.S. proposing to walk away now? In recent months, there has been growing momentum to revisit the issue, triggered by the Edward Snowden revelations of widespread Internet surveillance. Although NSA surveillance has no real connection to Internet governance - the management of the domain name system is not typically a surveillance target - the issue has galvanized many countries and groups who sense an opportunity for change. By forcing the issue, the U.S. has successfully seized the agenda and set the conditions for a transfer of power.
While a transfer would be perceived by many to represent a change in control, the reality is that the U.S. will not be relinquishing much power even when (or if) the transition occurs. In the years since the U.S. first indicated that it would shift away from Internet governance, it has steadily erected jurisdictional authority over a considerable portion of the Internet infrastructure. â¨â¨
For example, in 2009 the U.S. and ICANN entered into an agreement that institutionalized "the technical coordination of the Internet's domain name and addressing system." That document included a commitment for the U.S. to remain involved in the Governmental Advisory Committee (GAC), the powerful body within ICANN that allows governments to provide their views on governance matters. It also contained an ICANN commitment to remain headquartered in the U.S., effectively ensuring ongoing U.S. jurisdiction over it.
â¨Not only is the U.S. able to assert jurisdiction over ICANN, but it has also asserted jurisdiction over all dot-com, dot-net, and dot-org domain names. In 2012, a U.S. court ordered the seizure of a dot-com domain that was registered in Canada with no U.S. connection other than the location of the domain name registry. This effectively means the U.S. retains jurisdiction over half of all domain name registrations worldwide regardless of where they are registered or who manages the system.â¨â¨
The U.S. might transition away from the current model (though the initial 2015 date seems ambitious), but much of its jurisdictional power will remain largely unchanged. The latest announcement has the potential to fulfill a promise made nearly two decades ago, but skeptics can be forgiven for suspecting that power over Internet governance will remain firmly rooted in the U.S. no matter how the issue is resolved.
Last week marked the 25th anniversary of the drafting of Tim Berners-Lee's proposal to combine hypertext with the Internet that would later become the World Wide Web. Berners-Lee used the occasion to call for the creation of a global online "Magna Carta" to protect the rights of Internet users around the world.
The desire for enforceable global digital rights stands in sharp contrast to the early days of the Web when advocates were more inclined to tell governments to stay away from the burgeoning medium. For example, John Perry Barlow's widely circulated 1996 Declaration of the Independence of Cyberspace, asked governments to "leave us alone", claiming that conventional legal concepts did not apply online.
While the notion of a separate "cyberspace" would today strike many as inconsistent with how the Internet has developed into an integral part of everyday life, the prospect of a law-free online environment without government is even more at-odds with current realities. Rather than opposing government, there is a growing recognition of the need for governments to ensure that fundamental digital rights are respected.
My weekly technology law column (Toronto Star version, homepage version) notes that building on Berners-Lee's vision of global online protections, the World Wide Web Foundation, supported by leading non-governmental organizations from around the world, has launched a "Web We Want" campaign that aims to foster increased awareness of online digital rights. The campaign focuses on five principles: affordable access, the protection of personal user information, freedom of expression, open infrastructure, and neutral networks that do not discriminate against content or users.
Supporters recognize that global protections are more likely to develop on a country-by-country basis, with potential domestic support for national digital bills of rights. In the United Kingdom, the opposition Liberal Democrats have already thrown their support behind a digital bill of rights, while the United Nations Human Rights Council has backed a resolution declaring Internet access and online freedom of expression a human right.
With Industry Minister James Moore set to unveil the long-awaited national digital strategy (reportedly to be dubbed Digital Canada 150), these issues have the potential to play a starring role.
The government has identified universal access as a key issue, allocating $305 million in the most recent budget for broadband initiatives in rural and remote communities. While there is some disagreement on a target date for universal Canadian broadband - the CRTC has set its goal at 2015, while the federal government is content with 2019 - there is a consensus that all Canadians should have affordable broadband access and that there is a role for the government to make that a reality in communities that the leading Internet providers have largely ignored.
The protection of personal information raises questions about the adequacy of current privacy rules and the concerns associated with widespread surveillance. Industry Canada's Report on Plans and Priorities for 2014-15 quietly referenced "modernizing the privacy regime to better protect consumer privacy online" as a legislative priority for the coming year, the clearest signal yet that the government plans to re-introduce privacy reform.
The surveillance concerns will undoubtedly prove even more challenging, with the government saying little about the steady stream of revelations of government-backed surveillance. The Canadian role in global surveillance activities and the government's decision to revive lawful access legislation represent the most disturbing aspects of online policies that must be addressed for digital rights leadership.
As the government finally embarks on its digital strategy, it has an opportunity to do more than just tout recent policy initiatives. Instead, it should consider linking its goals with the broader global initiatives to help create the Web we want.
Yesterday, I was contacted by a Toronto radio station wanting to discuss wireless pricing increases that have occurred over the past few months (including increases over the weekend at both Rogers and Bell). Their key question was what lay behind the increased prices? While some might point to reduced roaming revenues or costs associated with the spectrum auction, I believe the answer is far simpler.
The carriers increased prices because they can.
Indeed, this is precisely what the Competition Bureau of Canada concluded could and would happen in its analysis of the wireless environment in Canada. In its January 29, 2014 submission to the CRTC, it stated:
In the Bureau's view, mobile wireless markets in Canada are characterized by high concentration and very high barriers to entry and expansion. Furthermore, Canadian mobile wireless markets are characterized by other factors that, when combined with high concentration and very high barriers to entry and expansion, create a risk of coordinated interaction in these markets. Given these factors, the Bureau's view is that incumbent service providers have market power in Canadian retail mobile wireless markets.
And what is market power? As the Bureau notes, "market power is the ability of a firm or firms to profitably maintain prices above competitive levels (or similarly restrict non-price dimensions of competition) for a significant period of time."
The risk of coordinated action and the ability to profitability maintain prices above competitive levels? Sounds familiar.
The Canadian Copyright Institute, an association of authors and publishers, has released a new paper that calls on the Canadian education community to stop relying on its current interpretation of fair dealing and instead negotiate a collective licence with Access Copyright. The paper was apparently published in the fall but is being released publicly now since Canadian education groups have refused to cave to Access Copyright's demands.
The CCI document, which raises some of the same themes found in an Association of Canadian Publisher's paper that distorts Canadian copyright law (thoroughly debunked by Howard Knopf), features at least three notable takeaways: the shift to threats of government lobbying, long overdue admissions that the value of the Access Copyright licence has declined, and emphasis on arguments that have been rejected by the courts and government. There are also three notable omissions: the fact that the overwhelming majority of copying in schools is conducted with publisher permission, the role of technological neutrality, and the relevance of other copyright exceptions. By the end of the document, the CCI and Access Copyright work to fabricate a new fair dealing test that is inconsistent with Supreme Court of Canada rulings as they call for dialogue so long as it leads to a new collective licence.
The Notable Takeaways
First, the CCI threatens the education community that it will lobby the government to change the law unless it resumes paying Access Copyright:
Without an acceptable solution - in other words, the resumption of licensing for schools, colleges and universities - writers and publishers will have to pursue political as well as legal solutions. This is not their preference. There exists a long and valued relationship (symbiotic, even) among writers, publishers, educators and students. We believe that there is a better way forward.
The threat of political solutions is particularly laughable given that the same groups lobbied extensively for two years during the Bill C-32/C-11 process to urge the government to scale back fair dealing. Despite numerous appearances before parliamentary committees, star witnesses, social media campaigns, and public opinion pieces, the government completely rejected their demands. With no appetite for more copyright reform in Ottawa, the threat of a renewed lobby campaign is no threat at all.
Second, Access Copyright and the CCI finally admit that the recent legal changes have reduced the value of their collective licence. After the Supreme Court decisions, Access Copyright stated:
This decision, however, has no impact on the requirement that royalties continue to be paid on
the hundreds of millions of pages of student texts that are copied for use in Kâ12 classrooms
It even argued after the decision that the Supreme Court had not ruled that the copies at issue were fair dealing. Now the groups acknowledge:
Copyright owners may not like but they do accept the Alberta (Education) decision, and that means accepting a lower value for Access Copyright licensing.
In fact, the decreasing value of an Access Copyright license stems from more than just changes to Canadian copyright law. The collective has also admitted that works older than 20 years are unlikely to be copied under its licences. In its 2012 Payback FAQ to authors, the collective noted:
Q. Why are you only asking for works published within the last 20 years?
A. Our statistical analysis of copying data shows that works published more than 20 years ago are unlikely to be copied under our licences.
This admission from Access Copyright shows how its repertoire is declining in value since a growing percentage of newer materials are available by alternative means, while the older materials may not be subject to an alternate licence, but they are unlikely to be copied. Over the coming years, the Access Copyright squeeze is only going to grow as the entire repertoire of materials likely to be copied - the materials published within the last 20 years - are all published in the digital/Internet era with many available through alternative means such as open access or site licences.
Third, the document's emphasis on the Supreme Court's dissenting opinion or attempts to downplay the law provides a sure sign of a weak argument. The law of the land is reflected by the majority, not the minority view. The references to a "very powerful dissent" or the "bare majority" suggest doubt that simply does not exist. As I pointed out in this post, each of Access Copyright's key arguments (user rights, copier perspective, private study, and aggregate copying) were rejected by the court. The majority view is unlikely to be revisited in the short term. In fact, should the issue return to the court, it is worth noting that the majority judges all remain on the bench, whereas the dissent has already had one retirement with another on the way.
The document also tries to downplay the effect of the Court's decision on numerous occasions. For example, it states:
with the recent addition of "education" as a fair dealing purpose, we accept that some copying for classroom distribution now meets the first test for what can be fair dealing - subject to the very important second test of fairness.
Yet the first test only requires an appropriate purpose. With the inclusion of education in the law as one of the purposes, all copying for classroom distribution undoubtedly meets that part of the test.
What the CCI and Access Copyright Do Not Say
The document is also notable for what it does not say. The CCI and Access Copyright emphasize the 250 million copies that are copied annually, rather than the 16.9 million copies addressed by the court. Yet the evidence in the case before the Copyright Board actually found far more copying. The Access Copyright sponsored study that lies at the heart of the K-12 case that ended up in Canada's highest court found that schools already had permission to reproduce 88% of all books, periodicals, and newspapers without even conducting a copyright analysis or turning to the Access Copyright licence.
That study, conducted by Circum Network Inc., tracked the photocopying practices at hundreds of schools across the country with full logging of all copying over two-week periods. The study found a huge amount of photocopying - the Canada-wide estimate was 14 billion copies - but the overwhelming majority have nothing to do with Access Copyright. In fact, once personal copies, unpublished copies, administrative documents, and self-produced documents were accounted for, the number of copies dropped to 4.5 billion. Most of those 4.5 billion copies were taken from books, but there was permission to reproduce nearly 4 billion of the copies without Access Copyright.
In other words, Access Copyright's own evidence is that schools obtained permission (typically through direct licences or permission from the publishers from whom they purchased hundreds of millions in books) to cover 88% of their book, periodical, and newspaper copying. Access Copyright is simply irrelevant for the overwhelming majority of copying even before anyone conducts a fair dealing analysis.
The document also conveniently omits the Supreme Court's emphasis on technological neutrality. For example, it states:
The Court looked only at photocopying of âshort excerptsâ. It said nothing about digital delivery. And in CCH, the Court questioned whether it would have come to the same conclusions with other methods of copying and if longer excerpts were involved.
Yet the court's discussion of alternative digital delivery models do not help Access Copyright given the new principle of technological neutrality articulated in the ESAC case:
The principle of technological neutrality requires that, absent evidence of Parliamentary intent to the contrary, we interpret the Copyright Act in a way that avoids imposing an additional layer of protections and fees based solely on the method of delivery of the work to the end user. To do otherwise would effectively impose a gratuitous cost for the use of more efficient, Internet-based technologies.
The singular focus on fair dealing also omits the many additional exceptions available to education. The fact that much of the copying of short excerpts may simply be de minimis and not even require a fair dealing analysis (much less an Access Copyright licence) is not discussed, though the Copyright Board wants the collective to address the issue. Moreover, the education Internet exception, the non-commercial user generated content exception, the distance education exception, and others may all be used by education to cover some copyright uses. Indeed, these same groups warned during the C-11 process that those provisions would have the effect of granting education expansive new rights.
What is the End Game?
Leaving aside empty threats about lobbying, what is the Access Copyright end game? The document makes it clear that for all the references to "dialogue," from its perspective the only satisfactory outcome is an Access Copyright licence. Indeed, the document states:
Canada's copyright owners will support whatever action is needed to reinstate collective licensing in schools, colleges and universities.
Copyright law changes, the millions spent on site licenses, a diminishing repertoire, and the growth of open access publishing? All irrelevant in the eyes of Access Copyright which only wants to talk about reinstating a collective licence. If that wasn't enough to reject calls for dialogue, there is also an effort to fabricate a fair dealing test far different from the one articulated by the Supreme Court of Canada. In the place of user rights, the document raises a series of new considerations such as "whether the copying is spontaneous and non-systematic" (irrelevant from a fair dealing perspective), "whether the copying is directed by the teacher or is mandated by a board or ministry of education" (having lost the argument on whether teacher directed copying is fair dealing (it is), Access Copyright is now shifting to the claim that board directed copying is not fair dealing), or "whether the copies are retained/reused" (another non-fair dealing factor).
The reality is that the Supreme Court and the government were both clear with respect to the emphasis on user rights, fair dealing, and new user exceptions. The CCI, Access Copyright and its allies argued these issues before the court and Parliamentary committees. They lost. The new fair dealing guidelines adopted by the Canadian education community are a modest implementation of those rules. There is no need for threats or disingenuous calls for more dialogue, but rather acceptance of the law and efforts to adapt to the new legal environment. The CCI document suggests that is still not part of the collective's strategy for moving forward.
Welcome to WordPress. This is your first post. Edit or delete it, then start blogging!
Looking out for commercial Linux distributors, Greg Kroah-Hartman has announced that the 3.10 Linux Kernel will be supported for two years.
Android 4.3 added significant new security features, and Google has also added two other new security features to older versions of Android. Now, if only the carriers and OEMs would patch the Bluebox security hole every Android user would be happier.
Verizon and T-Mobile have announced that they'll be supporting the Ubuntu phone in the United States.
IBM continues to bet on Linux and open-source databases with its new PowerLinux 7R4 server.
Today, Linux rules supercomputing. It wasn't always that way. Here's how Linux moved from being Linus Torvald's hobby operating system to being the OS of choice for high-performance computing.
Canonical believes that Ubuntu can be one operating system and Unity the one interface you need for your PC, your smartphone, and your tablet. Here's how they'll do it.
This latest version of Android Jelly Bean has many good, new features for both developers and users.
It looks like Ubuntu Edge will reach the $32 million goal that Mark Shuttleworth set to begin building the hybrid smartphone PC. But will it have a market? Could it replace the traditional PC?
It may be trailing LibreOffice, but OpenOffice is still alive and kicking -- now with better Microsoft Office Open XML support.
Who says you need millions for a supercomputer? Not Adapteva, which has started shipping its $99 Parallella single-board parallel processing board.
In three years OpenStack has come out of nowhere to be one of the most popular cloud programs around. How did that happen? Jim Curry, one of OpenStack's founders, explains.
GitHub, the popular open-source development community site, is finally getting its licensing act together. It's high time since Black Duck has found that 77-percent of GitHub projects have no declared open-source license.
With OEMs still not releasing Google's fix for the security hole discovered by Bluebox Security researchers have released of a mobile application that fixes the vulnerability.
The open-source Eucalyptus cloud project has just released a new version that's improved its Amazon Web Service cloud interoperability.
Other key sites