Access and use "technological measures" - a legal distinction without a technological difference?

Last Friday I had the opportunity to speak with a lawyer that was trying to understand the differences between "access controls" and "use controls" in the context of technological measures used by copyright holders. Bill C-61, in the definition of "technological measure", makes a differentiation. In our discussions she observed that nearly every technological measure that controls the "use" of a copyrighted work restricts "access" to the work first. It was then asked if if was appropriate to differentiate between the two at all.

I'm a technical person, and while I love to talk to lawyers about technology law, I can't answer why lawyers want to make this distinction. All I can do is share my technical knowledge, and hope that the legal community will author and interpret laws that make sense.

Here is what Bill C-61 says:

"“technological measure” means any effective technology, device or component that, in the ordinary course of its operation,

(a) controls access to a work, to a performer’s performance fixed in a sound recording or to a sound recording and whose use is authorized by the copyright owner; or

(b) restricts the doing -- with respect to a work, to a performer’s performance fixed in a sound recording or to a sound recording -- of any act referred to in section 3, 15 or 18 and any act for which remuneration is payable under section 19."

Section 3 (copyright in works), 15 ( Copyright in performer’s performance) and 18 (Copyright in sound recordings) are lists of activities which require permission from the copyright holder in order to do with their work, and section 19 (Right to remuneration) talk about activities which are under a compulsory license (See: Copyright: locks, levies, licensing or lawsuits? Part 2: levies) for Performers and Sound Recording Makers.

Technological measures that "(a) controls access" are called "access controls", and measures which "(b) restricts the doing" are called "use controls".

From a technological point of view, we need to translate this legal speak to real-world technology which obeys the laws of physics. Content, whether digitally encoded or in an analog format, cannot itself make decisions or do things to itself (copy itself, "self destruct", read itself out loud, etc). For this you either need a human being (for human observable content) or some sort of hardware and software combination that is able to "observe" the content and then make it observable to a human.

As I included in a handout for the meeting (OpenDocument, PDF), there are some things that can be done to content. You can use cryptography to convert ordinary content (plaintext) to gibberish (cyphertext) in a way that requires a decryption key to convert the gibberish back to the content. Cryptography can also be used to digitally sign the content, and watermarking can be used to identify content or embed hidden messages in the content.

These are the most robust types of technological measures used by modern digitally encoded content. In the past an additional technique was used, which involves creating deliberate defects in the media. The marketing claim was that it was possible to introduce defects which would cause content to not be accessed by some types of devices (for example, VCR's) but would work perfectly fine for other types of devices (for example, Televisions). As anyone who remembers Macrovision on VHS cassettes will know that this never quite worked well. The defects would mess up many televisions and would not be noticed by some VCRs. This technique was tried on a variety of content, and even included putting laser holes in floppy disks for game software -- with all of these copy control methods being trivial to defeat for those who wanted to infringe copyright, and often made the content fail to work for legitimate customers of that content.

Side-Note: Many people in the movie industry don't think fondly of Macrovision. What they really created was a type of 'tax' that Macrovision could collect for every commercially produced VHS movie distributed and every 'legal' VCR. Fairly inexpensive off-the-shelf Time base correction technology eliminated the Macrovision defects, bypassing this alleged copy control. It was expensive for the industry, annoying for legitimate customers, and easily circumventable -- the only winner was Macrovision itself.

Beyond these techniques (cryptography, watermarks, deliberate media defects), everything else that is done with technological measures is done in software running on some computer hardware.

If you look at the second handout I used (OpenDocument,PDF) I offered some details on 11 different scenarios involving technological measures. The most common real-world situation is this:

a) content is encrypted and only distributed/communicated in encrypted form, accessibly only with the right decryption key
b) decryption key is embedded within specific devices or software, forcing customers of the content to use one of the "authorized" devices or software.
c) These devices and/or software are locked down to disallow their owners to control the device/software.

The first thing to notice is that from the perspective of the content, it is encrypted to only allow it to be accessed by authorized devices. In my mind as a technical person, that is an "access control" technological measure as it controls access to the work.

There is also a technological measure applied to the hardware and/or software by the hardware manufacturer or software author. If there are restrictions on what people can do with content accessed by this hardware/software, it is encoded in the rules authored by the software authors. This means that "use controls", when they exist, are authored in software by software authors and executed on computer hardware. They are not things which can be applied to content alone.

Internationally renowned security technologist and author Bruce Schneier has said a few times that, "trying to make digital files uncopyable is like trying to make water not wet". This is common knowledge to everyone in the computer security field.

What can be done to content is to make it inaccessible without the right technology to access it. This means that any technology that claims to "(b) restricts the doing" starts with somehow forcing people to use "authorized" hardware and/or software, and then implements any use restrictions in that software.

Is it possible to have a 'use control' without an 'access control'?

Use controls are accomplished in software running on hardware. If people are free to make their own hardware and software choices, then they will make choices of combinations that meet their own demands, not the demands of someone else. The user of software under their own control may choose to use the software in a lawful way and never infringe copyright. The software may even help them by making the intentions of the copyright holder clear to the owner of the technology. In this case we are talking about copyright being enforced by the law and courts rather than by technology disobeying the instructions of its owner.

Encrypting the content to try to revoke hardware and software choices from audiences is one common technique used by copyright holders. There is, however, an even worse situation: government regulation of technology.

In the USA under the title of the "Broadcast Flag" they have been discussing a regulatory regime where any hardware and software that is involved in the reception of broadcast signals are legally not allowed to be under the control of average citizens. When we consider TV tuner cards able to be plugged into generic computer equipment, this essentially disallows hardware and software choice for a massive amount of consumer electronics.

In both cases the goal is to disallow average citizens to own and control (through software choice) their own communications technology, and in both cases it is radical changes in the law against the interests of technology owners that is the root of the problem.

Why this distinction with a difference matters

Close observers to the digital copyright debate will notice something important.

With few exceptions, the proponents of anti-circumvention legislation are thinking entirely about "digital locks" being applied to content that in theory will protect the interests of the copyright holders of that content.

With few exceptions, the opponents of anti-circumvention legislation are focused on "digital locks" being applied to hardware and software which oppose the legitimate interests of the owners of the hardware and users of the software.

This debate is hard to understand until you realize that these technological measures can be applied to many things (not just content!), and are being applied by other than the owner of what the technological measure is being applied to. Different participants in the debate are focused on the consequences (unintended or intended) of applying different types of technological measures to different things.

The fact that there are policy makers wanting to make a legal distinction between "access controls" and "use controls" in the law suggests that they may not be aware that in practise nearly every conceivable "use control" starts with an "access control", unless we are talking about further government regulation against technology owners. They may also not be aware that the people who control the "use control" technology is not the copyright holder, but the software author -- and that this software author will have their own interests in mind when authoring the software, not the interests of their customers or copyright holders.

Related: Even in the “DRM” debate, Content is not King.

(article also published on IT World Canada's blog »)

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

C-61 gives a new right to control access to a work

One of the really scary things in C-61 is that if it passes then, by applying a TPM, rightsholders can legally control who can access a work.

Public Libraries were created so that people without enough money to buy books could still partake of our shared culture.

I just hope that TPMs don't get to the point where they are able to say "only women (or caucasians, or Muslims, or hetereosexuals) are allowed to read this book", because C-61 would give that restriction the force of law.

TPM discrimination basis

TPMs may not yet restrict based on gender, race, religion or sexual orientation, but they already restrict based on economic and political beliefs.

I am someone who subscribes to the Lessig "Code is law" philosophy. I think of computer science as similar to political science (process of creating rules computers obey, process of creating rules which humans obey), and not a branch of engineering or other natural sciences.

I believe that peer production of software is far better for the economy as a whole (economic beliefs), and I believe that software rules should have the same level (or better) of accountability and transparency as the political rules which govern us.

The vast majority of TPMs first restrict what brands of hardware/software people are allowed to use if they access DRM infected content, with that hardware/software then being locked down to disallow the accountability and transparency I believe all software rules should have.

Some people say that I have enough choices of devices and software to access DRM encoded multimedia content. In my mind this is no different than telling me in politics that I'm allowed to vote for anyone I want, as long as they are members of the Communist Party.

The general public doesn't yet spend the time to think about the roll of software in our lives, and are apathetic even worse than they are with politics where a decreasing number of people are engaged. We haven't yet outlawed political parties other than "The Party" yet in Canada, but given the rhetoric I hear from the proponents of C-61 I don't know how far off that will be.


Free/Libre and Open Source Software (FLOSS) consultant.