Skip to Main Content
Augmented Legality
BlogsPublications | April 4, 2014
4 minute read
Augmented Legality

Can I Augment That? 5 Targets to Be Wary of When Making AR

With the steady growth of new tools for user-generated augmented reality, I've been fielding a ton of questions lately on whether it's legally okay to augment particular content. In other words--if you're not familiar with how AR works--they want to know if it's permissible to associate certain digital content with a particular physical object (the "target"), such that when a user views the target through an AR app on a video-enabled mobile device, the digital content will appear on the screen to be physically superimposed on the target.

Readers of this blog know that I've tackled these issues many times before, and that in my view, most decisions of what to augment and how will be considered free speech protected by the First Amendment.  As in every other context, though, there are boundaries to First Amendment rights where other laws can take precedence.

Here's a quick list of five types of targets that you should at least think twice about before augmenting them with digital content:

  1. Copyrighted images. In most cases, it's unlikely that merely associating digital content with a copyrighted target will infringe the copyright owner's rights, because that mere act does not reproduce or create a derivative of the physical target. In order for the AR app to recognize the target, however, you may need to upload a copy of the target into your AR creation tool--and that reproduction could be found infringing. Moreover, depending on what you do with that target image once you've uploaded it, you might end up making a derivative work of the digital copy. It also remains undetermined whether digital augmentation could be considered a " a distortion, mutilation, or other modification of the work" under the Visual Artists Rights Act of 1990 or other species of "moral rights."
  2. Trademarks. Trademarks, service marks and the like exist for one purpose: to indicate to consumers the source of a product or service. When I see the Golden Arches at the next exit, for example, I know I can pull off there to get a Big Mac.  And if someone hands me a sandwich called a "Whopper," I'll know that I ended up at Burger King instead. Content infringes trademark rights if it creates a "likelihood of confusion" in the minds of relevant consumers as to where the product comes from, or whether there is some form of association or sponsorship between the trademark owner and the author of the content. By associating your digital content with someone else's trademark--and depending on the circumstances and how you advertise your augmentation--you run the risk of creating confusion and therefore of infringing the trademark. Moreover, if your content tarnishes the trademark or somehow lessens its ability o function as an indicator of source, you could be on the hook for trademark dilution as well.
  3. Employer's Materials. Just like mixing your content with someone else's trademark could create confusion as to whether the content comes from, or is approved by, the trademark owner, so too could an employee's content create confusion if it is associated with their employer's materials.  We already see quite a bit of litigation over what employees post in social media, but the potential for legal headaches is even bigger in the augmented space.  While an employee may (sometimes) think twice before posting offensive or objectionable statements on an employer's Facebook page, for example, they may not think twice about adding their own digital commentary to any random physical object associated with the employer, because they may think that the content is so secret that the employer would never find it.
  4. Faces.  So much of the current privacy debate around augmented reality and wearable devices centers on facial recognition. Many of the most obvious and helpful uses of digital eyewear require the device to recognize the people it sees. In light of how vitriolic the public reaction has been to this concept, however, none of the major companies in the space want to be the first to roll out the technology. Facebook and Google are sitting on troves of facial information and doing very little with it. The terms of use for Google Glass and many AR software developer kits--and perhaps even the AR creation tool you're using--specifically ban any facial recognition functionality. The risk is not only an invasion of privacy, but also the untested potential for the right of publicity to apply to this technology.
  5. Anything Else Related to Specific Individuals.  Hundreds, if not thousands, of people have already been criminally prosecuted or otherwise punished for threatening others online, often in social media.  Even something as benign as a Facebook "poke" has been found to be enough of an unwanted contact to be a violation of a personal protection order.  Associating AR content with a particular person, their home, or anything else intimately associated with them could likewise be viewed as a threat, stalking, harassment, defamation, infringement of publicity rights, or other offense, depending on the circumstances.