policy

Ford Foundation to require CC BY for all grant-funded projects

Timothy Vollmer, February 3rd, 2015

FordFoundationLogo

Today the Ford Foundation announced an open licensing policy for all of their grant-funded projects and research. The new arrangement came into effect February 1, 2015 and covers most grant-funded work, as well as the outputs of consultants. The Ford Foundation has chosen to adopt the CC BY 4.0 license as the default for these materials. Grant agreements will now include a paragraph requiring the grant recipient to broadly share all copyrightable products (such as research reports, photographs, videos, etc.) funded by the grant under CC BY. And the Ford Foundation is leading by example by adopting CC BY for all materials not subject to third-party ownership on their own website.

Darren Walker, president of the Ford Foundation, said, “This policy change will help grantees and the public more easily connect with us and build upon our work, ensure our grant dollars go further and are more impactful, and – most importantly – increase our ability to advance social justice worldwide.”

“We’re incredibly pleased to see the Ford Foundation adopting a Creative Commons licensing policy for a wide range of grant-funded works, promoting openness and re-use of content produced through its philanthropic grantmaking,” said Ryan Merkley, CEO of Creative Commons. “The Ford Foundation joins a growing movement of foundations and governments adopting policies that increase access to and re-use of digital education materials, research articles, and data using Creative Commons.”

The Ford Foundation is an independent, nonprofit grant-making organization created in 1936. Its mission is “to strengthen democratic values, reduce poverty and injustice, promote international cooperation, and advance human achievement.” In 2013 the Ford Foundation granted almost $570,000,000 to projects and organizations around the world.

The Ford Foundation joins several other philanthropic grantmaking organizations who have adopted Creative Commons licensing policies for the outputs of their charitable giving. We’ve highlighted several over the last few months, including the William and Flora Hewlett Foundation (who also now require CC BY for all their project-based grantmaking) and the Bill & Melinda Gates Foundation (who adopted a CC BY open access policy for published grant-funded research and data). Releasing grant-funded content under permissive open licenses like CC BY means that these materials can be more easily shared and re-used by the public. And they can be combined with other resources that are also published under an open license.

Congratulations to the Ford Foundation on adopting an open licensing policy that will encourage the sharing of rich content and data in the digital global commons. Creative Commons continues to urge other foundations and funding bodies to emulate the ongoing leadership of the Ford Foundation by making open licensing an essential component of their grantmaking strategy.

Comments Off

The Limits of Copyright: Text and Data Mining

Timothy Vollmer, January 21st, 2015

We’re taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of the law, and addressing what’s at stake, and what we need to do to make sure that copyright promotes creativity and innovation.

Today’s topic is about supporting fair use, a legal doctrine in the United States and a few other countries that permits some uses of copyrighted works without the author’s permission for purposes such as parody, criticism, teaching, and news reporting. Fair use is an important check on the exclusive bundle of rights granted to authors under copyright law. Fair use is considered a “limitation and exception” to copyright.

One area of particular importance within limitations and exceptions to copyright is the practice of text and data mining. Text and data mining typically consists of computers analyzing huge amounts of text or data, and has the potential to unlock huge swaths of interesting connections between textual and other types of content. Understanding these new connections can enable new research capabilities that result in novel scholarly discoveries and critical scientific breakthroughs. Because of this, text and data mining is increasingly important for scholarly research.

Recently the United Kingdom enacted legislation specifically excepting noncommercial text and data mining from copyright. And as the European Commission conducts their review of EU copyright rules, some groups have called for the addition of a specific text and data mining exception. Copyright for Creativity’s manifesto, released Monday, urges the European Commission to add a new exception for text and data mining, in order to support new uses of technology and user needs.

Another view holds that text and data mining activities should be considered outside the purview of copyright altogether. The response from the Communia Association to the EU copyright consultation takes this approach, saying “if text and data mining would be authorized by a copyright exception, it would constitute a de facto recognition that text and data mining are not legitimate usages. We believe that mining texts and data for facts is an activity that is not and should not be protected by copyright and therefore introducing a legislative solution that takes the form of an exception should be avoided.” Similarly, there have been several actions advocating that “The right to read should be the right to mine.”

Whether text and data mining falls under a copyright exception or outside the scope of copyright, it is clearly an activity that should not be able to be controlled by the copyright owner. But unfortunately, that is exactly what some incumbent publishing gatekeepers are trying to do by setting up restrictive contractual agreements. One example we’ve seen of this practice is with the deployment of a set of “open access” licenses from the International Association of Scientific, Technical & Medical Publishers (STM), many of which attempt to restrict text and data mining of the licensed publications. In jurisdictions such as the United States, users do not need to ask permission (or be granted permission through a license) to conduct text and data mining because the activity either falls outside of the scope of copyright or is squarely covered by fair use.

Ensuring that licenses give copyright owners no more control over their content than they have under copyright law is a fundamental principle of CC licensing. That’s why the licenses explicitly state that they in no way restrict uses that are under a limitation or exception to copyright. This means that users do not have to comply with the license for uses of the material permitted by an applicable limitation or exception (such as fair use) or uses that are otherwise unrestricted by copyright law, such as text and data mining in many jurisdictions.

Today’s topic of fair use rights reminds us that “for copyright to achieve its purpose of encouraging creativity and innovation, it must preserve and promote ample breathing space for unexpected and innovative uses.” To liberate the massive potential for innovation made possible by existing and future types of text and data mining, we need user-focused copyright policy that enables these new activities.

 

Comments Off

Public access to research language retained in U.S. spending bill

Timothy Vollmer, December 22nd, 2014

Last year, the U.S. Congress included a provision in its appropriations legislation that would ensure that some research conducted through federal spending would be made accessible online, for free. It mandated that a subset of federal agencies with research budgets of at least $100 million per year would be required provide the public with free online access to scholarly articles generated with federal funds no later than 12 months after publication in a peer-reviewed journal. The agencies affected by the public access provision of the appropriations bill included the Department of Labor, Department of Education, and Department of Health and Human Services. Of particular note is the Department of Health and Human Services, which encompasses research-intensive agencies such as the National Institutes of Health, Food and Drug Administration, and Centers for Disease Control and Prevention.

SPARC reports that the public access language has been included in the fiscal year 2015 spending bill (PDF), which appears on p. 961-962:

SEC. 525. Each Federal agency, or in the case of an agency with multiple bureaus, each bureau (or operating division) funded under this Act that has research and development expenditures in excess of $100,000,000 per year shall develop a Federal research public access policy that provides for— 1) the submission to the agency, agency bureau, or designated entity acting on behalf of the agency, a machine-readable version of the author’s final peer-reviewed manuscripts that have been accepted for publication in peer-reviewed journals describing research supported, in whole or in part, from funding by the Federal Government; (2) free online public access to such final peer reviewed manuscripts or published versions not later than 12 months after the official date of publication.

Alongside the federal spending legislation, there were references included in accompanying reports (see Departments of Commerce, Justice, Science report at p. 30 and Department of Interior report at p. 32) that point to President Obama’s Directive requiring agencies to increase access to the results of federally funded scientific research. The appropriations language passed for 2014 and 2015 echoes the language of the White House Directive, issued in February 2013. It directs “Federal agencies with more than $100M in R&D expenditures to develop plans to make the published results of federally funded research freely available to the public within one year of publication and requiring researchers to better account for and manage the digital data resulting from federally funded scientific research.” The agency plans were due in August 2013, and according to the Office of Science and Technology Policy (OSTP), all agencies have submitted at least a draft plan (PDF). Those plans are now being reviewed by OSTP.

Progress has been slow, but public access to publicly funded research remains on the table in the United States.

 

Comments Off

Open Definition 2.0 released

Timothy Vollmer, October 7th, 2014

Today Open Knowledge and the Open Definition Advisory Council announced the release of version 2.0 of the Open Definition. The Definition “sets out principles that define openness in relation to data and content,” and is the baseline from which various public licenses are measured. Any content released under an Open Definition-conformant license means that anyone can “freely access, use, modify, and share that content, for any purpose, subject, at most, to requirements that preserve provenance and openness.” The CC BY and CC BY-SA 4.0 licenses are conformant with the Open Definition, as are all previous versions of these licenses (1.0 – 3.0, including jurisdiction ports). The CC0 Public Domain Dedication is also aligned with the Open Definition.

The Open Definition is an important standard that communicates the fundamental legal conditions that make content and data open. One of the most notable updates to version 2.0 is that it separates and clarifies the requirements under which an individual work will be considered open from the conditions under which a license will be considered conformant with the Definition.

Public sector bodies, GLAM institutions, and open data initiatives around the world are looking for recommendation and advice on the best licenses for their policies and projects. It’s helpful to be able to point policymakers and data publishers to a neutral, community-supported definition with a list of approved licenses for sharing content and data (and of course, we think that CC BY, CC BY-SA, and CC0 are some of the best, especially for publicly funded materials). And while we still see that some governments and other institutions are attempting to create their own custom licenses, hopefully the Open Definition 2.0 will help guide these groups into understanding of the benefits to using an existing OD-compliant license. The more that content and data providers use one of these licenses, the more they’ll add to a huge pool of legally reusable and interoperable content for anyone to use and repurpose.

To the extent that new licenses continue to be developed, the Open Definition Advisory Council has been honing a process to assist in evaluating whether licenses meet the Open Definition. Version 2.0 continues to urge potential license stewards to think carefully before attempting to develop their own license, and requires that they understand the common conditions and restrictions that should (or should not) be contained in a new license in order to promote interoperability with existing licenses.

Open Definition version 2.0 was collaboratively and transparently developed with input from experts involved in open access, open culture, open data, open education, open government, open source and wiki communities. Congratulations to Open Knowledge and the Open Definition Advisory Council on this important improvement.

1 Comment »

Hewlett Foundation extends CC BY policy to all grantees

Timothy Vollmer, September 23rd, 2014

Last week the William and Flora Hewlett Foundation announced that it is extending its open licensing policy to require that all content (such as reports, videos, white papers) resulting from project grant funds be licensed under the most recent Creative Commons Attribution (CC BY) license. From the Foundation’s blog post: “We’re making this change because we believe that this kind of broad, open, and free sharing of ideas benefits not just the Hewlett Foundation, but also our grantees, and most important, the people their work is intended to help.” The change is explained in more detail on the foundation’s website.

The foundation had a long-standing policy requiring that recipients of its Open Educational Resources grants license the outputs of those grants; this was instrumental in the creation and growth of the OER field, which continues to flourish and spread. Earlier this year, the license requirement was extended to all Education Program grants, and as restated, the policy will now be rolled out to all project-based grants under any foundation program. The policy is straightforward: it requires that content produced pursuant to a grant be made easily available to the public, on the grantee’s website or otherwise, under the CC BY 4.0 license — unless there is some good reason to use a different license.

“When we began thinking about extending the policy from OER grants to the foundation as a whole, we wanted to be sure we would not be creating unforeseen problems,” said Elizabeth Peters, the general counsel of the Hewlett Foundation. “So we first broadened it to cover education grants that were not for OER — and have been pleased to find that there were very few issues, and those few easily resolved. CC BY for all grant-funded works will now be the default, but we are willing to accommodate grantees who have a persuasive reason to take a different path. The ultimate goal of this policy is to make the content we fund more openly available to everyone. We’re only just beginning to implement this change, and will continue to monitor how it’s working, but so far we have found most grantees are ready and willing to apply the license that makes their works fully open for re-use of all kinds.”

In practice, the new policy means that nearly all of the extensive content produced with Hewlett project-based grant funds–not only works specifically commissioned as Open Educational Resources, but scholarly research, multimedia materials, videos, white papers, and more, created by grantees on subjects of critical importance–will be widely available for downstream re-use with only the condition that the creator is attributed. Text will be openly available for translation into foreign languages, and high-quality photographs and videos will be able to be re-used on platforms such as Wikipedia. Releasing grant funded content under permissive open licenses like CC BY means that these materials can be more easily shared and re-used by the public. And they can be combined with other resources that are also published under an open license: this collection grows larger every day as governments and other publicly-facing institutions adopt open policies. Promoting this type of sharing can benefit both the original creator and the foundation, as it enables novel uses in situations not intended by the original grant funding.

For a long time Creative Commons has been interested in promoting open licensing policies within philanthropic grantmaking. We received a grant from the Hewlett Foundation to survey the licensing policies of private foundations, and to work toward increasing the free availability of foundation-supported works. We wrote about the progress of the project in March, and we’ve been maintaining a list of foundation IP policies, and a model IP policy.

We urge other foundations and funding bodies to emulate the outstanding leadership demonstrated by the William and Flora Hewlett Foundation and commit to making open licensing an essential component of their grantmaking strategy.

Comments Off

Announcing the Institute for Open Leadership Fellows

Timothy Vollmer, August 21st, 2014

Creative Commons and the Open Policy Network are pleased to announce the first round of fellows for the Institute for Open Leadership. The Institute is a training program to develop new leaders in education, science, public policy, and other fields on the values and implementation of openness in licensing, policies, and practices. We received over 90 applications from around the world and representing a broad diversity of fields. Here are the fellows for this year.

  • Dairo Alexander Escobar Ardila; Instituto Humboldt – SiB Colombia; Bogotá, Colombia
  • David Ernst; University of Minnesota; St. Paul, Minnesota, USA
  • Eric Phetteplace; California College of the Arts; Oakland, California, USA
  • Fátima Silva São Simão; UPTEC – Science and Technology Park of the University of Porto; Porto, Portugal
  • Georgia Angelaki, National Documentation Center/Hellenic Research Institute; Athens, Greece
  • Jagadish Chandra Aryal; Social Science Baha; Kathmandu, Nepal
  • Jane Gilvin; National Public Radio; Washington, D.C., USA
  • Julian Carver; Land Information New Zealand; Christchurch, New Zealand
  • Klaudia Grabowska; Polish History Museum; Warsaw, Poland
  • Mohamud Ahmed Rage; Ministry of Higher Education & Culture, Somalia; Mogadishu, Somalia
  • Nasir Khan; Management Information Services, Directorate General of Health Services, Bangladesh; Dhaka, Bangladesh
  • Paul UE Blackman; Barbados Community College; St. Michael, Barbados
  • Vincent Kizza; Open Learning Exchange Uganda; Kampala, Uganda
  • Werner Westermann Juarez; Instituto Profesional Providencia, Santiago, Chile

The in-person portion of the Institute will take place in San Francisco, California in January 2015. The fellows will be develop, refine, and work to implement a capstone open policy project. The point of this project is for the fellow to transform the concepts learned at the Institute into a practical, actionable, and sustainable initiative within her/his institution.

Congratulations to the fellows, and thank you to all the applicants.

Comments Off

Dozens of organizations tell STM publishers: No new licenses

Timothy Vollmer, August 7th, 2014

The keys to an elegant set of open licenses are simplicity and interoperability. CC licenses are widely recognized as the standard in the open access publishing community, but a major trade association recently published a new set of licenses and is urging its members to adopt it. We believe that the new licenses could introduce unnecessary complexity and friction, ultimately hurting the open access community far more than they’d help.

Today, Creative Commons and 57 organizations from around the world released a joint letter asking the International Association of Scientific, Technical & Medical Publishers to withdraw its model “open access” licenses. The association ostensibly created the licenses to promote the sharing of research in the scientific, technical, and medical communities. But these licenses are confusing, redundant, and incompatible with open access content published under other public licenses. Instead of developing another set of licenses, the signatories urge the STM Association to recommend to its authors existing solutions that will truly promote STM’s stated mission to “ensure that the benefits of scholarly research are reliably and broadly available.” From the letter:

We share a positive vision of enabling the flow of knowledge for the good of all. A vision that encompasses a world in which downstream communicators and curators can use research content in new ways, including creating translations, visualizations, and adaptations for diverse audiences. There is much work to do but the Creative Commons licenses already provide legal tools that are easy to understand, fit for the digital age, machine readable and consistently applied across content platforms.

So, what’s really wrong with the STM licenses? First, and most fundamentally, it is difficult to determine what each license and supplementary license is intended to do and how STM expects them each to be used. The Twelve Points to Make Open Access Licensing Work document attempts to explain its goals, but it is not at all clear how the various legal tools work to meet those objectives.

Second, none of the STM licenses comply with the Open Definition, as they all restrict commercial uses and derivatives to a significant extent. And they ignore the long-running benchmark for Open Access publishing: CC BY. CC BY is used by a majority of Open Access publishers, and is recommended as the optimal license for the publication, distribution, and reuse of scholarly work by the Budapest Open Access Initiative.

Third, the license terms and conditions introduce confusion and uncertainty into the world of open access publishing, a community in which the terminology and concepts utilized in CC’s standardized licenses are fairly well accepted and understood.

Fourth, the STM licenses claim to grant permission to do many things that re-users do not need permission to do, such as describing or linking to the licensed work. In addition, it’s questionable for STM to assume that text and data mining can be regulated by their licenses. Under the Creative Commons 4.0 licenses, a licensor grants the public permission to exercise rights under copyright, neighboring rights, and similar rights closely related to copyright (such as sui generis database rights). And the CC license only applies when at least one of these rights held by the licensor applies to the use made by the licensee. This is important because in some countries, text and data mining are activities covered by an exception or limitation to copyright (such as fair use in the United States), so no permission is needed. Most recently the United Kingdom enacted legislation specifically excepting noncommercial text and data mining from the reach of copyright.

Finally, STM’s “supplementary” licenses, which are intended for use with existing licenses, would only work with CC’s most restrictive license, Attribution-NonCommercial-NoDerivatives (BY-NC-ND). Even then they would have very limited legal effect, since much of what they claim to cover is already permitted by all CC licenses. As a practical matter, these license terms are likely to be very confusing to re-users when used in conjunction with a CC license.

The Creative Commons licenses are the demonstrated global standard for open access publishing. They’re used reliably by open access publishers around the world for sharing hundreds of thousands of research articles. Scholarly publishing presents a massive potential to increase our understanding of science. And creativity always builds on the past, whether it be a musician incorporating samples into a new composition or a cancer researcher re-using data from past experiments in their current work.

But to fully realize innovations in science, technology, and medicine, we need clear, universal legal terms so that a researcher can incorporate information from a variety of sources easily and effectively. The research community can enable these flows of information and promote discoveries by sharing writings, data, and analyses in the public commons. We’ve already built the legal tools to support content sharing. Let’s use them and not reinvent the wheel.

Questions should be directed to press@creativecommons.org.

1 Comment »

European Commission endorses CC licenses as best practice for public sector content and data

Timothy Vollmer, July 17th, 2014

Today the European Commission released licensing recommendations to support the reuse of public sector information in Europe. In addition to providing guidance on baseline license principles for public sector content and data, the guidelines suggest that Member States should adopt standardized open licenses – such as Creative Commons licenses:

Several licences that comply with the principles of ‘openness’ described by the Open Knowledge Foundation to promote unrestricted re-use of online content, are available on the web. They have been translated into many languages, centrally updated and already used extensively worldwide. Open standard licences, for example the most recent Creative Commons (CC) licences (version 4.0), could allow the re-use of PSI without the need to develop and update custom-made licences at national or sub-national level. Of these, the CC0 public domain dedication is of particular interest. As a legal tool that allows waiving copyright and database rights on PSI, it ensures full flexibility for re-users and reduces the complications associated with handling numerous licences, with possibly conflicting provisions.

The Commission’s recommendations warn against the the development of customized licenses, which could break interoperability of public sector information across the EU. The guidelines clearly state that license conditions should be standardized and contain minimal requirements (such as attribution-only).

In order to proactively promote the re-use of the licenced material, it is advisable that the licensor grants worldwide (to the extent allowed under national law), perpetual, royalty-free, irrevocable (to the extent allowed under national law) and non-exclusive rights to use the information covered by the licence… it is advisable that [licenses] cover attribution requirements only, as any other obligations may limit licensees’ creativity or economic activity, thereby affecting the re-use potential of the documents in question.

This is a welcome outcome that will hopefully provide a clear path for data providers and re-users. It’s great to see this endorsement after our efforts alongside our affiliate network to advocate for clear best practices in sharing of content and data. The recommendation benefits from CC’s free international 4.0 licenses, saving governments time and money, and maximizing compatibility and reuse.

Kudos to the Commission and the assistance provided by LAPSI, Open Knowledge, and others.

Comments Off

Apply now to participate in the Institute for Open Leadership

Timothy Vollmer, May 23rd, 2014

iol_small

Earlier this week, we kicked off the Open Policy Network. We announced that the first project within the Network is the Institute for Open Leadership. The Institute for Open Leadership is a training program to develop new leaders in education, science, public policy, and other fields on the values and implementation of openness in licensing, policies, and practices. The Institute is looking for passionate public- and private-sector professionals interested in learning more about openness and wish to develop and implement an open policy in their field.

Interested applicants should review the application information and submit an application by June 30, 2014. We plan to invite about 15 fellows to participate in the first round of the Institute for Open Leadership. The in-person portion of the Institute will be held in the San Francisco bay area in January 2015 (TBD: either January 12-16 or January 19-23). Applications are open to individuals anywhere in the world.

A central part of the Institute will require fellows to develop and implement a capstone open policy project. The point of this project is for the fellow to transform the concepts learned at the Institute into a practical, actionable, and sustainable initiative within her/his institution. Open policy projects can take a variety of forms depending on the interests of the fellow and the field where the project will be implemented.

Questions about the Institute for Open Leadership should be directed to opn@creativecommons.org. Our thanks to the William and Flora Hewlett Foundation and the Open Society Foundations for funds to kickstart the Institute for Open Leadership.

3 Comments »

The beginning of the Authors Alliance

Timothy Vollmer, May 22nd, 2014

authorsalliancesmall

Yesterday marked the launch of the Authors Alliance, a nonprofit organization that supports authors who want “to harness the potential of digital networks to share their creations more broadly in order to serve the public good.”

In an interview with Publisher’s Weekly, Authors Alliance founder Pamela Samuelson explained that the Authors Alliance will have a few different roles. Inwardly, the group will “provide authors with information about copyrights, licensing agreements, alternative contract terms,” and other practical legal information so that they can make their works widely and openly available. And externally, the Alliance will “represent the interests of authors who want to make their works more widely available in public policy debates,” and advocate for these reforms alongside like-minded public interest organizations.

The Authors Alliance was developed by Samuelson and several of her colleagues at the University of California Berkeley including Molly Van Houweling, Carla Hesse, and Thomas Leonard. The Alliance also has an advisory board made up of pre-eminent scholars, writers, and public interest advocates, including several members of the Creative Commons board of directors. The Authors Alliance is now accepting new members.

The Alliance has already developed a set of copyright reform principles, outlining its vision for changes to copyright law to support authors who write to be read.

We have formed an Authors Alliance to represent authors who create to be read, to be seen, and to be heard. We believe that these authors have not been well served by misguided efforts to strengthen copyright. These efforts have failed to provide meaningful financial returns to most authors, while instead unacceptably compromising the preservation of our own intellectual legacies and our ability to tap our collective cultural heritage. We want to harness the potential of global digital networks to share knowledge and products of the imagination as broadly as possible. We aim to amplify the voices of authors and creators in all media who write and create not only for pay, but above all to make their discoveries, ideas, and creations accessible to the broadest possible audience.

The principles include:

  1. Further empower authors to disseminate their works.
  2. Improve information flows about copyright ownership.
  3. Affirm the vitality of limits on copyright that enable us to do our work and reach our audiences.
  4. Ensure that copyright’s remedies and enforcement mechanisms protect our interests.

At the core, the Authors Alliance and Creative Commons share a similar goal: to provide useful resources and tools for creators who aren’t being served well by the existing copyright system. We’re excited to work with the Alliance on issues that support authors who write to be read–and the public interest for whom these authors create.

Comments Off


Page 1 of 7123456...Last »