Last week the Association of American Universities (AAU), Association of Public and Land-grant Universities (APLU), and the Association of Research Libraries (ARL) released a draft plan on how they’d support public access to federally funded research aligned with the February 22 White House public access directive. The SHared Access Research Ecosystem, or SHARE, is a plan that would draw upon existing university infrastructure in order to ensure public access to publicly funded research. SHARE works through a federated system of university repositories. Participating universities would adopt a common set of metadata fields for publicly funded research articles. The metadata will communicate specific information so the article may be easily discovered through common search engines. Minimum metadata will include author name, title, journal, abstract, and award number. The university-focused SHARE plan was announced in the same week as CHORUS, an effort championed by a coalition of commercial publishers.
In order to promote broad access and reuse of publicly funded research outputs, the SHARE proposal says that federal agencies need to be granted permissions that enable them to make the deposit system work. Therefore, universities and principal investigators need to retain sufficient rights to in turn grant those permissions (access, reuse, archiving) to the federal agencies. From the plan:
Copyright licenses to allow public access uses of publications resulting from federal awards need to be awarded on a non-exclusive basis to the funding agency responsible for deposit in order for that system of public deposit to work [...] Federal funding agencies need to receive sufficient copyright licenses to peer-reviewed scholarly publications (either final accepted manuscripts or preferably final published articles) resulting from their grants to enable them to carry out their roles in the national public access scheme. Such licenses would enable the placement of peer-reviewed content in publicly accessible repositories capable of preservation, discovery, sharing, and machine-based services such as text mining, once an embargo has expired.
The need for universities and researchers to maintain rights to make their research available under open licenses is aligned with the recommendations that Creative Commons made to the federal government in our testimony during the public hearings at the National Academies. In our comments, we urged agencies to allow authors to deposit articles immediately in a repository under a worldwide, royalty-free copyright license that allows the research to be used for any purpose as long as attribution is given to the authors. By making it possible for authors to make their research articles available immediately as open access, federal agencies will be clarifying reuse rights so the downstream users know the legal rights and responsibilities in using that research. This would include important reuse permissions noted in the SHARE proposal.
We also suggested that federal agencies require that authors deposit their manuscripts into a public repository immediately upon publication in a peer reviewed journal. This is also in line with the SHARE plan. If an embargo is present, the SHARE repository will link to the commercial publisher’s website. And once the embargo period expires, the repository would be able to “flip on” access to the article which would then made available under the open license.
The SHARE proposal also notes, “licensing arrangements should ensure that no single entity or group secures exclusive rights to publications resulting from federally funded research.” It is important that universities and scholarly authors properly manage copyrights from the get-go in order to make sure that the final manuscript is made publicly available under the requirements set out by the White House public access directive. This important consideration has been widely discussed at the federal level when the NIH Public Access Policy went into effect. In addition, universities have passed open access policies that reserve the legal rights to archive research conducted by their faculty. And author-level copyright tools have proved to be useful for faculty to preserve some rights to the articles to which they submit to commercial publishers.No Comments »
Two weeks ago we wrote about the U.S. Executive Order and announcement of Project Open Data, an open source project (managed on Github) that lays out the implementation details behind behind the President’s Executive Order and memo. The project offers more information on open licenses, and gives examples of acceptable licenses for U.S. federal data. Some of this information is clear, while other pieces require more clarification. Below we’ve provided some commentary and notes on the licensing parts of Project Open Data.
The Open Licenses page on Project Open Data says that a license will be considered “open” if the following conditions are met:
Reuse. The license must allow for reproductions, modifications and derivative works and permit their distribution under the terms of the original work.
Users can copy and make adaptations of the data. The government may use a copyleft license, thus requiring that adapted works be shared under the same license as the original. In our view, the reference to the government using a license is confusing. Works created by federal government employees in the in the public domain, and a license is not appropriate–at least as a matter of U.S. copyright law. More on this below.
The rights attached to the work must not depend on the work being part of a particular package. If the work is extracted from that package and used or distributed within the terms of the work’s license, all parties to whom the work is redistributed should have the same rights as those that are granted in conjunction with the original package.
Everyone is offered the work under the same public license.
Redistribution. The license shall not restrict any party from selling or giving away the work either on its own or as part of a package made from works from many different sources.
Third parties can sell the data verbatim or produce adaptations of the data and sell those.
The license shall not require a royalty or other fee for such sale or distribution.
Users don’t have to pay to use the licensed data.
The license may require as a condition for the work being distributed in modified form that the resulting work carry a different name or version number from the original work.
When the data gets remixed the licensor can require that the remixer note that their remixed version is different from the original.
The rights attached to the work must apply to all to whom it is redistributed without the need for execution of an additional license by those parties.
Public licenses must be used, which means that everyone gets offered the data under the same terms, without the need to negotiation individual licenses.
The license must not place restrictions on other works that are distributed along with the licensed work. For example, the license must not insist that all other works distributed on the same medium are open.
The license doesn’t infect other data or content that is distributed alongside the openly licensed data. It’s important that the open data is marked as such; the same goes for marking of the the non-open data.
If adaptations of the work are made publicly available, these must be under the same license terms as the original work.
This is a confusing statement, because it seems to require that all data be licensed under a copyleft license. This does not align with the licensing options listed in the Open License Examples page.
No Discrimination against Persons, Groups, or Fields of Endeavor. The license must not discriminate against any person or group of persons. The license must not restrict anyone from making use of the work in a specific field of endeavor. For example, it may not restrict the work from being used in a business, or from being used for research.
Anyone may use the licensed data for any reason.
Open License Examples
The Open License Examples page offers a helpful guide as to which open licenses will be accepted for government data released by federal agencies. As we noted in our earlier post, there is some confusion in that the Open Data Policy Memo says, “open data are made available under an open license that places no restrictions on their use.” Saying that data should be placed under a license with no restrictions doesn’t make sense, since even a very “open” license (such as CC BY) requires attribution to the author a condition on using the license. If the United States truly wishes to make federal government data available without restriction, it could consider mandating only those tools that accomplish this, for example the CC0 Public Domain Dedication or the Open Data Commons Public Domain Dedication and License.
Data and content created by government employees within the scope of their employment are not subject to domestic copyright protection under 17 U.S.C. § 105.
The fact that data and content created by federal government employees is not subject to copyright protection in the United States is a longstanding positive feature of the US code. But as noted here, this copyright-free zone only applies when talking about domestic protection, e.g. inside the United States. Outside its borders, the United States government could assert that, for example, one of its works is protected under French copyright law, and then enforce its copyright in France. It’s unclear how much this legal nuance is leveraged outside of the United States. But it does seem to create a challenge for the U.S. federal agencies in utilizing public domain dedication tools like CC0. This is because CC0 puts content into the worldwide public domain, whereas under Section 105 works created by federal government employees are only in the public domain in the United States. So, while it’s useful that works created by U.S. federal government employees is in the public domain in the United States, it’s a shame that this seems to preclude federal agencies from utilizing public domain tools like CC0, which would help communicate broad reuse rights easily and in machine-readable form. This begs the larger question, if information created by federal government employees is in the public domain in the United States, then is it inappropriate to license this data and content under one of the licenses noted below? And, if that is true, then what content will be licensed under the conformant licenses? Third party content?
When purchasing data or content from third-party vendors, however care must be taken to ensure the information is not hindered by a restrictive, non-open license. In general, such licenses should comply with the open knowledge definition of an open license. Several examples of common open licenses are listed below:
- Creative Commons BY, BY-SA, or CC0
- GNU Free Documentation License
- Open Data Commons Public Domain Dedication and Licence (PDDL)
- Open Data Commons Attribution License
- Open Data Commons Open Database License (ODbL)
- Creative Commons CC0
Notwithstanding the questions above about licensing options for the work produced by federal government employees, the Administration is taking a great step in recommending that licenses should align with the Open Definition. In addition, the Administration might include information about appropriate software licenses, should those come into play when they release data.2 Comments »
Seal Of The Executive Office Of The President / Public Domain
Yesterday President Barack Obama issued an Executive Order requiring federal government information to be open and machine-readable by default. This Order is the latest in a series of actions going back to 2009 in support of increasing access to and transparency of government information.
In addition to the Executive Order, the White House released a Memorandum (PDF) explaining how federal government agencies will comply with the new open data policy.
This Memorandum requires agencies to collect or create information in a way that supports downstream information processing and dissemination activities. This includes using machine readable and open formats, data standards, and common core and extensible metadata for all new information creation and collection efforts. It also includes agencies ensuring information stewardship through the use of open licenses and review of information for privacy, confidentiality, security, or other restrictions to release.
It provides a forward-thinking set of guidelines for open data to be released by U.S. federal agencies:
Open data: For the purposes of this Memorandum, the term “open data” refers to publicly available data structured in a way that enables the data to be fully discoverable and usable by end users. In general, open data will be consistent with the following principles:
- Public. Consistent with OMB’s Open Government Directive, agencies must adopt a presumption in favor of openness to the extent permitted by law and subject to privacy, confidentiality, security, or other valid restrictions.
- Accessible. Open data are made available in convenient, modifiable, and open formats that can be retrieved, downloaded, indexed, and searched. Formats should be machine-readable (i.e., data are reasonably structured to allow automated processing). Open data structures do not discriminate against any person or group of persons and should be made available to the widest range of users for the widest range of purposes, often by providing the data in multiple formats for consumption. To the extent permitted by law, these formats should be non-proprietary, publicly available, and no restrictions should be placed upon their use.
- Described. Open data are described fully so that consumers of the data have sufficient information to understand their strengths, weaknesses, analytical limitations, security requirements, as well as how to process them. This involves the use of robust, granular metadata (i.e., fields or elements that describe data), thorough documentation of data elements, data dictionaries, and, if applicable, additional descriptions of the purpose of the collection, the population of interest, the characteristics of the sample, and the method of data collection.
- Reusable. Open data are made available under an open license that places no restrictions on their use.
- Complete. Open data are published in primary forms (i.e., as collected at the source), with the finest possible level of granularity that is practicable and permitted by law and other requirements. Derived or aggregate open data should also be published but must reference the primary data.
- Timely. Open data are made available as quickly as necessary to preserve the value of the data. Frequency of release should account for key audiences and downstream needs.
- Managed Post-Release. A point of contact must be designated to assist with data use and to respond to complaints about adherence to these open data requirements.
The Memorandum provides some more information about how U.S. government information will be made reusable:
Ensure information stewardship through the use of open licenses – Agencies must apply open licenses, in consultation with the best practices found in Project Open Data, to information as it is collected or created so that if data are made public there are no restrictions on copying, publishing, distributing, transmitting, adapting, or otherwise using the information for non-commercial or for commercial purposes.
Depending on the exact implementation details, this could be a fantastic move that would remove any legal confusion about using federal government data. By leveraging open licenses, the U.S. federal government would be doing a great service to reusers by communicating those rights available in advance. And, if the U.S. truly wishes to make federal government information available without restriction, it could consider using a tool such as the CC0 Public Domain Dedication. CC0 is used by many data providers to place open data directly in the public domain. We’ve already suggested this (PDF) as an option for sharing federally funded research data.
The White House should be commended for taking another positive step forward to ensure that U.S. government data is made legally and technically accessible and useable.2 Comments »
Today, U.S. Register of Copyright Maria Pallante stood before Congress to say: we need a new copyright law. Pallante’s prepared remarks (127 KB PDF) to the U.S. House of Representatives, Subcommittee on Courts, Intellectual Property, and the Internet called for “bold adjustments” to U.S. copyright law.
This is a most welcome aspiration. A strong push for copyright reform is currently occurring around the world through domestic reviews and in international fora like WIPO — coming both from those wanting increased recognition of user rights and those calling for tighter author controls. With the United States one of the leading nations advocating for stronger copyright protection through treaties such as ACTA and the TPP, the international community will be closely observing any movement in U.S. domestic law.
Seal of the United States Copyright Office / Public Domain
In addition to several meaningful reform ideas — including shortening the copyright term itself, alterations to the Digital Millennium Copyright Act, and making revisions to exceptions and limitations for libraries and archives — we’re happy to see that the Register is highlighting the crucial need to expand and protect the public domain. Some of the most compelling work undertaken by Creative Commons and others in the open community has to do with increasing the accessibility and value of the public domain. We hope a more positive public domain agenda can become ingrained into the foundations of U.S. copyright policy. The central question: Can the United States devise a better system for both authors and the public interest in an environment where technology and social norms are increasingly disconnected from an aging copyright law?
Pallante said, “[A]uthors do not have effective protections, good faith businesses do not have clear roadmaps, courts do not have sufficient direction, and consumers and other private citizens are increasingly frustrated.” However, there is no doubt that public copyright licenses are offering a substantial and effective counter to some of these pains — even noted by Ms. Pallante in her longer lecture at Columbia University titled The Next Great Copyright Act (337 KB PDF), “[S]ome [authors] embrace the philosophy and methodology of Creative Commons, where authors may provide advance permission to users or even divest themselves of rights.” CC licenses and public domain instruments are right now helping alleviate frustration with copyright for all — individuals, businesses, institutions, governments — who opt in to using public licenses and licensed works.
Indeed, public licenses are easy-to-use tools for communities that wish to share their creativity on more flexible terms. And when millions of motivated creators share under public copyright licenses like CC, they create great and lasting things (hello Wikipedia). Public copyright licenses shine brightly in the light of Pallante’s telling reflection: “If one needs an army of lawyers to understand the precepts of the law, then it is time for a new law.”
At the same time, the existence of open copyright licenses shouldn’t be interpreted as a substitute for robust copyright reform. Quite the contrary. The decrease in transaction costs, increase in collaboration, and massive growth of the commons of legally reusable content spurred on by existence of public licenses should drastically reinforce the need for fundamental change, and not serve as a bandage for a broken copyright system. If anything, the increase in adoption of public licenses is a bellwether for legislative reform — a signal pointing toward a larger problem in need of a durable solution.
We and the rest of the international community are looking forward to seeing what Pallante and Congress have in mind when they continue the discussion after today. In her oral testimony, Ms. Pallante said, “Copyright is about the public interest.” We hope that the public interest has a seat at the table, with room both for open content licensing and positive legislative reform. The existence of CC licenses does not limit the need for reform. Open licenses help forward-thinking people and institutions to live and thrive in the digital age now, and illuminate the roadmap for beneficial reform to come. Let us begin.1 Comment »
Today, the White House issued a Directive supporting public access to publicly-funded research.
John Holdren, Director of the Office of Science and Technology Policy, “has directed Federal agencies with more than $100M in R&D expenditures to develop plans to make the published results of federally funded research freely available to the public within one year of publication and requiring researchers to better account for and manage the digital data resulting from federally funded scientific research.”
Each agency covered by the Directive (54 KB PDF) must “Ensure that the public can read, download, and analyze in digital form final peer reviewed manuscripts or final published documents within a timeframe that is appropriate for each type of research conducted or sponsored by the agency.”
The Directive comes out after a multi-year campaign organized by Open Access advocates, and reflects a groundswell of grassroots support for public access to the scientific research that the public pays for. Of course, the White House Directive is issued on the heels of the introduction of the Fair Access to Science and Technology Research Act (FASTR). Both the Directive and the FASTR legislation are complementary approaches to ensuring that the public can access and use the scientific research it pays for.
We applaud this important policy Directive. While the Directive and FASTR do not specifically require the application of open licenses to the scientific research outputs funded with federal tax dollars, both actions represent crucial steps toward increasing public access to research.3 Comments »
Today marks an historic step forward for public access to publicly funded research in the United States. The Fair Access to Science and Technology Research Act (FASTR) was introduced in both the House of Representatives and the Senate. FASTR requires federal agencies with annual extramural research budgets of $100 million or more to provide the public with online access to the research articles stemming from that funded research no later than six months after publication in a peer-reviewed journal.
If passed, the legislation would extend the current NIH Public Access Policy (with a shorter embargo) to other US federal agencies, such as the Department of Agriculture, Department of Energy, NASA, the National Science Foundation, and others.
The bill text is available here. The legislation was introduced with bi-partisan support in both the House and Senate. Sponsors include Sens. Cornyn (R-TX) and Wyden (D-OR), and Reps. Doyle (D-PA), Yoder (R-KS), and Lofgren (D-CA).
Creative Commons has supported policies aligned with the practice of making taxpayer funded research available free online and ideally under an open license that communicates broad downstream use rights, such as CC BY. While FASTR – like the NIH Public Access Policy before it – does not directly require the application of open licenses to the scientific research outputs funded with federal tax dollars, it represents a key next step toward increasing the usefulness of public access to research.
Specifically, FASTR includes provisions that move the ball down the field toward better communicating reuse rights. Peter Suber notes,
- FASTR includes a new “finding” in its preamble (2.3): “the United States has a substantial interest in maximizing the impact and utility of the research it funds by enabling a wide range of reuses of the peer-reviewed literature that reports the results of such research, including by enabling computational analysis by state-of-the-art technologies.”
- FASTR includes a formatting and licensing provision (4.b.5): the versions deposited in repositories and made OA shall be distributed “in formats and under terms that enable productive reuse, including computational analysis by state-of-the-art technologies.”
In addition to making articles free to access and read after a six-month publishing embargo, these new provisions make a significant impact in pushing federal agencies to ensure that the research they fund is available and useful for new research techniques like text/data mining.
SPARC has issued an action alert, and there are several specific things you can do to support of FASTR. Today marks the 11th anniversary of the Budapest Open Access Initiative, and you can voice your support that the public needs and deserves access to the research it paid for and upon which scientific advancement and education depends.2 Comments »
Last week the Federal Research Public Access Act (FRPAA) was reintroduced with bipartisan support in both the U.S. House of Representatives and the Senate. According to SPARC, the bill would “require federal agencies to provide the public with online access to articles reporting on the results of the United States’ $60 billion in publicly funded research no later than six months after publication in a peer-reviewed journal.” If passed, the legislation would extend the current NIH Public Access Policy (with a shorter embargo) to other US government-funded research, including agencies such as the Department of Agriculture, Department of Energy, NASA, the National Science Foundation, and others. FRPAA was first introduced in 2006.
Unlike the Research Works Act, FRPAA would ensure that the public has access to the important scientific and scholarly research that it pays for. Creative Commons recently wrote to the White House asking that taxpayer funded research be made available online to the public immediately, free-of-cost, and ideally under an open license that communicates broad downstream use rights, such as CC BY. While FRPAA–like the NIH Public Access Policy before it–does not require the application of open licenses to the scientific research outputs funded with federal tax dollars, it is a crucial step toward increasing public access to research.
SPARC has issued an action alert, and there are several specific actions you can take in support of FRPAA. On this 10th anniversary of the Budapest Open Access Initiative, please voice your support that the public needs and deserves access to the research it paid for and upon which its education depends.2 Comments »
In November we wrote that the White House Office of Science and Technology Policy (OSTP) was soliciting comments on two related Requests for Information (RFI). One asked for feedback on how the federal government should manage public access to scholarly publications resulting from federal investments, and the other wanted input on public access to the digital data funded by federal tax dollars.
Creative Commons submitted a response to both RFIs. Below is a brief summary of the main points. Several other groups and individuals have submitted responses to OSTP, and all the comments will eventually be made available on the OSTP website.
- The public funds tens of billions of dollars in research each year. The federal government can support scientific innovation, productivity, and economic efficiency of the taxpayer dollars they expend by instituting an open licensing policy.
- Scholarly articles created as a result of federally funded research should be released under full open access. Full open access policies will provide to the public immediate, free-of-cost online availability to federally funded research without restriction except that attribution be given to the source.
- The standard means for granting permission to the public aligned with full open access is through a Creative Commons Attribution (CC BY) license.
- If the federal government wants to maximize the impact of digital data resulting from federally funded scientific research, it should provide explicit, easy-to-understand information about the rights available to the public.
- The federal government should establish policies that insure the public has cost-free, unimpeded access to the digital data resulting from federally funded scientific research. Access to this data should be made available as soon as possible, with due consideration to confidentiality and privacy issues, as well as the researchers’ need to receive credit and benefit from the work.
- The federal government can grant these permissions to the public by supporting policies whereby 1) data is made available by dedicating it to the public domain or 2) data is made available through a liberal license where at most downstream data users must give credit to the source of the data. CC offers tools such as the CC0 waiver and CC BY license in support of these goals.
The hearings are still going on; please keep calling, emailing, and otherwise spreading the word!
Tomorrow the House Judiciary Committee will debate and potentially vote on SOPA, the Internet Blacklist bill that would break the Internet.
Our friends at the Electronic Frontier Foundation have compiled a list of 12 actions you can take now to stop SOPA.
Soon you’ll find a huge banner at the top of every page on the CC site protesting SOPA. The Wikimedia community is considering a blackout to bring massive attention to the danger posed by SOPA. Many others are taking action. What are you doing?
For background on the bill, why it would be especially bad for the commons, and links for news, check out our previous post calling for action against SOPA and a detailed post from Wikimedia’s General Counsel.
Finally, remember that CC is crucial to keeping the Internet non-broken in the long term. The more free culture is, the less culture has an allergy to and deathwish for the Internet. We need your help too. Thanks!3 Comments »
November 16 the U.S. Congress will hold hearings on a bill that would unfairly, recklessly and capriciously enable and encourage broad censorship of the Internet in the name of suppressing distribution of works not authorized by copyright holders. As Public Knowledge aptly summarizes, the “Stop Online Piracy Act” would seriously “threaten the functioning, freedom, and economic potential of the Internet” by:
- short-circuiting the legal system, giving rightsholders a fast-track to shutting down whole websites;
- creating conflicts between Domain Name System (DNS) servers, making you more vulnerable to hackers, identity theft, and cyberattacks;
- sanctioning government interference with the Internet, making it more censored globally.
SOPA threatens every site on Internet, but would especially harm the commons, as the Electronic Frontier Foundation explains, focusing on free software. The same applies to free and open projects beyond software, which often use CC licenses. While standard public licenses have lowered the costs and risks of legal sharing and collaboration, SOPA would drastically increase both the costs and risks of providing platforms for sharing and collaboration (think sites ranging from individual blogs to massive community projects such as Wikipedia, from open education repositories to Flickr and YouTube), and vaporize accessibility to huge swathes of free culture, whether because running a platform becomes too costly, or a single possibly infringing item causes an entire domain to be taken down.
The trend that one can plot from the DMCA (1998) to SOPA, and continued extensions and expansions of copyright and related restrictions around the world, also demonstrate the incredible importance of the commons for healthy information policy and a healthy Internet — almost all other “IP” policy developments have been negative for society at large. The DMCA was decried by advocates of free speech and the Internet, and has over past 13 years had many harmful effects. Now, in 2011, some think that the U.S. Congress ‘struck the right balance’ in 1998, while big content is dissatisfied, and with SOPA wants to ratchet the ‘balance’ (watch out, 2024!) much further to their short-term advantage.
Techdirt has excellent coverage of the gritty details of SOPA, its ill effects, and the many constituencies alarmed (such as librarians and sports fans).
Please take action! If you aren’t already sharing works under a CC license and supporting our work, now is a good time. Bad legislation needs to be stopped now, but over the long term, we won’t stop getting new bad legislation until policymakers see broad support and amazing results from culture and other forms of knowledge that work with the Internet, rather than against it. Each work or project released under a CC license signals such support, and is an input for such results.7 Comments »