public policy

Trans-Pacific Partnership Would Harm User Rights and the Commons

Timothy Vollmer, November 16th, 2015

The final text of the Trans-Pacific Partnership (TPP) was released earlier this month. The gigantic agreement contains sweeping provisions regarding environmental regulation, pharmaceutical procurement, intellectual property, labor standards, food safety, and many other things. If adopted, it would be the most sweeping expansion of international restrictions on copyright in over two decades. Over the last five years, the TPP has been developed and negotiated in secret. With the text now locked down, participating governments will decide whether to ratify it.

The TPP is a direct threat to the public interest and the commons. It downplays the importance of the public domain and exceptions and limitations, increases the term of copyright protection, and demands harsh infringement penalties.

The TPP must be rejected.

In our initial analysis, we examine several issues that would be detrimental to the public domain, creativity and sharing, and user rights in the digital age.

  • 20-year copyright term extension is unnecessary and unwarranted: The agreement requires member nations to increase their term of copyright protection to life of the authors plus 70 years. Six of the twelve participating countries will have to increase their copyright terms 20 years past the baseline required by existing international treaties.
  • The mention of the public domain is lip service, at best: Text has been removed which more actively supported the public domain as a key policy objective.
  • Enforcement provisions are mandatory, while exceptions and limitations are optional: Instead of securing mandatory limitations and exceptions for uses of copyrighted works under TPP, all of the provisions that recognize the rights of the public are voluntary, whereas almost everything that benefits rightsholders is binding.
  • Potentially drastic infringement penalties, even for non-commercial sharing: The agreement allows for infringement penalties that are disproportionate to harm, providing for the possibility of imprisonment and excessive monetary fines for lesser infringements.
  • Criminal penalties for circumventing digital rights management on works: The agreement adopts a mechanism that would prohibit the circumvention of technological protection measures (DRM) on works, and treats this type of violation as a separate offense regardless of any copyright infringing activity on the underlying content.
  • Investor-state dispute settlement mechanism may be leveraged for intellectual property claims: Copyrighted materials can be subject to the investor-state dispute settlement (ISDS) mechanism, meaning that a private company could bring a lawsuit against a TPP country if that country adopts a law that the company claims would harm its right to exploit its copyright interest.

Statement from Creative Commons on copyright-related aspects of the TPP

1 Comment »

Supporting user rights for mass digitization of culture

Timothy Vollmer, October 12th, 2015

Assignments of copyrights photostat copies by mollyali, CC BY-NC 2.0

A few months ago the United States Copyright Office issued a request for comments on an extended collective licensing (ECL) pilot program they are considering for mass digitization projects. The Office thinks that such a program would permit greater access to cultural works by allowing institutions to engage in mass digitization and then licence those digital collections for a fee. Creative Commons and Creative Commons USA submitted comments to the Copyright Office in coordination with Wikimedia and Internet Archive.

We urged the Office to reconsider the pilot because the fair use doctrine has actually been strengthened in the U.S. due to recent court cases. This has increased the certainty with which a number of entities can engage in mass digitization. And even though the Office points toward similar pilots in Europe, their reliance on ECL is a response to the inflexibility of the current EU copyright framework. Some European cultural heritage institutions are willing to accept the ECL framework because they have no other option. U.S. institutions—such as university libraries—can rely on fair use.

The ECL system as proposed by the Office would not work well to support mass digitization projects. Many authors are not primarily interested in financial rewards—for example those that write scholarly books. And if there is no expectation of revenues for the creator, paying a collective rights organization collect fees to use such works is inefficient and in opposition to the intentions of these authors.

The proposed ECL scheme in the U.S. would be more powerful if it could do more, but the Office has chosen to favor a pilot program that would “facilitate the work of those who wish to digitize and provide full access to certain collections of books, photographs, or other materials for nonprofit educational or research purposes.” By limiting the proposed ECL scope to noncommercial uses, the Office inadvertently makes a stronger case that the activities of digitizers and users will be considered a fair use and that the ECL is not needed in the first place.

We explained that if the Office ultimately pursues an ECL pilot, it should affirmatively exclude works that are publicly licensed and allow other authors who wish to be excluded to apply a Creative Commons license to their work.

In the end, we agreed with many of the libraries that if the Copyright Office is serious about helping to increase legal mass digitization of our shared cultural heritage, it should instead focus on: 1) Encouraging the application of fair use to digitization projects; 2) Promoting the development of better copyright ownership and status information through enhanced registries, rethinking recordation, and asking copyright owners to identify themselves and their works through an internationally-compliant formalities system; and 3) Providing better access to existing copyright ownership and status information by digitizing or encouraging others to digitize and provide free access to all of the Copyright Office’s records.

Comments of Creative Commons and CC USA (PDF)

Comments Off on Supporting user rights for mass digitization of culture

Don’t mess with the right to link:

Timothy Vollmer, May 6th, 2015

StL banner _800

(Hyper)links are the fundamental building blocks of the web, but the practice of linking has come under attack over the last few years. If copyright holders are able to censor or control links to legitimate content, it could disrupt the free flow of information online and hurt access to crucial news and resources on the web.

In the U.S. and Canada we may take for granted that no one requires permission or is forced to pay a fee to link to another place online. But this isn’t the case everywhere. Copyrighted content holders (including news organizations, media, and entertainment sites) around the world are working to remove the right to free and open linking, and the threat is more present than you may think.

Today a coalition of over 50 organizations (including Creative Commons) from 21 countries are launching The campaign aims to raise awareness about the issue and prompt action to urge decision makers to protect the practice of free and open linking online.

Ryan Merkley, CEO of Creative Commons, said, “At its core, the Internet is a network of links — connectivity is at the heart of the Web we love. Breaking that structure by giving some the ability to decide what links should work and what links should not undermines free expression, access to information, and the public commons.”

An example of how restricting access to links is already in place in Spain, where the Spanish government passed a law that “requires services which post links and excerpts of news articles to pay a fee to the organisation representing Spanish newspapers.” This type of pseudo-copyright law was intended to protect the revenue flows of Spanish media publishers. However, you have to question whether such a practice might have backfired for publishers who wanted to use the new rule as a means to monetize access to their content. It’s quite tell that Google News–which funnels significant traffic to media websites–shut down in Spain shortly after the law was passed, citing concerns that allowing rights holders to charge for access to links would have been an unworkable practice for them.

Last year’s public consultation on the review of European copyright rules also  contained a question on the right to link:

Should the provision of a hyperlink leading to a work or other subject matter protected under copyright, either in general or under specific circumstances, be subject to the authorisation of the rightholder?

Many groups, including Creative Commons, responded that allowing rights holders to control access to links would be a terrible idea.

Under no circumstance should hyperlinks be subject to protection under copyright. Sharing links without needing permission from the rightsholder is core to the operation of the internet. Changing this fundamental structural aspect of how the internet works would be detrimental to the free flow of information and commerce online.

You can check out the website for examples from other areas around the world where the right to link is in danger. Read the press release here.

If links can be censored by rights holders, it would be detrimental to access to information, free expression, and economic activity. It could fracture the longstanding mechanism underlying the sharing of information on the web. Let’s not let that happen.

You can sign the petition at Organizations wishing to join the coalition can join here.

1 Comment »

Hague Declaration calls for IP reform to support access to knowledge in the digital age

Timothy Vollmer, May 6th, 2015

hague logo

Today Creative Commons joins over 50 organizations in releasing the Hague Declaration on Knowledge Discovery in the Digital Age. The declaration is a collaboratively-created set of principles that outlines core legal and technical freedoms that are necessary for researchers to be able to take advantage of new technologies and practices in the pursuit of scholarly research, including activities such as text and data mining. The drafting of the declaration was led by LIBER, the Association of European Research Libraries. It was developed through contributions from dozens of organizations and individuals, including several experts from the CC community. Creative Commons is an original signatory to the declaration.

One of the key principles recognized in the declaration is that intellectual property law does not regulate the flow of facts, data, and ideas–and that licenses and contract terms should not regulate or restrict how an individual may analyze or use data. It supports the notion that “the right to read is the right to mine”, and that facts, data, and ideas should never be considered to be under the protection of copyright. To realize the massive, positive potential for data and content analysis to help solve major scientific, medical, and environmental challenges, it’s important that intellectual property laws and private contracts–do not restrict practices such as text and data mining.



The Hague Declaration also lays out a roadmap for action in support of these principles. The roadmap suggests the development of policies that provide legal clarity that content mining is not an infringement of copyright or related rights. It’s important for advocates to champion this notion, especially as there have been increasing suggestions from rights holders who are attempting to develop new legal arrangements and licenses that require users to ask permission to engage in practices such as text and data mining.

In addition to supporting the notion that the right to read is the right to mine–free from additional copyright-like rights, license, or contractual arrangements–the declaration also suggests that if funding bodies are considering adopting open licensing mandates as a component of receiving grant funds, they should aim to adopt policies that champion a liberal licensing approach. Specifically the declaration states that research articles created with grant funds should be published in the global commons under a liberal license such as CC BY, and that research data should be shared in the worldwide public domain via the CC0 Public Domain Dedication.

The Hague Declaration is an important set of principles and recommended actions that can aid the speed and effectiveness of scholarly research and knowledge discovery today. You can read the LIBER press release here. To show your support, you can sign the declaration.


Report back: Institute for Open Leadership meeting

Timothy Vollmer, February 10th, 2015

Creative Commons and the Open Policy Network hosted the first Institute for Open Leadership meeting in San Francisco 12-16 January 2015. The Institute for Open Leadership (IOL for short) is a training program to identify and cultivate new leaders in open education, science, public policy, research, data and other fields on the values and implementation of openness in licensing, policies and practices. The rationale for the IOL is to educate and empower potential open advocates within existing institutional structures in order to expand and promote the values and practices of the idea that publicly funded resources should be openly licensed.

iol group small
IOL group shot by Cable Green under CC BY

There was significant interest in the first iteration of the IOL program: we received over 95 applications and selected 14 fellows for the first Institute. The fellows came from around the world (Bangladesh, Barbados, Chile, Colombia, Greece, Nepal, New Zealand, Poland, Portugal, Somalia, United States ), and reflect a wide range of institutions–from community colleges to government sector to public radio.

The central component of the IOL program requires fellows to develop, refine, and implement a capstone open policy project within their home institution. Creative Commons staff and other selected mentors provided guidance throughout this process.

Week’s activities
The week was deliberately structured with the fellows at the center of the conversation, with a specific focus on providing them with the information and tools to develop and successfully implement their open policy project in their institution. We constructed the week’s activities to cover a wide range of topics, including:

  • Overview of Creative Commons and open licensing, as this is a key aspect to all open policies.
  • Deep dive into open policy, including identifying existing real world examples, sharing lessons learned, discussing the value proposition, sharing typical opposition arguments.
  • Discussion of practical development of policy roadmaps and roll-out strategies across different sectors/institutions.
  • Campaign planning and advice/best practices about how to communicate with decision makers about open policy.
  • Identification of resources in support of open policy development and implementation, including presentations, reports, videos, informational and promotional materials.
  • Sharing of best methods for educating and informing decision makers about open policy, including workshops, courses, hackathons.
  • Testing fellow’s open policy knowledge and expected challenges through an open policy “shark tank.”
  • Hewlett Foundation communication team interviewed multiple IOL fellows for a Hewlett story on the power of CC licensing.

Mentors included Cable Green, Paul Stacey, Timothy Vollmer and Puneet Kishor from Creative Commons and Nicole Allen and Nick Shockey from SPARC. Each of these persons had specific subject-area expertise and acted as a “mentor” for two or more of the fellows. We grouped the fellows based on their project ideas with a mentor in the following categories: Open Educational Resources, Open Access, Open Data, Open GLAM (galleries, libraries, archives, museums), and Open Business Models. During the week, we provided time for fellows to work individually, with other fellows, and with their mentors.

iol conference small
IOL session by txtbks under CC BY

On the final day of the in-person Institute we asked each fellow to report back on their progress from during the week, and asked each to answer common questions, such as talking about their open policy project objectives, planned activities to meet those objectives, identification of challenges they expect to face, partners they plan on working with, and metrics for success.

In addition to the whole group discussions, mentor breakouts, and individual work, we included informational and motivational speakers to talk with the fellows over our lunch breaks. These talks were given by individuals with experience working in open policy across a variety of sectors, including Hal Plotkin (former Senior Policy Advisor within the U.S. Department of Education), Abel Caine (OER Program Specialist at UNESCO), Heather Joseph (Executive Director at SPARC), Laura Manley (Project Manager at Open Data 500) and Romain Lacombe (Plume).

Next steps
With the successful completion of the in-person portion of the IOL, the fellows have now returned to their home countries and will begin the process of implementing their open policies. The mentors are committed to continue working with their respective fellows, including providing advice and assistance. Fellows and mentors will meet to discuss progress over webinars planned for the following months. The goal is for the fellows to have implemented their open policy at the institution within a year. The fellows will be able to share more information about the implementation of their capstone policy projects in the coming months.

We’ve already solicited feedback from fellows and are currently evaluating the activities and structure of the just-completed IOL. There are already several improvements we’d like to see as we begin to develop the second round of the IOL, to be held outside of North America in January 2016. We plan to open the application process for round two in mid-2015. The demand for IOL is large and additional funding is being sought to support additional ones beyond the first two.

Yoda Fountain by Nasir Khan under CC BY-SA
Note: Lucasfilm has offices inside The Presidio, where the IOL took place. Thus, Yoda.

One of the aims of the Institute For Open Leadership is to link participants together into a global network. Participants from this inaugural Institute for Open Leadership, and all future ones, become part of a peer-to-peer network providing support for each other, asking and answering questions, and getting ongoing help with open policy development and implementation. This network helps participants overcome barriers and ensure open policy opportunities come to fruition.

1 Comment »

Institute for Open Leadership kicks off next week

Timothy Vollmer, January 5th, 2015

The Presidio by Mindus under CC BY-NC-SA

It’s a new year, and Creative Commons and the Open Policy Network are excited to work with the inaugural group of fellows at the Institute for Open Leadership. The Institute for Open Leadership–or IOL–is an effort  to cultivate new leaders in open education, science, public policy, and other fields on the values and implementation of openness in licensing, policies and practices. The rationale for the Institute is to educate and empower potential open advocates within existing institutional structures in order to expand and promote the values and practices of the idea that publicly funded resources should be openly licensed.

We received nearly 100  high quality applications and selected 14 fellows for the first Institute. The fellows come from around the world (12 countries), and reflect a wide range of institutions–from community colleges to government ministries  to public radio.

We’re hosting the in-person portion of the Institute in California next week. It’s important that the Institute help fellows move from theory to reality: a major component of the program requires fellows to develop, refine, and implement a capstone open policy project within their home institution. Creative Commons and the open community will provide mentorship and guidance throughout this process. As the fellows build and eventually implement their policy projects, we’ll ask them to share their progress, challenges, and successes. We also plan on running a second Institute for Open Leadership outside of North America – in late 2015.

Comments Off on Institute for Open Leadership kicks off next week

Open Definition 2.0 released

Timothy Vollmer, October 7th, 2014

Today Open Knowledge and the Open Definition Advisory Council announced the release of version 2.0 of the Open Definition. The Definition “sets out principles that define openness in relation to data and content,” and is the baseline from which various public licenses are measured. Any content released under an Open Definition-conformant license means that anyone can “freely access, use, modify, and share that content, for any purpose, subject, at most, to requirements that preserve provenance and openness.” The CC BY and CC BY-SA 4.0 licenses are conformant with the Open Definition, as are all previous versions of these licenses (1.0 – 3.0, including jurisdiction ports). The CC0 Public Domain Dedication is also aligned with the Open Definition.

The Open Definition is an important standard that communicates the fundamental legal conditions that make content and data open. One of the most notable updates to version 2.0 is that it separates and clarifies the requirements under which an individual work will be considered open from the conditions under which a license will be considered conformant with the Definition.

Public sector bodies, GLAM institutions, and open data initiatives around the world are looking for recommendation and advice on the best licenses for their policies and projects. It’s helpful to be able to point policymakers and data publishers to a neutral, community-supported definition with a list of approved licenses for sharing content and data (and of course, we think that CC BY, CC BY-SA, and CC0 are some of the best, especially for publicly funded materials). And while we still see that some governments and other institutions are attempting to create their own custom licenses, hopefully the Open Definition 2.0 will help guide these groups into understanding of the benefits to using an existing OD-compliant license. The more that content and data providers use one of these licenses, the more they’ll add to a huge pool of legally reusable and interoperable content for anyone to use and repurpose.

To the extent that new licenses continue to be developed, the Open Definition Advisory Council has been honing a process to assist in evaluating whether licenses meet the Open Definition. Version 2.0 continues to urge potential license stewards to think carefully before attempting to develop their own license, and requires that they understand the common conditions and restrictions that should (or should not) be contained in a new license in order to promote interoperability with existing licenses.

Open Definition version 2.0 was collaboratively and transparently developed with input from experts involved in open access, open culture, open data, open education, open government, open source and wiki communities. Congratulations to Open Knowledge and the Open Definition Advisory Council on this important improvement.

1 Comment »

Dozens of organizations tell STM publishers: No new licenses

Timothy Vollmer, August 7th, 2014

The keys to an elegant set of open licenses are simplicity and interoperability. CC licenses are widely recognized as the standard in the open access publishing community, but a major trade association recently published a new set of licenses and is urging its members to adopt it. We believe that the new licenses could introduce unnecessary complexity and friction, ultimately hurting the open access community far more than they’d help.

Today, Creative Commons and 57 organizations from around the world released a joint letter asking the International Association of Scientific, Technical & Medical Publishers to withdraw its model “open access” licenses. The association ostensibly created the licenses to promote the sharing of research in the scientific, technical, and medical communities. But these licenses are confusing, redundant, and incompatible with open access content published under other public licenses. Instead of developing another set of licenses, the signatories urge the STM Association to recommend to its authors existing solutions that will truly promote STM’s stated mission to “ensure that the benefits of scholarly research are reliably and broadly available.” From the letter:

We share a positive vision of enabling the flow of knowledge for the good of all. A vision that encompasses a world in which downstream communicators and curators can use research content in new ways, including creating translations, visualizations, and adaptations for diverse audiences. There is much work to do but the Creative Commons licenses already provide legal tools that are easy to understand, fit for the digital age, machine readable and consistently applied across content platforms.

So, what’s really wrong with the STM licenses? First, and most fundamentally, it is difficult to determine what each license and supplementary license is intended to do and how STM expects them each to be used. The Twelve Points to Make Open Access Licensing Work document attempts to explain its goals, but it is not at all clear how the various legal tools work to meet those objectives.

Second, none of the STM licenses comply with the Open Definition, as they all restrict commercial uses and derivatives to a significant extent. And they ignore the long-running benchmark for Open Access publishing: CC BY. CC BY is used by a majority of Open Access publishers, and is recommended as the optimal license for the publication, distribution, and reuse of scholarly work by the Budapest Open Access Initiative.

Third, the license terms and conditions introduce confusion and uncertainty into the world of open access publishing, a community in which the terminology and concepts utilized in CC’s standardized licenses are fairly well accepted and understood.

Fourth, the STM licenses claim to grant permission to do many things that re-users do not need permission to do, such as describing or linking to the licensed work. In addition, it’s questionable for STM to assume that text and data mining can be regulated by their licenses. Under the Creative Commons 4.0 licenses, a licensor grants the public permission to exercise rights under copyright, neighboring rights, and similar rights closely related to copyright (such as sui generis database rights). And the CC license only applies when at least one of these rights held by the licensor applies to the use made by the licensee. This is important because in some countries, text and data mining are activities covered by an exception or limitation to copyright (such as fair use in the United States), so no permission is needed. Most recently the United Kingdom enacted legislation specifically excepting noncommercial text and data mining from the reach of copyright.

Finally, STM’s “supplementary” licenses, which are intended for use with existing licenses, would only work with CC’s most restrictive license, Attribution-NonCommercial-NoDerivatives (BY-NC-ND). Even then they would have very limited legal effect, since much of what they claim to cover is already permitted by all CC licenses. As a practical matter, these license terms are likely to be very confusing to re-users when used in conjunction with a CC license.

The Creative Commons licenses are the demonstrated global standard for open access publishing. They’re used reliably by open access publishers around the world for sharing hundreds of thousands of research articles. Scholarly publishing presents a massive potential to increase our understanding of science. And creativity always builds on the past, whether it be a musician incorporating samples into a new composition or a cancer researcher re-using data from past experiments in their current work.

But to fully realize innovations in science, technology, and medicine, we need clear, universal legal terms so that a researcher can incorporate information from a variety of sources easily and effectively. The research community can enable these flows of information and promote discoveries by sharing writings, data, and analyses in the public commons. We’ve already built the legal tools to support content sharing. Let’s use them and not reinvent the wheel.

Questions should be directed to

1 Comment »

European Commission endorses CC licenses as best practice for public sector content and data

Timothy Vollmer, July 17th, 2014

Today the European Commission released licensing recommendations to support the reuse of public sector information in Europe. In addition to providing guidance on baseline license principles for public sector content and data, the guidelines suggest that Member States should adopt standardized open licenses – such as Creative Commons licenses:

Several licences that comply with the principles of ‘openness’ described by the Open Knowledge Foundation to promote unrestricted re-use of online content, are available on the web. They have been translated into many languages, centrally updated and already used extensively worldwide. Open standard licences, for example the most recent Creative Commons (CC) licences (version 4.0), could allow the re-use of PSI without the need to develop and update custom-made licences at national or sub-national level. Of these, the CC0 public domain dedication is of particular interest. As a legal tool that allows waiving copyright and database rights on PSI, it ensures full flexibility for re-users and reduces the complications associated with handling numerous licences, with possibly conflicting provisions.

The Commission’s recommendations warn against the the development of customized licenses, which could break interoperability of public sector information across the EU. The guidelines clearly state that license conditions should be standardized and contain minimal requirements (such as attribution-only).

In order to proactively promote the re-use of the licenced material, it is advisable that the licensor grants worldwide (to the extent allowed under national law), perpetual, royalty-free, irrevocable (to the extent allowed under national law) and non-exclusive rights to use the information covered by the licence… it is advisable that [licenses] cover attribution requirements only, as any other obligations may limit licensees’ creativity or economic activity, thereby affecting the re-use potential of the documents in question.

This is a welcome outcome that will hopefully provide a clear path for data providers and re-users. It’s great to see this endorsement after our efforts alongside our affiliate network to advocate for clear best practices in sharing of content and data. The recommendation benefits from CC’s free international 4.0 licenses, saving governments time and money, and maximizing compatibility and reuse.

Kudos to the Commission and the assistance provided by LAPSI, Open Knowledge, and others.

Comments Off on European Commission endorses CC licenses as best practice for public sector content and data

Proposed U.S. law would weaken and postpone public access to publicly funded research

Timothy Vollmer, March 12th, 2014

This week the U.S. House Representatives introduced H.R. 4186, the Frontiers in Innovation, Research, Science and Technology Act of 2014 (FIRST Act). The stated goal of the proposed law — “to provide for investment in innovation through scientific research and development, [and] to improve the competitiveness of the United States — is worthy and well received. But part of the bill (Section 303) is detrimental to both existing and proposed public access policies in the United States.

According to SPARC:

Section 303 of the bill would undercut the ability of federal agencies to effectively implement the widely supported White House Directive on Public Access to the Results of Federally Funded Research and undermine the successful public access program pioneered by the National Institutes of Health (NIH) – recently expanded through the FY14 Omnibus Appropriations Act to include the Departments Labor, Education and Health and Human Services. Adoption of Section 303 would be a step backward from existing federal policy in the directive, and put the U.S. at a severe disadvantage among our global competitors.

The White House Directive, NIH Public Access Policy, Omnibus Appropriations Act, and the proposed Fair Access to Science and Technology Research Act (FASTR) all contain similar provisions to ensure public access to publicly funded research after a relatively short embargo (6-12 months). These policies make sure that articles created and published as a result of federal funding are deposited in a repository for access and preservation purposes. In addition, the policies provide for a reasonable process and timeline for agencies to development a plan to comply with the public access requirements.

The FIRST Act would conflict with each of these practices. Instead, if enacted it would permit agencies that must comply with the law to:

  • Extend embargoes to federally funded research articles to up to 3 years after initial publication, thus drastically increasing the time before the public has free public access to this research. We’ve said before that the public should be granted immediate access to the content of peer-reviewed scholarly publications resulting from federally funded research. Immediate access is the ideal method to optimize the scientific and commercial utility of the information contained in the articles.
  • Fulfill access requirements by providing a link to a publisher’s site. However, this jeopardizes long-term access and preservation of publicly-funded research in the absence of a requirement that those links be permanently preserved. A better outcome would be to ensure that a copy is deposited in a federally-controlled repository.
  • Spend up to 18 additional months to develop plans to comply with the conditions of the law, thus further delaying the plans that are already being organized by federal agencies under the White House Directive and Omnibus Appropriations Act.

This bill is scheduled to be marked up in the House Committee on Science, Space, and Technology tomorrow, March 13.

But there are better alternatives, both in existing policy (e.g. White House Directive), and in potential legislation (e.g. FASTR). Here’s what you can do right now:

  • Send a letter to members of the House Science, Space and Technology Committee opposing Section 303 of the FIRST Act.
  • Use the SPARC action center to customize and send letters directly to your legislators. Tweet your opposition to Section 303 of the FIRST Act, or post about the bill on Facebook.
  • Write a letter to the editor or an op-ed for your local or campus newspaper. You can write directly to them or by using the SPARC legislative action center.
  • Share this post with your colleagues, labs, friends and family.
Comments Off on Proposed U.S. law would weaken and postpone public access to publicly funded research

next page