The Playground

  • It seems to me LENR is the slow release of energy particles, A reactor is the wrong word,- the collection of the particles and attach them the to open orbits.

    moving voltage to keep them from bursting

    feels like 1 step forwards and 3 steps back today..

  • it is a pity that the patent on 'Pseudo-Capacitor Structure for Direct Energy Conversion' status has been abandoned

    The first PineSci patent has also been 'abandoned' yet is continued in the recent iteration. I discovered the second PineSci patent while reviewing the patent by Liviu. Although 'abandoned' his patent is cited by 23 other patents, which is a measure of significant influence. The PineSci patent is one and I came across it there in Liviu's patent citations. This is how I find many of the patents I 'scoop' and present in my articles.


    Has his patent influenced recent advances in S-SAFE

    and nano engineered lattice structures and thin film layers? I think it has and might continue to have influence.

  • Interesting interview about Nuclear Weapons.


    External Content youtu.be
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.

  • Ah yes, all this crap coming from the person who calls cold fusion Pseudoscience. It seems to me, you don't follow the science. You only seem to want to protect mainstream opinion, not the science.

  • Preprint from late June and not yet peer reviewed https://www.medrxiv.org/conten…101/2022.06.28.22276926v1

    My underlining

    ABSTRACT

    BACKGROUND There were increased SARS-CoV2 hospitalizations and deaths noted during Omicron (B.1.1.529) variant surge in UK despite decreased cases, and the reasons are unclear.

    METHODS In this observational study, we analyzed reported SARS-CoV2 cases, hospitalizations and deaths during the COVID-19 pandemic in the UK. We also analyzed variables that can affect the outcomes. The vaccine effectiveness among those ≥18 years of age (August 16, 2021 to March 27, 2022) was analyzed.


    RESULTS Of the total cases (n= 22,072,550), hospitalizations (n=848,911) and deaths (n=175,070) due to COVID-19 in UK; 51.3% of cases (n=11,315,793), 28.8% of hospitalizations (n=244,708) and 16.4% of deaths (n=28,659) occurred during Omicron variant surge. When comparing the period of February 28-May 1, 2022 with the prior 12-weeks, we observed a significant increase in the case fatality rate (0.19% vs 0.41%; RR 2.11 [2.06-2.16], p<0.001) and odds of hospitalization (1.58% vs 3.72%; RR 2.36[2.34-2.38]; p<0.001). During the same period a significant increase in cases (23.7% vs 40.3%; RR1.70 [1.70-1.71]; p<0.001) among ≥50 years of age and hospitalizations (39.3% vs 50.3%;RR1.28 [1.27-1.30]; p<0.001) and deaths (67.89% vs 80.07%;RR1.18 [1.16-1.20]; p<0.001) among ≥75 years of age was observed. The vaccine effectiveness (VE) for the third dose was in negative since December 20, 2021, with a significantly increased proportion of SARS-CoV2 cases hospitalizations and deaths among the vaccinated; and a decreased proportion of cases, hospitalizations, and deaths among the unvaccinated. The pre-existing conditions were present in 95.6% of all COVID-19 deaths, various ethnic, deprivation score and vaccination rate disparities noted that can adversely affect hospitalization and deaths among compared groups.


    CONCLUSIONS There is no discernable vaccine effectiveness among ≥18 years of age, vaccinated third dose population since the beginning of the Omicron variant surge. Pre-existing conditions, ethnicity, deprivation score, and vaccination rate disparities data need to be adjusted for evaluating VE for hospitalizations and deaths. The increased cases with significantly increased hospitalizations and deaths among the elderly population during the Omicron variant surge underscores the need to prevent infections in the elderly irrespective of vaccination status with uniform screening protocols and protective measures.

  • Not sure whether this is actually written by antivaxxers, but as a UK resident well aware of the data here the headline here shows great ignorance.


    CFRs are never reliable, because cases are not the same as infections.


    In the Uk we have well documented case rates, but also via the wonderful ONS survey very accurate whole-population infection rates.


    Over the period here - the government policy relaxed from mandating PCR and LFT tests, meaning case rates track infection rates much better, towards discouraging tests.


    Guidelines for business similarly changed making it much less likely that infections will be detected as cases.


    In addition omicron resulted in stealth infections indistinguishable from common cold and therefore not recorded as cases unless LFTs are mandated.


    Any comparison of CASE rates with hospitalisation, rather than comparing the much more reliable ONS survey infection rates with hospitalisation, is just plain silly.


    Or maybe not so silly if you have an antivaxxer agenda?


    Anyway, let us be generous, and note that the thing which is a mystery to the researchers here is for any UK resident quite obvious: case rates have gone down over this period due to deliberate government policy which discourages testing as part of a "live with COVID" agenda - infection rates have not tracked case rates.

  • If your (politically or tribally informed) worldview is that vaccines are uniquely bad for people, or that COIVID vaccines are a worldwide conspiracy, and the scientific establishment is covering it up, you need a whole load of other ideas to make this consistent including widespread censorship of science.

    Actually, that is precisely what shown in the Grand Jury process, which is still ongoing. Just saying, So yeah, those shots are pure poison and evil!
    https://covid-crime.org/grand-jury/

  • 2022-07-03 08:10 Raphael 

    Dr Rossi:

    Did I correctly understand that the plan B is hypothesized as a full time 1 year long broadcasting od an ECAT SKLep in operation in the site of a Leonardo Corporation’s client ?

    Thank you if you can answer,

    Raphael


    2022-07-03 08:12 Andrea Rossi 

    Raphael:

    Yes,

    Warm Regards,

    A.R.


    2022-07-03 08:10 Andrea Rossi 

    Dr Rossi:

    Did I correctly understand that the plan B is hypothesized as a full time 1 year long broadcasting od an ECAT SKLep in operation in the site of a Leonardo Corporation’s client ?

    Thank you if you can answer,

    Raphael

  • Pierre Kory is as we all know a founder of FLCC and strong advocate of ivermectin for the treatment of COVID. As the big RCTs are now coming in we find for sure that it has no (or at most very little) benefit. Kory was arguing that RCTs were not needed because the benefits were so obvious they needed no proof.

    You are distorting his message...again. He was/is an advocate of early treatment. That includes many therapeutics, along with Ivermectin/HCQ. Doctors who have treated soon, and aggressively have seen much better results preventing progression to hospitilization/death than those who have followed the "do nothing until patient is sick enough to check into the hospital" advice given by the public health hierarchy.


    I can't think of one thing that more exemplifies how terribly mislead we were than the fact the public health leaders never established some type of preemptive health care protocol for those testing positive. Making it even worse, they actually threw up obstacles for those like Kory who did. Many who took the initiative to do something have been, and are being prosecuted.


    Anyway, I doubt you care, but the American Board of Internal Medicine is taking action against him (Kory), and he could lose his license for successfully treating patients and preventing many deaths, and hospitilizations. That is so wrong, and what he is basically writing about in his article.


    And you are wrong about what the "big RCT's now coming in" are showing us. The Together has been ripped to shreds by just about everyone who internet peer reviewed it. And the Activ actually showed some small benefit, even though the initial dose was too small. They are following up with more study using higher doses on moderate/severe cases. Then of course, there were the Brazil mega observational study, India, and many more on the list.

  • The FDA will address this problem later this year? Previous research has shown problems. More insanity from another alphabet agency, let's wait to talk about it!


    Flawed oxygen readings may be behind Covid-19’s toll on people of color


    Flawed oxygen readings may be behind Covid-19’s toll on people of color - POLITICO


    Doctors have sometimes failed to diagnose serious cases of Covid-19 among people of color — and the Food and Drug Administration acknowledges one reason may be flaws in devices it approved to measure blood oxygen levels.


    Pulse oximeters can overestimate blood oxygen in people with dark skin, causing doctors to miss patients’ distress signals.


    Researchers identified problems with pulse oximeters years ago, with small studies pointing to misreadings in people of color in 1990, 2005 and 2007.

  • CMNS Energy Technologies

    Commercialisation Consideration


    Along many fronts... This year!


    Meanwhile

    Here is a well balanced discussion about start-ups in Silicon Valley. A relevant window looking into history, insight into perspectives of those influencing the field as various CMNS Energy Technologies enter the world marketplace. I am fairly certain, other's get a sense of this as well, ICCF-24 heralds market entry.



    Still learning... Good to seek understanding.


    This is from Hacker News

    *Larry Page's brother Carl Page had experience with venture capitalists, having ... | Hacker News


    Quote

    Larry Page's brother Carl Page had experience with venture capitalists, having sold eGroups to Yahoo for $432 million.

    I did not know that.


    I did know Bill Gates' mom was on the United Way executive committee with IBM's CEO.


    I think these details are often omitted because we like to promote the idea that all you need is hard work and talent and perhaps a bit of luck to succeed.


    We tend to not like stories about success as a function of familiar ties. This may have something to do with our distaste for monarchies and aristocracies.


    It's also a bit depressing when you're working on a startup but no one in your family is well off, and no one is on a board with the CEO of a huge and potential partner, and your brother did not sell his startup for hundreds of millions.


    nostrademons on March 17, 2011 | next [–]

    Think of it from Carl Page's perspective, then: his brother had not yet sold his startup for hundreds of millions.


    lux on March 17, 2011 | parent | next [–]

    Very nicely put!

    A have-vs-have-not mindset can creep in easily and get you down on yourself before you've realized it. But so what if someone else has more advantages? I might not have a rich anyone or well-connected anyone to help me, but I have countless other advantages others don't have as well, that I should be thankful for, and work hard to make good on deserving them. Thanks for the reminder :)


    zubazuba on March 17, 2011 | parent | prev | next [–]

    You both make good points.

    It is still amazing what they did with what they had, despite the fact that they had more than the average person.


    However, there is no denying that they were lucky to have that money in the first place so that they could wait until their company was valued more and they had to give less away to VCs. I wonder how many startups had to give away their autonomy for badly needed cash? A large number no doubt.


    From Carl Page's perspective: his brother had not yet sold his startup for hundreds of millions. It is quite a feat that he was able to do it by himself. But if Carl Page did have those hundreds of millions as a possible source of funding, could he have kept his startup from VCs longer and had more autonomy in it, and perhaps sold it for billions?


    nostrademons on March 17, 2011 | root | parent | next [–]

    eGroups wasn't sold to Yahoo until August 2000 (after it IPO'd, actually), 2 years after Google was founded. Larry Page didn't have any more money available than any other Stanford grad student with a professor dad. He did have the experience of having a family member who'd gone through the VC process, but lots of people know people who've taken VC.


    kj12345 on March 17, 2011 | prev | next [–]

    So true. I've often read interviews with founders and things get very vague around the "how did you pay bills before you had revenue" question. It becomes pretty clear that there was some family money, but it's not talked about since it takes away from the pure bootstrapped ideal. I wish there would be more honesty about it though, because it's still an amazing accomplishment to start a company, and it might help founders to get a more accurate picture of what's involved.


    mkramlich on March 17, 2011 | prev [–]

    Agreed. The fact that Page and Brin were both smart and made many good decisions early (well, decisions that later seemed to pay off from them, which is approximately equivalent, but not exactly the same) was a major contributor. But I find it hard to believe that the fact that one of them had a brother who sold a business for $432 million did not help them significantly as well. In two ways: yes the connections, but also the fact you could potentially go to a brother rather than a total stranger and get millions of dollars to invest. Unlike a lot of current Lean/Agile/Web2.0 startups, Google couldn't exactly just rent computing nodes from EC2 at the time, they had to buy or build them, which costs lots of cash upfront. "Carl, remember that time I promised not to tell mom about who really broke her vase? I'm calling that in now."

    Yeah it does remind me of Bill Gates a lot too. Smart guy, smart decisions, but also a family with wealth and powerful connections

    -end quotes


  • Shane - do you mean he was not a founder of FLCC? Or am I distorting FLCC? Confused.


    More generally - just as he was convinced early treatment with ivermectin saved lives, so he and others may be convinced early treatment with other stuff saves lives.


    The reason for caution is pretty obvious - the things that have been tried in RCTs do not deliver the "its obvious - lets do it" results that guys like Kory expect.


    But no-one is against early treatment options - if we can find them. It is only in the US this could become the stuff of talk shows and politics. The reason for special caution with early treatment is you have to medicate everyone - not just those with severe disease. So a medicine that does even a tiny but of harm, and no good, kills people. The same ultra-caution over safety applies to vaccines - the earliest type of early treatment. In past epidemics we have rushed to early treatment options that proved (eventually) to have killed more people than they saved.


    Anyway - have you heard of early treatment for the common cold? Or Flu? (I think we have an antiviral for Flu now, it took a long time).


    The Together has been ripped to shreds by just about everyone who internet peer reviewed it. And the Activ actually showed some small benefit, even though the initial dose was too small.


    I was exactly correct about trials:

    TOGETHER - negative - though stopped early (clearly they did not like the idea getting death threats via social media from FLCC promoters if their results were negative) so tolerance of error +/- allows small positive

    ACTIV - negative - the results were within what you expect statistically for something that has no effect. Sceintists do not say that "shows small positive effect". They say "that shows no effect".

    PRINCIPLE - fairly neutral - or it would have reported by now, but we will know exactly how neutral in a few months.


    I said "at most a small positive effect".


    Advocates of drugs (like FLCC) can always claim higher doses are more effective. The reasons for limiting doses is side effects.


    THH


    PS - PRINCIPLE has found budesonide reduces recovery time in the population as a whole. This paper just shows how tough it is to know whether early treatment options do good or harm.


    https://www.thelancet.com/article/S0140-6736(21)01744-X/fulltext


    and it is currently withdrawn as a UK early treatment option - not because Uk doctors want to kill people - because they no that useless treatments still have unwanted side effects and occasional interactions with other medicines:


    GPs told to no longer prescribe budesonide to treat Covid
    The chief medical officers (CMOs) have withdrawn a recommendation for inhaled budesonide as a treatment for Covid.
    www.pulsetoday.co.uk


    So what early treatment options do Shane's internet doctors know about that would improve our non-political treatment in the UK? I am sure the doctors here will listen to any real evidence?

  • LENR is the slow release of energy particles, A reactor is the wrong word,- the collection of the particles and attach them the to open orbits.

    moving voltage to keep them from bursting

    Yes

    Reactor is not the best word.


    A series of energy sequences in formation of atomic energy harvesting congregates. Like from the dendritic growth through the thin films and obsidian dust on your boat paint/art using a DC battery... experiencing an unexpected ball lighting event...

  • External Content youtu.be
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.

  • This time knowing it almost the same as the model from the 1st time but with CCs- honestly. It looks like the instruction anyway..

    If I use the CCs for prior art - no one can patent ect.

    frequencies,- Tesla drivers ect can be picked up cheep but ot right for this.. "no small Tesla's powering larger pulse..... its more fun just to built it the hard way. "let time pass"
    Still a bit to do before it looks like anything and why I don't see a point in posting the progress. it will make little sense..


    Yes

    Reactor is not the best word.


    A series of energy sequences in formation of atomic energy harvesting congregates. Like from the dendritic growth through the thin films and obsidian dust on your boat paint/art using a DC battery... experiencing an unexpected ball lighting event...

  • 2013


    LENR: The Debutante at the Ball


    The Cold Fusion research of Fleischmann and Pons was an anomaly in and of itself. Two electrochemists, while having a bit of fun with the maximum loading of hydrogen into palladium in an electrolytic cell, ventured into a realm of subatomic phenomenon. No one had been there before in quite this way.


    They hazarded to say it was nuclear, and got blasted.


    These two electrochemists had no assistance from other branches of science in trying to figure it out. Nobody came to their assistance. In fact, those who should have joined in this scientific quest, ridiculed the pair as charlatans. Instead of helping out these two lone electrochemists with a scientific dilemma, leaders in the nuclear scientific community of the U.S. government-funded Department of Energy (DOE) labs ridiculed them to no end. This left the two fellows to fend for themselves while being kicked out of the tribe, so to speak.


    Luckily, the few scientists who had found positive results during the DOE-sponsored race to replicate the Fleischmann-Pons Effect (FPE) persisted, mostly in obscurity and without funding, in this query of the unknown.


    Cutting edge experimental science requires patience, honest sharing of data, and evaluation for a continued improvement of the experimenters’ ability to enter into an unknown realm; which is to actually observe and record aspects of a difficult to create phenomenon and thereby test theory. In this manner, our understanding within the unknown realm grows.


    I publish. You review after working it a bit. Always improving experiments. Together with theorists, we collect data, analyze and implement sound suggestions, always moving forward, advancing the science. Open collaboration quickens this difficult quest into the unknown. Open and enthusiastic collaboration by all branches of the scientific community into the query of the unknown is the basis of good science and is essential for the birth of a new science.


    These early cold fusioneers formed an association of the shunned and published in a few “unrecognized” trade journals which they had to create in order to continue the scientific process in this controversial field. The Internet had appeared before the observed

    Fleischmann-Pons Effect (FPE), freeing these researchers from the limitations of the printing press.


    The printing press had advanced science simply by causing more researchers to be reading more researchers work, which caused a quickening of the scientific process.


    I publish. You review it after working it a bit (through meticulous experimentation and collection of data). Together with theorists, we improve our ability to observe and record phenomenon, improve analysis of data, always moving forward.


    Today’s scientists no longer face the hurdle of a publisher’s peer review to get work printed. If you have fallen into an unknown realm who is your peer? Obviously only those who you find there with you. The Internet allowed the peers of cold fusion research to publish, which is the first step in involving the larger community in your scientific endeavor. Only after publishing can true scientific review begin.


    Many of the established branches of science could have assisted Fleischmann and Pons with a few of their questions. These two were wondering what was actually going on. They also were trying to figure out, why, during different runs of their experiments, some cells produced nuclear levels of energy while others did not. None of those in mainstream science helped them to answer any of the questions concerning the new realm they were entrusted with.


    The people who are experts in atomic theory had nothing to add. The people doing high-energy subatomic research at CERN or Lawrence Livermore had nothing to add. Thermoelectric devices are almost like LENR devices, without the hydrogen. Yet the mainstream thermoelectric crowd offered no assistance even though their grandfather, Harold Aspden, had became a godfather to new cold fusion research. Even the emergent semiconductor field could have assisted this new science with their knowledge of dopants and understanding of the adolescent quantum field branch of science.


    None of these folks showed even a bit of healthy scientific interest in this work. Almost all their curiosity evaporated into thin air.


    After the announcement of the birth of cold fusion research people were thrilled. Then to have virtually all curiosity evaporate within the whole scientific community, is an anomaly of such a magnitude that it is hard to comprehend. These lone researchers from a single branch of science, with their Internet printing, were left to care for this newly born area of research by themselves, held separate from the larger scientific community. They were left without communal guidance or assistance in their care of this new unknown scientific field, the infant known as cold fusion research.


    Fleischmann and Pons were just trying to figure it out. Who knows how dirty their electrical currents were? Might there have been harmonic frequencies created upstream of their current supply, caused by any number of other electrical equipment being turned on, or turned off, at the same time? (My TV used to go fuzzy when the neighbor turned on his table saw.)


    These electrical eddy currents could cause one cell to go positive, with nuclear dense energy being produced, while another, without this added focusing of energetics, would be a dud. Would there be pulsations created simply by a portional electrical on/off factor, thereby creating superwaves or standing wave formations? Are influential magnetic moments created within such electron dense environments? Are harmonic frequencies within the lattice the key?


    What surface topography or nano engineering is required? Are the proper fractal geometries essential for equilateral fusion firing and control throughout the system? Do we need some dopants thrown in? Do we need to get the advanced materials folks engaged in doing some Edisonian style research with every known metal and alloy? Is an unknown source of energetics thrown into the mix, such as dark energy or gravity?


    How might one capitalize on these many components within the atomic and the subatomic realm of the cold fusion nuclear reactive environment? Are angular eddy currents within the electron shell a key? Or specific angular thermal currents? Do subatomic transmutations within the molecular liquid crystal plasma create atomic transmutations, on an atom by atom basis?


    So many questions faced Fleischmann and Pons in their efforts to sustain this child that, unassisted by the larger community, the new science of cold fusion barely survived. Luckily the science did and she is growing up, as we shall see.


    Science has been progressing nicely since the birth announcement of cold fusion research in 1989. Quantum physics and engineering has matured since then. After a battle for acceptance, it is now seen as a branch of science that will advance us beyond our present understanding of known Einsteinian physics. Nano-science has emerged fairly well developed, with exciting possibilities, being fully realized quite quickly.


    Both of these branches of science have been openly courting cold fusion research and standing within the low energy nuclear reaction environment for some time. Once an ugly duckling, now a beautiful swan, LENR Energy is now considered to be exciting and full of potential. Highly energetic with no known faults LENR Energy attractive and much sought after.


    LENR Energy Science and Engineering is finding herself best able to thrive as a multi-disciplinary field. LENR is the debutante at the ball. With some really great features: Clean inexpensive energy. Both LENR Electrical and LENR Thermal are embodiments of her grace.


    We would certainly be amiss if we failed to mention the most attractive features. LENR energy transmutes radioactive waste while driving the turbines. My kinda gal. And when she steps onto the dance floor she actually flies, with the grace of a modern spaceplane and the beauty of a Boeing 747.


    My hope is she will capture the attention of the semiconductor and thermoelectric crowd soon. Now that I stop and think on this, they are probably dancing together already. We will soon see.


    Laboratoire de Physique Théorique – Toulouse – UMR 5152

    A gauge theory picture of an exotic transition in a dimer model

    Laboratoire de Physique Théorique de Toulouse - UMR 5152 - A gauge theory picture of an exotic transition in a dimer model


    We study a phase transition in a 3D lattice gauge theory, a coarse-grained version of a classical dimer model. The dimer model on a cubic lattice, first studied by F. Alet and collaborators, displays a continuous transition between an ordered...

  • Commercialisation Considerations


    Patent citations as a metric of influence... NASA


    National Bureau of Economic Research NBER

    WORKING PAPER 6044

    DOI 10.3386/w6044

    ISSUE DATE May 1997


    "Evidence from Patents and Patent Citations on the Impact of NASA and Other Federal Labs on Commercial Innovation"

    Source


    Adam B. Jaffe

    Brandeis University and NBER

    Michael S. Fogarty

    Case-Western Reserve University

    Bruce A. Banks

    NASA Lewis Research Center

    May 1997


    This project could not have been completed without generous contributions of time for discussions by individuals at NASA and a number of private firms. We thank, without implicating, Daniel J. Adams, Tim Gurin, Richard Hullihen, Sylvia Kraemer, Norman Smith, Allan Vogele, and Warren W. Wolf. We also benefited from comments at the NBER Productivity Lunch, and from the able research assistance of Margaret Lister Fernando and Chu Chi-Leung. We are grateful for financial support from the Alfred P. Sloan Foundation, via the National Bureau of Economic Research Project on Industrial Technology and Productivity. Bruce A. Bank�s participation in this research project was as an individual and not an employee of NASA-Lewis.

    ABSTRACT

    We explore the commercialization of government-generated technology by analyzing patents awarded to the U.S. government and the citations to those patents from subsequent patents. We use information on citations to federal patents in two ways: (1) to compare the average technological impact of NASA patents, "other Federal" patents, and a random sample of all patents using measures of "importance" and "generality"; and (2) to trace the geographic location of commercial development by focusing on the location of inventors who cite NASA and other federal patents. We find, first, that the evidence is consistent with increased effort to commercialize federal lab technology generally and NASA specifically. The data reveal a striking NASA "golden age" during the second half of the 1970s which remains a puzzle. Second, spillovers are concentrated within a federal lab complex of states representing agglomerations of labs and companies. The technology complex links five NASA states through patent citations: California, Texas, Ohio, DC/ Virginia-Maryland, and Alabama. Third, qualitative evidence provides some support for the use of patent citations as proxies for both technological impact and knowledge spillovers.


    Introduction

    Federal research institutions comprise a significant component of the U.S. research infrastructure. The approximately 700 Federal labs are extremely heterogeneous, varying from the large "National" laboratories of the Department of Energy such as Las Alamos and Oak Ridge, to small highly specialized facilities. They include "intramural" facilities that are owned and operated by the Federal government, and also Federally Funded Research and Development Centers ("FFRDCs"), which are operated by a university, a private firm or a non-profit organization but receive all or most of their funding from the Federal Government. Examples of intramural labs include the National Institutes of Health, the National Institute of Standards and Technology (NIST) within the Department of Commerce, and the research centers of the National Aeronautics and Space Administration (NASA). Examples of FFRDCs include the DOE National Labs and the NASA-funded Jet Propulsion Laboratory operated by Cal Tech. In 1995, approximately $25 billion of R&D was performed at federal research institutions, which is about 14 percent of the aggregate U.S. research effort, and about 41 percent of Federal spending on R&D. For comparison, universities�the other major locus of public research�performed about $22 billion of R&D, of which about $13 billion was funded by the Federal government, only 21% of Federal research expenditures (National Science Board, 1996).


    The last 15 years have seen increasing policy and academic interest in the role of Federal labs in commercial innovation.

    Beginning with the


    Stevenson-Wydler Technology Innovation Act in 1980, and continuing with the


    Federal Technology Transfer Act of 1986, the


    National Competitiveness Technology Transfer Act of 1989, and the


    Defense Conversion, Reinvestment, and Transition Assistance Act of 1992,


    Congress has implemented statutory changes explicitly designed to foster the transfer of technology from the public sector into the private sector. The biannual Science and Engineering Indicators now publishes a wealth of statistics documenting technology transfer activities at the labs. There have been several studies by the Congressional General Accounting Office of such activities.


    This paper analyzes patents awarded to the U.S. government, and the citations received by those patents from subsequent patents taken out primarily by private firms. The citation analysis allows us to examine two aspects of the production of commercially relevant technology by the federal government. First, we use the frequency and diversity of citations received to measure the "importance" and "generality" of the inventions that are emanating from federal laboratories. We examine how these have changed over time. Second, we examine the geographic locus of citing patents to infer where the subsequent commercial development of technologies that originate at the labs takes place. We focus particularly on patents awarded to NASA, but also examine patents awarded to other federal agencies.


    There is a large and growing literature on the use of patents and patent citations to infer the technological output and impact of different institutions. As discussed further below, there is evidence confirming the validity of patent citations as a measure of technological impact. There is not, however, a clear understanding of the exact nature of the relationship between citations and technology flows. For this reason, in addition to the quantitative analysis, we undertake a detailed qualitative analysis of the citations to a small number of NASA patents in order to gain a richer understanding of the inferences that can be drawn about technological impact and its geographic distribution by examining citations data.

  • Commercialisation Considerations


    "There is no single, obvious way to measure the success of tech transfer that everyone has somehow been missing. Metrics themselves should be seen as experimental, and their impact needs to be monitored. At the same time, metrics should not be altered lightly because stability is needed to make comparisons over time.”

    RFI Response, Massachusetts Institute of Technology


    CMNS Energy technology transfer and IP policy study is of interest. This recent publication provides insight. - gbgoble


    Iman Hemmatian ,Todd A. Ponzio,Amol M. Joshi

    Published: May 24, 2022


    Exploring the role of R&D collaborations and non-patent IP policies in government technology transfer performance: Evidence from U.S. federal agencies (1999–2016)


    Abstract

    Around the world, governments make substantial investments in public sector research and development (R&D) entities and activities to generate major scientific and technical advances that may catalyze long-term economic growth. Institutions ranging from the Chinese Academy of Sciences to the French National Centre for Scientific Research to the Helmholtz Association of German Research Centers conduct basic and applied R&D to create commercially valuable knowledge that supports the innovation goals of their respective government sponsors. Globally, the single largest public sector R&D sponsor is the U.S. federal government. In 2019 alone, the U.S. government allocated over $14.9 billion to federally funded research and development centers (FFRDCs), also known as national labs. However, little is known about how federal agencies’ utilization of FFRDCs, their modes of R&D collaboration, and their adoption of non-patent intellectual property (IP) policies (copyright protection and materials transfer agreements) affect agency-level performance in technology transfer. In particular, the lack of standardized metrics for quantitatively evaluating government entities’ effectiveness in managing innovation is a critical unresolved issue. We address this issue by conducting exploratory empirical analyses of federal agencies’ innovation management activities using both supply-side (filing ratio, transfer rate, and licensing success rate) and demand-side (licensing income and portfolio exclusivity) outcome metrics. We find economically significant effects of external R&D collaborations and non-patent IP policies on the technology transfer performance of 10 major federal executive branch agencies (fiscal years 1999–2016). We discuss the scholarly, managerial, and policy implications for ongoing and future evaluations of technology transfer at federal labs. We offer new insights and guidance on how critical differences in federal agencies’ interpretation and implementation of their R&D management practices in pursuit of their respective missions affect their technology transfer performance outcomes. We generalize key findings to address the broader innovation processes of public sector R&D entities worldwide.


    R&D collaborations and non-patent IP policies in government technology transfer performance: Evidence from U.S. federal agencies (1999–2016). PLoS ONE 17(5): e0268828. https://doi.org/10.1371/journal.pone.0268828

    Editor: Antonio Rodriguez Andres, German University in Cairo, CZECH REPUBLIC

    Received: October 1, 2021; Accepted: May 10, 2022; Published: May 24, 2022

    Copyright: © 2022 Hemmatian et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

    Data Availability: All data underlying the findings in our manuscript are available at Harvard Dataverse: https://doi.org/10.7910/DVN/DNUFWR.

    Funding: A.J. This research was supported by the Ewing Marion Kauffman Foundation under a Kauffman Junior Faculty Fellowship grant. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

    Competing interests: The authors have declared that no competing interests exist.

    Introduction

    As highlighted in a recent study by the World Intellectual Property Organization (WIPO), governments in many countries significantly expanded and accelerated their investments in research and development (R&D) activities as part of their policy responses to the 2009 financial crisis and the 2020 coronavirus outbreak [1]. For example, China is focused on building critical nationwide infrastructure via the construction of advanced data centers, 5G wireless networks, and new energy vehicles [1, 2]. These initiatives are driven primarily by greater public sector investment in the Chinese Academy of Sciences, which in 2018 announced plans to rapidly grow its number of national labs from 200 to 700 by 2020, with a fully operational key lab system expected to be completed by 2025. In another example, specifically for combating the coronavirus, France pledged 5 billion euros in R&D spending, which represents a 25% increase over its original R&D budget for 2020. This effort is led by the 10 research institutes that make up the French National Centre for Scientific Research (CNRS) and receive 80% of all public R&D funds allocated by the French government [3, 4]. In a similar effort within Europe, Germany’s second stimulus package, which targets COVID-19 recovery, features 50 billion euros of R&D investments in a wide array of future-focused technologies. These funds are directed towards R&D projects conducted by three distinct networks of federal- and state-sponsored labs, which include the Helmholtz Association of German Research Centers, the Max Planck Institutes, and the Fraunhofer Institutes [5, 6]. In other countries such as Turkey [7], India [8], and Israel [9], governments initiated similar programs to promote technology commercialization and spark growth. Despite the considerable differences in political systems and economic priorities across China, France, Germany, Israel, Turkey, and India, what their respective public sector R&D entities all have in common are clearly defined government mandates to pursue scientific and technical breakthroughs that may fuel long-term growth, prosperity, and security.


    In line with its counterparts in the aforementioned countries, but on an even broader scale, the single largest public sector R&D sponsor in the world is the U.S. federal government, which has a similar pro-growth mandate to drive scientific discovery, develop new knowledge, promote technical standards, and generate useful innovations. For instance, in 2019 alone, the U.S. government funded an estimated total of $141.5 billion in R&D expenditures [10], which “plays an irreplaceable role in directing technology toward more general and active domains” [11, 12]. Approximately 27% or $39.6 billion of this total is intramural R&D conducted internally by federal agencies, while the bulk of this funding, 73% or $101.9 billion, is allocated to R&D conducted externally by for-profit corporations and nonprofit organizations. Within the extramural R&D allocation, industry represents $43.6 billion, universities receive $33.4 billion, and contractor-operated federally funded research and development centers (FFRDCs, many of which are called ‘national labs’ or ‘federal labs’) account for $14.9 billion. Although there is an established stream of prior research on university-industry technology transfer [1320], far less is known about technology transfer at national/federal labs and non-university research institutes. Despite governments’ consistently large and increasing budget allocations to public sector R&D entities and activities within their respective countries, there appears to be inconsistent and limited use of quantitative metrics for measuring performance outcomes related to creating and commercializing new technologies.


    The dearth of research in this area is surprising because national labs are an essential component of the core systems of innovation in the U.S. and around the world [21, 22]. We aim to extend prior research in a new direction by exploring how government-industry technology transfer at national/federal facilities may differ from university-industry technology transfer along critical dimensions, especially in terms of the identification, adoption, and usage of appropriate performance metrics.


    Indeed, an evaluation of institutional policies and practices at these R&D facilities may yield new managerial and theoretical insights for improving the effectiveness of existing government-supported technology transfer processes.


    Our study investigates the following research question: How do differences in R&D policy implementation across federal agencies affect their technology transfer activities and performance?


    We believe that obtaining empirical evidence to answer this question is timely, relevant, and strategically important as governments around the world continue to expand the scale and scope of their funding for public sector R&D. The central premise of our study is that two main elements of federal agencies’ varying approaches to innovation management directly influence technology transfer performance:


    (1) their engagement in external R&D collaborations, through formal or informal partnership agreements; and


    (2) their adoption of policies for sharing non-patented intellectual property (IP).


    Although the importance of participating in external R&D collaborations and adopting policies for handling IP are routinely incorporated into existing research on university-industry transfer [23, 24], these factors are not yet systematically integrated into the emerging stream of research on government-industry technology transfer [25]. Our study aims to contribute to the nascent literature on federal technology transfer by providing a conceptual framework, expanded metrics to measure successful technology transfer, and fresh empirical evidence to guide scholars, managers, and policymakers in their evaluation of R&D commercialization processes.


    Our study differs from previous studies in two critical ways. First, unlike prior research on government and academic technology transfer that focuses primarily on protecting proprietary technologies through patenting activities [26, 27], we examine the importance of external R&D collaborations and sharing proprietary technologies through non-patent IP policies such as copyrights and materials transfer agreements (MTAs). Second, in contrast to the emerging set of studies on federal technology transfer that use only agency-driven supply-side metrics to evaluate agency performance [28], we introduce customer-driven demand-side metrics and integrate both types of measures into our empirical analyses.


    For example, beyond the traditional supply-side metrics of filing ratio, transfer rate, and licensing success rate [29, 30] that capture a producer’s ability to push technologies into the commercial marketplace, we use the demand-side metrics of licensing income and portfolio exclusivity that capture a customer’s willingness to pull technologies out of government labs [31].


    By incorporating external R&D collaborations and non-patent IP policies as predictors and demand-side metrics as outcomes in our models, we seek to provide a more holistic picture of federal technology transfer performance at the agency level.


    We organize our study by first explaining the historical context of key legislative acts and proposing a conceptual framework. We then conduct a set of exploratory analyses that offer initial empirical evidence for the effects of external R&D collaborations and non-patent IP policies on the technology transfer performance of 10 major federal executive branch agencies (fiscal years 1999–2016). Overall, when we specifically examine agencies’ use of formal agreements for partnerships, we find a positive and significant relationship between this type of external R&D collaboration and all of our supply-side metrics (filing ratio, transfer rate, and licensing success rate), as well as portfolio exclusivity on the demand-side. In contrast, agencies’ use of other types of customized and informal external R&D collaborations appears to be associated with two main effects on the demand-side: (1) a significant decrease in licensing income; and (2) a simultaneous and somewhat surprising corresponding increase in portfolio exclusivity. We find evidence that agencies with a greater utilization of FFRDCs have lower agency-driven technology transfer performance in terms of supply-side metrics. On the demand-side, we find that greater FFRDC utilization is associated with greater licensing income and lower portfolio exclusivity.


    We also find the adoption of non-patent IP policies to be associated with substantial and economically significant shifts in the supply-side and demand-side metrics. While the supply-side effects are similar for copyright agreements and MTAs, the demand-side effects are different. External R&D collaborations appear to further amplify these observed effects. In sum, our findings indicate that federal technology managers must carefully consider the combined effects of external R&D collaboration and non-patent IP policies when formulating their respective agencies’ technology transfer plans and programs. Based on these findings, we discuss the scholarly, managerial, and policy implications for ongoing and future evaluations of federal agencies’ technology transfer performance. Beyond U.S. federal agencies, we also consider how our key findings may inform possible innovation process improvements and policy reforms for public sector R&D entities and activities in other countries and contexts.

    Historical context and conceptual framework

    Historical context

    The Bayh-Dole Act of 1980 (Bayh-Dole), the Stevenson-Wydler Technology Innovation Act of 1980 (Stevenson-Wydler), and the Federal Technology Transfer Act of 1986 (FTTA of 1986) are the cornerstones of the legal foundations for federal agencies’ interpretation and implementation of their R&D management practices and technology transfer activities in pursuit of their respective missions [32]. Bayh-Dole allows non-profits and small businesses to keep title to inventions made using federal government funding. Enacted in December of 1980, Bayh-Dole—which made changes to U.S. Code (USC) title 35, i.e., the “Patents” chapter—also explicitly authorized federal agencies to “grant exclusive or partially-exclusive licenses to patents, patent applications, or other forms of protection obtained” (Codified as amended at 35 USC 200 et seq.). The importance of Bayh-Dole (PL 96–517) on innovation policy is well-documented in the existing literature and is prominently featured in numerous studies of federal technology transfer, academic entrepreneurship, and the commercialization of university-owned patents [3336].


    Prior to Bayh-Dole, the granting of an exclusive license was a lengthy and cumbersome endeavor, with specific requirements differing by the agency. In practice, exclusive licenses were almost never granted [37]. The elaborate process for the U.S. Navy, considered among the more forward-thinking agencies at the time [38], involved advertising the patent in three different publications (the U.S. Patent Office, the Federal Register, and at least one other publication of choice) for a period of at least six months, followed by another 60 day public notice of a prospective exclusive license [39]. Given the lengthy processes and bureaucratic hurdles, agencies would typically grant only non-exclusive licenses, which made a prospective licensee’s business decision of investing in federally-owned patented technologies far riskier. As a result, the government appeared to be hoarding around 30,000 unlicensed patents, and potentially useful and valuable new technologies were not being commercialized. Hence, prior to the enactment of Bayh-Dole, if one looked at patent licensing activity, federal technology transfer appeared to be at a complete standstill. For example, in 1976 alone, only about 150 patents were licensed by all federal agencies from over 2,000 issued patents [39]. With such a small fraction of patents actually being licensed for commercial use, proposed reforms recommended offering a degree of exclusivity in licensing through a unified and more streamlined process to accelerate the commercialization of unlicensed inventions [40, 41]. By explicitly authorizing federal agencies to exclusively license inventions, Bayh-Dole aimed to address this issue.

    Although it is less well-known than Bayh-Dole, an equally important piece of legislation is Stevenson-Wydler (PL 96–480), which was enacted fifty-two days earlier and signed into law, also by President Carter. This law made changes to title 15, the “Commerce and Trade” chapter, and was focused on using the federal labs’ R&D capabilities and resulting technologies to more directly benefit citizens of the U.S. by moving those technologies to the private sector (Codified as amended at 15 USC 3701 et seq.). Of the two laws, Stevenson-Wydler is the only one to explicitly mention “technology transfer” and make technology transfer a codified mission of the federal labs by requiring each agency “strive where appropriate to transfer federally owned or originated technology to State and local governments and to the private sector [42].”


    Agencies were legally bound to establish an Office of Research and Technology Applications (ORTA) at each lab, staff the office with at least one full-time professional, and devote not less than 0.5% of the agency’s R&D budget to support the technology transfer function. A waiver for the 0.5% budgetary requirement was built into the law, and essentially all of the agencies requested waivers; the requirement was dropped when the law was amended in 1986 [4345]. From the beginning, the implementation of technology transfer legislation at federal agencies and national labs had its supporters [46, 47] and skeptics [48]. Part of the ongoing debate about the effectiveness of these policies and laws arose from the lack of a uniform, standardized set of metrics for consistently monitoring and measuring technology transfer performance across agencies, labs, and teams:

    Quote
    “There is no single, obvious way to measure the success of tech transfer that everyone has somehow been missing. Metrics themselves should be seen as experimental, and their impact needs to be monitored. At the same time, metrics should not be altered lightly because stability is needed to make comparisons over time.
         RFI Response, Massachusetts Institute of Technology [49]

    The enactment of Stevenson-Wydler required federal agencies to incorporate technology transfer activities directly into their respective missions and to allocate dedicated resources but did not adequately define appropriate metrics for measuring these activities across agencies. In the 1980s, several academic studies indicating that the U.S. faced a risky competitive decline in innovation, along with the growing recognition that many national labs had specialized facilities and equipment that could be leveraged to support innovation more broadly, prompted renewed attention on federal technology transfer [5052]. In response, Congress passed the FTTA of 1986. Signed into law by President Reagan, the legislation amended Stevenson-Wydler in a number of noteworthy ways, including moving licensing from being handled centrally by agency managers to being handled by the inventing lab, establishing a Federal Laboratory Consortium, and most importantly, authorizing the use of Cooperative Research and Development Agreements (CRADAs) by the labs.

    A CRADA is a contractual agreement between one or more federal laboratories and one or more non-federal entities “under which the Government, through its laboratories, provides personnel, services, facilities, equipment, intellectual property, or other resources with or without reimbursement (but not funds to non-Federal parties) and the non-Federal parties provide funds, personnel, services, facilities, equipment, intellectual property, or other resources toward the conduct of specified research or development efforts which are consistent with the missions of the laboratory” (15 USC 3710a). Shortly after the law came into effect, CRADAs experienced explosive growth and rapidly came to dominate the formal channels of federal technology transfer [22, 5355].

    In sum, the enactment of Bayh-Dole, Stevenson-Wydler, and the FTTA of 1986 established some of the guiding principles that remain influential today in shaping how federal agencies’ employees manage R&D activities. All three pieces of legislation address key issues regarding the ownership, transfer, and sharing of knowledge generated by federal labs. Bayh-Dole emphasized the importance of licensing out government-owned inventions. Stevenson-Wydler established the requirement that federal agencies explicitly incorporate technology transfer into their missions. The FTTA of 1986 broadly authorized resource-leveraging agreements in the form of CRADAs to facilitate collaboration with private sector partners. These laws collectively provide broad authorization and guidance to agencies conducting R&D on how to execute their mission with a focus on technology transfer.

  • Studying this and all it's references...

    So much to learn.


    The importance of open innovation

    CRADAs were authorized under the FTTA of 1986. The importance of CRADAs has endured and grown, and there continue to be far more CRADAs (or similar partnership agreements) than licenses executed by the individual laboratories and across all agencies [29]. However, even as their importance (by the numbers) far outweighs that of licenses, the emphasis on licensing lingers, and a recent report by the Government Accountability Office attempted to prescribe actions to increase licensing activity [108].

    Several years prior to the enactment of the TTCA of 2000 and the required annual agency reports, a survey of industry leaders was conducted to ascertain what, in fact, industry wanted from the federal labs. The thinking was that rather than asking policymakers and academics for answers, it would make sense to seek real-world perspectives from the very people responsible for conducting the practical commercial application side of technology transfer: industry partners. Industry was found to view the very outputs Congress focuses on as actually providing minimal value, with more value stemming from contract and cooperative R&D activities, as well as idea transfer vs. technology transfer per se [31]. In this regard, industry leaders saw the federal labs’ contribution to collaborative, multidisciplinary R&D as potentially helpful, articulating federal labs’ promise within the context of an ‘open innovation’ system several years before the term took root [109]. While industry prized the cooperative activities above licensing or IP, the absence of a CRADA metric in the reporting statute is an early example of how empirically derived information may not appreciably affect policy in this realm [54, 110]. Our model suggests that industry input has been proven correct, in that there is a strong association between CRADAs and classical measures of success.

    For example, if industry does indeed prize open innovation arrangements with federal labs, it would follow that those agencies that practice more open innovation should have more opportunity to be involved in the development of commercially-relevant licensable IP. Specifically, those agencies that engage in more CRADAs would be expected to develop more licensable IP, including patents, resulting in a higher transfer rate and licensing success rate [29, 30], not to mention licensing income. Broadly speaking, our empirical results suggest that this is indeed the case. If we normalize traditional CRADAs to the R&D budget, agencies can be ranked by normalized cost/CRADA (Table 6). Agencies with a lower cost/CRADA have a higher open innovation or ‘R&D Collaboration’ rank. Three of the top agencies (EPA, USDA, and DOE) are also among the top agencies receiving licensing income normalized to R&D budget, and three (VA, DOC, and EPA) are top performers for transfer rate.

    It might be expected that agencies engaged in open innovation would be exposed to more market demand for those same technologies. For example, the VA has engaged in the largest number of CRADAs as normalized to its R&D budget over the 18-year period, yet it had low-mid range normalized license royalties. One explanation can be found in the CRADA statute itself, which provides a notable carrot for industry partners:

    Quote
    “The laboratory shall ensure, through such agreement, that the collaborating party has the option to choose an exclusive license for a pre-negotiated field of use for any such invention under the agreement or, if there is more than one collaborating party, that the collaborating parties are offered the option to hold licensing rights that collectively encompass the rights that would be held under such an exclusive license by one party. (See 15 USC 3710a(b)(1))

    This required provision essentially removes the risk of market competition for inventions developed under a CRADA. While those agencies that execute more CRADAs may have higher performance metrics such as higher transfer rates and higher licensing success rates, the actual licensing income is not significantly affected. The association between a higher number of traditional CRADAs and a higher portfolio exclusivity rate corroborates this explanation. Specifically, for every 10-fold increase in traditional CRADAs, there appears to be an 8% increase in exclusivity (see Table 4). Those are likely to be inventions developed under a CRADA and subject to the above provision.


    Also

    Hmmmmmm 🤔 of interest.

    ....last paragraph in the chapter

    Generalizing Key Findings


    For example, proposals based on expectations that blockchain technology may facilitate a practical approach to knowledge management of these forms of IP, and thereby enhance their commercialization potential, are being developed and disseminated [138, 139]. In another example that is especially pertinent to the outbreak and aftermath of the coronavirus, there is renewed interest among national governments and transnational actors such as the World Health Organization to facilitate greater pooling of cohort data from clinical trials, foster more open sharing of biological materials, and encourage the voluntary waiving of certain IP rights to accelerate medical R&D for combating COVID-19 [140142]. Hence, new forms of CRADAs and MTAs may emerge in the near future, which only underscores the importance of identifying valid and meaningful ways to measure technology transfer activities and performance over time.

    Limitations and future directions

Subscribe to our newsletter

It's sent once a month, you can unsubscribe at anytime!

View archive of previous newsletters

* indicates required

Your email address will be used to send you email newsletters only. See our Privacy Policy for more information.

Our Partners

Supporting researchers for over 20 years
Want to Advertise or Sponsor LENR Forum?
CLICK HERE to contact us.