Some Recent Pubs

April 14, 2020:  just published a review in BioScience of a good collection of papers on scientific communication:  The Duty to Communicate.  The book reviewed is Susanna Priest, Jean Goodwin, and Michael F. Dahlstrom (eds.), Ethics and Practice in Science Communication (U Chicago Press, 2018). 

Last summer (2019), Michael Byers and I posted an op-ed on the Fermi Paradox on Bulletin of the Atomic ScientistsDid climate change destroy the aliens?

Here is my new collection of my papers on quantum mechanics, philosophy of time, and (a bit of) quantum logic:  Quantum Heresies (College Publications, 2018).  This volume comprises most of my previously published papers in foundations of physics (some corrected or updated), together with a new Introduction, one new paper, and an Envoi ("The Work to be Done"). 

And here are some recent papers, which I present here in the form of pre- or postprints since the published versions are mostly behind paywalls:

"As Much as Possible, As Soon As Possible:  Getting Negative About Emissions"

A study of the problem of so-called "negative emissions", the process of (hopefully) extracting large amounts of carbon dioxide from the atmosphere.  I present the prima facie scientific case that seems to demand the use of negative emissions, and explore a possible solution to the problem of "mitigation deferral", the problem that the presumptive promise of negative emissions in the future could be used as yet another excuse to defer emission reductions now.  

"A Different Kind of Rigor:  What Climate Scientists Can Learn From Emergency Room Doctors". 

I explain the relevance of professional ethics to the problem of "scientific reticence" identified by James Hansen.

"Fermi and Lotka:  The Long Odds of Survival in a Dangerous Universe".

I explore the notion that the number of advanced species in the universe may be controlled by a power law (such as Lotka's Law), which would imply that there would be far fewer of such species than optimistic estimates would indicate. 

"Energy, Complexity, and the Singularity."

This paper explores the relevance of ecological limitations such as climate change and resource exhaustion to the possibility of a technologically-mediated “intelligence explosion” in the near future.  The imminent risks of global carbonization and loss of biodiversity, as well as the dependency of technological development on a healthy biosphere, are greatly underestimated by singularity theorists such as Ray Kurzweil.  While development of information technology should continue, we cannot rely on hypothetical advances in AI to get us out of our present ecological bottleneck.  Rather, we should do everything we can to foster human ingenuity, the one factor that has a record of generating the game-changing innovations that our species has relied upon to overcome survival challenges in our past. 

 I'll post more as soon as I have time to massage the preprints!

May 28, 2021