HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » NNadir » Journal

NNadir

Profile Information

Gender: Male
Current location: New Jersey
Member since: 2002
Number of posts: 25,931

Journal Archives

Nature: "Current models of climate economics assume that lives in the future are less important...

than lives today, a value judgement that is rarely scrutinized and difficult to defend..."

This language comes from a news feature "focus" article from Nature featured on this issues cover: Nature, Vol. 539 Iss. 7591 pg 397 (2016)

The issue, at least in its news and viewpoint sections, is devoted to reflections on scientists' need to reflect on how their work will impact future generations.

One "news" article asks the question, "Should parents edit their children's genes." Nature 530, 402–405 (25 February 2016) It now seems perfectly technologically feasible to do so, owing to the invention of CRISPR-Cas, a technique using complementary genetic material to carry a protein which is a nuclease, designed to clip sections of DNA enabling the insertion of other genes.

This has very high potential to edit the genome in a very facile and efficient way, not only humans, but practically every other high species on the planet. Ultimately it is a technology by which humanity could, were it so inclined, design its own ecosystem and all of the creatures in it?

Were this technology fully developed when the embryo that ultimately became me, my parents might have considered snipping and replacing the gene for type II diabetes, which I apparently carry. Would I be me? Would I know that I wasn't me? Would I care?

My son, who just was admitted to a fairly prestigious art school, is dyslexic, generally associated with chromosome 18. Would I have been wide or foolish to edit it?

Of course, the implications go way beyond any particular individual, myself included. These are not easy questions to answer.

(One of two independent discoverers of CRISPR-Cas, Jennifer Doudna, wrote a wonderful rumination a few issues back, also in Nature on how ill equipped she was to deal with the ethical implications of her work, the emergence of which surprised her and got her to thinking in new ways: Genome-editing revolution: My whirlwind year with CRISPR (Nature 528, 469–471 (24 December 2015))

One of the articles in the current issue also features a rumination on the Environmental issue before us, climate change. An economist, Nicolas Stern, authored an article titled Current climate models are grossly misleading. The point here is that climate models talking about a 2[sup]o[/sup]C increase is a global average, but the economic effects locally can hardly be expected to the same everywhere. The author writes:

Current economic models tend to underestimate seriously both the potential impacts of dangerous climate change and the wider benefits of a transition to low-carbon growth. There is an urgent need for a new generation of models that give a more accurate picture.

Dark impacts

...The Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC), published in 2013 and 2014, provided a comprehensive overview of the literature on the costs of action and inaction. But the assessment understated the limitations of the research done so far. Essentially, it reported on a body of literature that had systematically and grossly underestimated the risks of unmanaged climate change. Furthermore, that literature had failed to capture the learning processes and economies of scale involved in radical structural and technical change, and the benefits of reducing fossil-fuel pollution, protecting biodiversity and forests, and so on...


An article with a larger physical science focus was published a few weeks ago:

Allowable CO2 emissions based on regional and impact-related climate targets (Nature 529, 477–483 (28 January 2016))

The authors show that a 2[sup]o[/sup]C "average" temperature increase in the climate is dominated by the relatively mild changes over the oceans; elsewhere the impacts will most extreme.

The following graphic demonstrates this:



Here's another plot from their paper:



The authors write:

This figure is compelling because it shows a clear linear relationship between cumulative CO2 emissions and a measure of the global climate response. The obvious consequences are (1) that every tonne of CO2 contributes about the same amount of global warming no matter when it is emitted, (2) that any target for the stabilization of ΔTglob implies a finite CO2 budget or quota that can be emitted, and (3) that global net emissions at some point need to be zero2, 3, 4, 5, 6.


"Every tonne contributes the same amount of global warming no matter when its emitted."

This includes tons emitted when the wind isn't blowing and the sun isn't shining. We may think we're doing something by mouthing mindless platitudes about how great wind and solar and other forms of so called "renewable energy" are, but we are lying to ourselves.

What we are doing isn't working; it isn't working at all.

2016 has been an unprecedented year, with the weekly data as compared to the same week the year before routinely being over 3 ppm higher. February 21, 2016, 3.33 ppm higher than the weekly average of 2015

I don't think we'll find the wherewithal to stop at 2C. It's going to be much worse.

Have a nice week.

I'm so embarrassed. My governor endorsed a raging racist to be President of the United States.

The State of New Jersey is, I think, one of the best places in the world to live, but somehow we have a problem electing decent Governors.

Now we have the height of obscenity. Our useless Governor, Chris Christie, a blubbering incompetent buffoon, announced that he supports a freak racist.

I'm so embarrassed.

It's looking very bad these last few weeks at the Mauna Loa carbon dioxide observatory.

At the Mauna Loa carbon dioxide observatory website, they have a data page which compares the averages for each week of the year with the same week of the previous year.

The data goes back to 1974, and comprises 2,090 data points.

I import this data into a spreadsheet I maintain each week, and calculate the weekly increases over the previous year. I rank the data for the increases from worst to best, the worst data point being 4.67 ppm over the previous year, which was recorded during the week ending September 6, 1998, when much of the rain forest of Southeast Asia was burning when fires set to clear the forests for palm oil plantations got out of control during unusually dry weather. Six of the worst data points ever recorded occurred in 1998 during this event, another was recorded in the January following that event.

Of the twenty worst data points ever recorded out of 2090 two of them have occurred in the last four weeks. The week ending January 31, 2016 produced a result of a 4.35 ppm of increase. The week just passed, that ending, 2/14/2016, produced a result of 3.79 ppm increase, tying it for the aforementioned week in January 1999, that ending on January 24, 1999, and that of January 2, 2011.

Of the twenty highest points recorded, 9 have occurred in the last 5 years, 10 in the last 10 years.

The week ending February 7, 2016 was until today's data was published, the 20th of the top 20, it was pushed out and is now the 21st worst.

I also keep a record of the monthly data that is similar to that for the weekly data. This data, unlike the weekly data, goes back to 1958.

November of 2015 was the second worst November ever recorded, 3.08 ppm over the previous November, December of 2015, the worst ever recorded, 3.07 ppm over the previous December, and January of 2016 the 4th worst ever observed, 2.56 ppm over the previous January.

The observatory is still evaluating the final results for 2015; it involves a running average from November through February compared with the data of the previous year. A few weeks ago the preliminary data suggest that 2015 was the worst year ever observed, the data today declares that it is actually a few hundredths of a ppm (a few hundred millionths of a part) behind 1998.

There is no event of which I'm aware comparable to the 1998 fires, and that makes this doubly disturbing to me at least, since it suggests what may be an out of control event such as temperature driven out gassing of sequestered carbon dioxide from permafrost or from oceanic hydrates.

But there's no reason that you should be disturbed as I am. Don't worry, be happy: They're building a solar roadway in France, and even if it ends up covered with grease, skid marks, tire wear marks, sand and salt, it's the thought that counts.

My worry that we are kidding ourselves to the point of suicide by thinking we're actually doing something is pure "Chicken Little," I'm sure.

I now return you to the Hillary vs. Bernie cartoon show.

Enjoy what's left of the weekend.

Nature: Historical Nectar Resources of the British Isles Reflects Their Rise and Fall.

This paper really caught my eye when I was leafing through the current issue of Nature:

Historical nectar assessment reveals the fall and rise of floral resources in Britain (Nature 530, 85–88 (04 February 2016))

An excerpt of the opening lines from from the text:

There is considerable concern over declines in insect pollinator communities and potential impacts on the pollination of crops and wildflowers1, 2, 3, 4. Among the multiple pressures facing pollinators2, 3, 4, decreasing floral resources due to habitat loss and degradation has been suggested as a key contributing factor2, 3, 4, 5, 6, 7, 8. However, a lack of quantitative data has hampered testing for historical changes in floral resources. Here we show that overall floral rewards can be estimated at a national scale by combining vegetation surveys and direct nectar measurements. We find evidence for substantial losses in nectar resources in England and Wales between the 1930s and 1970s; however, total nectar provision in Great Britain as a whole had stabilized by 1978, and increased from 1998 to 2007. These findings concur with trends in pollinator diversity, which declined in the mid-twentieth century9 but stabilized more recently10. The diversity of nectar sources declined from 1978 to 1990 and thereafter in some habitats, with four plant species accounting for over 50% of national nectar provision in 2007. Calcareous grassland, broadleaved woodland and neutral grassland are the habitats that produce the greatest amount of nectar per unit area from the most diverse sources, whereas arable land is the poorest with respect to amount of nectar per unit area and diversity of nectar sources...


A graphic included therein:



Another graphic showing the mass of sugars available to pollinators throughout the British Isles:



The closing text:

Our findings provide new evidence based on floral resources to support habitat conservation and restoration. First, we provide evidence of the high nectar value of calcareous grassland for pollinating insects. Calcareous grassland area has declined drastically in Great Britain, and only a small fraction of the historical national cover remained by 2007 (refs 13, 14). Second, the low availability and diversity of nectar sources in arable habitats highlights the need to provide supplementary resources to support pollination services in farmlands, especially as the use of insect-pollinated crops has increased nationally24 and globally25. The conservation and restoration of broadleaf woodland and neutral grassland as components of the farmland matrix could help to support diverse flower-visiting insect communities in arable land. The contrast in nectar productivity between linear features and the surrounding vegetation is particularly high in arable land, suggesting that linear features, especially hedgerows, provide an efficient means to enhance floral resources in farmlands if they are managed appropriately to allow flowering26. While agri-environment options such as nectar flower mixtures can also enhance the supply of floral resources locally, their contribution to nectar provision nationally remains low. The higher profile given to floral resource provision in the revised Countryside Stewardship guidelines for England16 may substantially enhance resources in future. Finally, our results indicate that improved grassland has the potential to contribute massively to the nectar available nationally. Small adjustments to the management cycle in improved grasslands, allowing white clover, the dominant resource species, to flower, would help realize this potential, although its utility might be restricted to a limited number of pollinator species (Extended Data Table 2). Together, our results on the nectar values of the commonest British plants and the historical changes in plant communities provide the evidence base needed to understand recent national changes in nectar provision and identify the management options needed to restore national nectar supplies.


This was quite an interesting perspective about which we don't think, at least about which I haven't thought. It demonstrates the importance of diversity in both species and habitats, and the important inter-dependency of the our commercial agricultural land on what surrounds it.

In New Jersey we often see bumper stickers (issued by our State agricultural department) that read "No farms, no food."

One may extend this to: "No pollinators, no food."

This speaks to efforts in some midwestern states in the US to make grassland parks, and points, one thinks to the economic as well as the aesthetic value of doing so.

Enjoy the weekend.

December 2015 is recorded as the worst ever for carbon dioxide increases over the previous...

...December at the Mauna Loa carbon dioxide observatory.

A text file for monthly mean data, recorded since 1958 at the Mauna Loa, is found here: Mauna Loa Data Page (Monthly Data).

As each month is posted, I load it into an Excel file I've built for calculation and ranking of the data. The increase of 3.07 ppm as recorded for December 2015 over December 2014, is the largest in 55 years of such observations.

November of 2015 was the second worst ever observed, 3.03 ppm increased CO[sub]2[/sub] as compared to November of 2014.

We did better in January. January of 2016 was "only" the fourth worst January ever observed.

Whatever we think we are doing to address this situation is clearly not working, and inasmuch as the majority of such feeble attempts we make: Trillions of dollars "invested" in so called "renewable energy" over the last ten years - so called "renewable energy" by the way is not sustainable in any way because of its extremely low energy to mass density (there aren't enough materials on the planet to dig up to manufacture meaningful amounts of that rickety stuff) - switching from coal to gas, and imagining that we are "conserving" energy and "becoming more efficient".

Experiment trumps theory, 100% of the time.

One may wish to kill the messenger, but the messenger is not really me, nor the scientists at Mauna Loa and elsewhere, it's the clear chemical signature registered in the composition of the atmosphere. And let's be clear that messenger is being killed.

Enjoy the weekend.

Statistical methods for the eternal monitoring of carbon dioxide waste dumps.

Right now the world's largest, pretty much to the exclusion of all others, carbon dioxide dump is the planetary atmosphere. The failure to address climate change by humanity is obviated that the increase in planetary carbon dioxide concentrations in this dump, according to preliminary figures at the Mauna Loa carbon dioxide observatory, for the first time exceeded 3 ppm in a single year, setting an all time record.

Obviously all strategies to address the issue have failed miserably. We are now drilling more gas, more oil, and mining more coal than ever before, and basically the politically popular strategies for addressing the issue have all failed miserably.

One often discussed approach to dealing with the dangerous fossil fuel waste carbon dioxide is to "sequester" it in abandoned oil and gas fields after all of the dangerous fossil fuels in them have been mined and burned. Each year the amount of carbon dioxide dumped into the atmosphere is more than 30 billion tons; in 2012, according to the table on page 93 of the 2014 World Energy Outlook report, the world emissions were 31.6 billion tons. Undoubtedly the figures for 2013, 2014, and 2015 were significantly worse.

Twelve years ago, an overly optimistic and much discussed paper about "stabilization wedges" was published by two Princeton University faculty members, the famous Pacala and Socolow paper. (Science 13 Aug 2004: Vol. 305, Issue 5686, pp. 968-972)

One may note that many of the "suggestions" in this paper describing technologies that were allegedly "already available" in 2004 are extremely dubious, for two obvious examples being substituting wind energy and solar energy for coal; coal plants have capacity utilization factors of approximately 70 to 80 percent, whereas wind plants are lucky to approach 40% and solar facilities 20%, and I'm probably being overly generous with both of these figures. If one shuts a coal plant down for the four hours when lots of solar energy is available near the summer equinox for example, one will be required to waste huge amounts of energy since coal boilers are not perfectly thermally isolated, and extra energy will be required to return the boilers to operational levels, much as a kettle on a stove requires significant heat before the water boils.

(Solar and wind energy are therefore useless as alternatives to coal; and in fact, they are completely dependent on dangerous natural gas to exist at all, with all the fracking and other risks gas dependency requires.)

One of the "stabilization wedges" discussed was carbon dioxide capture and sequestration sites designed to collect and store dangerous fossil fuel waste when, um, the wind wasn't blowing and the sun wasn't shining. In Pacala and Socolow's paper, in table 1 on page 970 this is described as "building 3500 Sleipners."

A "Sleipner" in case one doesn't know, refers to a program proposed by the Norwegian dangerous fossil fuel company Statoil to put lipstick on its offshore oil and gas drilling pig by injecting carbon dioxide into the Sleipner oil field for "sequestration" which Statoil liked to imply was "eternal sequestration." After much hulaboo, the Sleipner program was abandoned on the grounds that it was, um, "too expensive" compared to dumping carbon dioxide waste directly into the existing and "economic" dump, the planetary atmosphere.

The number of "Sleipners" built since 2004 is uncomfortably close to zero; nearly one hundred percent of all carbon dioxide injected into oil and gas fields today is designed for "EOR," the euphemistically named "enhanced oil recovery" scheme, where the plan is to drive even more dangerous fossil fuels out of the ground so the waste can be dumped in the atmosphere.

But one may ask: Suppose that there really were significant carbon dioxide waste dumps built on the scale that Socolow and Pacala suggested were "already available" in 2004, what then?

A recent paper in the scientific journal Environmental Science and Technology discusses some of the issues that are grotesquely ignored in what I regard as this "sweep it under the rug and let future generations worry about it" scheme: The possibility that these dumps for containing a dangerous gas might, um, leak.

The paper is here: Environ. Sci. Technol., 2015, 49 (2), pp 1215–1224

The title is: Quantifying the Benefit of Wellbore Leakage Potential Estimates for Prioritizing Long-Term MVA Well Sampling at a CO[sub]2[/sub] Storage Site.

Here's some of the introductory text from the paper:

In an effort to mitigate concentrations of carbon dioxide (CO2)in the atmosphere that are caused by stationary anthropogenic inputs, the United States Department of Energy (DOE) is pursuing carbon capture and sequestration (CCS) as one approach in a portfolio of greenhouse gas (GHG) reduction strategies. CCS involves (1) separating CO2 from an industrial process, (2) transporting the CO2 to a storage location, and (3)injecting and sequestering the CO2 in a geologic reservoir furlong-term isolation from the atmosphere.1 Through the Carbon Sequestration Program, the DOE is working with seven Regional Carbon Sequestration Partnerships (RCSPs) to identify feasible sites within the U.S. and portions of Canada for large-scale (i.e., one million tons of CO2 or greater) CO2geologic sequestration.2 The DOE is pursuing three primary types of geologic systems for long-term CO2 storage: (1)depleted oil and gas fields; (2) unconventional formations such as gas shales, coal seams, and basalts; and (3) salineformations.3

One of the potential risks associated with the injection and long-term storage of CO2 into geologic reservoirs is leakage of stored CO2 from geologic containment and into the near surface or surface environment. A potential leakage pathway in depleted oil and gas fields is associated with legacy exploration and production wells.4−6 These legacy wells provide a potential conduit through low-permeability cap rock formations that would otherwise act as a seal to retain CO2 in the storage reservoir. Extensive work has been conducted in Alberta, Canada over the past decade to assess the potential CO2leakage risk of legacy wells by drawing inferences from well completion and abandonment information. This work has, in part, been performed as part of the DOE Regional PartnershipPlains CO2 Reduction (PCOR) Partnership...


The paper then explores the "statistical power" of sampling a subset of drilled wells to determine the probability that more are leaking.

...A well leakage potential scoring approach like the one developed by Watson and Bachu8 provides a quantitative means for ranking the increased probability of CO2 leakage at specific well because of SCVF and/or GM. Applying this scoring methodology to the legacy wells that are located within particular region provides a screening-level risk assessment approach for identifying potential geologic CO2 storage sitesareas with a high incidence of high-ranking wells would represent locations that are not favorable to long-term geologic storage of CO2, while areas with a low incidence of highrankingwells may be suitable future CO2 injection and storage.In addition, once a geologic CO2 storage site has beenidentified, then such a well ranking approach also informs themonitoring, verification, and accounting (MVA) sampling planfor the site, as higher-ranking wells would take priority overlower-ranking wells...


There's no mention at all of what it might cost future generations to monitor these dumps for...um, um, um...eternity, but if that bothers you, don't worry, be happy: You can be reasonably assured that these dumps, not twenty "Sleipners" never mind 3500 of them, will not be built. It's far more convenient and, um, "economic" to use the "traditional" dump, the planetary atmosphere.

Enjoy the remainder of your Sunday.

Nature Editorial Comment: India needs home-grown GM food to stop starvation.

The following text is excerpted from a "World View" comment in Nature, one of the world's highest impact scientific journals:

At the beginning of this month, Prime Minister Narendra Modi announced a road map to guide India’s science and technology over the next two decades. Launched during the Indian Science Congress at the University of Mysore, the plan signalled a cautious approach to techniques such as genetically modified (GM) crops, noting that “some aspects of biotechnology have posed serious legal and ethical problems in recent years”. That is true, but a different and much larger problem looms for India. According to the 2015 United Nations World Population Prospects report, India will surpass China by early next decade as the most populous country on Earth, with the most mouths to feed. India is already classed as having a ‘serious’ hunger problem, according to the 2015 Global Hunger Index of the International Food Policy Research Institute. There is a danger that many of these new Indians will not have sufficient food.

Where can additional food come from? Grain production is stagnant, and rapid urbanization is reducing available land. To increase food production, India needs to invest in modern agricultural methods, including GM crops.

Indian researchers have shown that they have the expertise to generate GM plants, most obviously the pest-resistant cotton that is now widely grown in India. But almost all of this work has relied on molecular-biology research done elsewhere...

...India should stop trying to build the Taj Mahal with borrowed bricks. We need a concerted effort at home to discover and manipulate relevant genes in indigenous organisms and crops (such as chickpea and rice). Indian microbial institutes should take up projects in this direction, because most of the currently used genes for transgenic generation are of microbial origin. That requires a change in direction from an Indian GM-food strategy that has traditionally aimed at quick product development instead of careful assessment of the underlying science.

“Some GM crops designed abroad need more water than is usually available in some parts of India.”
Such home-grown GM crops would also reduce reliance on transgenic technology produced by multinational companies, which is expensive and rarely optimized for the conditions of specific regions. Some GM crops designed abroad need more water than is usually available in some parts of India, for example, putting great stress on farmers....


Full text (which may or may not be behind a firewall) is here: Nature 529, 439 (28 January 2016)

Enjoy the weekend!

The "Extreme Learning Machine."

I'm most definitely snowed in today, and am leafing through some issues of one of my favorite journals, Industrial Engineering and Chemistry Research and I came across a cool paper about one of my favorite topics, ionic liquids, that discusses the "Extreme Learning Machine."

Ionic liquids are generally salts of cationic and anionic organic molecules which are liquids at or near room temperature. Because they are generally not volatile, they can eliminate some of the problems associated with other process solvents, specifically air pollution. Although the term "green solvent" is probably over utilized with respect to ionic liquids, their very interesting potential uses have lead to a vast explosion of papers in the scientific literature concerning them. There are, to be sure, almost an infinite number of possible ionic liquids (and related liquids called "deep eutectics".)

My own interest in these compounds is connected with my interest in the separation of fission products and actinides in the reprocessing of used nuclear fuels, as well as an interest in their potential for the treatment of certain biological products, including lignins, a constituent of biomass that is quite different from cellulose, representing a sustainable route of access to aromatic molecules, as well as their possible use as radiation resistant (in some cases) high temperature heat transfer fluids.

Anyway, about the "deep learning machine:" The paper in question, written by scientists at Beijing Key Laboratory of Ionic Liquids Clean Process, State Key Laboratory of Multiphase Complex Systems, Institute of Process Engineering, Chinese Academy of Sciences, Beijing 100190, China, that I've been reading is this one: Ind. Eng. Chem. Res., 2015, 54 (51), pp 12987–12992

The S[sub]σ‑profile[/sub] is a quantum mechanical factor describing the charge distribution of the surfaces of molecules and organic ions.

Here's the fascinating text:

As compared to the ANN algorithm, the extreme learning machine (ELM) is a relatively new algorithm which was first developed by Huang et al.[sup]23,24[/sup] It can effectively tend to reach a global optimum and only needs to learn a few parameters between the hidden layer and the output layer as compared with the traditional ANN and thus can beused to predict properties because of its excellent efficiency and generalization performance.[sup]25[/sup] However, to the best of our knowledge, the ELM has not yet been used for predicting the properties of ILs until now. Thus, we employed this relatively new ELM algorithm to predict the heat capacity of ILs in this work.


Reference 24 is" Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489−501.

Hmm...the program needs to "learn" only a few parameters...

I always keep in the back of my mind Penrose's criticism of the concept of "artificial intelligence" (maybe because being a human being, I still want my species to be relevant) but I'm intrigued. Neurocomputing is a journal I've never accessed before, but when I can get out of here after this blizzard, I'm going to take a look at that paper which is apparently available at Princeton University's library.

I guess I'm a dork, but I find it all kind of cool...

2015 comes in as the worst year ever observed at the Mauna Loa CO2 observatory.

Data trumps theory, 100% of the time, always.

For decades, we have heard all kinds of stuff about how we would address climate change. We are not addressing it.

The preliminary data for 2015 is now in at Mauna Loa and it's telling. We have many people here who can only understand things when presented as a graphic, and here it is:



The preliminary data shows the increase in 2015 to be the first to exceed 3.00 ppm in a single year: 3.17 ppm

Before 2015, the worst year ever observed was 1998, at 2.93 ppm, a year that had an unusual event inasmuch as much of the Southeast Asian forest burned when fires set to clear land for "renewable energy" palm oil plantations (for German biodiesel) went out of control.

As for so called "renewable energy" which has been hyped to a point nearing insanity for roughly half a century, nothing, absolutely nothing draws out its grotesque failure than this data. I repeat my long standing statement that it is not actually renewable, inasmuch as it requires, owing to its low energy to mass ratio, the massive mining and refining of metals and other materials, many of which are highly toxic.

The last, best hope for humanity was one that has traditionally be the subject of much malign fear and ignorance from some of us on the left, nuclear energy. (It remains the only source of primary energy to have avoided 60 billion tons of the dumping of dangerous fossil fuel waste into the planetary atmosphere, equivalent to about two years worth of said dumping.) It remains the world's largest, by far, source of climate change gas free energy, but it is only expanding at a trivial rate, with eight reactors having been shut in the worst CO[sub]2[/sub] year ever observed, and only 10 new reactors having come on line in that same year.

World Starts Up 10, Shuts Down 8 nuclear reactors in 2015

We deserve what we are getting. Fear and ignorance, so dire in human history has triumphed again. I would like to congratulate all of the anti-nukes here and elsewhere on their grand victory, even as I am prone to weep at what their "victory" means for the future of humanity and the world.

Enjoy the rest of the weekend.



Proserpine.



1874, Dante Gabriel Rossetti (1828-1882) English.

At the Tate Museum, London.

Go to Page: « Prev 1 ... 92 93 94 95 96 97 98 99 100 101 102 Next »