Tuesday, December 16, 2014

Global warming, peak oil, collapse and the art of forecasting

Predicting the future usually means to provide an accurate analysis of the present and extrapolating some trends. Therefore the forecasts that try to look at least some decades into the future need to be well informed of the current situation. However, we have various options; we may make different decisions that will influence the outcome. For climate change, the amount of greenhouse gases emitted will be a major factor that determines the amount of warming, and this is obviously the topic for negotiations.

The Intergovernmental Panel on Climate Change (IPCC)

The IPCC thus operates with four "representative concentration pathways" (RCPs) of CO2 equivalents in the atmosphere and what effects that will have on the climate. The pathways stand for different strategies of coping with the challenges, from turning off greenhouse gas emissions immediately (not likely to happen) to sticking ones head in the tar sands and increasing the emission rates as much as possible. A part of the debate now revolves around whether or not the most restrictive pathway can be combined with continued economic growth, but more about that below.

The IPCC's fifth report from November 1 2014 comes with its usual summary for policymakers. However, having a policy is not restricted to the assumed readers of the report; we are all policymakers. IPCC's report sketches the facts (the melting ice, acidifying of oceans etc) and suggest mitigation strategies and adaptation to the inevitable worsening climate in many parts of the world. Perhaps the physics behind global warming is assumed to be known, but possible positive feedback mechanisms are worth mentioning. This is how the summary alludes to that:
It is virtually certain that near-surface permafrost extent at high northern latitudes will be reduced as global mean surface temperature increases, with the area of permafrost near the surface (upper 3.5 m) projected to decrease by 37% (RCP2.6) to 81% (RCP8.5) for the multi-model average (medium confidence).
Melting ice means that the albedo will decrease so that a darker sea surface will absorb more energy. Methane may also be released, either gradually or in a sudden burp, as permafrost thaws. For the record, methane has a global warming potential many times that of CO2 (the factor depends on the time horizon which is why different figures may be used).

As usual, the so called policy makers are reluctant of taking consorted action. From the debate it may appear as though the goal of limiting global warming to 2°C would be sufficient to keep up business as usual. Oil companies and their allied business partners either invest in campaigns casting doubt on climate science, or argue that their production is so much cleaner than in the rest of the world, and if they didn't drill for oil, then someone else would. Yet, we know that most of the known hydrocarbon sources will have to remain in the ground to minimize the impact of climate change. Even with one of the most optimistic scenarios, we would have to prepare for extreme weather; draughts, hurricanes, forest fires, flooding, and mass extinction of species. Perhaps there is an awareness of the unwillingness to take prompt measures to reduce greenhouse gas emissions that has tipped the IPCC's focus somewhat over on adaptation to shifting climate conditions rather than focusing narrowly on mitigation strategies.

Global Strategic Trends: a broader perspective

With this gloomy outlook in mind, one should not forget that there is a fast-paced development in science and technology in areas from solar panels to medicine and artificial intelligence. The British Ministry of Defence's think tank Development, Concepts and Doctrine Centre has compiled a report that tries to summarize trends in several areas in a thirty year perspective up to 2045. Their Global Strategic Trends (GST) report does not make predictions with assertions of the likelihood of various outcomes, but rather illustrates where some of the current trends may lead, as well as providing alternative scenarios. The strength of this Global Strategic Trends report is its multifaceted view. Politics, climate, population dynamics, scientific and technological development, security issues, religion and economics are all at some level interrelated so that significant events in one domain has implications in other domains. Such cross-disciplinary interactions may be difficult to speculate about, but the GST report at least brings all these perspectives into a single pdf, with some emphasis on military strategies and defence. A weakness of the GST report is that it is perhaps too fragmented, and the most important challenges are almost lost in all the detail.

Peak oil, the debt economy and climate change

A better and more succinct summary of these great challenges might be one of Richard Heinberg's lectures. As Heinberg neatly explains, the exceptional economic growth witnessed the last 150 years is the result of the oil industry. In the first years, oil was relatively easy to find and to produce. Gradually, the best oil fields have been depleted and only resources of lesser quality remain such as the recently discovered shale oil across the UK. The production of unconventional oil is much less efficient than that of standard crude oil. The question is, how much energy goes into producing energy, or what is the Energy Return On Energy Invested (EROEI)? As this ratio goes down to 1, the energy produced is equal to that invested in producing it. Although exact figures are hard to estimate, it is clear that the EROEI of unconventional oil is significantly lower than that of conventional oil; its EROEI is so low in fact that even in strictly economic terms it is scarcely worth producing, leaving aside its potentially devastating effects on the environment.

Heinberg discusses three issues which combined seem to have very dramatic consequences for the civilised world as we know it: First, there is the energy issue and peak oil. It appears unlikely that renewable energy sources and nuclear power will be able to replace fossil fuel at the pace that would be needed for continuing economic growth. Second, there is debt as the driver of economic growth. As an illustration, Heinberg mentions the early automobile industry with its efficient, oil-powered assembly line production, so efficient in fact that it led to over-production. The next problem became how to sell cars to customers, because these were more expensive than people could generally afford. Hence came the invention of advertising, of "talking people into wanting things they didn't need" and subsidiary strategies such as planned obsolescence; making products that broke down and had to be replaced, or the constant redesign and changing fashion so that consumers would want to have the latest model. The solution to the cars being too expensive was to offer consumers credit. Money is created every time someone gets a loan from the bank with the trust that it will be paid back in the future, which again necessitates economic growth. This system has sometimes been likened to a pyramid game, and it is not a very radical idea to believe that it could collapse at some point. The third issue that Heinberg brings up is the climate change, which will lead to global disruption as large parts of the world become uninhabitable.

Two of Heinberg's recent essays deal with the energy situation and the coming transition to a society with a very different pattern of energy consumption. His criticizes the excessive optimism on behalf of renewable energy on one hand and "collapsitarians" on the other, some of which even think we are bound for extinction. The conclusion is that energy use and consumption in general must be reduced, either voluntarily or as a matter of necessity when there are no longer any alternatives.

An even more vivid account of more or less the same story is given by Michael C. Ruppert in the documentary Collapse. And after the realisation that this is probably where we are heading, it may be best to take the advice of Dmitry Orlov who has some experience of living in a collapsed Soviet Union: just stay calm, be prepared to fix anything that breaks down yourself and don't expect any more holiday trips to the other side of the planet!

Monday, August 25, 2014

Reptilian Revolution


Almost unquantized music.

Q: Is this a concept album?
A: Yes. Its subject matter is not only derived from those entertaining kooks who see shapeshifting reptilians on u-tube videos with their very own eyes, hence they must exist; there are also references to more serious topics such as the unhealthy state of affairs alluded to in this previous post.

Monday, May 12, 2014

Programming for algorithmic composition

Computer programming is an essential prerequisite for musical composition. Imagine if that were the case. Of course it is not, any more than composition is necessary for programming. However, in algorithmic composition you do not get very far without recourse to computer programming, and for composers to learn programming, algorithmic composition is the best way to get started. 

Writing a program that outputs a specific composition is very different from programming for the general needs of some other user, which is what professional programmers do. Since no assumptions about the interaction with some unknown user have to be made, it should be much easier. Programming in algorithmic composition is nothing other than a particular way of composing, a method that inserts a layer of formalisation between the composer and the music. Algorithmic composition is a wonderful opportunity to investigate ideas, to learn about models and simulations, to map data to sound.

Generative music, which is not necessarily synonymous with algorithmic composition, is often concerned with making long pieces or a set of pieces that may be created on demand. The forthcoming album Signals & Systems is filled to the brim with algorithmic compositions, and the source code for a program that outputs variants on one of the pieces is now available.

Wednesday, February 26, 2014

Manifesto for self-generating patches

Ideas for the implementation of autonomous instruments in analog modular synths (v. 0.2)

The following guidelines are not meant as aesthetic value judgements or prescriptions as to what people should do with their modulars  as always, do what you want! The purpose is to propose some principles for the exploration of a limited class of patches and a particular mode of using the modular as an instrument.

Self-generating patches are those which, when left running without manual interference, produce complex and varied musical patterns. Usually, the results will be more or less unpredictable. In this class of patches, there are no limitations as to what modules to use and how to connect them, except that one should not change the patch or touch any knobs after the patch has been set up to run. An initial phase of testing and tweaking is of course allowed, but if preparing a recording as documentation of the self-generating patch, it should just run uninterrupted on its own.

A stricter version of the same concept is to try to make a deterministic autonomous system in which there is no source of modulation (such as LFOs or sequencers) that is not itself modulated by other sources. In consequence, the patch has to be a feedback system.

The patch may be regarded as a network with modules as the nodes and patch cords as the links. Specifically, it is a bidirectional graph, because modules usually have both inputs and outputs. (The requirement that there be no source of modulation which itself is not modulated by other modules implies that, e.g., noise modules or LFOs without any input are not allowed.) Thus, in the graph corresponding to the patch, each node that belongs to the graph must have at least one incomming link and at least one outgoing link. The entire patch must be interconnected in the sense that one can follow the patch cords from any module through intervening modules to any other module that belongs to the patch.


Criterion of elegance:
The smaller the number of modules and patch cords used, the more elegant the patch is. (Caveat: modules are not straightforwardly comparable. There are small and simple modules with restricted possibilities, and modules with lots of features that may correspond to using several simpler modules.)

Aesthetic judgement:
Why not organize competitions where the audience may vote for their favourite patches, or perhaps let a panel of experts decide.

Standards of documentation:
Make a high quality audio recording with no post processing other than possibly volume adjustment. Video recordings and/or photos of the patch are welcome, but a detailed diagram explaining the patch and settings of all knobs and switches involved should be submitted. The diagram should provide all the information necessary to reconstruct the patch.

Criterion of robustness:
Try to reconstruct the patch with some modules replaced by equivalent ones. Swap one oscillator for another one, use a different filter or VCA and try to get a similar sound. Also try small adjustments of knobs and see whether it affects the sound in a radical way. The more robust a patch is, the easier it should be for other modular enthusiasts to recreate a similar patch on their system.

Criteria of objective complexity:
The patch is supposed to generate complex, evolving sounds, not just a static drone or a steady noise. Define your own musical complexity signal descriptor and apply it to the signal. Or use one of the existing complexity measures.

Dissemination:
Spread your results and let us know about your amazing patch!


Tuesday, February 11, 2014

The geometry of drifting apart

Why do point particles drift apart when they are randomly shuffled around? Of course the particles may be restricted by walls that they keep bumping into, or there may be some attractive force that makes them stick together, but let us assume that there are no such restrictions. The points move freely in a plane, only subject to the unpredictable force of a push in a random direction.

Suppose the point xn (at discrete time n) is perturbed by some stochastic vector ξ, defined in polar coordinates (r, α) with uniform density functions f, such that 

fr(ξr) = 1/R,  0 ≤ ξrR
fα(ξα) = 1/2π,  0 ≤ ξα < 2π.

Thus, xn+1 = xn + ξ, and the point may move to any other point within a circle centered around it and with radius R.


Now, suppose there is a point p which can move to any point inside a circle P in one step of time, and a point q that can move to any point within a circle Q.
First, suppose the point p remains at its position and the point q moves according to the probability density function. For the distance ||p-q|| to remain unchanged, q has to move to some point on the blue arc that marks points equidistant from p. As can be easily seen, the blue arc divides the circle Q in two unequal parts with the smallest part being closest to p. Therefore, the probability of q moving away from p is greater than the probability of approaching p. As the distance ||p-q|| increases, the arc through q obviously becomes flatter, thereby dividing Q more equally. In consequence, when p and q are close, they will be likely to move away from each other at a faster average rate than when they are farther apart, but they will always continue to drift apart.

After q has moved, the same reasoning can be applied to p. Furthermore, the same geometric argument works with several other probability density functions as well.

When a single point is repeatedly pushed around, it traces out a path and the result is a Brownian motion resulting in skeins such as this one.


Different probability density functions may produce tangles with other visual characteristics. The stochastic displacement vector itself may be Brownian noise, in which case the path is more likely to travel in more or less the same direction for several steps of time. Then two nearby points will separate even faster.

Monday, January 13, 2014

Who cares if they listen?


No, it wasn't Milton Babbitt who coined the title of that notorious essay, but it stuck. And, by the way, it was "Who cares if you listen?". Serialism, as Babbitt alludes to, employs a tonal vocabulary "more efficient" than that of past tonal music. Each note-event is precisely determined by its pitch class, register, dynamic, duration and timbre. In that sense, the music has a higher information density and the slightest deviation from the prescribed values is structurally different, not just an expressive coloration. Such music inevitably poses high demands on the performer, as well as on the listener. (Critical remarks could be inserted here, but I'm not going to.) Similarly to how recent advances in mathematics or physics can be understood only by a handful of specialists, there is an advanced music that we should not expect to be immediately accessible to everyone. For such an elitist endeavor to have a chance of survival at all, Babbitt suggests that research in musical composition should take its refuge in universities. Indeed, in 20th century America that was where serialist composers were to be found.

The recent emergence of artistic research at universities and academies is really not much different from what Babbitt was pleading for. Artistic research is awkwardly situated between theorizing and plain artistic practice as it used to be before everyone had to write long manifestos. But more about that on another occasion, perhaps.

In a more dadaist spirit, there is this new release out now on bandcamp, also titled Who cares if they listen.


In these days there are enough reasons to care if they listen (yes they do, but not wittingly) and to worry about its consequences, as discussed in a previous post.