Archive

Archive for the ‘Electra’ Category

Recurrence

March 30th, 2017 No comments


Most oklo.org readers know the story line of Fred Hoyle’s celebrated 1957 science fiction novel, The Black Cloud. An opaque, self-gravitating mass of gas and dust settles into the solar system, blots out the sun, and wreaks havoc on the biosphere. It gradually becomes clear that the cloud itself is sentient. Scientists mount an attempt to communicate. A corpus of basic scientific and mathematical principles is read out loud in English, voice-recorded, and transmitted by radio to the cloud.

The policy was successful, too successful. Within two days the first intelligible reply was received. It read:

“Message received. Information slight. Send more.”

For the next week almost everyone was kept busy reading from suitably chosen books. The readings were recorded and then transmitted. But always, there came short replies demanding more information, and still more information…

Sixty years later, communicating interstellar clouds are still in the realm of fiction, but virtualized machines networked in the cloud are increasingly dictating the course of actions in the real world.

In Hoyle’s novel, the initial interactions with the Black Cloud are quite reminiscent of a machine learning task. The cloud acts as a neural network. Employing the information uploaded in the training set, it learns to respond to an input vector — a query as a sequence of symbols — with a sensible output vector. Throughout the story, however, there’s an implicit assumption that the Cloud is self-conscious and aware; nowhere is it intimated that that the processes within the Cloud might simply be an algorithm managing to pass an extension of the Turing Test. On the basis of the clear quality of its output vectors, the Cloud’s intelligence is taken as self-evident.

The statistics-based regimes of machine learning are on a seemingly unstoppable roll. A few years ago, I noticed that Flickr became oddly proficient at captioning photographs. Under the hood, an ImageNet classification with convolutional neural networks (or the like) was suddenly focused, with untiring intent, on scenes blanketing the globe. Human mastery of the ancient game of Go has been relinquished. Last week, I was startled to read Andrej Karpathy’s exposition of the unreasonable effectiveness of recurrent neural networks.

By drawing from a large mass of example text, a recurrent neural network (RNN) character-level language model learns to generate new text one character at a time. Each new letter, space, or punctuation mark draws its appearance from everything that has come before it in the sequence, intimately informed by what the algorithm has absorbed from its fund of information. As to how it really works, I’ll admit (as well) to feeling overwhelmed, to not quite knowing where to begin. This mind-numbingly literal tutorial on backpropagation is of some help. And taking a quantum leap forward, Justin Johnson has written a character-level language model, torch-rnn, which is well-documented and available on github.

In Karpathy’s post, RNNs are set to work generating text that amuses but which nonetheless seems reassuringly safely removed from any real utility. A Paul Graham generator willingly dispenses Silicon Valley “thought leader” style bon mots concerning startups and entrepreneurship. All of Shakespeare is fed into the network and dialogue emerges in an unending stream that’s — at least at the phrase-to-phrase level — unkindly indistinguishable from the real thing.

I’m very confident that it would be a whole lot more enjoyable to talk to Oscar Wilde than to William Shakespeare. As true A.I. emerges, it may do so in a cloud of aphorisms, of which Wilde was the undisputed master, “I can resist everything except temptation…”

Wilde employed a technique for writing The Picture of Dorian Gray in which he first generated piquant observations, witty remarks and descriptive passages, and then assembled the plot around them. This ground-up compositional style seems somehow confluent with the processes — the magic — that occurs in an RNN.

The uncompressed plain text UTF8 version of Dorian Gray is a 433701 character sequence. This comprises a fairly small training set. It needs a supplement. The obvious choice to append to the corpus is A rebours — Against Nature, Joris-Karl Huysman’s 1884 classic of decadent literature.

Even more than Wilde’s text, A rebours is written as a series of almost disconnected thumbnail sketches, containing extensive, minutely inlaid descriptive passages. The overall plot fades largely into the background, and is described, fittingly, in one of the most memorable passages from Dorian Gray.

It was a novel without a plot and with only one character, being, indeed, simply a psychological study of a certain young Parisian who spent his life trying to realize in the nineteenth century all the passions and modes of thought that belonged to every century except his own, and to sum up, as it were, in himself the various moods through which the world-spirit had ever passed, loving for their mere artificiality those renunciations that men have unwisely called virtue, as much as those natural rebellions that wise men still call sin. The style in which it was written was that curious jewelled style, vivid and obscure at once, full of argot and of archaisms, of technical expressions and of elaborate paraphrases, that characterizes the work of some of the finest artists of the French school of Symbolistes. There were in it metaphors as monstrous as orchids and as subtle in colour.

A rebours attached to Dorian Gray constitutes a 793587 character sequence, and after some experimentation with torch-rnn, I settled on the following invocation to train a multilayer LSTM:

MacBook-Pro:torch-rnn Greg$ th train.lua -gpu -1 -max_epochs 100 -batch_size 1 -seq_length 50 -rnn_size 256 -input_h5 data/dorianGray.h5 -input_json data/dorianGray.json

My laptop lacks an Nvidia graphics card, so the task fell to its 2.2 GHz Intel Core i7. The code ran for many hours. Lying in bed at night in the quiet, dark house, I could hear the fan straining to dissipate the heat from the processor. What would it write?

This morning, I sat down and sampled the results. The neural network that emerged from the laptop’s all-nighter generates Wilde-Huysmans-like text assembled one character at a time:

MacBook-Pro-2:torch-rnn Greg$ th sample.lua -gpu -1 -temperature 0.5 -checkpoint cv/checkpoint_1206000.t7 -length 5000 > output.txt

I opened the output, and looked over the first lines. It is immediately clear that a 2015-era laptop staying on all night running downloaded github code can offer no competition — in any sense — to either Mr. Wilde or Mr. Huysmans. An abject failure of the Turing Test, a veritable litany of nonsense:

After the charm of the thread of colors, the nineteenth close to the man and passions and cold with the lad's heart in a moment, whose scandal had been left by the park, or a sea commonplace plates of the blood of affectable through the club when her presence and the painter, and the certain sensation of the capital and whose pure was a beasts of his own body, the screen was gradually closed up the titles of the black cassion of the theatre, as though the conservatory of the past and carry, and showing to me the half-clide of which it was so as the whole thing that he would not help herself. I don't know what will never talk about some absorb at his hands.

But we are not more than about the vice. He was the cover of his hands. "You were in his brain."

"I was true," said the painter was strangled over to us. It is not been blue chapter dreadfully confesses in spite of the table, with the desert of his hands in her vinations, and he mean about the screen enthralled the lamp and red books and causes that he was afraid that he could see the odious experience. It was a perfect streating top of pain.

"What is that, I am sorry I shall have something to me that you are not the morning, Mr. Gray," answered the lad, and that the possession of colorings, which were the centre of the great secrets of an elaborate curtain.

You cannot believe that I was thinking of the moon.

He was to be said that the world is the restive of the book to the charm of a matter of an approvingian through a thousand serviced it again. The personality of the senses by the servants were into the shadow of the next work to enter, and he had revealed to the conservatory for the morning with his wife had been an extraordinary rooms that was always from the studio in his study with a strange full of jars, and stood between them, or thought who had endured to know what it is.

"Ah, Mr. Gray?"

"I am a consolation to be able to give me back to the threat me."

But such demands are excessive. The text is readable English, convened in a headlong rush by a program that could just as easily have been synthesizing grant proposals or algebraic topology. Torch-rnn contains no grammar rules, no dictionaries, no guides to syntax. And it really does learn over time. Looking at the early checkpoint snapshots of the network, during epochs when words and spaces are forming, before any sense of context has emerged, one finds only vaguely English-like streams of gibberish:

pasticite his it him. "It him to his was paintered the cingring the spure, and then the sticice him come and had to him for of a was to stating to and mome am him himsed at he some his him, and dist him him in on of his lime in stainting staint of his listed."

Perhaps the best comparison of Torch-rnn’s current laptop-powered overnight-effort capabilities are to William S. Burroughs’ cut-up novels — The Soft Machine, The Ticket that Exploded — where one sees disjoint masses of text full of randomized allusions, but where an occasional phrase sparkles like a diamond in matrix, “…a vast mineral consciousness near absolute zero thinking in slow formations of crystal…”

In looking over a few thousand characters of text, generated from checkpoint 1,206,000 at temperature T=0.61, one finds glimmers of recurrent, half-emerged truths,

You are sure to be a fragrant friend, a soul for the emotions of silver men.

Categories: Electra Tags:

A signal amplified

February 25th, 2017 1 comment

There was something a little disorienting about TRAPPIST-1 vaulting into the public consciousness to fleetingly become one of the largest news events in the world. The small-telescope detection of temperate Earth-sized planets orbiting stars at the bottom of the main sequence was a frequent topic during oklo.org’s first ten years. In looking back over the early articles, one of the very first posts (from 11/29/2005) looks quaint, naive and prescient all at once:

We know that planets aren’t rare, and by now, with the tally over at the extrasolar planet encyclopedia poised to blast past 200, the announcement of a newly discovered run-of-the-mill Jupiter-sized planet barely raises the collective eyebrow.

The headline that everyone is anticipating is the discovery, or better yet, the characterization of a truly habitable world — a wet, Earth-sized terrestrial planet orbiting in the habitable zone of a nearby star. Who is going to get to this news first, and when?

299 million dollars of smart money says that Kepler, a NASA-funded Discovery mission currently scheduled for launch in June 2008, will take the honors. The Kepler spacecraft will fly in an Earth-trailing 377.5 day orbit, and will employ a 1-meter telescope to stare continuously (for at least four years straight) at a patchwork of 21 five-square-degree fields of the Milky Way in the direction of the constellation Cygnus. Every 15 minutes, the spacecraft will produce integrated photometric brightness measurements for ~100,000 stars, and for most of these stars, the photometric accuracy will be better than one part in 10,000. These specs should allow Kepler to detect transits of Earth-sized planets in front of Solar-type stars.

Kepler has a dedicated team, a solid strategy, and more than a decade of development work completed. It’s definitely going to be tough to cut ahead of Bill Borucki in line. Does anyone else stand a chance?

Practitioners of the microlensing technique have a reasonably good shot at detecting an Earth-mass planet before Kepler, but microlensing-detected planets are maddeningly ephemeral. There are no satisfying possibilities for follow-up and characterization. Doppler RV has been making tremendous progress in detecting ever-lower mass planets, but it seems a stretch that (even with sub-1 meter per second precision) the RV teams will uncover a truly habitable world prior to Kepler, although they may well detect a hot Earth-mass planet.

There is one possibility, however, whereby just about anyone could detect a habitable planet (1) from the ground, (2) within a year, and (3) on the cheap. Stay tuned…

In marveling at the avalanche of media attention during the last week, from the front pages of the New York Times and the New York Post, to NPR push notifications, to NASAwatch sleuthing out the story, to a co-opt of the front page of Google, I was struck by the fact that viewed externally, this is really just the massive amplification, complete with distortion — see the NASA/JPL go-to image — of an exceedingly faint signal. TRAPPIST-1 continually bathes the Earth with 14 Joules per second of energy. Over the course of the few weeks it took to detect the seven planets, its transits cumulatively decreased this share of the light by the energy equivalent of a single tic tac.

Categories: Electra, worlds Tags:

6/5/4/3

December 22nd, 2016 Comments off


It was like the opening pages of a thriller. In the gathering dusk of an early winter evening last year, the postman handed me a package with a Belgian postmark and a cryptic symbol.

Inside, wrapped in layers of translucent paper, were two books, both in French. Nothing else. Needless to say, I was intrigued…

Dialectique du Monstre by Sylvain Piron revealed itself (with the use of Google Translate) to be a close study of the life and work of Opicinus de Canistris (1296-c.1353), a mysterious, psychologically tormented clerical official attached to the Avignon Papacy. The book is filled with reproductions of Opicinus’ elaborate parchment diagrams, which are like figments of the fever dreams of Archimedes or Leonardo; bizarre maps and masses of symbols harboring intimations just out of reach, a taproot into unseen connections between individuals, cities, whole worlds.

A while back, I wrote of the Electra Hypothesis, the idea that as the planet binds itself and its bit processes into an ever more interconnected web of radio links and optical fiber, its computational edges and nodes will develop into something of a sequel to Lovelock’s Gaia. Although layered in ambiguity, and separated by a gulf of time and mindset, Canistris seemed to have been drawn toward a similar notion.

The second book, opaquely titled 6/5, vaults the web of interconnection squarely into the modern world. Written by Alexandre Laumonier, the Sniper in Mahwah, it is a history of modern electronic markets and the rise of machines. In contrast to Dialectique du Monstre, it connects not to the past but to the future. The changes, computerization, machine learning, algorithms, that have swept over the financial markets are now spreading ever more thoroughly into an ever-wider range of endeavor.

The title 6/5 is a compressed code for a set of developments that have unfolded mostly out of view. The first part of the book, 6, refers to the floored number of milliseconds for a signal to travel from Chicago to New York on the fastest optical fiber. The second section, 5, alludes to the faster-than-glass signaling over the same route by microwave, which has now dropped two notches below that round number, to 3.982, within a sliver of the vacuum latency on the great circle connecting the endpoints.

A node of Electra’s graph. Hundreds of billions of dollars in coded trades rush daily through the towers of this Appalachian ridgeline.

For nearly a year, I’ve left a latin phrase at the top of the site… Pythagoreorum quaestionum gravitationalium de tribus corporibus nulla sit recurrens solutio, cuius rei demonstrationem mirabilem inveniri posset. Hanc blogis exiguitas non caperet.

The translation of the phrase is connected to the pythagorean three-body problem, another obliquely related topic involving descending integers that has seen regular rotation on oklo.org. A remarkable feature of Burrau’s original version of the problem (masses of 3, 4, and 5 started from rest under Newtonian gravity at the vertices opposite the sides of a 3-4-5 right triangle) is that the solution is almost, but not quite periodic. At time, T~15.830, bodies 4 and 5 almost collide, while body 3 nearly comes to rest. In a paper from 1967, Szebeheley and Peters show that a slight adjustment of the initial positions is sufficient to transform the situation into one that repeats itself endlessly.

The integers 3, 4, and 5 are a single example drawn from the infinite set of Pythagorean triples, combinations of integers that correspond to the lengths of the the sides of right triangles. Each triple defines a variation on the original Pythagorean three-body problem, and I believe it’s the case that not a single member of this infinity of initial conditions will generate a periodic solution.

Scatter plot of the legs (a,b) of the first Pythagorean triples with a and b less than 6000. Negative values are included to illustrate the parabolic patterns. (Source: Wikipedia)

With a nod to Fermat, this assertion can be recast as a conjecture:

There exist no periodic solutions to any of the Pythagorean gravitational three-body problems. There may exist a truly marvelous demonstration of this proposition that this weblog has no space to contain.

Or at least it is true for every spot check that I’ve computed. For example, the tortured path of 20-21-29:

To place a tiny obstacle in the crush of progress, a translation into Latin beyond what Google can yet achieve seemed in order. I contacted Alexandre, who forwarded the request to Sylvain, who transmitted the following:

Pythagoreorum quaestionum gravitationalium de tribus corporibus nulla sit recurrens solutio, cuius rei demonstrationem mirabilem inveniri posset (could be found) /esse posset (could be). [Le verbe exstare (exister, être présent avec force) conviendrait mal àcette modalité.] Hanc blogis exiguitas non caperet.

Translation in English of “[Le verbe exstare (exister, être présentavec force) conviendrait mal à cette modalité]”: the verb “exist” would not be good here. inveniri posset seems to be the best solution.

Categories: Electra, worlds Tags:

Electra

February 22nd, 2015 3 comments

FullSizeRender2222015
Have you noticed that the Internet can seem slow? Sometimes it takes a long time for web pages to load. It would really be better if they would just snap up instantly on the screen.

In practice, “instant” response occurs if the latency is less than ~1/30th of a second, or ~30 msec. Animation at thirty frames per second looks smooth. Only a small minority of the population has the retinal read-out frequency required to see that the Crab pulsar is flashing at 33.5 msec intervals.

Coincidently, the speed-of-light travel time along the (almost entirely overland) great circle route between Tokyo and New York is (to within a millisecond) the same as the Crab Pulsar’s current spin period. In theory, it should possible to load Japanese-sourced web pages with barely perceptible latency, as the service of a request involves a round-trip.

Screen-Shot-2015-02-22-at-5.17.14-PM

The fastest communication between Japan and the West Coast of the United States is via NTT’s PC-1 cable, which runs between cable landings at Ajigaura (near Tokyo) and Harbour Pointe (near Seattle). Round-trip communication on the cable takes 80 msec, which, given that the speed of light in optical fiber is ~1.44x slower than the speed of light in vacuum, indicates that cable must adhere fairly closely to the great circle route beneath the Pacific.

Here’s an interesting paper by Ankit Singla and his collaborators which explores the various drag terms that keep the Internet from actually running at the speed of light. As part of their research, they report on 20+ million measurements of 28,000 web urls served from 120+ countries. The cumulative distribution function of all that pinging points to a median latency for loading html that is ~40x slower than if the message was covering the inferred great circle distance at the speed of light in vacuum.

Screen-Shot-2015-02-22-at-5.52.43-PM

Singla et al. argue that the speed doesn’t have to be so slow:

A parallel low-latency infrastructure: Most flows on the Internet are small in size, with most of the bytes being carried in a small fraction of flows. Thus, it is conceivable that we could improve latency for the large fraction of small-sized flows by building a separate low-latency low-bandwidth infrastructure to support them. Such a network could connect major cities along the shortest paths on the Earth’s surface (at least within the continents) using a c-speed medium, such as either microwave or potentially hollow fiber. Such a vision may not be far-fetched on the time horizon of a decade or two.

Even a decade might be an overestimate. As oklo.org readers know, during the past several years, a secretive fleet of microwave networks have sprung up to transfer information between the Chicago and New York metro areas at as close to the speed of light as possible. The fastest of these networks now transmit within ~2% of the physical minimum. Tremendous efforts have gone into squeezing out every last source of delay.

It’s thus interesting to look at what a national low-latency microwave backbone might look like. To optimize on costs, and to minimize connection times, one wishes to connect a number of nodes (metropolitan areas) with the minimal complement of route segments. This task, known as the Steiner tree problem has an interesting history, and computationally, is non-deterministic polynomial-time (NP) hard. One can get analog solutions by placing a board with pegs representing the nodes into soapy water. The connective soap bubble films are physical representations of the Steiner trees:

Screen-Shot-2015-02-22-at-10.03.46-PM

I coded up a Steiner tree finder using an incremental optimization algorithm, and ran it on the top 20 metro areas in the US by populations, which (ranked according to distance from their centroid) are:

1 DFW
2 MSP
3 ORD
4 IAH
5 DIA
6 ATL
7 COL
8 DTW
9 DCA
10 PHX
11 TPA
12 PHL
13 NYC
14 MIA
15 SAN
16 LAX
17 BOS
18 SFO
19 PDX
20 SEA

The algorithm, which employs the Vicenty distance formula between points on the Earth’s surface, and which is not guaranteed to find the absolute shortest route, links the 20 cities with a total path length of 9,814km, about 10x the length of a NYC-CHI route:

Screen-Shot-2015-02-16-at-8.13.20-PM

The added interconnecting nodes on the tree are the Steiner points. A prominent example on the map above connects Dallas and Denver with the Minneapolis-Chicago interconnect point, and lies in an obscure field a few miles south of Haven, Kansas.
Screen-Shot-2015-02-22-at-10.22.12-PM
Remarkably, when one zooms in on the exact spot, and settles into street view, there’s a red and white microwave tower a hundred meters or so from the actual Steiner point.
Screen-Shot-2015-02-22-at-10.24.05-PM
Rather fittingly, the tower has three dishes, indeed, pre-aligned and pointing in what appears to be the requisite directions…
Screen-Shot-2015-02-22-at-10.21.49-PM
The Gaia hypothesis, was introduced by James Lovelock in the 1970s and “proposes that organisms interact with their inorganic surroundings on Earth to form a self-regulating, complex system that contributes to maintaining the conditions for life on the planet.”

As the planet wires itself and its computers ever more tightly together in an ever-lower latency web of radio links and optical fiber, it no longer seems like a particular stretch to float an Electra hypothesis in which computational nodes and their interconnections assume a global role comparable to that now filled by the biological organisms.

Categories: Electra, worlds Tags:

The Machine Epoch

February 14th, 2015 Comments off

Screen-Shot-2015-02-14-at-6.23.47-PM

In looking through oklo’s activity logs, it is evident that many of the visitors are not from the audience that I have in mind as I write the posts. The site is continually accessed from every corner of the planet by robots, harvesters, spamdexing scripts, and viral entities that attempt to lodge links into the blog.

A common strategy consists of attempts to ingratiate with generically vague comments of praise:

Screen-Shot-2015-02-13-at-11.51.48-AM

The Turing test was envisioned as a text-only conversation with a machine. The machine passes the test if it can’t be distinguished from a real person. In Alan Turning’s Computing Machinery and Intelligence, he asks, “Are there imaginable digital computers which could do well in the imitation game?”

For now, the general consensus seems to be no. Machines can’t consistently pass the test (and the test itself seems increasingly dated), but their moment is approaching fast. Judith Newman’s recent NYT article about interaction with the iPhone’s Siri telegraphs the stirrings of the current zeitgeist.

The economics of comment spam must be relatively minor. Were serious money was at stake, a Nice Post! robot armed with state-of-the-art-2015 natural language processing skills and tuned to the universe of text strings and facts could almost certainly pull the wool over my eyes.

Screen-Shot-2015-02-14-at-6.29.23-PM

In early 2001, I was working at NASA Ames Research Center. The first Internet Bubble hadn’t quite ended. Highway 101 was a near-continual traffic jam. Narrow billboard trucks advertising this or that dot com were still cycling aimlessly up and down the Peninsula. We had just published our plan to move the Earth in response to the gradually brightening Sun.

I got an e-mail with a stanford.edu address from someone named John McCarthy, who asked if he could come to NASA Ames to talk with us about astronomical engineering. This was before the Wikipedia, and for me, at least, before the ingrained reflex to turn to the web for information about someone one doesn’t know. I just wrote back, “Sure!”

I recall McCarthy in person as a rather singular character, with a bushy white beard surrounding thick black glasses. He had a rattletrap car with a bulky computer-like device somehow attached next to the steering wheel. My co-author, Don Korycansky, was there. I remember that the conversation was completely focused on the details of the orbits and the energy budgets that would be required. We didn’t engage in any of the far-out speculations or wide-eyed ramifications that thrust us, as a result of my ill-advised conversation with a reporter a few weeks later, into a terrifying worldwide media farce.

Only later did I realize that John McCarthy was one of the founding giants of computer science. He coined the term Artificial Intelligence, invented Lisp, and was famous for his Usenet .sig, “He who refuses to do arithmetic is doomed to talk nonsense.”

McCarthy’s Progress and Sustainability web pages (online at http://www-formal.stanford.edu/jmc/progress/index.html) are dedicated to the thesis of optimism — that human progress is desirable and sustainable. He wrote, “There are no apparent obstacles even to billion year sustainability.” In essence, the argument is that the Anthropocene epoch, which began at 05:29:21 MWT on July 16, 1945, will stretch to become an eon on par in duration with the Archean or the Proterozoic.

Optimistic is definitely the operative word. It’s also possible that the computational innovations that McCarthy had a hand in ushering in will consign the Anthropocene epoch to be the shortest — rather than one of the longest — periods in Earth’s geological history. Hazarding a guess, the Anthropocene might end not with the bang with which it began, but rather with the seemingly far more mundane moment when it is no longer possible to draw a distinction between the real visitors and the machine visitors to a web site.

Categories: Electra, worlds Tags:

lightspeed

January 1st, 2015 Comments off

Screen-Shot-2015-01-01-at-1.21.48-PM

Aon Tower, as seen from Lurie Garden in Millennium Park

Millennium Park in Chicago is a remarkable place. Skyscrapers shoulder together and soar up steeply to the north and to the west. The vertiginous effect of their cliff faces is reminiscent of Yosemite Valley.

Lurie Garden is at the center of the park, and is given over largely to native plants that carpeted the Illinois landscape in the interval between the retreat of the glaciers and the advance of the corn fields. In the silence of a photograph with a narrow field of view, it is as if the city never existed.

Screen-Shot-2015-01-01-at-4.47.48-PM

Lurie Garden

Restore the sound, and the the buzz and hum of insects are superimposed on the wash of urban noise. A swarm of bees, algorithmic in their efficiency, and attuned to the flowers’ black light glow, collect the nectar. 55% sucrose, 24% glucose and 21% fructose.

When viewed in microwaves and millimeter waves, say from 1 to 100 GHz, the Millennium Park scene displays a similarly jarring juxtaposition. The sky glows with the ancient three degree background radiation — the cosmic static of the Big Bang explosion — subtly brightest in the direction of the Virgo Supercluster. All around, the buildings, the roads and the sidewalks are lit up with manically pulsating wireless transmitters: routers, cell phones, myriad sensors. In highly focused 6 GHz and 11 GHz beams, billions of dollars in coded securities orders streak above the urban canyons on line-of-sight paths linking the data centers of Chicago, Aurora, and suburban New Jersey. The fastest path of all runs through the top of the monolithic Aon Tower, where the signal is amplified and launched onward across the Lake and far into Michigan.

The microwave beams are a new development. In mid-2010, price movements at the Chicago Mercantile Exchange generated reactions in New Jersey nine milliseconds later. The signals traveled on fiber optic cables that meandered along railroad rights-of-way.

stockResponse2010

Now, the messages arrive within a few microseconds of the time it would take light to travel in vacuum, galvanizing the swarm of algorithms that are continually jostling and buzzing in the vicinity of the match.

stockResponse2013

Categories: Electra, worlds Tags:

50 oklo

December 7th, 2014 Comments off

Screen-Shot-2014-12-07-at-2.28.58-PM

In writing about the rise of the data centers earlier this year, I suggested the “oklo” as the cgs unit for one artificial bit operation per gram per second. That post caught the eye of the editor at Nautilus Magazine, who commissioned a longer-form article and a series of short interviews, which are on line here.

In writing the Nautilus article, it occurred to me that the qualifier “artificial” is just that: artificial. A bit operation in the service of computation should stand on its own, without precondition, and indeed, the very word oklo serves to reinforce the lack of any need to draw a distinction. The Oklo fossil reactors operated autonomously, without engineering or direction more than two billion years ago. In so doing, they blurred snap-judgment distinctions between the natural and the artificial.

Several years ago, Geoff Manaugh wrote thoughtfully about the Oklo reactors, drawing a startling connection to a passage in the second of William S. Burroughs’s cut-up novels:

I’m reminded again here of William Burroughs’s extraordinary and haunting suggestion, from his novel The Ticket That Exploded, that, beneath the surface of the earth, there is “a vast mineral consciousness near absolute zero thinking in slow formations of crystal.” Here, though, it is a mineral seam, or ribbon of heavy metal—a riff of uranium—that stirs itself awake in a regularized cycle of radiative insomnia that disguises itself as a planet. Brainrock.

Revising the definition,

1 oklo = 1 bit operation per gram of system mass per second,

brings the information processing done by life into consideration. Our planet has been heavily devoted to computation not just for the past few years, but for the past few billion years. Earth’s biosphere, when considered as a whole, constitutes a global, self-contained infrastructure for copying the digital information encoded in strands of DNA. Every time a cell divides, roughly a billion base pairs are copied, with each molecular transcription entailing the equivalent of ~10 bit operations. Using the rule of thumb that the mass of a cell is a nanogram, and an estimate that the Earth’s yearly wet biomass production is 1018 grams, this implies a biological computation of 3×1029 bit operations per second. Earth, then, runs at 50 oklo.

Using the Landauer limit, Emin=kTln2, for the minimum energy required to carry out a bit operation, the smallest amount of power required to produce 50 oklo at T=300K is ~1 GW. From an efficiency standpoint, DNA replication by the whole-Earth computer runs at about a hundred millionth of the theoretical efficiency, given the flux of energy from the Sun. The Earth and its film of cells does lots of stuff in order to support the copying of base pairs, with the net result being ~200,000 bit operations per erg of sunlight globally received.

Viewed in this somewhat autistic light, Earth is about 10x more efficient that the Tianhe-2 supercomputer, which draws 17,808KW to run at 33.8 Petaflops.

 

 

Categories: Electra, worlds Tags:

optical data transmissions

November 28th, 2014 2 comments

Screen-Shot-2014-11-28-at-6.24.11-PM

The amount of information that can be carried on a laser diode-driven fiber optic cable is staggering. The current state-of-the-art is of order a petabit per second over 50 km, with a direct power consumption of order 100 milliwatts, as described in this press release from NTT, and in primers on optical communication.

When data is transmitted via optical fiber, no signal leaks into space at all (other than a trivial quantity of waste heat). From the standpoint of eavesdropping civilizations, Earth is going dark, presenting a fashionable and much-remarked potential solution to the Fermi Paradox.

To order of magnitude, fiber optic cables currently employ 10^-16 ergs to transmit one bit of information over a distance of one centimeter. It’s interesting to compare this with the energy throughput and transmission efficiency of the first recorded description of an optical information transmission network.

In The Information — A History  A Theory A Flood, James Gleick draws attention to a passage that appears in Aeschylus’ Agammemon describing how a chain of eight signal bonfires transmitted the news of Trojan defeat over the course of a single night to Clytemnestra, scheming, four hundred miles away in Sparta.

Aeschylus’ full passage is worth tracking down and is thrilling to read; a satisfyingly direct antecedent to NTT’s press release describing their record-setting petabyte per second optical data transmissions.

LEADER:

Yet who so swift could speed the message here?

CLYTEMNESTRA:

From Ida’s top Hephaestus, lord of fire,
Sent forth his sign; and on, and ever on,
Beacon to beacon sped the courier-flame.
From Ida to the crag, that Hermes loves,
Of Lemnos; thence unto the steep sublime
Of Athos, throne of Zeus, the broad blaze flared.
Thence, raised aloft to shoot across the sea,
The moving light, rejoicing in its strength,
Sped from the pyre of pine, and urged its way,
In golden glory, like some strange new sun,
Onward, and reached Macistus’ watching heights.
There, with no dull delay nor heedless sleep,
The watcher sped the tidings on in turn,
Until the guard upon Messapius’ peak
Saw the far flame gleam on Euripus’ tide,
And from the high-piled heap of withered furze
Lit the new sign and bade the message on.
Then the strong light, far-flown and yet undimmed,
Shot thro’ the sky above Asopus’ plain,
Bright as the moon, and on Cithaeron’s crag
Aroused another watch of flying fire.
And there the sentinels no whit disowned,
But sent redoubled on, the hest of flame
Swift shot the light, above Gorgopis’ bay,
To Aegiplanctus’ mount, and bade the peak
Fail not the onward ordinance of fire.
And like a long beard streaming in the wind,
Full-fed with fuel, roared and rose the blaze,
And onward flaring, gleamed above the cape,
Beneath which shimmers the Saronic bay,
And thence leapt light unto Arachne’s peak,
The mountain watch that looks upon our town.
Thence to th’ Atreides’ roof-in lineage fair,
A bright posterity of Ida’s fire.
So sped from stage to stage, fulfilled in turn,
Flame after flame, along the course ordained,
And lo! the last to speed upon its way
Sights the end first, and glows unto the goal.
And Troy is ta’en, and by this sign my lord
Tells me the tale, and ye have learned my word.

Given that the message was one bit, the signal coding was at the Shannon Limit. The route can be correlated with current-day geographic features,

Screen-Shot-2014-11-28-at-6.18.03-PM

and then traced out in Google Earth:

Screen-Shot-2014-11-28-at-6.17.44-PM

 

The bonfire on Mt. Ida that signaled the end of the Trojan War probably consumed about a cord (3.62 cubic meters) of wood and emitted about 5×10^12 ergs/sec over a span of an hour, for a transmission efficiency of order 10^9 ergs per centimeter per bit. A mere three thousand years has brought twenty five orders of magnitude of improvement.

With the take-away being that the quality of the message is likely superior in importance to the quantity.

Categories: Electra Tags:

1215095

January 25th, 2014 2 comments

Screen-Shot-2014-01-25-at-4.15.37-PM

Above: A Google Data Center Image source.

A few weeks ago, there was an interesting article in the New York Times.

On the flat lava plain of Reykjanesbaer, Iceland, near the Arctic Circle, you can find the mines of Bitcoin.

To get there, you pass through a fortified gate and enter a featureless yellow building. After checking in with a guard behind bulletproof glass, you face four more security checkpoints, including a so-called man trap that allows passage only after the door behind you has shut. This brings you to the center of the operation, a fluorescent-lit room with more than 100 whirring silver computers, each in a locked cabinet and each cooled by blasts of Arctic air shot up from vents in the floor.

The large-scale Bitcoin mining operation described in the article gravitated to Iceland in part because of the cheap hydroelectric power (along with natural air conditioning, the exotic-location marketing style points, and a favorable regulatory environment). Bitcoin mining is part of a emergent global trend in which the physical features and the resource distribution of the planet are being altered by infrastructure devoted to the computation that occurs in data centers. As an example, here is a map showing new 6, 11, and 18 GHz site-based FCC microwave-link license applications during the past three years.

Screen-Shot-2014-01-25-at-4.02.35-PM

The Western terminus of the triangle is a mysterious building (read data center) just a mile or so south of Fermilab (for more information see this soon-to-be-published paper of mine co-authored with Anthony Aguirre and Joe Grundfest).

Data Centers are currently responsible for about 2% of the world’s 20,000 TWH yearly electricity consumption, which amounts to roughly 1.4×10^24 ergs per year. If we use the Tianhe 2 computer (currently top of the list at top500.org, with a computational throughput of 33.8 petaflops, and a power usage of 17,808 kW) as a forward-looking benchmark, and if we assume that a floating-point operation consists of ~100 bit operations, the data centers of the world are carrying out 3×10^29 bit operations per year (70 moles per second).

I’ll define a new cgs unit:

1 oklo = 1 artificial bit operation per gram of system mass per second

Earth, as a result of its data centers, is currently generating somewhat more than a microoklo, and if we take into account all of the personal devices and computers, the planetary figure is likely at least several times that.

I think it’s likely that for a given system, astronomically observable consequences might begin to manifest themselves at ~1 oklo. The solar system as a whole is currently running at ~10 picooklos. From Alpha Centauri, the Sun is currently just the nearest G2V star, but if one strains one’s radio ears, one can almost hear the microwave transmissions.

Landauer’s principle posits the minimum possible energy, E=kTln2, required to carry out a bit operation. The Tianha-2 computer is a factor of a billion less efficient than the Landauer limit, and so it’s clear that the current energy efficiency of data centers can be improved. Nevertheless, even if running near the Landauer limit, the amount of computation done on Earth would need to increase several hundredfold for the Solar System to run at one oklo.

So where to look? Three ideas come to mind in increasingly far-out order.

(1) Dyson spheres are the perennial favorite. Several years ago, when the WISE data came out, I worked with two high-school students from UCSC’s Summer Internship Program to search the then newly-released WISE database for room-temperature blackbodies. To our surprise, it turns out that the galactic disk is teeming with objects that answer to this description:

Screen-Shot-2014-01-25-at-6.56.12-PM

(Some further work revealed that they are heavily dust-obscured AGB stars.)

(2) Wait long enough, and your data center will suffer an impact by a comet or an asteroid, and computational hardware debris will begin to diffuse through the galaxy. In the event that this happened regularly, then it might be possible to find some interesting microscopic things in carbonaceous chondrites.

(3) The T in Landauer’s principle suggests that cold locations are better suited for large-scale computation. Given that here on Earth a lot of cycles are devoted to financial computation, it might also be relevant to note that you get a higher rate of return on your money if your bank is in flat space time and you are living in a region of highly curved spacetime.

Categories: Electra Tags: ,

Malbolge

September 21st, 2013 Comments off

Screen-Shot-2013-09-21-at-1.13.29-PM

Fall quarter at UCSC has arrived, and with it, the latest iteration of my astrophysical fluid dynamics course.

This class covers the workings of bodies that are composed of gas, ranging from molecular clouds and accretion disks to stars and giant planets. These objects are complicated enough so that numerical calculations can often help to generate insight, so I’ve traditionally distributed some simple numerical routines for use in the class problem sets.

My first exposure to computers was in the mid-1970s, when several PLATO IV terminals were set up in my grade school in Urbana. My mid-1980s programming class was taught in standard Fortran 77. Somehow, these formative exposures, combined with an ever-present miasma of intellectual laziness, have ensured that Fortran has stubbornly remained the language I use whenever nobody is watching.

Old-style Fortran is now well into its sixth decade. It’s fine for things like one-dimensional fluid dynamics. Formula translation, the procedural barking of orders at the processor, has an archaic yet visceral appeal.

Screen-Shot-2013-09-21-at-2.12.23-PM

Student evaluations, however, tend to suggest otherwise, so this year, everything will be presented in python. In the course of making the sincere attempt to switch to the new language, I’ve been spending a lot of time looking at threads on stackoverflow, and in the process, somehow landed on the Wikipedia page for Malbolge.

Malbolge is a public domain esoteric programming language invented by Ben Olmstead in 1998, named after the eighth circle of hell in Dante’s Inferno, the Malebolge.

The peculiarity of Malbolge is that it was specifically designed to be impossible to write useful programs in. However, weaknesses in this design have been found that make it possible (though still very difficult) to write Malbolge programs in an organized fashion.

Malbolge was so difficult to understand when it arrived that it took two years for the first Malbolge program to appear. The first Malbolge program was not written by a human being, it was generated by a beam search algorithm designed by Andrew Cooke and implemented in Lisp.

That 134 character first program — which outputs “Hello World” — makes q/kdb+ look like QuickBasic:

(‘&%:9]!~}|z2Vxwv-,POqponl$Hjig%eB@@>}=m:9wv6wsu2t |nm-,jcL(I&%$#”`CB]V?Txuvtt `Rpo3NlF.Jh++FdbCBA@?]!~|4XzyTT43Qsqq(Lnmkj”Fhg${z@\>

At first glance, it’s easy to dismiss Malbolge, as well as other esoteric programming languages, as a mere in-joke, or more precisely, a waste of time. Yet at times, invariably when I’m supposed to be working on something else, I find my thoughts drifting to a hunch that there’s something deeper, more profound, something tied, perhaps, to the still apparently complete lack of success of the SETI enterprise.

I’ve always had an odd stylistic quibble the Arecibo Message, which was sent to M13 in 1974:

Screen-Shot-2013-09-21-at-12.56.58-PM

It might have to do with the Bigfoot-like caricature about 1/3rd of the way from the bottom of the message.

Screen-Shot-2013-09-21-at-2.47.33-PM

Is this how we present to the Galaxy what we’re all about? “You’ll never get a date if you go out looking like that.”

Fortunately, I discovered this afternoon that there is a way to rectify the situation. The Lone Signal organization is a crowdfunded active SETI project designed to send messages from Earth to an extraterrestrial civilization. According to their website, they are currently transmitting messages in the direction of Gliese 526, and by signing up as a user, you get one free 144-character cosmic tweet. I took advantage of the offer to broadcast “Hello World!” in Malbolge to the stars.

Screen-Shot-2013-09-21-at-2.50.46-PM

Categories: Electra Tags: