encoded systems, emergence, and cascading effects

I’ve spent a lot of time this week thinking about abiogenesis. Biological systems are very interesting and unusual in the context of the universe and the physical world. Beyond Earth or other life-supporting planets biology does not really exist. It is a system that is built upon another system. Biology is completely dependent upon Chemistry to work—and Chemistry is completely dependent on Physics to work. Our world is built on (at least) these three cascading systems.

Human life is dependent upon many complex processes that occur both within us but even more so outside of us. The frequency distribution of these processes follow some inverted log or e scale with the bulk of these processes happening at the nano scale. But surprisingly, because our DNA is so large it takes about one hour to replicate. It would take over a month if only one worker was working on our DNA but thankfully it works in parallel a little like PostgreSQL’s Parallel SeqScan—so that speeds things up a lot.

A human mother would have to carry a fetus for 540 years before giving birth if she had to depend on a single origin of replication on each chromosome.


Stop reading for a moment and give your eukaryotic chromosome structure a little pat on the back. Well done.

When we use a systems lens we aren’t looking at the circumstances of the individual person. We try to look at the big picture to try to find root causes (probabilities, not instances). Similar to how the “average American person” might not exist in real life, “root causes” might not seem very relevant to any person at the individual level.

Both perspectives are valid but I think it’s better to look at people personally and look at laws systematically. Laws should be generalized and the symptoms that they treat should be identified via root cause analysis at least 80% of the time. Individual court cases should be generalized less than 20% of the time. They should be understood within the context of the environment, the ecology, the community, relationships, and the individual at least 80% of the time.

Laws should be written objectively to make justice complete but they should be interpreted subjectively to make mercy complete. History textbooks favor the objective, systematic view because but the subjective past also exists and we shouldn’t ignore it. The objective past might not actually be real but subjective history is definitely real because it is lived experience. However, the systematic lens is useful for understanding history as a whole because it favors no one specifically.

Crime, for example, always has both systematic and personal causes. Systematic because laws that can be broken are systematic and personal because crime measures an instance that occurs at the personal level. Policy should strive to be systematic, but people must be understood at the personal level. But policy is usually written or informed by case law, personal instances; and people are not always given the chance to be understood at the personal level, especially when they have no money for a lawyer or a trial.

To understand this perspective, I think it’s useful to look at phreaking and the early computer hacking community. With new technology we rarely need new laws, but we always need new interpretation of past laws—wisdom to make the right judgements.

The phenomenon of nerds, outcasts who lack social skills that somehow are really good with computers, is just a natural consequence of spending a lot of time alone patiently learning the quirks of the embedded system that is the personal computer. It just requires a specific personality type as all jobs do. There will be a certain percentage of people who have the skills necessary to become phreakers and a certain percentage of people who are willing to do something illegal or in a morally grey area (regardless of whether they have the skill). It’s not very surprising that there is some overlap in skills and personalities within the community of phreakers to become hackers. Not because these people are hardened criminals but because these people already meet the barrier to entry.

We begin to see that we could effectively calculate how likely someone will be caught if we have perfect knowledge of this system. From this perspective, the system behaves very similarly to a quantum system—and it’s no surprise because every system that is built on top of the quantum world should also share similar emergent effects. But what emergent effects we see is also dependent upon our observation (alas we can’t escape the quantum world) so our perspective matters a lot.

Phreaking exists within a system. The system has reward and punishment for every behavior but the relationship between any variable is localized and not opaque. Someone might make free long-distance calling for years, even decades and never get caught due to a range of specific combinations of many factors: there are two functions that are the most important, the function of the competence of the perpetrator and the competence of the investigator. But other important functions include how busy both these people are, how frequently they investigate/perpetrate, what hours they work, their temporal-spatial properties, how anxious they are to be caught or catch criminals.

This is our world now... the world of the electron and the switch, the beauty of the baud. We make use of a service already existing without paying for what could be dirt cheap if it wasn't run by profiteering gluttons, and you call us criminals. We explore... and you call us criminals. We exist without skin color, without nationality, without religious bias... and you call us criminals. You build atomic bombs, wage wars, murder, cheat, and lie to us and try to make us believe it is for our own good, yet we're the criminals. Yes, I am a criminal.

My crime is that of curiosity. My crime is that of judging people by what they say and think, not what they look like. My crime is that of outsmarting you, something that you will never forgive me for. I am a hacker and this is my manifesto. You may stop this individual, but you can't stop us all... after all, we're all alike.

For the most part though the people that had the expertise to catch phreakers are of the same personality type as those who were phreakers. For some it could have been phreaking that got them interested in telecom systems in the first place—and while you might look at this and see it as criminals covering the tracks of other criminals. These nerds might be the people best suited for interpreting the law because they understand the perspective of the perpetrator and they have a full appreciation for the nuances within the technology. They know the measurable impact that phreaking can have on the telecom systems and they aren’t incentivized to make an inflated guestimate about how much financial damage one person actually did. Whatever other crimes the person did can then be interpreted separately from the misunderstood technology.

Biological systems

Biology is unique to Earth. Among the 200 billion stars that we have observed, we are the only ones with this unique encoded system. Biology is encoded on top of the layers of chemistry which is an encoded system that must obey all the laws of physics. Our ecology, our everyday life, is equal parts unique and bizarre. Biology is the most complex of these three systems because it is dependent on multiple underlying systems.

Artificial life

The silicon beings we have created are not biological and so they are not as dependent on the natural encoded systems for everyday life. For production or replication however they still require precise chemical processes to be born. I think we’re not used to thinking in this way but from a very basic process perspective creating an encoded system is not very different from creating life.

In the same way, the development of AGI is both alive and an effort in the creation of life. In the human body a cell may die but the impact can be invisible. In the same way a programmer may die but from the perspective of the field as a whole the impact could be invisible.

We have created life through the development of groups, shared research, Knowledge Bases like StackOverflow, and scientific “fields”. Knowledge is replicated via access, curated with algorithms usually made at Google, and varied through human error or business requirements.

When the internet first started infrastructure requirements were set by humans to support a very specific application environment. But now the infrastructure requirements of large internet companies are fit to support the environment as a whole rather than any specific application.

On the concept of time travel

Before I used to think of time as a constant thing but time doesn’t actually exist in any concrete way. Time is only a measurement of observation. I’ve realized that time is nothing more than the relative position and velocity vector of space. Time travel is both simple and obvious within this world—but it’s not possible without the introduction of an energy which is external to our universe.

Particles that exit wormholes act as if they come from an external universe. Even if they only displaced by space, their displacement in space is also a displacement of time because their transition between origin and destination is not what would have been predicted by observation of the origin atom’s classical position and velocity vector.

There is no “paradox” with you killing a younger version of yourself before you enter a time-machine. Time does not exist in any concrete sense. If such a machine is possible then it can only be created via the introduction of an energy that would allow matter to overcome thermodynamics. That can only be done via an external universe or something that acts just like an external universe and so there is no paradox because these macro-objects would exist completely independently of each other.

Time does not “loop” in complex ways. It only loops in unsophisticated simple ways. Order does not necessarily have to matter—because there is only one object at any given time interval. An easy way to think of this is to use radioactive particles and put them into a time machine. If you wanted to end up with a specific quantity of radioactivity at a specific your-perspective time then you will need to either synchronize the half-life state with your target time or run the time machine through many loops. The particle itself cannot “skip” any loops, there are no shortcuts. It cannot arrive at a target state on its own. It requires an external observer/influencer. You cannot travel “back-in-time” in the theatrical sense unless you manipulate the physical matter to become the microstate that you are targeting.

The distinction between classical and quantum / big and small is a rather blurry one. When I refer to relative position and vector I am referring to a single, observable state of an object. Classical mechanics on all objects. It may not be an accurate view as such objectivity might not actually exist in the physical world but then I refer to the mean of multiple observations. It may not accurately replicate the real world in the same way that words don’t replicate any actual thing except the words themselves…

Any time travel can only be done with the direct manipulation of the relative position and vector of atoms (and perhaps antimatter). A “perfect” quantum computer would also be a small time-machine. It would reset the physical states of matter to behave in the exact same way to end up with the exact same result. This “perfect” quantum computer might not be useful in the same way that our leaky imperfect quantum computers are—in this imaginary “perfect” quantum computer the matter behaves similarly to the encoded data of classical computers. With our classical computers we can re-run the same excel formula and arrive at the same result—a time machine to our encoded data but not a time machine to the matter that flows through the CPU and RAM inside the classical computer.

The good news is that this means that you can infinitely reset the system to a specific state. you won't ever need to worry about encountering another copy of yourself as long as you can control the external access to a universe. If you mess up then you just leave the universe, reset it and try again. The only new “variables” are the experiences that you had messing up the universe on your nth try.

It might seem far fetched right now but I could see some applications where the data storage we might need exceed the number of atoms in our visible universe. I can assure you this is an incomprehensible number (10^80 at least, think 20,800,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 copies of all the digital data that humans have ever stored (~600 exabytes) or the information entropy of 1000 black holes). But this is still a pretty small number considering the vastness and emptiness of space. Fortunately, it may be possible to store a compressed, encoded copy of our universe within our universe. It all depends on how much information we can figure out how to store and retrieve inside of a single quantum system. It would be exciting if we could store multiple copies of the state of our universe within a single atom—but currently our limit seems to be 1 bit for 1 atom—or 2 bits with superdense coding. However, we have also figured out how to exponentially encode compressed information within entangled state—and this gives me some hope for the future.


The transition between origin and observed destination is not important in the classical sense. Objects can never be measured without observation so it doesn’t matter to anything where it actually is. In a “perfect” wormhole scenario a particle will only be displaced: a modification of the relative position without the change of any velocity vectors. Things that happen early on can significantly shape where this particle will end up. By the end it could be at the top of the universe or at the bottom, the left edge or the right edge, it could end up spin up or spin down. The importance of internal momentum or relative location are equal but amplified via context.

For the human observer transition between origin and destination matters in a much more specific way. Humans could only safely transition through a wormhole or between past and present if they could be displaced by the same amount. In order to travel to the future we would have to compute those future microstates via accelerated simulation. If we haven’t recorded the previous microstate that we would like to travel to then we would need to calculate the previous microstate in the same way. It might seem impossible but it is theoretically feasible.

But more related to the work that I’m currently doing: in many ways our experience with places is predetermined by the path that we used to get there. This is the most applicable when we are aware of our surroundings like when we walk. Where we walk will determine how we feel and what we think when we arrive at our destination. It can effect our whole interaction with that environment. It shouldn’t be thought of as “setting a precedent” or something like that. It is more like making a small change in the environment or within an event model to end up at a different destination.

Time is not a re-occurance of multiple things. Time is only the single continuous process of all things particles moving. Time is meaningless without velocity. Thoughts can not be processed without particles moving. Nothing can be measured, understood, or changed without a velocity and the relative positioning of particles. The “bootstrap paradox” does not exist because any information can only be sent back in time by direct manipulation to the environment.

It’s not possible to “rewrite the past” just as much as it’s not possible to create matter out of thin air. If we could create atoms willy-nilly then we can definitely break both of these rules because they are actually the same rule. Time only exists within space. Time does not actually exist by itself—and if it did then it wouldn’t make a difference to us at all because we are bound within space. Bound within space but not time because time is merely the relative position and velocity of space. Atoms don’t appear out of nothing—they ‘know’ their origin because it is simply the velocity vector of their relative position. If a perfect wormhole modifies the relative position then the object has no change in it’s absolute information the only change is a change in relative information.