## The Premise

Real systems aren’t linear, but have scales where the cost is fixed below, but astronomical above.

## The Goal

Extend/Restrict the Minion Machine to capture what it means to operate at “optimal scale”.

## The Concept

Define a Multi-Minion Machine as a Minion Machine with the following changes:

1. There is one minion for each bin (and thus each object) (M = N)
2. Minions never move; they just shoot objects to other minions.
3. The N objects are arranged in a ring of radius R, so “1″ is next to “N”.
4. The objects travel on independent tracks of size r << R, so they don’t collide, but take effectively the same distance to a given bin.

## The Model

Assume the minions are smart enough to figure out the optimal route from one bin to another. Instead of specifying a distance, we can thus just specify a destination (and not have to worry about ‘overflow’ or ‘underflow’).

Our primitive commands only need specify the initial (b_i) and final (b_f) bins, giving a size of:

S1 = 2 log(N) := 2 k

All other quantities are the same, except that the average distance d will be less (half?) due to the ring topology.

## Metrics

Let us use bold characters to represent an action tuple (E, t) whose norm is E times t. For example, operation L has the action A_L = (E_L, t_L). The action of our system can be decomposed into C for the communicator and M for movement.

If solving the puzzle requires n commands of size S1 and average distance d, we can write our action as:

``A0 = n S1 C + n M(d) ``
`[Errata: parallel operations could complete in a time proportion to max(d), independent of n. There is a complex dependency on the relative values of C_t and M_t which I overlooked].`

Now we can ask: would higher order commands reduce the action?

To start, let us introduce a program with per-command cost T that interprets a command as a transposition instead of a move. For example, if N = 8, the command 0x1f is split into 0x1f and 0xf1 and executed in parallel.

For a set of disjoint transpositions that would normally take n moves to solve, the action is now:

``A1 = n/2 S1 C + n/2 M(d) + n/2 T ``

For this case, it is a net win when (substituting k = log(N) = S1 / 2):

``T < 2 k C + M(d) ``

which is a net win for sufficiently large k.

However, that advantage only holds for disjoint permutations. Conjoined permutations (e.g., cycles) take the same number of steps as before, but most now pay the penaltyT.

To solve that, we could replace T with a program L that describes loops (cycles) rather than mere transpositions. This gives us, for all (?) permutations:

``A2 = n/2 S1 C + n/2 M(d) + n/2 L ``

with a similar constraint:

``L < 2 k C + M(d) ``

## Interpretation

A particular command/program specification can be interpreted as a “strategy”.

For example [as Christy suggested], imagine two players Satan and God.

1. Each of them is given a Multi-Minion box for which they devise a fixed strategy behind closed doors.
2. When the curtain comes up, Satan & God get to see each other’s strategies.
3. Satan secretly feeds commands into his box to entangle a set of balls.
4. Those balls are teleported into God’s box, where he must dis-entangle them.

Every command costs some number of “action points” (great name, Christy :-). The winner is the player who spends the fewest action points.

This leads to a number of interesting questions:

## Questions

• Are there optimal strategies for God and Satan? Is the optimal strategy the same for both players? Is there a meta-strategy for which commands Satan should use, after finding out God’s strategy?
• Does one player have an intrinsic advantage in this case? What about the case where the entanglement isn’t simple permutations, but some NP-complete problem?
• How should we calculate the per-command cost P for the program used to implement the strategy? Naively, L ought to be bigger than T, but by how much? Can we break all possible strategies down into a “basis” of simpler components, allowing cost comparisons between them?
• Do any of these results change in interesting ways if we add baseline costs for any of the elements?

## Conclusion

I’m not sure if we learned anything about scale, but we did develop a useful concept of strategy. It also implies that the action (which is perhaps closer to “difficulty” rather than mere “complexity”) depends on interactions between the instruction set chosen and details of the input vectors.

Then again, maybe that is why we have different scales: to allow optimal instruction sets for different levels of representing a problem…

## The Minion Machine

August 9, 2013 § 1 Comment

# The Action of Complexity

## The Premise

Using concepts derived from physics such as Energy and Time, we can gain insight into the nature of computational complexity.

## The Goal

Devise the simplest possible physical system that captures the aspects of computation relevant to complexity theory.

## The Concept

We are given a box containing:
1. A set of N distinguishable objects each of which occupies one bin in a linear array, also of size N.
2. An army of M “minions” that can move those objects (with some cost in energy and time)
3. Some mechanism for communicating with the minions (which also costs energy and time)
4. Some reliable way to measure both energy and time

The goal is to move the objects from an initial ordering I to final ordering F while consuming the least amount of time and energy. Importantly, the only way to accomplish this task is by giving commands to the minions. Minions only understand commands of the form “Minion – Starting Bin – Direction – Distance”.

## The Model

We start by making a number of simplifying assumptions. These can be revisited later as needed.

1. The energy required for the minions to live and move themselves is either negligible or from an external source. The only energy we care about is that required to i) move the objects and ii) communicate with the minions.
2. All objects have significant mass (so it takes energy to move them) but negligible size (so we don’t need to worry about collisions).
3. All objects have the same mass m0 and top speed v0. The array has negligible friction, and the distance between bins is very large compared to the distance required to accelerate to top speed. This allows us to assume that moving any object from one bin to another takes the same amount of energy (to accelerate & decelerate):

E0 = m0 v0^2

but a varying amount of time, proportional to the distance x:

t = x / v0

1. The communicator uses something like FM modulation, which requires energyE_c and time t_c both proportional to the dimensionless size S of the command, e.g. in bits:

E_c = a S t_c = b S

The sizes N and M are fixed, so we can specify that all primitive commands use a fixed-width bitfield of size S0:

S0 = log(M)+ log(N) + 1 + log(N)

## Metrics

Say that it takes n steps to obtain the desired order. The distance traversed by each step is given by x_i, which can be summed and divided by n to get the average distance d.

Assuming serialized movements with no latency between them gives:

Energy = n a S0 + n E0 = n (a S0 + E0)

Time = n b S0 + n d / v0 = n (S0 + d/v0)

We can multiply these to get the action:

Action = Energy * Time = n^2 (a S0 + E0 ) (b S0 + d/v0)

Since S0 is dimensionless, we can pull all the dimensions in a new constant h, which is the per bit action of the communicator:

h = a b

giving us new dimensionless constants:

e = E0 / a f = d / (b v0)

allowing us to write:

Action = n^2 h (S0 + e)(S0 + f) = n^2 h (S0^2 + 2(e + f)S0 + e f)

## Interpretation

The action can be interpreted as a measure of the effort required to ‘disentangle’ a system from an initial ordering I to final ordering F.

The constant e is the ratio between the energy required for each step of movement (E0) and that for each bit of control (a).

The constant f is the ratio between the average time required for each step of movement (d/v0) and that to send each bit of control (b).

Which tells us that the effort is primarily determined by:

• Movement, when e + f >> 1
• Control, when e + f << 1
• Energy, when e >> f
• Time, when f >> e

While those are perhaps obvious, this model also provides a precise way to measure the effort (action) in intermediate cases where e and f are comparable to 1 and each other. It also gives us a mathematical formalism that can be used to minimize the action when varying some of the constants or extending the action.

## Questions

• Is the action the right way to combine E and t? What are the alternatives, and their advantages and disadvantages?
• Right now having more minions doesn’t help (or hurt). What happens if we include their energy cost, but allow them to perform actions in parallel? What if we are allowed (at some cost) to send the same command to multiple minions at once?
• What if the energy cost is dependent the distance between bins, rather than constant?
• What is the physical interpretation of h e f, the “pure movement” action d E0 / v0?

## Conclusion

This model leads to a natural and interesting definition of action for computational systems that bears some interesting similarities to the idea of ‘complexity’. To flesh this out, however, would require a mechanism for encoding (and costing) higher-order algorithms such as “sort the array”, rather than merely “move these objects between bins”.

## Summary: Retrospective Thoughts on BitC

In my opinion, BitC is the most innovative take on systems programming we’ve seen since the invention of C.  While sad that it failed, I am deeply impressed by the thoughtful post-mortem by Jonathan S. Shapiro.   Here are links to the various threads of his analysis (and the equally thoughtful responses):

## The Celebration-Driven Church

[A follow-on to Spreading Effective Vision and The Agile Church, addressed specifically to the Church Spread of Kingsway Community Church.]

In less than twelve months, together with the Holy Spirit, we have completely reinvented Kingsway Church.  While our overall numbers may be the same, we have spread to two new neighborhoods, dramatically expanded our pastoral staff, and filled much of our congregation with renewed vision for reaching our communities.

What if that was just the beginning?

## Four Myths of Software Evolution

As software developers, we know that our systems will evolve with time. We must understand the forces that drive this evolution, in order to design systems that are easily evolvable. Unfortunately, many programmers have misconceptions about the real drivers of software evolution and about their own ability to predict how a system will evolve.

In my many years of experience as a professional software developer I had the luck to work on many projects from start.

Great summary of what "Designing for the Future" really involves.

October 11, 2012 § 2 Comments

While discussing The Agile Church and Metrics versus Goals, I realized that our organization’s primary motivation for adopting Agile practices is to spread the ownership of effective vision.

That is, we start with a shared belief that vision ought to be:

1. Effective: timely, clear, actionable & aligned with the organization’s overall purpose
3. Owned: each person takes responsibility for how they implement that common vision

Working from there, we can adapt techniques from, e.g., Scrum, that will help our organization achieve that goal.

Here are what I consider the most powerful suggestions:

1. Adopt a mindset of continuously developing and implementing new visions
2. Care about improving how we do things, not just what we do
3. Maintain a written backlog of “things worth doing/changing”
4. Innovate in seasons of 4-8 weeks, tied to, e.g., a sermon series
5. The leader (pastor) prioritizes 1-3 items from the backlog to focus on each season
6. The team owns the vision (together) and its implementation (individually)
7. Define the conceptual goal and practical metrics in terms of the value delivered to the customer (e.g., God)
8. At the end of each season, celebrate what was accomplished (“thanksgiving”) and reflect on what did or did not go well (“confession”)

To me, the key is moving from strategic once-a-year vision-and-budgeting meetings for leaders towards tactical “sprints” that mobilize the entire organization (congregation).

This pace may sound a bit exhausting, but that very awareness forces us to alternate “productive” and “relaxing” sprints to keep the whole community healthy. It is already too easy to fall into ruts where some people never do much while others are continually burning themselves out.  A good process should make explicit important issues that were previously implicit, so we are forced to consciously manage them.

## Metrics versus Goals

October 5, 2012 § 3 Comments

As a followup to my post on the Agile Church, our elder’s board is having a spirited discussion of the appropriate role of metrics and goals when leading a church.  My perspectives is that the main purpose of SMART Goals is to inspire operational metrics that enable continuous innovation.

In other words, knowing where we want to go is essential for prioritizing what we want to do; but, in an Agile world, we won’t know where we should be going until we get there.

To that end, I am deeply indebted to KISS Metrics for their article on How to Use a Single Metric to Run Your Startup.

“The One Metric That Matters (or OMTM) is a single number that you care the most about at the current stage of your startup (the OMTM will change).”

1. It answers the most important question you have.
2. It forces you to draw a line in the sand and have clear goals (defining success).
3. It focuses the entire company.
4. It inspires a culture of experimentation.

In particular, I love the point that “A rate or ratio is better than an absolute or cumulative value.”  Agile is all about improving velocity — doing better work with the resources we have — not contorting ourselves to reach arbitrary goals.

To be sure, finding the right metric for a non-profit is a perilous endeavor, in that the wrong choice can be devastating to individuals and the organization; of course, that is also true of for-profit metrics!  However, there is a growing body of research from impact investing that demonstrates the enormous value that is created when you do find the right metric .

The reality is that churches already measure crude metrics such as attendance and giving. Either we ignore them as irrelevant (at our peril), or we focus only on those (which could be worse).  I believe we owe it to God as our customer to wisely discern how  He wants our churches to grow in each “season”, and identify metrics to keep us accountable. Some possible metrics include:

• First-time guests
• New baptisms
• Number of regular, tithing attenders