So far storage for the system matrix seems to be the limiting factor. In my experiments I assume that there's 160 non-zeroes per column. This means 160 inputs+outputs per production unit on average.
The latest idea I've had for maximizing the size of the matrix is to use multiple M.2 drives, because motherboards currently max out at 12 TiB of RAM. With something like the X11QPL motherboard you can fit 22 NVMe drives 8 TiB each which with the RAM comes to 188 TiB in total. This translates to around 156,000,000,000 variables.
is 160 a realistic estimate?
my first thought (which may not be possible) is to split up commodities as follows:
if there are more than X number of inputs needed (say 20) you could have the commodity have Y number of intermediate steps, even if these intermediate steps don't actually reflect anything real.
so if it takes 110 inputs to make a commodity (call it C) you could have inputs 1-20 go into making C_part_a then C_part_a + inputs 21-39 go into making C_part_b . . . until you arrive at the true final commodity, C.
This would increase the size of the io matrix by at least 3 or 5 times but it would be much sparser.
I got the number 160 from Paul's work.
The splitting you describe doesn't help with the number of non-zeroes which is what matters. In fact it makes it worse since you have to add a bunch of ones and minus ones to chain those parts together. You can however aggregate multiple workplaces of the same type together, which works as a kind of preconditioner. Kantorovich writes about this. Fully disaggregated planning is best however.
Also how do I quote multiple people on here?
So far storage for the system matrix seems to be the limiting factor. In my experiments I assume that there's 160 non-zeroes per column. This means 160 inputs+outputs per production unit on average.
The latest idea I've had for maximizing the size of the matrix is to use multiple M.2 drives, because motherboards currently max out at 12 TiB of RAM. With something like the X11QPL motherboard you can fit 22 NVMe drives 8 TiB each which with the RAM comes to 188 TiB in total. This translates to around 156,000,000,000 variables.
is 160 a realistic estimate?
my first thought (which may not be possible) is to split up commodities as follows:
if there are more than X number of inputs needed (say 20) you could have the commodity have Y number of intermediate steps, even if these intermediate steps don't actually reflect anything real.
so if it takes 110 inputs to make a commodity (call it C) you could have inputs 1-20 go into making C_part_a then C_part_a + inputs 21-39 go into making C_part_b . . . until you arrive at the true final commodity, C.
This would increase the size of the io matrix by at least 3 or 5 times but it would be much sparser.
I got the number 160 from Paul's work.
The splitting you describe doesn't help with the number of non-zeroes which is what matters. In fact it makes it worse since you have to add a bunch of ones and minus ones to chain those parts together. You can however aggregate multiple workplaces of the same type together, which works as a kind of preconditioner. Kantorovich writes about this. Fully disaggregated planning is best however.
Also how do I quote multiple people on here?
I wanna thank you for your info seeing as I am really ignorant of a lot of the challenges and particulars of this.
I'm a little confused as to what you mean by it making the problem worse?
I'm a visual person so i made a fake IO table
each column is an output and each cell tells you how many units of the input it takes to make 100 units of the output.
corn iron coal labor corn 1 0 0 0 iron 10 5 2 0 coal 1 2 1 0 labor 10 10 10 0
so to represent the vector for corn you would have 4 non-zero inputs
but if we split it into parts so that we have an intermediate step called "precorn" which takes the coal, corn, and labor inputs and then redefine "corn" so it takes only the iron and precorn inputs, we get a vector for corn with only 2 non-zero inputs.
corn precorn iron coal labor corn 0 1 0 0 0 precorn 100 0 0 0 0 iron 10 0 5 2 0 coal 0 1 2 1 0 labor 0 10 10 10 0
Q: is the issue that now we have even more vectors to consider even tho the percent of zeroes in the matrix has gone up and the raw number of non-zero cells per column has gone down?
As a side thought, is there a reason to stick with IO formatting?
i don't know how trees are stored in memory but they are quite the popular way of visualizing production in the game Factorio (link to a very popular tool)
I assumed (though i may be mistaken) that the data is stored similar to this sankey diagram tool:
// Enter Flows between Nodes, like this: // Source [AMOUNT] Target Wages [1500] Budget Other [250] Budget Budget [450] Taxes Budget [420] Housing Budget [400] Food Budget [295] Transportation Budget [25] Savings Budget [160] Other Necessities
I guess that the entire process for the New Harmony Algo depends on it being matrices of the IO format so this idea might just be dead before it even gets going.
I wanna thank you for your info seeing as I am really ignorant of a lot of the challenges and particulars of this.
I'm a little confused as to what you mean by it making the problem worse?
I'm a visual person so i made a fake IO table
each column is an output and each cell tells you how many units of the input it takes to make 100 units of the output.corn iron coal labor corn 1 0 0 0 iron 10 5 2 0 coal 1 2 1 0 labor 10 10 10 0so to represent the vector for corn you would have 4 non-zero inputs
but if we split it into parts so that we have an intermediate step called "precorn" which takes the coal, corn, and labor inputs and then redefine "corn" so it takes only the iron and precorn inputs, we get a vector for corn with only 2 non-zero inputs.
corn precorn iron coal labor corn 0 1 0 0 0 precorn 100 0 0 0 0 iron 10 0 5 2 0 coal 0 1 2 1 0 labor 0 10 10 10 0Q: is the issue that now we have even more vectors to consider even tho the percent of zeroes in the matrix has gone up and the raw number of non-zero cells per column has gone down?
If you look closely you'll see that the number of non-zeroes has gone from 10 to 11, in addition to the number of variables increasing from 4 to 5. Sure the density is lower but that doesn't really help. In fact operations on a dense (no zeroes at all) matrix with the same number of non-zeroes as a sparse one will always be faster.
Both me and Dave Zachariah are of the opinion that newcomers to the field of planning should not be taught IO tables at all because it leads to confusion when you try to sit down and actually write linear programs for this stuff, because real systems are rectangular, not square. It's easier to just think in terms of linear programs in general, in terms of constraints and flows.
I wanna thank you for your info seeing as I am really ignorant of a lot of the challenges and particulars of this.
I'm a little confused as to what you mean by it making the problem worse?
I'm a visual person so i made a fake IO table
each column is an output and each cell tells you how many units of the input it takes to make 100 units of the output.corn iron coal labor corn 1 0 0 0 iron 10 5 2 0 coal 1 2 1 0 labor 10 10 10 0so to represent the vector for corn you would have 4 non-zero inputs
but if we split it into parts so that we have an intermediate step called "precorn" which takes the coal, corn, and labor inputs and then redefine "corn" so it takes only the iron and precorn inputs, we get a vector for corn with only 2 non-zero inputs.
corn precorn iron coal labor corn 0 1 0 0 0 precorn 100 0 0 0 0 iron 10 0 5 2 0 coal 0 1 2 1 0 labor 0 10 10 10 0Q: is the issue that now we have even more vectors to consider even tho the percent of zeroes in the matrix has gone up and the raw number of non-zero cells per column has gone down?
If you look closely you'll see that the number of non-zeroes has gone from 10 to 11, in addition to the number of variables increasing from 4 to 5. Sure the density is lower but that doesn't really help. In fact operations on a dense (no zeroes at all) matrix with the same number of non-zeroes as a sparse one will always be faster.
Both me and Dave Zachariah are of the opinion that newcomers to the field of planning should not be taught IO tables at all because it leads to confusion when you try to sit down and actually write linear programs for this stuff, because real systems are rectangular, not square. It's easier to just think in terms of linear programs in general, in terms of constraints and flows.
I guess I'm one more data point to add in support of that hypothesis.
Could you recommend me some other literature on different ways of planning I could sink my teeth into?
I guess I'm one more data point to add in support of that hypothesis.
Could you recommend me some other literature on different ways of planning I could sink my teeth into?
Well, you have Kantorovich's book for one (The Best Use of Economic Resources), but I haven't read it in full yet. cibcom.org's latest text is perhaps a good way to start: Mathematics to plan an economy
I guess I'm one more data point to add in support of that hypothesis.
Could you recommend me some other literature on different ways of planning I could sink my teeth into?
Well, you have Kantorovich's book for one (The Best Use of Economic Resources), but I haven't read it in full yet. cibcom.org's latest text is perhaps a good way to start: Mathematics to plan an economy
never heard of cibcom before! thank you!
@joe They put out a lot of great content related to cyber communism, though the majority of their work is in Spanish.
So far storage for the system matrix seems to be the limiting factor. In my experiments I assume that there's 160 non-zeroes per column. This means 160 inputs+outputs per production unit on average.
The latest idea I've had for maximizing the size of the matrix is to use multiple M.2 drives, because motherboards currently max out at 12 TiB of RAM. With something like the X11QPL motherboard you can fit 22 NVMe drives 8 TiB each which with the RAM comes to 188 TiB in total. This translates to around 156,000,000,000 variables.
is 160 a realistic estimate?
my first thought (which may not be possible) is to split up commodities as follows:
if there are more than X number of inputs needed (say 20) you could have the commodity have Y number of intermediate steps, even if these intermediate steps don't actually reflect anything real.
so if it takes 110 inputs to make a commodity (call it C) you could have inputs 1-20 go into making C_part_a then C_part_a + inputs 21-39 go into making C_part_b . . . until you arrive at the true final commodity, C.
This would increase the size of the io matrix by at least 3 or 5 times but it would be much sparser.
I got the number 160 from Paul's work.
The splitting you describe doesn't help with the number of non-zeroes which is what matters. In fact it makes it worse since you have to add a bunch of ones and minus ones to chain those parts together. You can however aggregate multiple workplaces of the same type together, which works as a kind of preconditioner. Kantorovich writes about this. Fully disaggregated planning is best however.
Also how do I quote multiple people on here?
Ah i almost forgot to answer this question. The quoted text is simply a blockquote format with someone's @ and the text they made. So, if you want to quote someone else hit the blockquote button, include somone's @ and then their text. Like this.
Also how do I quote multiple people on here?
There have to be at least semi-straightforward ways to do this. Walmart and Amazon centrally plan much of their enterprises and companies have been vertically integrating since capitalism first developed.
Hi, APICS-certified supply chain management person here (albeit I got the cert first and need to still get work experience). The short answer to this question is that you're a hundred years out of date. Vertical integration as the sole path par excellence to greater efficiency reached a peak in the middle of the twentieth century, after which it hit a dead end. You should instead read more about how horizontally integrated global value chains are coordinated via various systems of information-sharing, regulation, negotiation, and coercion by their nuclear firm.
Now, here's the longer answer.
The archetype of vertical integration is the midcentury vertically integrated Fordist oligopolistic industrial conglomerate (or its equivalents in other countries), which at least aspirationally tried to contain all the functions of its supply chain within a single firm or umbrella firm and then manage these through a top-down centralized command and control system where the departments all respond directly to the central corporate office. The problem with this is one of information-flow and specialization; it is extremely difficult, at the scale and levels of industrial complexity we're talking about, to really maintain quality or even output when, say, your massive department -- basically a megacorp in itself -- is being micromanaged constantly by a distant bureaucratic center that barely understands any of your product lines and sets impossible targets you can't reach based on goals you don't share or even fully comprehend from your position. Now, when everybody was drinking this Kool Aid, it wasn't that big of an issue because there was no external competition that was substantially better -- and all of this worked well enough that you could run a system of heavy industry adequately well with it, though many consumer products (even high-end ones) remained pretty shoddy by our standards.
This paradigm began to break down in the latter part of the twentieth century, much of it spurred by innovations implemented first by Japanese engineers like Shigeo Shingo and eventually consolidated into the famous Toyota Production System, which proceeded to be copied by capitalist firms around the world. There are many aspects of this transformation, but as far as the question of integrating operations goes, it is mostly characterized by the overshadowing of vertical integration by horizontal integration in capitalist industrial planning. That is, rather than seeking to create a single industrial conglomerate that contained the entire supply chain within itself, capitalists in control of the dominant oligopolies began more and more starting in around the 1980s to shed whole departments and companies-within-the-company, outsource the functions they formerly did internally, and engage in long-term business alliances with the companies that fulfilled those functions instead. (Note that this outsourcing is only of necessity "out" relative to the company itself, not necessarily outside the nation the company operates within; though in practice, outsourcing often did involve finding cheaper suppliers in the periphery of the capitalist world-system.) The criteria by which they decide whether to keep something in-house vs. outsource it is whether the function is (as post-1990s theory has called it) a core competency, i.e. something that the firm wants to be the best in the world at and compete on the basis of. So if you're Walmart, you can ask -- is our accounting department a core competency, or is it something that we don't care about as long as it gets done? If not, you outsource it to an accounting company that does literally nothing else and so is the best in the world at it, avoiding many of the problems that the midcentury vertically integrated Fordist oligopolies had trying to be the jacks of all trades and the masters of none. This does not, however, mean the supply chain is unregulated and unplanned. Indeed, supply chains are actually ranked in the management literature by their "maturity," i.e. their degree of integration and coordination, with the highest-ranked being those that function as one ruthlessly efficient machine. However, these supply chains achieve their coordination not through vertical integration (legal incorporation as one entity) but rather horizontal integration, where one dominant firm called the nuclear firm sets terms for everyone up and down from them in the supply chain and takes a leading role in promoting the practices of information-sharing, rules-setting, and discipline which allows the supply chain to reach new levels of efficiency, product quality, output, and ultimately profit for everybody involved (though ofc the nuclear firm tends to take the biggest cut in the end). These horizontally integrated supply chains are not individual firms but alliances of firms (indeed, the business people often call them alliances!), where vassal firms are allied to an imperial power that controls the chain. Think Foxconn vis-a-vis Apple, or dairy and poultry farmers vis-a-vis McDonald's. The dominant capitalist firms at the heart of the great imperial supply chains tend to enforce their will upon their vassals through incentive structures of great sophistication and inventiveness -- it's shocking how much of my training has involved how to set up things like certification systems, which one might suspect are bullshit but in fact seem to be central to how supplier relations are managed -- but ultimately the buck stops with market power, and the ability to put the squeeze on people (whether suppliers up the chain or competitor chains) via the old methods of oligopolistic competition. An interesting upshot of this is that all capitalist competition nowadays, at least at the commanding heights of the economy, is not really between individual firms but between supply chain alliances of firms coordinated by the great powers of the various industries, so to speak.
There's more to it than this, but I don't have time to get into it here. Knowledge of this stuff can be found in the supply chain literature (academic journals in management science/industrial engineering, APICS exam materials, etc), business history, and the "global value chains" literature in international political economy.
To follow up on this (sorry, ran out of time to write), you'll want to look up the following tools used by supply chain managers and industrial engineers to plan production:
- General supply chain management tools commonly used by big firms, such as:
- ERP systems (these are especially important, as they basically integrate input-output principles and have evolved iteratively over decades to manage very large individual supply chains -- don't reinvent the wheel)
- As well as older MRP systems from which they're descended
- Value stream mapping
- Warehouse management systems
- Decision-support systems, especially the new AI-powered ones
- Interestingly, the current best practices explicitly advise you not to automate decision-making processes, but rather to use AI to whittle down the menu of options and make suggestions. They want the buck to stop with a human being or committee of them.
- Common supply chain KPIs
- ERP systems (these are especially important, as they basically integrate input-output principles and have evolved iteratively over decades to manage very large individual supply chains -- don't reinvent the wheel)
- The basic frameworks for how to organize supply chains based on your end demand goals and competitive strategy. These have different names depending on the level of abstraction you're talking about, so I'll try to split them by layer (usually one item in each layer corresponds to an item in each of the other layers)
- Competitive strategy
- Lowest cost competition
- Differentiation competition
- Niche/focus competition
- Supply chain strategy
- Lean/efficient supply chain
- Agile/responsive supply chain
- Operational / production process design
- Competitive strategy
- Cybernetic and systems theory concepts with no clear classification, like the all-important theory of constraints (TOC) for managing and preventing bottlenecks or the dreaded bullwhip effect which will get your ass good if you don't dedicate your whole life to preventing it
- Project management frameworks and tools like:
- Quality management and process improvement frameworks like:
Hope you find this helpful. Mods can feel free to make this a wiki if they think it would be useful.
@thardin Yes! There are direct parallels between recent decades' trends in supply chain management and the old viable systems model stuff.
@jmc This is partly why I avoid talking too concretely on organizational methods. Seems to me it's historically and materially contingent.