Monday, December 23, 2019

All in the Numbers: Popular Economic Numbers as Used by Politicians


     There was "good news" in the U.S. in October 2019 (see article). The U.S. added 128,000 jobs to the economy. Unemployment is down. But, wait, what ARE those 128,000 jobs? According to CNBC, "The Labor Department explained in a press release that much of the upswing in leisure and hospitality came from a surge in hiring at food and beverage establishments, which alone added more than 45,000 jobs for the sector’s best month since January" The article further indicated that 60,000 were from the leisure and hospitality sector while health care and social assistance added 34,200 jobs. Is there a pattern here? Ah -- 22,000 jobs were created for professional and business services which is defined as management consultant positions, computer system design, and architectural and engineering services.
     So, we have 105,000 jobs added in fairly low-wage areas, 22,000 added in relatively high-wage areas, 36,000 lost in medium-wage manufacturing, and a few dozen thousand added in low medium-wage areas. Is this a success? Or is it a failure?
     The point is that the number of jobs added is not very relevant unless one also knows the median wage paid for those jobs. In the U.S., and in most of the more industrialized areas of the world, "income inequality" continues to accelerate. Numbers such as the "number of jobs created" are meant to encourage and say things are great. Maybe they are -- but the analysis of the October 2019 job results don't tend to lend themselves to that conclusion. Lots of low-paying jobs and many fewer higher-paying jobs do not start to spread the economy out to more and more people.
     "The average wage for people in the U.S. has gone up!"  Sounds great. Wait a minute. If I have 10 people working and five earn $10, 3 earn $15, 1 earns $20, and 1 earns $100 -- we have an average wage of $21.50. What happens if that person earning $100 now earns $200? Wow. We now have an average wage of $31.50. Doesn't that sound great! probably not for the lower-earning 9 people.
     So average doesn't tell us much. How about median or mode? Median is the value given when, sorted in order, the number of instances of a value below a point is the same as that above that point. In our example above, with an even number of instances (10), the median is the average of the values at position 5 and position 6 -- so, for the above example, it would be $12.50. Note that, unlike for average, if the person in position 10 (highest earner) doubles their income, the median does not change.
     How about mode? Mode is the "most popular" instance. Since, in the above example, five people earn $10, $10 is the mode. Once again, when the most highly paid person gets paid double, the mode remains the same.
     There is a quote popularized by Mark Twain which I have always liked: "There are three kinds of lies: lies, damned lies, and statistics". Although this does call out the need to both understand, and interpret correctly, statistics -- it is not really fair to statisticians. But statistics are a politician's best friend largely because most people do NOT know what statistics mean or how they are calculated.
     "The average wage has doubled!" Sounds good, right? Not necessarily. "500,000 jobs have been produced in the past month!" Wow, things are completely rosy. Once again, not necessarily. If 400,000 jobs were lost then it is actually terrible. Usually, what is reported is NET gain so if 500,000 jobs were added and 400,000 were lost it would be reported as 100,000 jobs added. Sound good? Well, with that great of a turnover of labor, the chances are not wonderful that those 400,000 (actually, probably not the same people) who lost and got a job improved their wages and benefits.
     "$70 million dollars of welfare fraud found". Horrible, right? Well, yes, it should certainly be reduced if possible but total welfare funding is about $67 billion. So, this atrocious fraud is actually part of 1/1000 of all of the welfare funding -- or 999/1000 are NOT committing fraud. In most businesses, this would be a resounding success. In Florida, politicians stamped the ground and screamed about all of the welfare fraud -- and spent MORE money investigating the fraud than the net amount of the fraud. A net taxpayer loss.
      Let's take the above $67 billion in welfare funding mentioned, once again. A lot of money isn't it? It certainly is and I would not mind getting my $200 share of that money. But, within the context of overall spending, the $67 billion is insignificant in comparison to the overall federal budget -- about 0.5% or 1 part out of 2000 [note that it is about 21% of the federal social support budget but that is also not a high percentage of the overall federal budget]. The $67 billion is not significant overall but politicians blow it up to seem significant so that people will ignore the other 99.5%.
     Another side track of statistics from the politician's mouth is rankings. We are number one! OK. We probably are -- but number one in ranking for what characteristic? Hopefully, the people of most countries think that their country is the best -- a subjective ranking that has multiple truths based on who is saying it and where they are saying it. But, when we get to rankings based on quantitative (countable) characteristics, the record isn't so great. Yes, if a specific formula is applied to the countable numbers, there will be a most highly ranked (# 1) and a most lowly ranked (# xxx). But not many countries have multiple "# 1 rankings" for their country. Given a large enough population and land mass, most countries will have A (one or more) "#1 ranking". But, even in this case is being first in numbers of prisoners equivalent to being first in the number of books per household? So, relative rankings are fairly useless -- dependent on many different factorsl 

     Numbers do mean something -- but not always what they are implied to mean.

Thursday, November 28, 2019

Holiday Epistles: If and Why?


     Each year, for the past 40 years, I have composed something to send off to friends and family for the holiday season. They started as letters, moved to cards and letters, then to cards including a letter and a sheet of photos, and finally just a letter and sheet of photos.
     Why do I do it? It certainly takes a lot of time and effort (and some money -- less since I stopped sending cards). And, of course, I could probably send them out electronically to 80% of the recipients -- but the catch is that there are those 20% to whom I cannot send them electronically. So, I support the postal service and have them delivered to their mailboxes. Sometimes, I have a vision of them progressing around the globe -- following the distribution pattern of mail sorting centers (yes, I am a geek and a nerd).
     Hopefully, there are very few who say to themselves "why is this person bothering me again this year?" But it is highly unlikely that every person is fascinated with all of the events recorded within the letter -- or is even aware of all of the people, places, and events mentioned. Most probably fall into the middle range -- glad to know I am still alive and somewhat interested in what has happened (but not always glad of the length of the letter).
     I don't keep a journal (it would be great if I did but my focus gets redirected on an irregular basis). So, these annual summaries of the life of my family become an archive. Trips, milestones, tragedies, triumphs follow through in the stream of my past 40 years as reflected in my epistles. This is a wonderful side-effect of my sharing my life via the annual epistles.
     The epistles also act as a prompt. They say to others "I am glad I am in contact with you". Their responses (if any) vary tremendously. A card with a printed message and signature indicates a desire to stay in touch. A message (of whatever length) indicates a desire to share their year with you.
     LACK of a response does not give a clear indication. Just as in the Paul Simon song, "Something So Right" -- "Some people never say the words 'I love you'. It's not their style to be so bold". What is the sound of one hand clapping? There is no set formula as to whether to continue to send to a non-responsive recipient. Do you hear from them, or about them, at some other point in the year? Are they significant people in your life such that you want to make sure they have the option of knowing what is going on in your life?
     Sometimes, it is time to allow someone to slip away. No more letters. No keeping up with changes of address. This doesn't mean that person was not important within your life -- only that it is not continuing into the present and future.
     A final aspect is the actual list of recipients. These are the people you want to keep informed. Not all are your best friends. Some might be work associates. Some used to be close but are not anymore. The list summarizes a good portion of the relationships, you have established in your life, that continue to have some type of impact for you. Looking over the list, you may discover that "hey, I would like to talk to person A once again. Do I have their phone number?" In other cases, it may trigger sadness. A person has died within the past year. Or they have entered into a non-responsive limbo and it is time to make a decision as to whether you should keep sending to them.
     It is a record but not a static item. I am working on this year's epistle -- but wondering what will be in next year's.

Saturday, November 2, 2019

Euphemisms: Language Shadows and Thieves


     I have loved learning languages for as long as I can remember. Of course, first I learned the "mother tongue" -- in my case, English. But, with each language that I endeavored to learn, I gained that little bit additional understanding of a people and a culture (and of myself). Russian loves indirection and passive sentence structure. German goes for precision and building block structuring. Farsi brings a bit of poetry into a people who have often had to struggle. And so forth.
     There are many different stages of proficiency with a language. To become truly fluent in a language, it is necessary to fully live with those for whom the language reflects their thoughts. Three final tests for fluency exist -- music, humor, and euphemisms. Music is the test to make sure that you can interpolate sounds even when they are not standard and, indeed, possibly quite distorted. Humor tests the ability to think about matters in similar ways as "native" speakers. And euphemisms -- which are a reflection on the language's associated social interactions, taboos, and insecurities.
     I have not had the (or, perhaps, created the) opportunities to immerse myself into any language to the fully fluent stage (except for English). But, even in English, I find myself fighting against the world of euphemisms.
     From my point of view, euphemisms can fall into two areas -- which I call language shadows and language thieves. A language shadow substitutes one word or phrase, which still has the same meaning, for another. This is done to "soften" the phrase -- to not be "blunt". "Passed away" or "Meet their maker" instead of "died". You would never use the phrase "passed away" for anything other than "died". Although I may still prefer to be more direct, I don't consider language shadowing to be a bad thing since it does not take away from the language but overlays the meaning with social, spiritual, or religious overtones.
     Euphemisms as language thieves substitutes a word or phase, which has its own distinct meaning, for another word or phrase which is "uncomfortable" to be said. In this manner, the original meaning of the euphemistic phrase is either lost (can no longer easily say what the words would normally mean) or require mental substitution which depends on sentence, or conversational, context. Euphemisms as language thieves could also be called a form of "doublespeak" or even "newspeak" -- as defined, and used, by George Orwell in his book1984.
     Language thieves are used when a subject makes us uncomfortable -- "sleeping with" rather than "having sex with". They are used when we want to avoid moral consequences -- "collateral damage" instead of "murdering civilians and bystanders" or "friendly fire" instead of "killing our allies instead of our declared enemies". They are used when we want to avoid responsibility -- "I'm washing my hair" rather than "I am not interested in you, please find someone else".
    Once again, the distinction between a language shadow and a language thief is whether the substituted word or phrase has, or should have, its own meaning. It is extremely difficult (a person has to double the number of words to explain what it is NOT) to use the phrase "sleeping with" to actually mean individually sleeping together without any sexual contact. It could be argued that "friendly fire" is a language shadow since it is an absurd phrase that has no inherent use or meaning. Perhaps we should refine the definition of a language thief to include substituting words or phrases that have no real use or meaning?

Monday, October 21, 2019

It's Not the Change, It's the Rapidity of Change


     Humans can be looked at as the "adapting animal". We have a history of adapting to different environments as well as altering those environments to make them more compatible. We do this more than any other animal. Not that animals (and plants, to a lesser degree) have not adapted to a wide variety of conditions -- they have. But they have little ability to live well within a different environment to which they have adapted.
     Humans largely adapt with use of technology. Clothing is a huge advantage to allow continued survival and activity within an annually changing climate. Other animals can do this to a limited extent with added fur in the winter or hibernation but it affects their daily activity. Fire! There are good reasons why the achievement of (usually) controlled fire is considered perhaps the most important discovery/ability of humans. Good old Prometheus! Stealing fire from the gods and giving it to us poor mortals. Fire (or controlled heat with advancing technology) kept predators at bay, broke down food elements to allow better nutrition, allowed us to roast marshmallows around the campfire, and (in conjunction with shelters) changed the seasons for us.
     On the other end of the temperature scale, air conditioning has allowed greater comfort within warm seasons or climates. During this period of climate change, air conditioning has been shifting from a comfortable luxury to a life-sustaining requirement. On July 12, 2019 a temperature of 108.6 degrees Fahrenheit (42.6 degrees Celsius) was reached in Paris. In Kuwait, on July 21, 2016, a temperature of 128.7 degrees Fahrenheit (53.7 degrees Celsius) was recorded. In India, these types of temperatures, combined with droughts partially caused by the receding of the Himalayan glaciers, have proven quite deadly in the countrysides.
     Adapting via the use of technology inherently requires energy though, by use of a combination of older architectures and new materials, there is the possibility of reducing energy needs with long-term buildings and structures to better adapt to changing climates. Thick insulating walls that can bring in cooler night-time air or retain heat/cool with less ongoing energy use. Digging deep into the earth (which use of caves, both natural and artificial, have been used throughout the ages) can achieve better thermal stability.
     As ocean levels rise, coastlines move backward toward higher ground. In the case of islands, that may cause much of the area to disappear. Water, being necessary for life in different ways, has always been a local attraction for living space. Location, location, location -- as the real estate agents declare. By rivers, lakes, and oceans, humans have built up cities and other concentrations of population. Because of this, even though the percentage of land lost by means of the rise of ocean level is rather small -- the direct effect on human populations is much greater.
     Compensation for climate change is possible via adapted architecture or technology/energy use but it is not economically feasible for most locations -- mass migrations are starting to happen and will continue to accelerate.
     Humans can continue to adapt with the use of technology/energy and ingenuity. When change occurs, as is typical over recorded history (not counting cataclysms such as the dinosaur-extinguishing asteroid or global cold waves caused by sulfur and particulate emissions of huge volcanic eruptions), on a slow extended basis over centuries, humans can (and will) adapt. But changes that occur over decades -- within a human lifespan! These are changes that require global coordination, planning, and cooperation. Even with such, it only facilitates the mass migrations, changing of local food crops, and abandonment of no-longer-livable areas.
     Within this blog, I have concentrated on the effects of climate change directly on humans. But humans are not the only life upon the planet (though sometimes such is ignored). The great coral reefs are dying and polar bears are losing their habitats. Some areas are turning into deserts while other areas undergo periodic flooding which affects not only crops but plants and animals in general. Extinctions rise in number. How will the food chains adapt -- and can they? Humans can, and will, adapt but the landscape of the future will likely be quite different.

Sunday, September 29, 2019

Energy, Trade-offs, and Automation


     It takes energy to do things. The verb "do" pretty much says that effort is needed. To "think" may not require direct physical work -- but, in order to be able to think, it is necessary that the body and brain (and the "mind" -- whatever that truly is) function -- which requires a lot of physical work by the parts of the body.
     The basic source of energy is, of course, the sun. Energy from the sun allows both physical and biological processes to continue. Plants grow and the energy is stored. Animals eat plants and other animals. As something is consumed, it is used as energy for further activity. From a physics point of view, matter is exchangeable with energy but, within biological systems such as a stalk of wheat, a butterfly, or a human, it is a release of stored energy via biochemical processes that keep one foot going after another.
     So, we "do" things. We walk. We run. We hunt. We eat. The first level was that all the energy was supplied directly from our bodies. The next level has us working with tools -- tools that allow use of other types of stored energy (kinetic, gravitational, wind, tidal, ...) Our use of energy allows us to make use of other forms of energy. This is called "leveraging" (probably not coincidentally similar to one of the basic tools -- the lever).
     The next level is to build other devices that can make use of stored energy autonomously (without active control and manipulation by humans or other animals). So, we make use of a wind-up or an electrically-powered clock. We use appliances that can be started and then will continue to work on their own. A train uses stores of fuel to move lots of other things. On a track, they can continue on their own (although normally a human is required to start them, stop them, or change tracks).
     There was once a statement (I cannot remember, or find, the source) that it was wise to consider the time needed to walk to a destination versus the amount of time and work needed to be able to buy a ticket on a train to a destination. Of course, such a comparison requires the need to value your time. If a ticket cost $100 and it takes you 20 hours at $5/hour to earn the money but it only takes you 15 hours to walk the distance then walking is less expensive. Of course, if you earn $10/hour then it is advisable to take the train.
     This type of value/cost comparison is relevant in the case of automation. If a human (or group of humans) with tools can create a shoe in 1 hour at $10/hour, then the base cost (plus materials and tools and so forth) is $10. If a machine can make 1 shoe a minute and it can operate 24 hours a day and it costs $300,000 to design and purchase it, with $500/month maintenance then it would pay for itself in less than six months. Obviously, if the humans can create a shoe in 1 hour at $1/hour, it would take more a little less than five years.
     But, let us return to energy. An automated shoe maker will need energy. If that is easily available and cheap then the formula just mentioned will be a guiding force for what to do in a company. What if it isn't? Then that human who converts their own energy has a clear claim to the work. So, we now have three factors, cost of tools/machinery cost of labor, and availability and cost of energy to run the machinery.
     As discussed in previous blogs, capital/money is a form of stored labor -- or stored energy. A final factor in calculation involving the profitability is the locking up of capital. A company that has 6 shoe making machines would have $1.8 million tied up in the machinery. What else could they be doing with that money? Perhaps they could make more profit using that money elsewhere?
     We now have four factors to determine the attractiveness of outsourcing or automation: cost of labor, cost of tools/machinery, cost of energy, and loss of flexibility via the locking up of capital into fixed equipment. Automation requires a consistent source of energy. Without that, it is best to stay with manual labor (because "cost of energy" becomes impractical to overcome).
     The economic practicality of automation can be defined in a straight-forward manner; accountants can determine best practices rather easily. The social implications are not so clear-cut. Automation shifts, and reduces, the needs from the labor force. More education is needed for design and maintenance of the automation and fewer people in those positions. This implies retraining of those who will take those positions as well as further education and training of those displaced from the industry. In addition, concentration of capital (via the owners of the automation and automated factories) continues concentration of wealth and income for which there needs to be methods of distribution.
   

Saturday, September 21, 2019

Leveraging chaos: Maslow, fear, and empathy



     Alvin Toffler created the term Future Shock, in his book of the same name, in 1970, A short definition is that future shock is a condition where too much is changing, in too brief a period of time, for people to handle.
     People who are used to, or make a living from, the use of fossil fuels have fears concerning the future as they are aware that the ability to use them is finite -- they WILL run out. People who are used to living, and behaving, in a particular way will often be very resistant to changing even if it is well known that it is necessary -- as evidenced by reaction to the situation of rapid climate change. Changes in the acceptance of people, or groups, who were not previously accepted makes many people afraid and religious interpretations can be manufactured to justify mistreatment.
     Change is usually unsettling and change that occurs quickly is even more upsetting. Thus, there can be a desire to "turn back the clock"; restore slavery, keep the homosexuals and other gender variations in the closets, have codified behaviors for women and lower income people to be subordinate to those who have power. Re-concentrate power into the groups that previously held the power.
     This seems to be a period of time when politicians all around the world are trying to encourage fear (and anger) in order to be elected, stay elected, or promote their own agenda (often based on personal profit or ideological reasons). A time when many people are afraid, having difficulties finding enough upon which to live, or filled with shame; it is an ideal time for politicians to take advantage of that situation. Scapegoats are easy to manufacture, and use, as distractions from political actions and foundation reasons for problems.
     Post World War I was an ideal period of time within Germany and Italy to produce the conditions under which a Hitler or Mussolini could be attractive. Authoritarian control, rigidly enforced central rule, and a tightly focused, simplistic, program can be attractive and comforting for someone whose life appears chaotic and unreliable. A fascist philosophy, in particular, can be attractive to those who want to be "saved" from chaos or rapid change. They are willing to throw away those "higher" ideals of freedom, justice, or even love if they choose to give in to their fears.
     Abraham Maslow, an American psychologist, wrote a paper in 1943 describing what he called "a hierarchy of needs". There were three groupings: basic needs, psychological needs, and self-fulfillment needs as seen in the following diagram:


     Each level builds upon the lower one. In order to be able to address safety needs, it is necessary to first satisfy biological and physiological needs. Or one can say -- you have to be able to stay alive to stay safe. You have to personally feel safe before you are able to start caring for others. And so forth. Note that, once a person has "climbed" the pyramid, it is possible to choose to override the lower needs. A parent may be willing to sacrifice their own life to save a child -- "love needs" become more important than anything lower on the pyramid. In a similar fashion, a person may have "climbed" the pyramid to the point where they are a dedicated follower of a belief system or avid proponent for others such that they are willing to forego all personal benefits in order to do their best for others. A person can make a choice to let the higher needs override the lower needs.
     According to Maslow, you can only start taking care of upper layer needs after the lower levels have been satisfied. But (not according to Maslow) the lower needs are threatened (in reality or in fantasy) such that the higher layers of needs are forsaken -- they have not been able to make a choice but, instead, have had their foundations shaken and broken.
     Tell people that their security is threatened by some other person, or group, (even if the accusation is completely false) and the religious or spiritual beliefs you hold can be suppressed, ignored, or twisted. Torture a person who has not gotten to the "self-actualization" level and they may toss away esteem, love, and safety to maintain their basic physiological needs (that is -- to live).
     When fear of loss of basic needs becomes sufficiently great, empathy disappears -- taking care of oneself becomes the overriding priority over taking care of others. It is something that politicians, people proclaiming themselves to be leaders of a religion, or other figures of authority have always recognized and have used to promote war or the mistreatment of individuals or groups.


Saturday, August 17, 2019

Work life balance: A Matter of Energy


     Jeff Bezos, of Amazon fame, in an interview, indicated that he doesn't like the phrase "work-life balance" because he considers the phrase "debilitating" -- implying a trade-off between work and non-work time. His general idea is that your time at work will provide the energy for you to take home and your time at home (or off work) will provide the energy for you to jump-start yourself at work.
     There are a couple of fairly large assumptions in that. The first is that your work is something that you love, with people you enjoy working with, and has just the right amount of stress to keep you at optimum performance. The second is that the same is true for your time at home (or off-work). I applaud all of the people in the world in this situation and wish them the very best. It also can be considered as an ideal goal for businesses and individuals.
     This principle can be generalized in such a manner that it can be applied to everyone's life. Although the terms are not exactly applicable, each part of life can be considered endothermic (absorbing energy) or exothermic (producing energy). Jeff Bezos' situation describes something where both worktime and non-worktime are exothermic and contributing to a large net positive amount of energy to use for many purposes.
     While I hope that no one reading this is in such a situation, a work area where the same thing is done over and over, with little (or no) interpersonal interactions, could be considered to be a high endothermic situation. The person just cannot wait to get off work and all their energy that is produced off-work is used to allow them to get through their workday.
     It can be argued (and I cannot say that I am certain of the best answer) that a work situation where you hate the work, or have negative interactions with people at work, would be a more energy losing situation. Personally, I don't think that is true. It may be much more stressful and less healthy but you probably head home and talk about all the terrible things that have gone on during the day. That may be negative (and I would not encourage you to continue in such a position and situation) but I would not call it low energy.
     So, what is meant by "work-life balance"? I propose that it is a net positive amount of energy. As said earlier, an ideal goal is a very strong net positive result from a very energy-producing work-life AND a non-work-life. However, if you have a net positive amount of energy then you are probably able to live a life that you can enjoy. This definition removes the idea of a "trade-off" but it does imply that everything that a business can do to improve work life AND the support they can give to non-work life will be beneficial to both the business and the individual.

Wednesday, July 3, 2019

The spectrum of ethics: morality, amorality, and immorality


     All thoughts of morality are based within the social contexts and agreements of a society. In other words, in spite of our social upbringing and lessons, there are few (if any) actions that are universally "good" or "bad".
     However, we do not live within a potential, alternate, parallel universe. Within current Earth conditions, laws, and priorities, there ARE items of "good", "bad", and "mixed". It is within this context that people, and actions, are evaluated according to the morally approved/disapproved conditions of the society.
     In this context, the non legally sanctioned killing of another is considered murder (there are many times at which such killing IS legally sanctioned). In this context, there is the notion of personal ownership of property and the unwilling (and non legally sanctioned) transfer of such property is considered theft.
     The term ethics is generally used in the same manner as the matter of luck. If you wish someone luck, it is assumed that you are wishing GOOD luck for them. If you ask whether someone is meeting ethics standards, it is assumed that you are asking about the achievement of GOOD ethics codes.
     Ethics or morality, it is possible to be positive, neutral, or negative. Someone who is actively trying to achieve the "good" standards and actively avoid the "bad" standards is guiding their life/actions according to morality.
     Someone who does not care about meeting "good"/"bad" standards evaluates their actions according to some other criterion. It could be financial/money, ego-reinforcement/narcissism, or some other criteria. This is called amorality. A psychopath is, in fact, an amoral person. They don't actively go against the societal norms unless it meets some other desire -- for "fun", for profit, for self-aggrandizement, for curiosity, ... If it meets their criteria for desirability, they are perfectly agreeable to do what society wants them to do. It does not matter to them.
     In the corporate world, a business is more likely to be amoral than immoral -- as being moral or amoral is most likely to increase profits. Within recent (30 year) history, there is the story of the car manufacturer who determined that the gas tank was likely to explode in a certain percentage of accidents. Accountants determined that the number of likely accidents, and subsequent lawsuits and damages, was less expensive than changing the design. Another worldwide consumer corporation has run cost analysis figures on deaths from improper (but likely) use of their products -- determining the likely costs of lawsuits and damages versus the loss of profits in not heavily marketing the product. These are both instances of amorality.
     So, what is immorality? That is the active pursuit of "bad" actions and the avoidance of "good" actions. Sometimes, it is difficult to distinguish between amoral actions and immoral actions as the objective, observable, results are often the same. The above corporate actions might be considered by many to be immoral but they are not done primarily to violate societal standards -- violation of those standards is secondary to their primary goals (in the above cases -- to increase profits).
     It is, of course, completely possible for an individual, or group, to deliberately pursue immoral activities. These are the "villains" of history or within books or movies or even popular media. Unlike within amorality, it is the active pursuit of socially disapproved behavior that is the goal. Doing "bad" things is the purpose.
     Another factor in the morality spectrum is that, based upon societal approved/disapproved conditions, it is not constant and, thus, can change. What was moral in one century may be considered immoral (or amoral) in a subsequent century. For example, slavery was widely considered acceptable in the 1700s and before. While slavery still exists within the current period, it is no longer considered acceptable. It has moved from moral to immoral. And, during the transition period in the 1800s, people who fought to eliminate slavery were often considered to be behaving immorally while, from the viewpoint of our current time, it is now considered very moral and their opposition considered immoral.
     While it is quite natural to view things of the past according to the current standards of morality -- it is dangerous to alter historical records to impose current morality and lose the reality of the past.

Saturday, June 8, 2019

It's dinnertime : logistics working from the end date


     A full-time homemaker requires a multitude of skills:
  • chauffeur
  • chef
  • house cleaner
  • laundry services
  • facilities manager
  • chief-of-Staff
  • child care
  • nutrition and menu planning
  • buying goods
  • finance manager 
  • versatile errand runner
  • ...
     Logistics is defined as "the detailed coordination of a complex operation involving many people, facilities, or supplies." If you need an experienced logistics manager, just find a successful homemaker.
     One of the everyday tasks of a homemaker is that of getting meals on the table and ready to eat. Let's decompose the task into parts.
  1. Get the ingredients and preparation materials
  2. Prepare the ingredients for cooking/assembly
  3. Cook and/or assemble
  4. Serve
     So, when do we do what -- what is our timeline? We must work backward, assuming that all food is expected to be ready at the same time. We must also treat the meal by components -- the dishes to be served -- such that we can coordinate all of them.
     Let's agree on a menu. How about a quiche, warm rolls, salad, and drinks? Also, agree on a time to serve everything -- say 6:00pm.
  • A quiche requires preparation of the crust, preparation of the filling, baking, and cooling and serving.
  • Warm rolls (assume pre-made rolls) must be warmed up (so an oven must be available).
  • A salad must have the components (greenery, other vegetables or fruit, dressing if desired) assembled and then put into serving bowls (or a communal bowl from which to be served).
  • Drinks must have glasses, the contents sorted and poured, and then served.
     Of these four items, two of them are time-independent. Salad and drinks can be prepared in advance. The quiche and the rolls both need to be served soon after completion -- and there is a competition for resources because both need an oven (assume only one oven is available).
     In order to arrange proper scheduling for these two items, let us list the times needed in reverse order. Ten minutes cooling, 40 minutes at 375 degrees Fahrenheit, 10 minutes filling preparation (could be done in 5 if needed), 8 minutes baking the crust at 375, 15 minutes crust preparation, 10 minutes gathering materials.
     For the rolls, it is fairly simple. 4 minutes at 400 degrees.
     Add up the times for the quiche (we could overlap the filling preparation time and the crust baking time but we will keep it simple) and we have 83 minutes. To have it ready at 6pm, we would thus have to start at 4:37pm. After we have pulled the quiche from the oven to cool, we can turn up the heat of the oven and put the rolls into the oven at 5:56.
     The time-independent items do not really enter into the planning except to make sure that they are done in time. I would recommend doing them during the time that the quiche is baking.
     That is a simple meal. What other duties are required from our homemaker/logistics manager? Here we may enter into the "shortest distance algorithms" area. If I have to pick up a child from school, go to the bank for a deposit, and drop off the dry cleaning then I have to figure out the best order and then the best route.
     All very small parts of a normal day.




Friday, May 10, 2019

The Thanos scenario: why it doesn't work


     If anyone out there has missed all of the Avengers' movie cycle -- in particular the "Infinity War" then I apologize for what may be a spoiler and warn you to not continue -- though this pertains only to a little bit of the movies.
     Thanos, the antagonist (villain/lunatic/self-described savior/your label here), has a mission -- to improve the lives of sapients in the galaxy by reducing overpopulation. Most people would probably agree that overpopulation can be very stressful to a society and its available resources. However, eliminating half of the human population would be, at best, a very short term solution. As seen from the following graph on human population, doubling population -- while variable -- is typically not an infinite process (though it could be):




As seen above, the human (it would likely be different for other sapients) population doubling rate has varied throughout history -- with the most rapid doubling taking place around 1987. Expected doubling time slows down to 95 years in 2088 -- about 2 2/3 times as long as in 1987.
     Population increase rate is a correlation between birth rates, death rates, and life expectancy. If as many people die within a year as are born, the growth rate is 0 (zero). If more people die than are born, there is a negative growth rate. It people never die (the goal of universal immortality is achieved) then you would need a zero birth rate to achieve stability. All of these scenarios have been examined in literature (and speculated upon in "non-fiction").  Many of the variants, as considered, have their own challenges and advantages.

Some factors which affect birth rates, death rates, and lifespan:
  • Medical care -- the ability to prevent, or fix, "natural" causes of death
  • Peace/war  -- how many people's lives are preserved or eliminated by direct action.
  • Procreative drive -- the desire for more, or fewer, children
  • Fertility -- the ability to create more, or fewer, children
  • Resource access -- having enough food, environmental support, etc.
  • Environmental health -- local and global contaminants and poisons
  • Societal policies -- agreed upon rules that decrease, or increase, birth rates, death rates, and lifespan
  • More -- ???
     Most of the above factors are self-explanatory. Of course, the above factors sometimes interact -- in surprising ways, at times. Increases in resource access tends to reduce the procreative drive (desire for more children). It appears (unproven) that environmental contaminants are decreasing fertility -- especially in the more resource accessible areas of the world. Thanos' wars, or use of the Infinity Stones, falls into the "peace/war" category while Dan Brown's "Inferno" works by doing an "end around" for medical care by creating a pandemic -- or perhaps it is a decrease in environmental health? Certainly, the conditions that foster cholera, or accelerated the bubonic plague, could be directly attributed to environmental health. The book, and movie, "Logan's Run" had all adults over 30 killed -- a definite societal policy. Categorization can be a problem -- is access to birth control a "resource access" problem or a "societal policies" problem?
     However it is categorized, we can easily see that a single event will not make a long-term result. If we look at general human history, wars and pandemics made only a very temporary dip in population. Long-term problems require long-term analysis, agreement, and planning.

Saturday, April 27, 2019

True consensus and the power of the micro-minority


     Many people use the word "consensus" as meaning a general agreement -- but that is not the original, or primary, definition. The primary definition includes the UNANIMITY of opinion. In other words, everyone must agree. Within the Religious Society of Friends (Quaker), consensus does still mean that -- everyone must agree -- but there is also the concept of "standing aside" such that a person who is not convinced that the majority is correct but is also not convinced that they are NOT correct can allow movement. They "stand aside" so that some decision, or action, can be made.
     But, if they feel strongly that the decision, or action, is the wrong one, then they can stop the action -- true consensus must happen. Throughout history, this situation has stopped Quakers from taking action on some items for a period of years or even decades. Sometimes, the final action is the opposite of what was originally expected -- the minority disappears as it is absorbed within the majority.
     Although the basis for this practice within Quakers is religious, the principle of the majority not always being correct applies throughout society. This is parallel to the stereotypical parent asking their child "if everyone decided to jump off a bridge, would that mean it was right for you to do it?"
     In fact, looking through history and science, this is the "normal" process. One person decides that slavery is wrong but the rest do not. Then a group shifts their viewpoint (and starts becoming vocal about their view) and more and more people change their view and -- at the end of the process -- almost all think that slavery is wrong. One person examines the solar system and skies and applies mathematics to the movements and decides that the Earth really isn't the center of everything -- starting with a minority of one and now almost (rarely ever everyone) everyone understanding that view.
     It isn't easy to be that beginning minority of one. Even in situations where such is explicitly allowed and encouraged, it takes a firm grasp on an individual position to stay there. This applies to scientific, business, and social situations. Failure may sometimes be considered a path to learn how to succeed but most people would rather be part of a supporting group than being the dissenting opinion.
     Those ultra-minority opinions are often suppressed -- sometimes with legal mechanisms -- more often with disparagement and attacks against the person and ideas. This can be done with the best of motives -- and it may be that the "majority opinion" proves to be the "correct" one such that that ultra-minority opinion SHOULD be removed. Various tricks and movements may be made to suppress the minority, or ultra-minority, opinion. Of course, it can also be done with malicious intent -- such as, within the Harry Potter series, Harry Potter's trial's time and location being changed and Dumbledore is "accidentally" overlooked to be informed.
     We often think of people being of majority/minority voices. If 5% believe X, then we think that 95% believe non-X. It seems to make sense, but it is more likely that 5% believe X, 15% believe non-X and 80% follow along with the perceived greater voice. This situation can be looked at as a silver lining or as a forecast of doom. Is it a matter of 80% being "sheep" and unable to make their own decision or is it a situation where 5% only have to convince another 6% in order to move the fulcrum to change the balance? Perhaps both are true. It is even possible that the 5% can shift part of the 80% and change the balance in that manner.
     Within a large group, a single individual can always have the potential to see things the most clearly.

Tuesday, April 16, 2019

Responsibility and Fault: Just where does the buck stop?


     "The Buck Stops Here". That is probably President Harry Truman's most famous quote. Obviously, he was not talking about a dollar bill. He was talking about responsibility. He recognized that he, as commander-in-chief and chief executive of the United States, was responsible for the words, actions, lack of actions, morality, and so forth for all of the people to whom he had delegated work.
     But, even though there is recognition that the final responsibility lies at the top of the management structure -- those that are delegated may likely delegate further -- and those people delegate even further down. Is the CEO of a company with 30,000 employees responsible for the actions of every one of these people?
     In the 1800s and earlier (pre-circa 1970) 1900s, it was felt that the head of the company DID have responsibility for all the people that worked for her or him (in those periods of time, usually a "him"). In exchange for taking that responsibility, there was also a lot of control -- dress codes, behaviors out-of-office, fixed and stringent company manuals, and so forth. The responsibility was also connected to employee "loyalty". The company, and head of the company, guaranteed certain things and the employee, in return, agreed to act in certain ways (including productivity within work).
     Starting in the later 1900s, work and non-work time began to decouple. Company pensions became rarer and rarer. Length of time of employment shortened. A lifetime of work for the same company became very unusual rather than very common. Employees decided on what they could, or could not, do outside-of-work hours. Within work hours, there were still various expectations of dress and behavior but, outside of work, it was up to the employee as to what they did and how they behaved. In this situation, certainly, a CEO would not be responsible for what an employee did outside-of-work.
     From the other point, the head of a company would certainly be responsible for the words and actions of all those she, or he, directly delegated. And, as Harry Truman indicated, the head has some responsibility for all. But if they are truly unaware of what the grand-delegates are doing/saying then it is hard to say they are directly responsible. In the post-1960s, the phrase of "plausible deniability" came into use -- basically a way of saying "you can't prove that I knew about what they were doing so don't try to hang the responsibility on me".
     But this blog is about responsibility and fault. Why "fault"? Because when something goes wrong, the break occurs someplace. Similar to a fault line where earthquakes occur, fault occurs at a location and the responsibility lies with a person. But it is not always the person that seems most obvious. If a person fails in their duty because they do not have the knowledge, training, or accessibility to do the job correctly then the fault lies with the one doing the delegating. If the person DOES have (or claims to have) the knowledge, training, and ability then it is that person's fault.
     Fault is not blame. Blame is a movement of responsibility. And it is unproductive. If the fault lies on the manager, then they need to correct the lacks. If the fault lies upon the delegate then there is the option of learning, correcting, and improving to not create the fault again. The action of those responsible may depend on the delegate's history. Does the delegate learn from mistakes? Do they correct past behavior or mistakes?
     During merit review of someone to whom there have been tasks delegated, errors or faults need to be looked at from the point of who/where/what. Not doing something when they could not do such is not their fault. Once again, it is the manager's responsibility AND fault. Not being willing, or able, to learn from (and correct) errors is the delegate's fault -- and should be considered in the review.

Saturday, March 30, 2019

Public versus Private -- Corporate choices


     There is almost always great excitement when a brand recognized company initiates an Initial Public Offering (IPO). We look at the public stock indices and it is sometimes difficult to remember that Google (Alphabet) was once a private company or Facebook or Amazon or ... It is possible that there exists a company that began as a public corporation but I am unaware of it.
     There are still a lot of companies that are privately owned. Occasionally, a public company will move back to private ownership (they do this by buying back all stock that has been publicly issued). This movement in both directions indicates that each has its pluses and minuses.
     I am not an economist or a lawyer and cannot tell you all of the ins and outs of what is applicable. There are two general sets of regulations. One set is applicable to both private and public corporations. This set is primarily concerned with safety, health, and wellbeing (financial, social, and others) of employees. OSHA (the Occupational Safety and Health Administration) oversees much of this in the U.S. There are also general accounting, environmental, and other laws which apply to all corporations.
     The other set of regulations cover the security of stockholders -- those people within the general population who have invested their money in the fortunes of the public corporation. Naturally, since there are no public owners (and are not available for "trading") of private corporate stock, these regulations do not apply to private corporations. This means that private companies have a lot more flexibility in how they use their internal money -- but, depending on size and other factors, may have to treat their employees similarly to that of a public corporation.
     What is the attraction of "going public"? As a former small company owner, I can only tell you my views. The first is "exit strategy" -- what do you have once you have left the company. Within a private company, whatever my share of ownership may be, my share has no formal valuation. (If it is the target of being acquired by another company then an informal, estimated, valuation will indeed be made.) I have 40% of Company ABC. If it is private then that is 40% of ??? If an offer is made to acquire the company for $10 million, then my 40% is effectively worth $4 million -- but only if someone pays that. In a similar fashion, an IPO will indicate -- selling X% of the company divided into Y initial shares priced at $Z -- how much my share of the company is worth (once again, assuming someone wants to buy it -- it is not a "liquid" asset).
     The second attraction is to bring in additional money (capital) into the corporation for future desires. I have a store that has an estimated (it is still private) value of $2 million based on $200,000 net yearly profits. I want to open a second store but I do not have enough actual money in the bank to do such. Loans for private corporations are largely based on personal assets of the primary owners -- so that may not be attractive. But, if I sell 49% (common to not have the initial offering be a majority of the ownership) of the store, in stock, to people for $1 million then I have money available to purchase/build a second store and, if the faith that the stockholders have placed in me is valid, I can hope to soon have two stores, each worth $2 million for $4 million total. And those wise and brave investors have stock now worth $2 million -- a 100% increase in their investment.
     An in-between of private and public is venture investment. They often will insist on getting a majority of the company (almost definitely so if a second round of investment is needed) but the investment is similar to that of public stockholders except that it is still within the private regulations and restrictions of access. The venture company hopes to double their money -- or triple or quadruple. They expect to do this within a finite period of time (say two or three years) and the easiest way for them to realize their profits is to then take the private company public -- or to ready them for an acquisition event. So, venture capital investment is often a route to an IPO. The primary difference is the possibility of rapidly building up value before public regulations take hold.
     It should not be needed to be said -- but I'll say it anyway. In the case of investors -- public or private -- not all investments go well. Investors have a bit more protection within public corporations. A venture capitalist will spread their risk -- $10 million spread between 4 companies. One goes bankrupt, two increase their value a little bit (say 10%), and one doubles their investment.  This means (depending on division of investment) that they make a good, but not great, return on their investment. If that successful investment triples their investment then they have made a much larger profit (once again, depending on division of investment). They don't expect every investment to work out but, to stay in business and make the profits they want, the average return needs to be attractive.
     So why would anyone want to stay private? Well, besides avoiding public regulations, there are also stockholder expectations. Some stockholders can be very patient (such as for Amazon which took many years before it showed a profit -- but its stock value kept rising anyway). Most want some type of increase of value (dividend, rising stock value) every quarter. If it stays level (or goes down) for a couple of quarters "short term investors" are likely to start looking around for a "better" investment. Companies hope for primarily "long term investors" but publicly traded means that almost anyone can invest.
     This short-term requirement leads into "next quarter development plans". Long-term development plans, and investment, must be kept limited as the short-term development plans must succeed to keep up the investor interest and stock value. If the company returns to private then they still are expected to make a profit but they can put much more effort, and resources, into longer range plans.
     If you plan to privately run a company for the rest of your life and then pass it along to your children, then there are few reasons to go public. In-N-Out is a good example of such a private company. If you want to leave and go on to your next great venture, then public is the direction to head -- with, perhaps, the assistance of a venture capital company to increase your value first. Every founder, or set of founders, has their own dream and priorities. Best of luck in following your particular dreams!

Saturday, March 16, 2019

Does Anybody Really Know What Time It Is?


     The United States (most of it) just went through the Spring ritual of changing to Daylight Savings Time. Or, as a cartoon published during my childhood indicated, cutting off one end of the blanket and sewing it on the other end. Along with the changing to DST comes the biannual articles and discussions on whether we should stay with Daylight Savings Time (I assume that it would then become the new "Standard" time and the old "Standard" time would disappear) year round.
     Personally, I don't know the "best" answer. But it does call out vividly that time, as displayed on a clock, is arbitrary. What we call time is a human invention. Of course, time as a reflection of entropy (physical movement towards disorder) exists without humans -- but it is unknown whether the manner in which we perceive time (past -> present -> future) is fixed or a matter of perspective.
     Within the general human situation, perception of time is generally more important than the numbers with which we associate it. It takes "forever" to receive an anticipated message or event. Children grow up "so quickly". In my experience (I cannot say about yours), looking back at time seems much shorter than looking ahead.
     I have a personal theory that perceived length of time is proportional to one's chronological age. Thus, for a five-year-old, a year is an enormous amount of time because it is 1/5 of their life so far. But, for a 60-year-old, a year seems much shorter.
     Perception of time is also cultural. Some cultures (stereotypical for the Swiss citizen) are "ruled by the clock" -- everything must be done exactly "on time" and the schedule rules. Other cultures (it is supposed to be traditionally true for many in the First Nations) see it as a general framework. Thus, "tomorrow morning" may vary plus or minus a day. And yet other cultures make an assumption of inherent delays in most planned events -- so a plane that takes off at "7pm" might actually take off sometime between 6:30pm and 9pm (or later, if mechanical or staffing problems intercede). And that is OK because punctuality is not expected.
     Another variable of perceived time is focus. If I am focused on doing something -- because of a deadline or because I really love doing it -- then I am concentrating on the task and not on the time and time will go "fast". If I am thinking of things other than the current task, "waiting" for something "better", or trying to keep in mind multiple things that should happen within the same period, time goes "slowly".
     One more parameter to the perception of time is emotion. If you are dreading something, time usually seems to go faster. If you are looking forward to something, then it "just never arrives". Perception of time seems to be the inverse of how much you want the event to arrive. I don't want it, it happens "faster". I do want it, it happens "slower".
     So, to answer the original question -- no, probably no one really knows what time it is. Does anyone really care?
    

Sunday, March 3, 2019

The KISS philosophy: Forgotten but still needed.


     As an engineering executive/manager, I had a developer once come to me and say "here it is. I have fixed the last bug". I smiled and said "well, I can't wait for the next-to-last bug" recognizing that, if we counted that way, we would never reach the first bug.
     All software has problems (not isolated only to software -- but it is particularly prevalent in software). Even if somehow the program was simple enough and used for a long enough period that all of the problems within the program were fixed, a program does not exist in isolation. It will interact with other programs and the hardware. Change those and problems may easily surface.
     The primary criterion that creates the situation where software always has problems (or "bugs") is complexity. A five line program may eventually get all possible interactions tested and all known problems fixed. A 200,000 line set of programs (or processes) has little chance of even having all the problems known.
     There is an anecdote about Bill Gates talking to a developer at Microsoft. He is said to have said "Don't worry about the size or the amount of memory needed. By the time it is finished, we will have faster processors with greater amounts of memory." A side effect of this (loved by sales, disliked by consumers) is that new programs usually require new software and hardware to perform at their best.
     On the flip side of this issue, Soviet developers used to have a reputation as being very good programmers. They were required to use older computers with much less memory and they had to share equipment such that they had only certain time slots in which they could compile and test their programs. These constraints in equipment forced them to be much more careful in their programming as well as making the programs small and efficient.
     In the world of programming and marketing, complexity is also sometimes referred to as "Creeping Featurism". Marketing and Sales demand new features that can be used to distinguish a program from that of the competition. However, each new feature increases the complexity of the program and the system -- and the complexity does NOT go up linearly -- an increase of 5% in the number of lines of code may double the number of initial problems to debug.
     This might eventually be of general benefit if it wasn't for the fact that many features go unused by most people. They may not even know they exist -- or, if they do know about existence, they don't know how to use it. If this is so, why add the extra features? The answer to this is that word "most".
     Assume that a program has 200 features. Twenty of those features are used by almost everyone. One user makes use of 40 features. Another user makes use of 50 features. But only 1 of the 20 "extra" features of the first user is the same as user 2's 30 "extra" features. The others are used only be the particular user.
     When I was a beginning programmer in the 1970s, all of our courses emphasized using the KISS philosophy. The acronym KISS stood for "Keep It Simple S_____" (substitute your own favorite S-word). It was a reminder of the discipline that was expected (and that the Soviet programmers had to have as a requirement). 100 lines of code that did a function was much better than 300 lines of code that performed the same function. It was easier to debug (and usually had fewer bugs in the first place), faster, and required fewer system resources. KISS fell victim to the expectation that Moore's Law would always hold -- that processor power WOULD continue to increase and that memory would be cheaper and cheaper. But being ABLE to function does not imply that is was written as well as it could have been.
     Although the KISS principle was created in connection with software development, the same holds true for other complicated, interconnected things -- such as laws and regulations or industrial factory processes.
     Do you work with the KISS principle? If so, why? If not, why not?

Saturday, February 9, 2019

Going to the robots: a shift of workforce


     People sometimes say that we are "going to the dogs" -- well, I would say that we are really "going to the robots". Robots were named in 1920 by the Czech playwright, Karel ÄŒapek, within his hit play "R.U.R" -- or Rossums UniversalRobots. The word robota initially was used to indicate servitude or forced labor. So, in accordance with the original usage, there are quite a few humans who would qualify. Within the play, the manufactured robots were described as soulless humans -- manufactured biological creatures without access to feelings or independent thoughts.
     Current usage applies to non-living mechanisms (with in-betweens of Cyborg and Android). In the past, it has been primarily used for non-living mechanisms which retain the general shape and capabilities of living humans. It has now expanded to mechanical reproduction of actions previously only possible by humans -- "robot arms", ATMs (replace bank tellers), self-check counters (replace cashiers), "humanoid" (adjuncts to healthcare, services -- huge future potential), and so forth. Robots are classified in various ways -- methods of movement, category of use, versatility (programmed for one use, capable of multiple uses, or adaptive (AI)), and others.
     Leaving out definitions of Artificial Intelligence (AI) and potential challenges therein, there are many consequences of a shift of labor to robots. By definition, a robot capable of performing a human duty, or action, displaces the human -- the human is no longer needed for this duty. However, the robot needs to be designed, built, programmed, and maintained. One can put together formulas of sorts. (#Robots * useful lifetime) replace workers (net negative of workers). (#people needed for design, building, programming, and maintenance * time needed) required by robots (net positive of workers). Design, building, and programming takes a finite (limited -- it stops at some point) amount of time and the efforts during that time may create a large number of robots. Maintenance is ongoing but one person might take care of dozens, or even hundreds, or robots.
     The final effect is that robots replace workers but require more highly skilled people for a smaller amount of time. This means that, as robotization of society occurs, people will need more and more education and technical and focused training. And, for each specific number of robots put into the workplace, fewer people are needed for support activities. The more robots, the fewer (but more highly educated and trained) people needed.
     This type of shift of workers occurred in the "Industrial Revolution" (mid 1700s to mid 1800s). Very early robots such as automated looms displaced traditional weavers from their professions. In response, there were riots which were stopped with considerable violence. Eventually, workers learned new trades and shifted up in education to take new roles which developed.
     This same shift will be needed for the new "robotic revolution". Greater amounts of education and training for people but, since fewer people will be needed to attain the same results, fewer hours of work per person. This could conceivably iterate (the process continues with additional, more highly educated, workers displaced) until one has a similar situation as posed by Isaac Asimov in The Naked Sun, where there are plantations of robots with isolated humans having few required tasks.
     I am not ready to anticipate robot plantations as of yet. But, we may very well be entering into a period where active labor is done by fewer and fewer people with higher levels of education and training. If so, there will be a strong need of greater emphasis (and availability and affordability) on continued education, more deliberate labor policy oriented at reducing the number of work hours per worker, and methods of distributing savings and benefits across the entire labor pool.

Saturday, February 2, 2019

Going to Waste or Going to Waist: the dilemma of food distribution


     For many in my generation, our parents (usually mothers) implored us to "clean our plates", people were starving (at that time, "in China") and would love to have the food on our plates. Although not inherently a bad thing to not waste food, such requirements often caused problems by teaching us to ignore our body signals as to whether or not we were hungry. And thus, by trying not to waste our food, it often accumulated around our waists.
     Another aspect of this (which occurred to myself and, I am sure, many other children) is how did my finishing up my food help those in other places who did not have enough food? Portion control (especially countering the economics of supersizing) is an excellent goal to achieve -- eating the amount that is best for our health and with the correct composition and nutrition. But portion control only keeps us more likely to have healthy bodies (exercise and general lifestyle still factor in). It does not allocate more food to those who do not have enough.
     Assume that we each eat only what we healthily should. In the U.S., that would mean a net reduction in the average amount of food eaten. Less food eaten means less food purchased and a surplus of food produced. That surplus can be addressed by reducing the amount of food produced or by finding other markets for the food. Reduction of food production hurts the farmers (though many have already been shoved aside by the mass food producers) -- much better to find other markets.
     After correcting our portion sizes, we now have additional (the U.S. is already a net food exporter) food to send out to those who do not have enough. Raw food items, which are globally produced and imported and exported, are considered to be commodities. The price of commodities goes up and down but is about the same all over the world. However, the price of prepared food sold to people varies tremendously around the world.
     On December 31, 2018, the price of wheat in Kansas (in the U.S.) was about $5 per bushel. One bushel of wheat produces about 60 pounds of whole-grain flour or 42 pounds of "white" flour. Each pound of whole-grain flour is about 3 1/2 cups which is about the amount needed to make one loaf. Thus, each bushel of wheat can make about 60 whole-grain loaves and each loaf would have about 8 cents ($0.0833) of flour in it. If you insist on white bread -- it will have about 12 cents ($0.12) of flour in it.
     Eight cents of flour in a whole-grain loaf! Do you pay eight cents for a loaf of bread? Probably not. There are a number of factors that increase that price to what you pay. First, of course, a loaf of bread is not JUST flour. Depending on the recipe, there may be oil (or butter), yeast, salt, sugar, milk solids, and whatever. In addition, there are also equipment, labor, fuel/energy, and time needed to convert the ingredients to the loaf of bread. Second, the price of the raw material is not what you will pay at the market (either used within a product or by itself). There are transportation costs added, profit margins for each person/company which handles it, and storage costs.
     Of these costs, labor is the most variable between countries. Also, the general cost of housing, fuel, and taxes will vary. So, a loaf of white bread in Nigeria will cost about 1/3 the price in the U.S. In France, that loaf of white bread will cost about 40% of that of a loaf in the U.S. In Sweden, the price is about the same as in the U.S.
     Okay -- we have (in possibly overly verbose detail) shown that bread costs different prices around the world. It ranges from 1/3 to the same as in the U.S. We now have to compare world income. Bread costs 1/3 in Nigeria as compared to the U.S. but average household income in Nigeria is 1/28 that of the U.S.. This means that that loaf of bread has an effective cost (amount of household income) of 28/3 (9 1/3) of that of the U.S. In other words, buying a loaf of bread in Nigeria takes the share of average household income as equivalent of those in the U.S. paying $10 to $40 for a loaf of bread (lowest cost white bread is around $1 and higher, fresh-baked, bread may cost $4 for a loaf). On the other hand, average income in Sweden is about 90% of that in the U.S. so the difficulty of buying a loaf of bread in Sweden is fairly close to buying such in the U.S.
     We have now achieved a general knowledge of both cost and affordability of food within the world. How do we transfer that surplus of food from the U.S. to other countries (in particular, to those with low average household incomes)?
     In brief (finally, you may say!), the food must be either sold or given to the people. Selling to people in a higher income country is not a big deal. But those people about whom our parents referred when we were urged to "clean our plates" are much less able to purchase it. Many worldwide charitable organizations donate food to where needed in such cases but the food reaching the people in need usually depends on political stability and honesty.
     So, it is a significant problem. The people most in need have the least capability to purchase and, often, political obstacles to receiving it even if it is given to them freely. Cleaning our plates does not help them. It also does not help us if the portions are not appropriate. The problems and solutions about getting food to those who need it are primarily at the desired receiving end.

Sunday, January 6, 2019

Economic reevaluation: From GDP to the donut


     The Gross Domestic Product (GDP) has been maintained as the holy grail of the world of economic evaluation for around 75 years, since it was given a modern definition by Simon Kuznets in 1934 and then adopted as the primary method for measuring a country's economy at the Bretton Woods conference in 1944. Even as Kuznets was making use of the term, he warned against overusing it and making it more important than it really was.
     Alas, humans often prefer to take the easy route rather than more troublesome, but more accurate, methods. Thus, the GDP -- which was relatively straight-forward (although requiring huge masses of data) to calculate became the primary indication of a country's economic health. An increasing GDP was "good", rapidly increasing GDP was "better" and a stagnant, or decreasing, GDP was "bad". An example of such a graph follows ("real" GDP compensates for inflation and graphs according to a certain monetary index at a fixed period of time):



     Many criticisms have been made about the GDP but it was simple, came with an apparently exact number, and there was no alternate proposal to take its place. Complaining about something that is bad is useless unless you have something better that people can agree upon to take its place.

Some of the primary criticisms of the GDP as a primary economic index are:
  1. It leaves out a lot of the economic activity of a country -- probably the majority of activity. It only counts activity where "money" (or economic credit) is transferred from one entity (person, corporation, country, ...) to another. This leaves out all of the work done by "non-paid" workers -- including parenting, "housewives" and "househusbands", inter-generational childcare and other family work (such as within a business or on a farm), and so forth. Think that shouldn't count? Think about how many minutes a country would survive without it.
  2. The model relies on continued growth. Growth of population, growth of numbers of consumers, growth of production, growth of monetary supplies according to GDP status (a bit circular there), and on and on. This emphasis on growth also pushes the economy towards consumerism and nonrenewable wastage of resources. In a finite world, with finite resources, and the need to protect the environment and economy for future generations, the idea is counter-productive and destructive.
  3. GDP aggregates the economic transfers within a country. Thus, if one company (or individual) controlled all official economic activity, the GDP could be the same as for a country where economic transfers were spread out equally among all the people within a country. Accurate numbers but largely meaningless in terms of economic health.
  4. Economic credit transfers is a poor indicator of a country's health by itself. There are many other "soft" factors -- "happiness", income distribution, access to food and water and clean air, and so forth. Thus, you can easily have a strongly positive GDP growth rate in a country in which no one wants to live.
     As mentioned above, one objection to discarding the GDP as a primary economic indicator has been the lack of anything "better" to take its place. In this case, "better" means something that, at the least, takes into account the above criticisms of the GDP model.
     Kate Raworth came up with a model that has limits -- the limits are indicated by an "outer" limit where human activity uses up resources faster than can be renewed and an "inner" limit beyond which human activity cannot achieve the minimum needed to live. These upper and lower limits are expressed as two concentric circles or -- in the shape of a donut (doughnut for some).



     So, what does it mean to have a new model? Will this new model solve all of the world's problems? No. However, in order to THINK about a situation, or an idea, it is very useful to have a visualization of the concept. A person must have words, or some other representation (such as images), to properly manipulate an idea.
     One real-life example that has come out of this model is that of reorienting sales from products to services (which is compatible with many business strategies). There is an airport (I believe in Germany) that now pays a company for light -- a certain amount of lumens distributed across certain living areas in the airport. This is instead of paying for light fixtures, light bulbs, and electricity. Thus, since the provider wants to maximize their profits -- it is to their advantage to have the most long-lasting, energy-efficient light production as possible AND to recycle older materials as they are replaced (rather than throw them out). Profits on services makes the provider want to make them as efficient as possible -- and that tends to fit into the donut model better than the continuous growth/consumption GDP model.

User Interfaces: When and Who should be designing them and why?

     I am striving to move over from blogs to subscription Substack newsletters. If you have interest in my meanderings please feel free to ...