Friday, April 26, 2024

To Waste or to Waist: That is the question

 

     As is true of many people growing up in the US, I was encouraged to always clean my plate (encouraged is putting it mildly -- I remember being required to still be sitting at the dining table at 3am (and even longer if "necessary") because I hadn't cleaned my plate). The general principle of not wasting food is a good one -- the dangling of "there are starving people in xxx" is NOT a good one as the reason for people starving in parts of the world is political and not economic (cleaning my plate, or not, would make no difference).

     If I took food onto my plate and then told "take what you want, eat what you take" then that would also be reasonable. I would learn to take only what I was hungry for and would eat and there would not be any immediate wastage. (Eventual wastage was/is dependent on how much was left at the end of the meal and whether there were unplanned meals where leftovers were a strong source of food.)

     Having others put food on my plate and then being required to eat it was NOT a good principle. It primarily taught me to ignore my feelings of satiation. There was no such thing as being "full" or "content" as long as there was still food left. This system of superfluous portions continues into the present day with "supersizing" and eating-out portions in general. There is always the possibility of bringing food home but, honestly, how many times does it go home to be later thrown out?

     Being able to overeat has been a source of status in many societies in the past. Actual overeating, however, has not been, and is not, healthy for anyone.

     This principle also comes true in business areas. It's always tempting to get a lot of something because you can get it for less per unit. Buy 50 for $100,  buy 100 for $150. But there are hidden costs to buying more than you currently need. Storage, logistics in general. Yes, you may save money (even allowing for hidden costs) but might that extra capital being held as storage of an item be more useful for something "now"? This is the primary driving principle for "just in time" deliveries of materials. When this is combined with primary storage working with multiple consumers then both types of savings can take place -- quantity savings for the primary storage location with "just in time" savings of space and capital for the consumers.

     As personal consumers of food, we also tend to put the surplus into storage. That storage affects our health and our waistlines. It is difficult to average out food consumption and purchasing to create an optimum "no extra, no waste" situation. It is more difficult with one person than two, more difficult for two people than for five. There is also the need to balance need versus cost. If buying four cucumbers costs only 25 cents more than buying two, then buying four is the obvious bargain, correct? But what if I said that you can only reasonably use two cucumbers before the others go bad? You end up saving 25 cents (and storage space) by NOT buying the larger quantity.

     "To waste or to waist". My childhood training taught me one set of values. My knowledge, as an adult, tells me otherwise. But that doesn't make it easy to change. It can also be difficult, in business, to pass up bargains where the real value received does not match up to the price. As always, all aspects of a transaction -- or transaction's journey -- must be considered.

     

Wednesday, April 17, 2024

Interrupt Driven: Design and Alternatives

 

     It should not be surprising that there are many aspects of computer architecture which mirror how humans think and behave. Humans designed computers. Humans developed programming. Humans devise algorithms. Is there any way that the patterns of thinking and behavior for people could NOT have influenced the design of computer systems?

     Sometimes, you want to do multiple things "at the same time" or you want multiple programs to run "at the same time". As discussed in my blog on multitasking, this isn't really possible but, on a computer, it is possible to make it LOOK like it is doing multiple things at the same time. The computer can produce this illusion because of how quickly it can follow instructions. The speed at which it changes back and forth between programs is quick enough that humans cannot perceive the difference. It is different with humans -- the swapping of tasks is very noticeable and humans are not as good at it as computers.

     Sometimes it is possible to want to do one thing most of the time but, occasionally, need to do something else. Sometimes that "something else" is only ready to be done at unknown times. Sometimes that "something else" is just of lower priority than your main task but you still want to have it done when possible.

     This latter situation is usually done on computers by means of the operating system (Windows, iOS, Linux, ...) and using priorities of tasks as a criterion for when programs, or tasks, are scheduled to run. In the world of the human, it is an organization matter. It is still most efficient to continue on one task to completion and then change to a different task but it is possible to schedule the tasks to take different amounts of the day. You work on cleaning the house for two hours, then you watch your favorite serial streaming program for an hour, then you resume cleaning the house for another two hours, and so forth.

     The first situation -- where a secondary task is not always ready to be done -- has a couple of primary ways to be handled. One -- you can check on the readiness of the secondary task every once in a while (either at regular intervals or just "whenever you think about it"). This is called "polling". The other is you have some type of indication that interrupts you and tells you that the secondary task is now ready to be done. That is what doorbells are used for. Rather than checking the front door every few minutes to see if someone is present, you expect them to ring the doorbell when they arrive. (If someone arrives, and you are expecting to hear the doorbell, then there may be a long wait if the doorbell is broken or they don't feel comfortable pushing the button.)

     So, when do you do one method and when do you do the other? Polling has the advantage that it is done when you are ready for it (the secondary task may not be ready but you are ready to check). Being ready, you have no additional things to do before you go check the front door. Interrupts (as long as they are reliable to happen) have the advantage in that you go to the secondary task ONLY when it is ready so there is no added inefficiency of checking and having the secondary task not ready.

     Sounds as if interrupts are best? Ah, but there are a couple of problems. First, since an interrupt may arrive when you are NOT ready, you have to quickly do whatever you may need to do to become ready before you attend the secondary task (turn off the heat on your cooktop, put the ice cream back into the freezer, put a bookmark into (or within for an ebook) a book you are reading, and so forth). This is "save and restore" (when you leave to get the secondary task and when you return from the secondary task) and it can be significant overhead for being able to handle the interrupt. Second, sometimes it just is NOT a good time to be interrupted. Ever take a good soaking bath and have the doorbell ring? You can disconnect the doorbell before you enter the bathtub but then you risk missing someone arriving. (In the case of computer systems, you may miss some critical thing that MUST be worked with in a certain amount of time -- thus, you never turn off interrupts within a nuclear power console -- you just become ready to go to the control panel in your bath robe.)

     The decision as to whether you poll or allow an interrupt to trigger a response depends on many factors. One important one is -- how time critical is the second event? Can the secondary event wait five or ten minutes? Perhaps even a half hour? It is important to reduce the heat under a sauce after it is ready but it may allow five or so minutes leeway. But, if a smoke alarm goes off it is very important to check as to what is wrong very quickly -- if only to silence the alarm. A secondary task that is not time critical is a good candidate for polling (checking every once in a while). The smoke alarm generates its own interrupt in the form of an audible sound. Another factor is "how often is this going to occur?" Remember that unexpected interrupts have an overhead, an extra amount of work needed to prepare to handle the interrupt. If the event happens (and is ready) frequently then polling is a good way to monitor it (because the percentage of times you check and it is ready is a high percentage).

     Life will carry on. You will continue to get unexpected interrupts. You will occasionally get back to doing something too late and a pan may burn. But if you understand what is going on and how you can handle it you may be able to organize matters to optimize how you work with those events.

Friday, April 12, 2024

Transitions: A part of life

 

     Transitions occur as part of many aspects of life and business. On a personal level, transitions occur as we grow older and as we accumulate experience and start applying that experience in different ways. For a business, key personnel will come and go, products will be retired and new ones come into being, and events will bring about unforeseen changes -- perhaps very rapid changes.

     We all grow older. We make transitions from newborn to toddler to young child to older child, adolescent, early adulthood, etc. It is certainly possible to separate the phases differently. In some ways, the transitions are continuous. We each age a day per day. But, usually, there is a defining event to indicate a transition. In some cultures, there are specific rituals and customs to mark the occasion of moving from adolescent to young adult, but many have no such thing. The transition may be obvious -- first time walking unaided, for example.

     Although our specific futures are not available for us to know, we can sometimes prepare somewhat. A baby practices the movements, and exercises the muscles, needed to walk. But I am not convinced that we can really prepare for adolescence (when those hormones kick in and changes our moods, interests, and reactions). In the business world, we can prepare for known changes and, perhaps, improve skills and methods that will allow easier adaptation to unforeseen future changes.

     Similar to transition lenses versus bifocals, transitions may be gradual or abrupt. A gradual transition may seem like an easier one but it may lead to a situation when you pass from safety to danger without noticing the change -- it is easier to notice danger when the change is abrupt. Note the old tale of the frog who was put into a pot of cold water which was gradually heated up to a boil. Much situational training concerns being able to make plans quickly, and act upon them, when faced with a difficult (possibly dangerous) situation. This can apply to being charged by a tiger or a necessary supplier going out of business.

     The one thing you cannot do safely during a transition is to ignore it. A change in circumstances (which affect you) means there will be a need for changes in reactions, processes, and attitudes. (Of course, if the change affects only someone else you can still be there to listen or offer assistance.) Although it is not true that an ostrich will hide their head in the sand upon encountering approaching danger, the metaphor is valid. Perhaps the monkey triad of "see no evil", "hear no evil", "speak no evil" is better. It probably depends on the circumstances and the direction from which the transition occurs.

     Internal transitions are, by definition, not visible to others. This makes them hard for others to help. There is an additional difficulty that the person involved with the internal transition may, themself, not be able to properly describe the changes. These are likely to be among the more difficult of transitions.

     How can a person prepare for transitions (whether in personal lives or business)? The foundation element is that of recognizing that circumstances, needs, and goals change -- and can change in a moment. This is much easier for Myers-Briggs types "P" (process oriented) who are inherently not as fixed on a goal. The "J" (judicious) types must learn to be flexible and ready for change. Although Myers-Briggs is directly applicable to humans, some of the same behaviors are true about corporate cultures.

     Other preparations are to know how to determine the components of a change. Much easier said than done, especially with internal transitions. Even external transitions may have important parts of which no one is aware. You cannot know all of the components of a change before the change (though you may know many of them) -- you are wanting to learn how to identify those components.

     Once the components are recognized, it will be time to decide what responses need to be. It is very difficult to do this in advance as there are so many unknowns leading up to most transitions.

     We do know, however, that there will be transitions.

Friday, April 5, 2024

ROI: Methods and Pitfalls

 

     When at the company I co-founded (TeleSoft International), we had a lot of the components of success, but missed a number of crucial ones -- which eventually led to a dwindling and a slow death of the company. But one area that we did recognize (but likely did not do as well as needed) was that we needed to focus our limited resources where we had the best potential for income. This is the area that is called "Return on Investment" (which, I am sure, most of you already know) or ROI. The area of Investment includes labor, capital, goodwill, time, existing equipment, and (I'm sure) other aspects. Return on investment often is listed as a number -- the amount of income generated by use of all of the resources invested. But, in reality, income is not the only return that can be generated. "Goodwill" -- or a positive image and, perhaps, some later assistance -- is a completely valid return worth an investment.

     The first aspect of attempting to determine ROI (which is an art rather than a science, no matter what people may want to believe) is to adequately determine resources. How many people are available? What are their experience levels? What are their particular expertises? What can they do better than the average person at another company? How much money is easily available and for how long could it last before replenishment of capital is needed? Are there any areas (sales, marketing, distribution, quality control, testing, engineering, ...) that are not sufficiently staffed? If not, are there plans to get them staffed?

     Are processes in place to allow for growth? Can you handle a tripling of orders in a week or two? Can you survive success? If your growth curve started to become geometric rather than mildly linear, would you be screaming with success or with panic?

     Once resources are well known, you have the capability of comparing potential products against the amount of potential gain versus the amount of resources needed. It is not sufficient to just read about product notices and other industry press. Much of that is information trying to convince you that the products and services are important and needed. You need to know just what is being USED and what is in real upcoming budgets to be purchased. If you have access to huge unused resources then you can afford to take risks approaching potential market niches opening up. Most companies do not have such.

     So, you must focus on what is and what is being planned for. You need to be able to analyze other companies sales status and be able to actively network with people in other companies to exchange non-proprietary information that indicates direction of spending and growth. This was the primary area in which my company failed -- those in charge of gathering information did not like to go to conferences and other networking activities. The news from the ivory tower of publications and announcements were the primary indicators -- and they often were only wishful.

     You have your known resources and you have targets of development and expansion that can bring back revenue. Now you have to match them. Why should your company be working in this area? Does your expertise give you a learning curve advantage to allow you to take the lead in the market? Will people obviously turn to you with the expectation that you can provide what they need or do they need to be convinced? The former is definitely the preferred path. You must establish priorities. What resources need to be committed to what, and for how long? How do you determine a dead path -- when you should stop and devote your resources to other projects that still seem viable?

     You know your resources. You know what you would like to produce. You know what best matches between your resources and desired products. Now go to it.

     

Thursday, March 28, 2024

Biases and Prejudices: There is a difference

 

     It is always difficult to choose people on a jury. Every potential juror has a history, education, and daily life which influences their attitude towards everything. And this set of experiences varies from person to person. Show a person a red rubber ball and one person will smile, thinking about enjoyable times in the playground. Another person might grimace and say "get that thing out of here" because, for them, that ball is a reminder of being pelted in dodgeball in gym class and going home with circular welts on the body. Same red rubber ball, very different reactions.

     Considering that each person likely has a different perspective and history what are they to do in a court? First, of course, is whether the person has any direct knowledge of the people or events in the case. In such cases, they might be prospective witnesses but they should not be members of a jury. The prosecutor is going to want to find people who will come back with a verdict in their favor and the defense attorney is going to want to find people who will come back in their favor.

     This is a problem because everyone will have biases. A bias is a feeling towards something, a first impression and reaction towards something. But it is only the first feeling -- it may change as things happen or they learn more. A child is initially scared of a vacuum cleaner because it makes so much loud noise. As they get used to the noise and watch what is done with it, they may start to appreciate it when the floor becomes cleaner and the air fresher. It might go the other way and they may dislike it even more as they get older and they are given the task of being the person to push the vacuum cleaner from room to room. But show them a vacuum cleaner and make it clear that it is not there for them to use and they will be okay with its presence.

     A prejudice is what the word breaks down into -- a "pre" "judgement". You have made a firm reaction, or decision, with no direct interaction with the current situation or information around the event. Prejudices could conceivably be about a thing or a cause but the word is predominantly used about groups of people. A lot of people have prejudices about athletes. Some think that, because the athletes have better-than-average physical abilities, they must not have intellectual abilities. Even if given test scores, awards, and well-testified examples, the best outcome with a person with such a prejudice is that "that person must be an exception".

     So, in the case of a potential juror, everyone has biases. If they favor short people then, if possible, they will be glad to support short people. But, if they are given evidence that this particular short person has done something bad, they will not support them. Or vice versa, they don't like people who dress as if they are wealthy, but when the evidence indicates that this particular suited individual has not done any harm, then they will support them and their position.

     If a potential juror has prejudices then they will support, or oppose, no matter what the current circumstance or evidence. Whereas, with biases, they are open (perhaps not eagerly open) to being presented with evidence that will lead to a verdict either direction. The prosecution would love to have people prejudicial in favor of guilty and the defense would love to have people prejudicial in favor of not guilty. Each will try to eliminate jurors prejudiced in the other side's favor. Biases are less important but each would like people biased in their direction -- it would mean less effort to convince.

     The "fairest" trial would have all the jurors without prejudices or biases -- working to make a judgement solely on the evidence presented. But all people have a history so awareness is our best hope.

Friday, March 22, 2024

We Are All Influencers

 

     A couple of years ago, I wrote a blog on the effect of influencers within our society. All that is still actively happening but I started "taking a step back" to recognize that we are ALL influencers.

     To influence is to be noticed. You might be seen, heard, sensed, or noticed through the effects of your actions. Influence can be positive or negative -- though for remembering a name on a ballot, it seems that just the "name recognition" is the most important and whether that person is known for behaving well or poorly is of secondary importance. Positive influence generally encourages positive behavior on the part of the observer. We want to be like them and we want to be acknowledged as such. Negative influence is not as clear. Observation of someone, in a position of influence, doing negative behaviors may be interpreted as permission for the observer to do similar negative behaviors.

     A great teacher can help self-motivate a student for the rest of their educational ladder. A rotten teacher can knock them out of a groove -- but a good one, or great one, can put them back on their path. In all cases, the idea is to help the person internalize the feelings that occur when someone external acts as an inspiration or cheerleader. That person may be family, a friend of the family, a teacher, or a respected friend but the process is to lay the foundations to become our OWN inspiration and cheerleader.

     Most of the time, we do not know when we are being influencers. There are some, of course, who are deliberately trying to influence. But much of the most important influences upon our lives come from those who are just living their lives -- doing the best (or worst) that they see to do.

     A "foundation" of mores and behaviors is created in our very early years. A strong foundation is internalized quite early and can support the person throughout their lives. This foundation gives them strength to choose among later influences that will help mold their growth. But, in many different circumstances, the foundation may not get a chance to develop -- or for the person to internalize a set of values that can sustain them through the challenges of life.

     To the best of my knowledge and awareness, every person (unless raised in complete isolation) -- in the process of growing up -- is surrounded by a peer group and, as themselves, are a peer to others. Depending on the strength of their early foundation, the peer group can have a greater, or lesser, ability to impress upon someone growing up. Joining a gang can be choosing a specific peer group. based on perceived values, when a person's foundation is weak or absent.

     The purpose of a leader is to guide, inspire, and support. If they work hard, and we feel acknowledged and supported, then we work hard. If they are not seen/heard/known to do significant work, then their influence can become unimportant or even negative. If they are extremely highly paid and work only moderately then it does not inspire, and motivate, as it does when recompense is aligned with amount of work.

     Inspiration by leaders lets us have a reachable goal for our efforts --- having a poor example does not inspire our best efforts. It gives us a feeling that our efforts are not worthwhile. Working more than others and earning less than those people is not encouraging.

     The effect of influencing is not always such a "weighty" matter. A smile may be passed from one person to another -- lightening the heart of each that encounters it. A cheerful "hello" may mean much to someone -- even if their immediate reaction is to growl back at you.

     We are all influencers. We often do not know when we are influencing others. It is thus important to continue to strive to do our best and be the best example of which we are currently able to be. (And forgive ourselves when that "best for the moment" does not meet our standards.)

     

Wednesday, March 13, 2024

Herstory: part of the rest of history

 

     "Her story" is a supplement of the traditional "his story" or history. It is a matter of restoring what is missing.

     "History is written by the winners". There are a lot of excellent articles, and blogs, available about how history is there to be learned from -- or ignoring history means to repeat it -- or other related topics about the importance of history in making decisions for the present to prepare for the future. An excellent topic and I may write about it some day. But this is not that blog.

     In order to chronicle every single event that occurs in all of our lifetimes, it would be necessary to double the amount of time available. We would spend one half of the time documenting the other half. Although many of our various global national security agencies monitor (legally or illegally) and document our conversations, messages, program watching habits, books checked out, and so forth, they have neither the time nor the resources to examine all of them. They use algorithms to narrow down what they look at (accurately or not, relevant or not). One gains privacy only by being totally uninteresting.

     The point of the previous paragraph is that any history that is written will be incomplete. History is written by the winners is an accurate split of accounts of war. Winners write the primary histories and their "side" will be uplifted and the opposition downgraded. Commanding officers will be given the largest portion of credit (or blame) while the many ordinary fighting (and dying) soldiers may not even get a name on their grave.

     If an extraterrestrial alien obtained one of our history books, they would get a very sad view as much of history in textbooks are concerned with battles and war. Alas, there is rarely a period of time when a war is NOT going on so there is plenty of material. But what about all of those people who are NOT at war, who are living on a regular, day-to-day, basis? Very hard to find in the history textbooks. They just weren't "important enough" to keep track of.

     Emphasis on wars and battles from the viewpoint of the "winners". Concentration on the political, economic, and military elite. There is so much of life that is left unwritten in the typical history textbook.

     And this is while we make the (rarely true) assumption that history, that has succeeded in being written down, does accurately reflects that small slice of life being recorded. History has the danger of being rewritten at all times. Sometimes deliberately to hide what people would prefer to not be remembered. Sometimes accidentally through destruction of records and witnesses. But in all cases, due to the incompleteness of the histories, so much is not recorded that one can only see the rest of life in the shadows of those of whom the light is shone upon.

     So, what gets eliminated the most often? The poor, the non-dominant ethnic and religious groups, and the very existent but denigrated segments of population. Archaeologists are thrilled to come up with exciting old sarcophagi with jewels or the burial chambers of ancient rulers because those are the finds that generate sufficient excitement to obtain funding for further excavations. But historians are even more excited when evidence of, and material pertaining to, the excluded people are found. These masses -- composing the overwhelming majority of the population -- are the ones who really make life continue and, yet, so little is known about them from different ages and societies.

     The fact that history books, and classes, are incomplete (of necessity and of deliberation) is the reason why excluded segments work hard to get supplemental materials added to school and college curriculum and included as a part of our general knowledge of our history. In the case of the often excluded roles, actions, and events concerning women, such additions are sometimes called "herstory". These concentrate on "her story" rather than "his story". There are books, and classes, that concentrate on other denigrated segments such as the First Nation people within the US, Ainu in Japan, aborigines in Australia, and so forth all over the world, the many waves of immigrants into the US, and so forth.

     If one looks at these supplemental works, and courses, it is easy to say "these are not accurate" -- because they are, like mainstream history texts, incomplete. Because their focus is different from the mainstream histories which are widely available throughout the population, these supplemental materials can make people uncomfortable when they are presented with information of which they had been uninformed. Often there is a reaction to suppress this information which as been neglected over the years -- but this does not make that neglected information false.

     History is written by the winners about those people, and events, that those in power deem important. But there is so very much more to life than just that.

Thursday, March 7, 2024

Apathy: The Greatest Ally to Entropy

 

     Once upon a time (about 50 years ago), I was allowed to give a speech in front of my high school graduation class upon Commencement. Searching though my experiences (all of 17 whole years), I wanted to come up with a topic that, in my opinion, affected life the most. I decided upon the issue of apathy. I had been a letter writer to the local newspaper about wildlife issues and other issues made me believe that proactive behavior was the only way to make changes.

     I think I bored 90%, or more, of the audience. They probably don't even remember the talk. Kind of a peculiar situation where the reaction was evidence for the theme. The saddest thing, however, is that (assuming I could find the speech) I could give the same speech today without varying much of anything.

     So, what is apathy and why is it so important? It can take on a couple of different faces but each one is a matter of watching events continue on their course without trying to affect the direction, goals, or results. The stereotypical profile of apathy is sitting back on a couch eating a bowl of popcorn, and drinking a soda, and watching a television game show while your neighbors float down the street riding a mattress that has been washed out from their house during a flood that is going on. Just cannot be bothered or -- as our daughter might say -- "whatever".

     Apathy can exist within the business world as well as the personal world. If all you do is what you are told then it is possible you really are a "replaceable cog" in the machinery. To fight apathy, and entropy, you must be heard and make a difference. "Yes people" need not apply. Of course, especially within a business environment, it takes multiple people to be able to change. If your manager is not open to input then the only thing your words can do is to bore them or anger them. I had a manager who claimed to be fully open to input but had a "stream of consciousness" form of telling the staff what was going on and there was no opportunity, no break in the stream of words, to give input. In order to give input, one would need to take notes so they could be responded to -- out of context -- once the speech was completed.

     Note that apathy can also exist while being a participant. It is possible to be an active part of a group of people, a "movement", a political party, or event and still be apathetic because your existence is just a matter of another head. You nod "yes" to everything that is said or done without any reflection, research, or giving any input of your own. Acceptance, and following, without knowledge and awareness is yet another form of apathy. It is an apathy "in disguise" as you are swirling around in the midst of a series of rapids -- within the heart of a group of movement -- but not working with an oar or making any difference in the fate of the trip. Even on a roller coaster, if you are not contributing, then you still are just part of the environment.

    Entropy has a number of different definitions but one is that of a "general decline into disorder". This certainly happens in my son's bedroom. It gets cleaned up and then, over a period of days or weeks, one more sock hits the floor and, at the end, the floor is findable only with a shovel. Most of life works that way -- the leaves fall from the trees and cover the ground, dust coats the furniture and eventually is able to provide soil for new plants to grow.

     Apathy just leaves entropy alone. Only by challenging apathy does change happen and, for at least the time you are active, disorder gets moved back to some type of order. Challenging apathy? That's hard, as the first paragraph of this blog indicated. But there are ways...

Wednesday, February 28, 2024

Frankenstein's Monster: AI's shadow

 

     Alan Turing, in 1950, released a paper called "Computing Machinery and Intelligence" while at the University of Manchester. The main gist of the paper was that if the output of a computer could not be distinguished from that of a human being then it must be considered intelligent. This has been a goal of those working on AI for a long time and it is the primary point of general acclamation that we "have achieved AI". Various AI applications can present artwork, prose, dialogs, and other such things that cannot easily be distinguished from that created by a human. In a "blind test" (where the people judging have no pre-knowledge of which output was done by a computer) it would pass judgement.

     No one is saying that AI has reached the goal of achieving human thought. AI is trained (as is true of humans) on lots of input -- lots of data from lots of sources and, iteratively, told what is right or wrong and "learns" how to do things, respond to things, and so forth. Which leads us to two of the current problems with AI. These are intellectual property laws and lack of discrimination.

     Intellectual property conflicts are the most straight-forward. In order to train, lots and lots of information must be fed in. While some of that information is data obtained from various public sources, other information is from private, proprietary, sources and other information that has been created by human minds and may be part of their livelihood as well as their career growth and reputation. That "brushstroke" done by a generative AI may be copied from an artist's work. That "narrative work of imagination" may be borrowing from the insights of dozens of writers who, justifiably, do not want their copyrighted material used without their permission. AI generated code may (and almost definitely will) make use of code that was created both for freeware as well as company proprietary code (that probably was not authorized to even be accessible via the Internet).

     Currently, an AI information gathering product can present bad information. As an AI model is being taught, it is given access to a lot of information. Hopefully, most of this information with be factual and truthful. But some of it will be misinformation (lies), disinformation (logically faulty or irrational), or declared works of fiction. Some of these can be eliminated by human monitors as the model is trained but this just leaves it in the hands of those humans.

     Humans do not have a good track record in doing the research needed to determine truthfulness of information -- they cannot be relied upon to do such for an AI model. It is conceivable that an AI model might be programmed with algorithms that will allow it to compare information, reliability of sources, and logical inconsistencies. They even would have an advantage since they do not (currently) have emotions. But AI models are NOT at that point now -- and it will depend on the accuracy of any such algorithms.

     Isaac Asimov, considered one of the best of the "classic" science fiction authors, used as part of his robot-focused novels an idea called "The Three Laws of Robotics". I'll let people investigate the evolution of the laws, direct quoting of the laws, and the interpretations as discussed within the various stories produced. Let's just say these "laws" are meant to be a "leash" to prevent robots (or AI) from hurting people. And, though the laws work pretty well in the stories, there are almost always loopholes in any "law" and the three laws are no exception.

     But current AI has no such thing as the three laws -- and no practical way in sight to create such laws and to enforce that they will be part of any, and all, AI products. While AI is monitored, and final decisions are made by humans, then the moral (and legal) responsibility for any problems will be in the hands of humans. If unmonitored and autonomous, it is unknown as to where the responsibility would lie. There are presently court cases struggling to delineate such matters in the relatively straight-forward arena of self-driving cars.

     Computers, and programs, are NOT "smart". Compared to most humans, computers can be considered rather poor in intellect. They can only do very, very simple things. Arithmetic, comparisons, actions based on values, and such are the very limited set of actions that they can perform. Computer programs eventually are split apart into these very simple "machine" instructions. BUT, computers can do these simple instructions very, very fast (and getting faster every year). This gives the illusion of great ability by the computer.

     Humans make mistakes. Computers (or their hardware and software) can make mistakes. But computers can make many more mistakes, much faster, than humans because their basic strength is speed. Plus, the more humans rely upon computers (and their hardware/software) the greater the difficulty of correcting errors because many people erroneously think that computers "cannot make mistakes". Even worse, humans are often not allowed the ability to override the decisions that come out of computer programs.

     This is particularly relevant in many of the areas of AI. People do a poor job of facial recognition. Computers can do a better job but not a perfect job. If drones are given a locate (and, perhaps, violent action) instruction autonomously then they will likely sometimes pick the wrong person. The situation is, of course, even worse if computers get to make the decisions about all-out war (such as in "WarGames", "Terminator", or many other apocalyptic films). Politicians are quite at ease in having "collateral damage" but if you, or someone you love, is part of that collateral damage then you would probably not feel as calm about it. If facial recognition is done as part of security then "doppelgangers" or others who look similar to those on no-fly lists (or other lists) will forever be fighting for their right to exist.

     Computers, their programs, and AI can be of great benefit to humans. But they should always be subordinate to humans because humans are the ones that suffer from any mistakes made. Recognition of the fallibility of computers is very important and self-regulation within AI programs is essential to reduce the escalation of errors.

Thursday, February 22, 2024

The Shark Must Move: Investment for the Future


     A shark breathes by having the water flow past its gills so that it can extract oxygen. If it stops moving, fresh oxygenated water does not continue to flow past its gills and it dies. Unless outside factors are supporting them, companies, groups, and individuals must keep moving or they will not prosper. Perhaps they will linger but they will not thrive. 

     Once a company goes public, there are strong expectations. They come from analysts, from stockholders, from a board of directors. Very few of them have the benefit of a long-range view. Tell them that you'll have something amazing in three or four years and they may say "that's nice, what do you have for me now?" Thus, the pressure is steady to accomplish within the short term -- a quarterly viewpoint.

     Within a family, the range of returns stretches out a bit. As long as there is a known goal to be approached and some type of progress, towards that goal, which can be recognized then all is smooth on the front.

    We have three aspects of preparing for the future. First is to understand the goal we wish to approach. Note that a goal must include the definition of a journey but it does not always include a requirement to reach the destination. Perhaps it is an iterative goal and process. Second, we need to commit to doing what is needed to move towards that goal. Third, we need planned activity and movement to go towards that goal. Definition, commitment, movement.

     There is a dream. It may not be reached but it is a direction. My company died because we had no direction of growth beyond providing the best current product and support that we could. That is NOT a bad "mission statement" but it doesn't keep the shark moving. As our market shrank so did the company.

     Doing what you currently do is great for the day to day -- until it isn't. Sometimes a company, or individual, will succeed in being able to just continue to provide small improvements and, if the market remains strong and competition weak, that can keep the shark moving enough to stay alive. But, if the market changes or the competition becomes stronger and better funded, it is not enough and you can only look forward to dwindling results.

     Assume you have decided upon that goal. First living person on Mars perhaps? Are you willing to commit? Will you allocate enough resources to realistically move towards the goal? Resources may include time, money, labor, thought, negotiations, etc. A company may succeed in deciding upon a goal but be so restricted in outlook towards the next quarter's profit and production that there is no true commitment toward any long-term goal. 

     Sometimes people find themselves devoting all their time and energy to staying alive. Just like providing a great product and support, staying alive is devotion to the present and it is NOT a bad thing. But it won't move you towards your goal. "Success stories" are often about people who commit more than they thought they had in order to move towards a goal. Night classes? Writing that great novel in the hour before sunrise each day? Finding a job that allows study during the time one is watching over an incoming queue?

     Commitment is recognition that the future cannot be ignored.

     Having decided upon the direction (recognizing that you might not achieve the final goal) and committing to do what is needed to move towards that goal, you now have to move. Further education? Active networking? Setting aside income for savings? Reducing quarterly dividends to allow for more research and development? Possibly (but not as a first step) your commitment and belief is strong enough that you decide to hasten the accumulation of money by taking out a loan. Many businesses have begun with an extra mortgage on a house (or of the founder's parents' house).

     And if you achieve that goal? Start the process over again. Keep the shark moving.

Wednesday, February 14, 2024

Buy One Get One: bulk orders and changes in size

 

     I don't know how in the world the US government gets their inflation numbers, but our grocery bills have gone up about 35% since pre-pandemic period. No changes in menu, number of people eating, or any such thing. This is all anecdotal -- it is a reflection of MY experience and it doesn't necessarily apply to anyone else (but it probably does).

     It does seem that the increases have been erratic and different between items. Russet potatoes went from $0.99/pound to $1.29/pound and have recently come back down (hurrah) to $1.19/pound. The cost of most oils went up quite a bit. I don't restock cooking oils that often so my memory is more likely to be fuzzy but I believe they went up more than 50% and possibly as much as 75%. One might say -- who cares about oil -- but I think that the increase in the cost of cooking oils has influenced processed food prices quite a lot as processing foods often requires cooking oil.

     On to processed foods, including the ultraprocessed  ("junk") foods. We try to keep those limited but they certainly are still a part of our diet. In one way, the increased prices of those is "good" as it gives incentive to not eat them or, at least, reduce the amount they are in our diets. One famous brand of stackable potato chips went from $1.69 pre-pandemic to $2.65 post-pandemic (a 57% increase). A carton of 12 cans of soda went from $5.25 to $9.25 (a 75% increase). One of my sons, who goes grocery shopping with me, is currently attempting to wean himself off of canned sodas -- in part due to the prices.

     I do not have the background knowledge on the reasons for the various price increases. I can read media articles but I haven't read anything that goes back to source material. So, I can only go according to the reports of "supply chain" failures and lack of people for various manual needs in food production. Certainly, a small amount of it has been due to reducing the underpayment of various people doing those manual needs in order to entice them to come back to work (before it was safe to do such, in some cases). But the source of the majority of the increases is a mystery to me. But, let's say they were real and caused the "laws" of supply and demand to move those prices up.

     Besides that ten cents per pound decrease in the price of potatoes, I cannot think of many items that have gone back down. Can you? Whatever causes there were to increase prices are likely gone, correct? There is at least one case in Europe right now that is confronting a food producer to give justifications for their continued price increase. At least the prices seem to have stopped their rampant growth though I noticed, yesterday, an additional ten cents per pound for ground beef (about a 2% increase -- insignificant though still raising questions).

     One method that grocery stores and food producers use to avoid reductions in prices is SALES. Rotate the sale prices among the various items. Some sale prices are as much as 60% off. A local grocery store often has a 'buy two, get two "free"' sale on cartons of soda. Lately, it has been 'buy two, get three "free"'. Snack chips have 40% sales and 'buy one, get one "free"'. These heavy discounts do two major things. They give the illusion that prices have gone back down (without officially causing "deflation") and they keep the demand moving which would normally decrease with the increases of prices.

     Another thing that food producers can do to influence perception is to change amounts per item or general size. Prices I have a reasonable chance of remembering and comparing (recognizing that memory is one of the most precarious things). But, do you remember how many ounces of potato chips were in that bag three years ago? I sure don't. I am pretty sure the sizes of the sack containers have decreased but that does not necessarily mean the amount inside of the sacks has reduced.

     In our economic, and political, situation in the US there isn't much that can be done about increased prices. We live with them and hope that prices aren't raised again for several years -- when general inflation has moved up enough to justify current prices. In the meantime, bulk purchases and sales discounts and bundling are our best bets to stretch the food budget.


Wednesday, February 7, 2024

Gifted: The double-edged sword

 

     Most school districts in the United States have something called a "gifted" program. My wife (a lifelong teacher) was the head of the gifted program for one school district in California once upon a time. I suspect that other countries have programs that are similar but I am ignorant of them. (I would be happy to hear about them.)

     In the case of the school districts, "gifted" meant the ability to excel on various academic tests and within school performance. The word "gifted" can certainly be used for other attributes -- and should be (perhaps more often than it is). Gifted as an athlete. Gifted as a musician whose instrument is played as if it was a part of their body. Gifted as a generous, giving, person. Gifted in external beauty. Name any attribute that is celebrated by a society, or some subset of society, and there will be people who will be considered as "gifted" within that group.

     The advantage in being identified as gifted (in whatever area you are being classified) is to be able to get, or qualify for getting, additional incentives and training within your focal strength. In the case of academic giftedness, it means classes that challenge a person more and prepare the people for more advanced courses quicker. In the case of athletic giftedness, it may mean additional coaching, trophies, scholarships, and being higher on recruiters examination roles.

      Most high schools (in the US) have yearbooks. There is often a section called "Most likely to" which apply to the various "gifted" categories as well as ones considered humorous (or insulting). When the yearbook is passed around to classmates for signatures and notes, any additional mention often focuses on the area of giftedness -- acknowledging what continues to be acknowledged.

     There is nothing wrong about appreciating, and encouraging, strong "gifted" foci. But, it can be a burden to be ONLY acknowledged for that "gifted" quality. Everyone has gifts. Everyone has areas in which they most need to improve. Sometimes a person will be gifted in more than one area but there is a hierarchy of recognition of gifts. A person considered beautiful according to societal norms may ALSO be very intelligent, very caring, and very empathetic. They may struggle throughout their life to have acknowledgement of those non-praised aspects. "Blonde jokes" are not only hurtful but may also be self-determining.

     This focus on the area of "gifted" qualities can become a burden if it goes out of balance. In fact, it is possible to push a person to the "burnout" stage if perfection becomes the goal and assumption. A talented child may lose all interest in sports after having been pushed to never fail. In academics, there can never be a mistake -- one missed question is a catastrophe (and has been known to even lead to suicide). And, if that focus doesn't have any balance, what happens when a talented young athlete -- who has focused on their sport all of their life -- has a severe compound fracture which cannot be set correctly?

     If a particular "gifted" quality is acknowledged and praised, then the person with that characteristic may find it difficult to be treated as a whole person. "Brainy" children may be ostracized from the other children and have a difficulty in being able to learn to socialize due to lack of opportunity. While a person judged beautiful by societal norms may also have other internal attributes that are more useful to a full, active, life -- the societal pressures to accentuate that "gift" may push them away from becoming better integrated into society. Likewise, the high school football star may be fantastic in business classes and practical application and also have a great ability to do well academically -- but the pressure to perform within their gift can make it harder for them to achieve, or be acknowledged for, balance in their lives.

     These "shadow aspects" of the "gifted" have a strong need to be acknowledged and nurtured in order to have a good, balanced life -- especially for those whose "gift" may decrease over the years (for example, physical prowess or external beauty).

Friday, February 2, 2024

Imposter Syndrome: Preemptive sabotage

 

     I did a blog on Imposter Syndrome last year. But that blog was more about what it is and how you can work with it and overcome it. There are other aspects of that feeling. One. in particular, is what I would call "preemptive sabotage". If you aren't comfortable feeling like you are suited for a role then why not demonstrate that lack of ability? You aren't an imposter if you really aren't able to do it and shouldn't do it, right?

     This aspect is closely related to "Fear of Success". The rationales are slightly different but the methods of achievement are quite parallel.

     Not up to it physically? Oversleep those important meetings. When I was growing up, I didn't enjoy being home very much and school was my escape (I know that, for most people, it works the other way). Somehow, I was never able to convince the schools or the teachers to open up on holidays and weekends but I did succeed in having my major childhood illnesses during school holidays quite a bit more than statistically likely. I didn't want to be home, didn't want to be physically available to do things (other than reading and watching cartoons), and so I was sick. Mumps? Measles? Chicken Pox? Stomach Flus? Usually, they occurred during winter/Christmas break so that I had time to get sick and get better before school resumed though I certainly succeeded in phasing out of spring breaks also. Single day holidays were usually safe as there was a danger that I wouldn't be well enough to return to school if I got sick.

     Of course, there are also those self-inflicted aspects which help one not to show up. One can get into all kinds of reasons behind them but stage fright is certainly one way to do it. You've prepared for the possibilities, memorized your lines, perhaps memorized all of the lines such that you could play any role and, ready to step onto the stage, you can't make that first step. You've walked up and down that stairway hundreds of times but, on the way to a presentation, you miss a step, pull a tendon, and limp into that meeting 20 minutes late.

     Just completed a complex assignment? Worked extra hard on it? Had to skip meals and your twentieth anniversary? Ah, just tell people around you -- especially any manager or supervisor -- "it was nothing". And they'll believe you. That doesn't mean you need to go overboard the other direction and puff yourself up until you reach true blowhard state. But, if you don't claim credit for your work there will likely be someone else who will step up and claim it for themself.

     Don't allow enough time for potential problems with traffic and you WILL sometimes miss those important meetings. In some cultures, it is worse to be early than to be late -- but you can always sit in the car, take an extra trip to the restroom, get a drink of water, or otherwise use any time that turns out to be unneeded.

     In many activities, meditative practice can be useful. Sit, imagine you are walking up to the plate and hitting the ball. Be in that meeting room showing those slides and not finding the ones you want. Practice recovering from mistakes and problems as much as you prepare for presenting it correctly. There may be a point at which you are tired of rehearsing but it is unlikely you shall ever reach a point where you have over-rehearsed.

     There are many ways to prepare. Failing to do such is also preparation -- preparation to not succeed. Follow through on what you have as your goal and dream.

Friday, January 26, 2024

Mission Impossible: concurrent multitasking for individuals

 

    There is always a temptation to try to work on more than one thing at a time. Back in the long ago, it might take a half hour or more for a compilation of a program to complete. During that time, one did something else -- possibly even something useful (isn't playing games useful?). But that was not truly multitasking. I was either working on the program, submitting it, playing games, or testing the program after it completed compilation. Compiling was a background task with which there was no active effort.

     It is certainly possible to work on more than one thing at a time if you have a group of people. One plus one does not quite equal two -- but it certainly allows more to be done. Reduction of need to interact helps the efficiency (but not necessarily the quality). As is seen in "The Mythical Man Month", projects can easily reach a size and complexity such that adding additional people is counter-productive. Teams can still add productivity if each task is delineated sufficiently. Consider a building crew for a house. Two people working together on framing, one person doing insulation, one person doing materials preparation. When tasks can be cleanly split, productivity from multitasking reaches its best -- but each person (or "processor") is working as an individual unit.

     As implied, a computer can make use of multiple processors -- or cores -- to allow simultaneous task performance. Four core, six core, eight core (usually in multiples of two -- does anyone know why?). Once again, the scheduling of software being performed must be coordinated between the cores.

     This blog, however, is primarily aimed at multitasking for individuals (or, more properly, the inability for a person to multitask). It may be easier to explain this by going outside of ourselves and use the single processor system as an example. A processor is moving along, performing the actions required by a particular program. Then, for whatever reason (operating systems, timers, and scheduling algorithms are not current topics), another program needs to be performed. The processor (actually part of yet another program called the operating system -- or done explicitly by each program) needs to "write down" current "context". This context is an image of the current situation at the time of moving to the other program. What is the next line of code to be executed, what are all current (temporary and permanent) results from the program -- all these need to be written down so that, when processing resumes on the current program, it can continue as if nothing had ever interrupted.

     This process -- context switching -- has a certain fixed amount of time needed. So, if you are doing two programs in "multitasking" (true but not concurrent) then there will be time in program one, time to store context, time in program two, store context, restore program one context, time in program one, ... The more often that programs need to swap with each other, the greater the percentage overhead -- this infers that the more programs being swapped between, the smaller active time available per task and the greater percentage overhead.  For example, if dividing up 150 secs of activity between two tasks:

2 tasks present; 75 secs task 1, 25 secs save and swap, 75 secs task2, 25 secs save/swap: 25% overhead

10 tasks present; 15 secs task 1, 25 secs save/swap, 15 secs task2, 25 secs save/swap, 15 secs task3,
25 secs save/swap, 25 secs save/swap, 15 secs task4, 25 secs save/swap, ... : 25/40 = 62.5% overhead

These numbers are greatly simplified (probably would be using nanoseconds, for example) but the principle holds -- the more tasks, the greater the overhead. Note that storage and retrieval of context requires space in addition to time. Too many tasks, too little resources and you have a system unable to do useful work.

     Although there appears to be some similarities between computers and the way humans process information (after all, we did design them) -- we are not the same. We probably do processes much differently than a computer. But the effects can be the same.

     Note that humans are able to walk and chew gum. We can listen to music while writing a letter. This is because different activities use different parts of the brain. In this manner, we have the equivalent of multiple processors -- however, these are not separate general processors -- they are very task-specific processors.

     We can only do one similar category of thing at a time. We can have laundry washing in the background, or a loaf of bread in the oven but those are not tasks in which we are currently active (once we reach the point of taking the laundry out, we are now active again). When we change tasks, we need to keep track of "just what we were doing" at the point of time we changed tasks in order to resume a task. The more tasks we switch between, the less time we have to do each task because of overhead.

     How do we save the context when changing tasks? The process is a statistical curve (maybe not quite a standard bell curve but still ...). When we are young and we get distracted, we may never get back to the original task (which might be the point of the distraction). As we get older, we learn to store context in medium-term memory (maybe jot a short note in addition) and get back to the original. We get "better" at doing more and more tasks -- but we are still decreasing efficiency with each additional task. At some point, we lose context. We cannot remember well enough to resume a task or a set of tasks. We can start recording the context more fully on paper but then we have to file and retrieve that piece of paper.

     From my 66-year-old point-of-view, that is a problem that gets worse with age. Short and medium-term memory gets more clogged with past events/contexts/swaps and we are less efficient at storing and retrieving those contexts. "Why did I go into the kitchen? Where did I put my keys when my son called asking what was for dinner?" At first it is irritating and then, with active understanding, it becomes humorous. I have learned that not everything MUST be remembered -- and that has reduced stress. But it will probably get worse.

     But I am in good company as the ability to task swap is a matter of degree but the challenges exist for all.

Wednesday, January 17, 2024

Microbiomes: The middle road to understanding


     "Western" medicine is excellent on the mechanics of the body -- replacing a hip, changing out a lens for the eye, insert a replacement cochlea, add an artificial replacement limb, and so forth. When it comes to working with the biochemistry of the body, it is more hit-and-miss and many discoveries have been serendipitous findings rather than the tail end of a focused search. Some aspects of interactions have been investigated and understood but, still, more from the point of view of mechanics -- receptacle points, chemical reactions, and enzymal subsystems.

     "Eastern" medicine has a longer history of treating the body, mind, and functioning as a holistic system -- recognizing that the mind (whatever that is), soul (whatever that is), and body all interact to make us live, react, and process life the way that we do. There is a tradition of building up longtime knowledge of the effects of various herbs, foods, and other substances with the way the body works. The manner in which such substances interact with the body are subsumed into spiritual, and traditional, teachings which often use words without specific "western" definitions.

     All approaches, and knowledge, can be of benefit.  Divisions between areas of knowledge are only useful for classification. "Western" medicine is becoming more interesting in energy aspects, such as chi. "Eastern" medicine is more allowing of the uses for engineering aspects of treating bodily ailments. Perhaps at some point, no divisions or classification will shine out.

     One area of development which is more in the "middle" of such development is research into the microbiomes of the body. As a recent "Gates Notes" blog indicated, a large percentage of our bodies are actually cells not directly part of the body. Most of such are beneficial and many are symbiotic. His blog emphasizes uses of pro- and prebiotics to help the microbiome in their tasks and provide a better symbiosis with the body. There are also books on microbiomes, a major tome of which is "I Contain Multitudes" by Ed Yong.

     Gates' blog emphasizes the role of the microbiome as applied to treatment of malnutrition. My own older blog on microbiomes (perhaps of interest to read, from May 25, 2015) covers some of the various important microbiomes -- including that on the skin which can provide a first point of defense for intruders into the body as well as the effect of the microbiomes may play with systemic diseases such as diabetes and asthma.

     I point out that a large problem with the transformation of our microbiomes is that manipulation can be very difficult to achieve. The upper digestive tract -- which should be the most direct route to the environment of the lower digestive tract, acting as a home for a major microbiome -- destroys most of any pre- or probiotics that try to enter via the route of ingestion. Substitution from the other direction (not necessarily for the squeamish) works much better but has the problem of being more of a "mallet" approach where it is not a manipulation but, rather, an attempt to conquer the old microbiome by a newer, hopefully healthier, microbiome.

     One side-effect of being aware of the existence of the microbiomes on, and inside, our bodies is that it has become much more obvious that it is wrong to say "kill all the bacteria" or "stop all the viruses". In fact, although antibiotics have been literal lifesavers to prevent major harm from bad bacteria, antibiotics may also contribute to more general illnesses (diabetes, arthritis, cancer, etc.) as a result of killing off helpful good bacteria.

     Note that there will never be ONE general population of microbiomes because each environment supports different situations. Someone who has rice as their principal food will support different creatures in their microbiome than someone who has wheat bread as their dominant food (and even different for those who have cola soft drinks as THEIR dominant calorie intake). A person living in near constant heat will have a different skin microbiome than someone who lives (or, with climate change, lived) in sub-freezing temperatures year-round.

     One of our challenges in this investigation is inventories. Just what viruses and bacteria are present in, and on, our bodies? What are their side-effects from living? What helps the good ones (which are typically dominant)? What hinders the specific bad ones (the methods we have do not discriminate well between beneficial inhabitants and those that are bad). We are aware that traditional uses of antibiotics are losing their usefulness as bacteria adapt to resist the medicines. Knowledge of our microbiomes will require us to understand well enough to tailor medicines against specific creatures rather than "all" creatures.

     After inventories -- knowing what, and what they do -- we need methods of manipulation. Pre-, pro-, and regular biotics might be encapsulated such that they only become vulnerable after the digestive process has completed its task of breaking down food into components. "Good" cultures could be maintained outside of the body and used to supplement existing microbiomes.

     Whatever emphasis is taken, our bodies ARE indeed houses for many. Some religious scriptures may refer to our bodies as temples and occupational spaces for the gods. We need good neighbors within our bodies as well as in the outside world between people. Finding methods of understanding these interior worlds of interacting cells may be as difficult as understanding of the outer world but both can be of great benefit and worth the effort;

Friday, January 12, 2024

First Do No Harm: ethics of morality, amorality, and immorality

 

     "First Do No Harm" (FDNH) is an adage that applies to the medical profession as well as variants of which as used by a high-tech company as their stated guideline. While this seems quite straight-forward, it rarely is. However, it is an ethics question that should be "top of mind" with every analysis of action, decision, or behavior.

     In the case of medicine, most treatments have possible, even likely, side-effects. It is always a judgement call as to whether the medicine, or treatment, is likely to do more good than harm. Both the effects of the treatment and the potential side-effects are on various sliding scales. Effects range from Very unlikely to help to almost guaranteed to help. Potential side-effects move along the sliding scale from unnoticeable to greatly affecting the quality of life. The side-effects have an additional sliding scale which ranges from extremely unlikely to occur to almost guaranteed to happen.

     Even if these sliding scales can be adequately defined, it is impossible to put any situation on a specific point on all of the scales. Effects, side-effects, and likelihoods are going to vary from person to person. Additionally, the importance of those effects, side-effects and likelihoods are going to vary also. Because this is true, FDNH evaluation often should be relegated back to the patient, rather than remaining solely with the doctor. For cases where (in the weighted opinion of the doctor) the positive effects are likely to be small and the potential negative side-effects have a large chance to be significant and quite harmful it makes no sense for the patient to not have the last word.

     Moral behavior for an individual or business must include the aspects of doing the best that one can -- with the least negative and the most positive. Actions should also be equivalent for all people, groups, and circumstances. For instance, if a salary freeze is in effect for a company then it should apply to all -- from the CEO or President to maintenance staff. I remember one year, in a large research company, when a salary freeze was put into effect -- but executive staff was given a 25% increase. Immoral and highly destructive to company morale and productivity.

     Evaluation of morality of actions and behavior always fall into a subjective arena. There will always be trade-offs and a variety of formulas of actions that affect different people in various ways. Sometimes, companies will allow the employees to choose between possibilities. One company presented the options of compensating for a lowered income stream by laying off a large number of people or taking a pay cut for all employees. When given explicit choices, I am proud to relay that the decision which is supportive of the most people is most often taken.

     What about immoral, or "bad",  behavior and actions? Usually immoral behaviors end up counter-productive and do not survive -- as long as such behavior is visible and subjected to external response and criticism. It does not make business sense to do "bad" things deliberately if the result is negative to the bottom line. In addition, doing "bad" things deliberately reduces corporate competitiveness. It is one of the major reasons why corporations are typically supportive of inclusive policies. Inclusive policies increase the customer base, increase the intellectual and physical pool of employees, and (not quite universally) adds to the corporate image. Non-inclusive, or discriminatory, policies do the opposite.

     One multinational megacorporation formalizes the trade-offs between positive effects and negative effects. When they set up their spread sheets for production and marketing, there are spots, in the rows and columns, for potential deaths and how much such deaths (or significant health effects) will cost in terms of legal costs, settlements, and PR hits that might affect the bottom line.  The company does not particularly WANT bad effects but they are very tolerant of living with such if profits are maintained -- they follow the definition of an amoral corporation. They are often at the target end of boycotts but the percentage market hit of those participating in the boycotts are also weighed into the spread sheet. Alas, not surprisingly, the spreadsheets often indicate that they can make more profit by allowing avoidable side-effects. And the boycotts have only moral value and do not significantly affect profits.

     Moral -- trying to do the most good and the least harm. Amoral -- trying to personally benefit the most while taking into account the repercussions of negative behavior; the negative behavior can be evaluated as excessive by outside forces. Immoral -- doing the most harm in spite of negative consequences.

     While these aspects of ethics can certainly be viewed in other manners, these are one set of usable criteria.



Wednesday, January 3, 2024

On a Pedestal: Setting them up to fail

 

     It is good to thank people and to appreciate what they do. It is also good to admire people based upon their past words and actions. But taking that admiration and using it to require a person to achieve, and maintain, perfection is very bad for everyone.

     Everyone has their own definition of perfection. The first definition of perfect in some dictionaries is "complete". Something that is perfect is complete. While I can certainly see that definition as possible, it is my opinion that most people define perfect as being without a flaw. Perhaps it is possible to have a flawless diamond, or a flawless speech but, once again, we are back to the individual definitions of "flaw". Maybe that diamond is flawless according to the evaluation lists for a gemologist but some person expects to be able to see a "Pink Panther" within the gem and, otherwise, it is not perfect.

     When people set a goal for themselves of perfection -- and do not give themselves credit for what they achieve short of that goal -- then it is an endless cycle of labors. Fear of mistakes and imperfection can cause additional hurdles to overcome even when doing what would normally be considered an easy job.

     I remember one time while driving in Boulder, Colorado, I was driving behind a car driving about 10 mph under the speed limit (which is actually illegal in most states). We reached the corner and the separated right turn lane had a "Yield" warning sign by it. But there was almost no traffic -- certainly nothing to yield for. As I examined the situation noticing the lack of traffic, I bumped into the car ahead of me which had stopped at the "Yield" sign. There were no injuries to either person or automobile as I was moving at about 3 or 4 mph. But it certainly shocked me as well as the driver ahead of me. They came out of the car frantic to see what damage had been done (luckily none had been). It turned out that he had borrowed the car from his girlfriend and he was driving "extra careful" to make sure there were no problems. That "extra careful" meant he was exceeding safety margins and creating a more dangerous situation rather than a safer one.

     People are humans. They make mistakes. They are meant to make mistakes from which they can learn. In the technical arena, learning within ML or AI is a process of recognizing mistakes and refining knowledge until mistakes become less and less.

     As in the movie "White Christmas", putting someone on a pedestal (whether they are a "white knight" or not) is what often can happen -- especially in the early stages of a relationship. And that can easily end the relationship even earlier than otherwise (perhaps, otherwise, it might not have even ended). Some day, perhaps, they will invent a Super Glue that allows humans to stay up on top of those pedestals.

     But don't hold your breath for it.

To Waste or to Waist: That is the question

       As is true of many people growing up in the US, I was encouraged to always clean my plate (encouraged is putting it mildly -- I remem...