Wednesday, February 28, 2024

Frankenstein's Monster: AI's shadow

 

     Alan Turing, in 1950, released a paper called "Computing Machinery and Intelligence" while at the University of Manchester. The main gist of the paper was that if the output of a computer could not be distinguished from that of a human being then it must be considered intelligent. This has been a goal of those working on AI for a long time and it is the primary point of general acclamation that we "have achieved AI". Various AI applications can present artwork, prose, dialogs, and other such things that cannot easily be distinguished from that created by a human. In a "blind test" (where the people judging have no pre-knowledge of which output was done by a computer) it would pass judgement.

     No one is saying that AI has reached the goal of achieving human thought. AI is trained (as is true of humans) on lots of input -- lots of data from lots of sources and, iteratively, told what is right or wrong and "learns" how to do things, respond to things, and so forth. Which leads us to two of the current problems with AI. These are intellectual property laws and lack of discrimination.

     Intellectual property conflicts are the most straight-forward. In order to train, lots and lots of information must be fed in. While some of that information is data obtained from various public sources, other information is from private, proprietary, sources and other information that has been created by human minds and may be part of their livelihood as well as their career growth and reputation. That "brushstroke" done by a generative AI may be copied from an artist's work. That "narrative work of imagination" may be borrowing from the insights of dozens of writers who, justifiably, do not want their copyrighted material used without their permission. AI generated code may (and almost definitely will) make use of code that was created both for freeware as well as company proprietary code (that probably was not authorized to even be accessible via the Internet).

     Currently, an AI information gathering product can present bad information. As an AI model is being taught, it is given access to a lot of information. Hopefully, most of this information with be factual and truthful. But some of it will be misinformation (lies), disinformation (logically faulty or irrational), or declared works of fiction. Some of these can be eliminated by human monitors as the model is trained but this just leaves it in the hands of those humans.

     Humans do not have a good track record in doing the research needed to determine truthfulness of information -- they cannot be relied upon to do such for an AI model. It is conceivable that an AI model might be programmed with algorithms that will allow it to compare information, reliability of sources, and logical inconsistencies. They even would have an advantage since they do not (currently) have emotions. But AI models are NOT at that point now -- and it will depend on the accuracy of any such algorithms.

     Isaac Asimov, considered one of the best of the "classic" science fiction authors, used as part of his robot-focused novels an idea called "The Three Laws of Robotics". I'll let people investigate the evolution of the laws, direct quoting of the laws, and the interpretations as discussed within the various stories produced. Let's just say these "laws" are meant to be a "leash" to prevent robots (or AI) from hurting people. And, though the laws work pretty well in the stories, there are almost always loopholes in any "law" and the three laws are no exception.

     But current AI has no such thing as the three laws -- and no practical way in sight to create such laws and to enforce that they will be part of any, and all, AI products. While AI is monitored, and final decisions are made by humans, then the moral (and legal) responsibility for any problems will be in the hands of humans. If unmonitored and autonomous, it is unknown as to where the responsibility would lie. There are presently court cases struggling to delineate such matters in the relatively straight-forward arena of self-driving cars.

     Computers, and programs, are NOT "smart". Compared to most humans, computers can be considered rather poor in intellect. They can only do very, very simple things. Arithmetic, comparisons, actions based on values, and such are the very limited set of actions that they can perform. Computer programs eventually are split apart into these very simple "machine" instructions. BUT, computers can do these simple instructions very, very fast (and getting faster every year). This gives the illusion of great ability by the computer.

     Humans make mistakes. Computers (or their hardware and software) can make mistakes. But computers can make many more mistakes, much faster, than humans because their basic strength is speed. Plus, the more humans rely upon computers (and their hardware/software) the greater the difficulty of correcting errors because many people erroneously think that computers "cannot make mistakes". Even worse, humans are often not allowed the ability to override the decisions that come out of computer programs.

     This is particularly relevant in many of the areas of AI. People do a poor job of facial recognition. Computers can do a better job but not a perfect job. If drones are given a locate (and, perhaps, violent action) instruction autonomously then they will likely sometimes pick the wrong person. The situation is, of course, even worse if computers get to make the decisions about all-out war (such as in "WarGames", "Terminator", or many other apocalyptic films). Politicians are quite at ease in having "collateral damage" but if you, or someone you love, is part of that collateral damage then you would probably not feel as calm about it. If facial recognition is done as part of security then "doppelgangers" or others who look similar to those on no-fly lists (or other lists) will forever be fighting for their right to exist.

     Computers, their programs, and AI can be of great benefit to humans. But they should always be subordinate to humans because humans are the ones that suffer from any mistakes made. Recognition of the fallibility of computers is very important and self-regulation within AI programs is essential to reduce the escalation of errors.

Thursday, February 22, 2024

The Shark Must Move: Investment for the Future


     A shark breathes by having the water flow past its gills so that it can extract oxygen. If it stops moving, fresh oxygenated water does not continue to flow past its gills and it dies. Unless outside factors are supporting them, companies, groups, and individuals must keep moving or they will not prosper. Perhaps they will linger but they will not thrive. 

     Once a company goes public, there are strong expectations. They come from analysts, from stockholders, from a board of directors. Very few of them have the benefit of a long-range view. Tell them that you'll have something amazing in three or four years and they may say "that's nice, what do you have for me now?" Thus, the pressure is steady to accomplish within the short term -- a quarterly viewpoint.

     Within a family, the range of returns stretches out a bit. As long as there is a known goal to be approached and some type of progress, towards that goal, which can be recognized then all is smooth on the front.

    We have three aspects of preparing for the future. First is to understand the goal we wish to approach. Note that a goal must include the definition of a journey but it does not always include a requirement to reach the destination. Perhaps it is an iterative goal and process. Second, we need to commit to doing what is needed to move towards that goal. Third, we need planned activity and movement to go towards that goal. Definition, commitment, movement.

     There is a dream. It may not be reached but it is a direction. My company died because we had no direction of growth beyond providing the best current product and support that we could. That is NOT a bad "mission statement" but it doesn't keep the shark moving. As our market shrank so did the company.

     Doing what you currently do is great for the day to day -- until it isn't. Sometimes a company, or individual, will succeed in being able to just continue to provide small improvements and, if the market remains strong and competition weak, that can keep the shark moving enough to stay alive. But, if the market changes or the competition becomes stronger and better funded, it is not enough and you can only look forward to dwindling results.

     Assume you have decided upon that goal. First living person on Mars perhaps? Are you willing to commit? Will you allocate enough resources to realistically move towards the goal? Resources may include time, money, labor, thought, negotiations, etc. A company may succeed in deciding upon a goal but be so restricted in outlook towards the next quarter's profit and production that there is no true commitment toward any long-term goal. 

     Sometimes people find themselves devoting all their time and energy to staying alive. Just like providing a great product and support, staying alive is devotion to the present and it is NOT a bad thing. But it won't move you towards your goal. "Success stories" are often about people who commit more than they thought they had in order to move towards a goal. Night classes? Writing that great novel in the hour before sunrise each day? Finding a job that allows study during the time one is watching over an incoming queue?

     Commitment is recognition that the future cannot be ignored.

     Having decided upon the direction (recognizing that you might not achieve the final goal) and committing to do what is needed to move towards that goal, you now have to move. Further education? Active networking? Setting aside income for savings? Reducing quarterly dividends to allow for more research and development? Possibly (but not as a first step) your commitment and belief is strong enough that you decide to hasten the accumulation of money by taking out a loan. Many businesses have begun with an extra mortgage on a house (or of the founder's parents' house).

     And if you achieve that goal? Start the process over again. Keep the shark moving.

Wednesday, February 14, 2024

Buy One Get One: bulk orders and changes in size

 

     I don't know how in the world the US government gets their inflation numbers, but our grocery bills have gone up about 35% since pre-pandemic period. No changes in menu, number of people eating, or any such thing. This is all anecdotal -- it is a reflection of MY experience and it doesn't necessarily apply to anyone else (but it probably does).

     It does seem that the increases have been erratic and different between items. Russet potatoes went from $0.99/pound to $1.29/pound and have recently come back down (hurrah) to $1.19/pound. The cost of most oils went up quite a bit. I don't restock cooking oils that often so my memory is more likely to be fuzzy but I believe they went up more than 50% and possibly as much as 75%. One might say -- who cares about oil -- but I think that the increase in the cost of cooking oils has influenced processed food prices quite a lot as processing foods often requires cooking oil.

     On to processed foods, including the ultraprocessed  ("junk") foods. We try to keep those limited but they certainly are still a part of our diet. In one way, the increased prices of those is "good" as it gives incentive to not eat them or, at least, reduce the amount they are in our diets. One famous brand of stackable potato chips went from $1.69 pre-pandemic to $2.65 post-pandemic (a 57% increase). A carton of 12 cans of soda went from $5.25 to $9.25 (a 75% increase). One of my sons, who goes grocery shopping with me, is currently attempting to wean himself off of canned sodas -- in part due to the prices.

     I do not have the background knowledge on the reasons for the various price increases. I can read media articles but I haven't read anything that goes back to source material. So, I can only go according to the reports of "supply chain" failures and lack of people for various manual needs in food production. Certainly, a small amount of it has been due to reducing the underpayment of various people doing those manual needs in order to entice them to come back to work (before it was safe to do such, in some cases). But the source of the majority of the increases is a mystery to me. But, let's say they were real and caused the "laws" of supply and demand to move those prices up.

     Besides that ten cents per pound decrease in the price of potatoes, I cannot think of many items that have gone back down. Can you? Whatever causes there were to increase prices are likely gone, correct? There is at least one case in Europe right now that is confronting a food producer to give justifications for their continued price increase. At least the prices seem to have stopped their rampant growth though I noticed, yesterday, an additional ten cents per pound for ground beef (about a 2% increase -- insignificant though still raising questions).

     One method that grocery stores and food producers use to avoid reductions in prices is SALES. Rotate the sale prices among the various items. Some sale prices are as much as 60% off. A local grocery store often has a 'buy two, get two "free"' sale on cartons of soda. Lately, it has been 'buy two, get three "free"'. Snack chips have 40% sales and 'buy one, get one "free"'. These heavy discounts do two major things. They give the illusion that prices have gone back down (without officially causing "deflation") and they keep the demand moving which would normally decrease with the increases of prices.

     Another thing that food producers can do to influence perception is to change amounts per item or general size. Prices I have a reasonable chance of remembering and comparing (recognizing that memory is one of the most precarious things). But, do you remember how many ounces of potato chips were in that bag three years ago? I sure don't. I am pretty sure the sizes of the sack containers have decreased but that does not necessarily mean the amount inside of the sacks has reduced.

     In our economic, and political, situation in the US there isn't much that can be done about increased prices. We live with them and hope that prices aren't raised again for several years -- when general inflation has moved up enough to justify current prices. In the meantime, bulk purchases and sales discounts and bundling are our best bets to stretch the food budget.


Wednesday, February 7, 2024

Gifted: The double-edged sword

 

     Most school districts in the United States have something called a "gifted" program. My wife (a lifelong teacher) was the head of the gifted program for one school district in California once upon a time. I suspect that other countries have programs that are similar but I am ignorant of them. (I would be happy to hear about them.)

     In the case of the school districts, "gifted" meant the ability to excel on various academic tests and within school performance. The word "gifted" can certainly be used for other attributes -- and should be (perhaps more often than it is). Gifted as an athlete. Gifted as a musician whose instrument is played as if it was a part of their body. Gifted as a generous, giving, person. Gifted in external beauty. Name any attribute that is celebrated by a society, or some subset of society, and there will be people who will be considered as "gifted" within that group.

     The advantage in being identified as gifted (in whatever area you are being classified) is to be able to get, or qualify for getting, additional incentives and training within your focal strength. In the case of academic giftedness, it means classes that challenge a person more and prepare the people for more advanced courses quicker. In the case of athletic giftedness, it may mean additional coaching, trophies, scholarships, and being higher on recruiters examination roles.

      Most high schools (in the US) have yearbooks. There is often a section called "Most likely to" which apply to the various "gifted" categories as well as ones considered humorous (or insulting). When the yearbook is passed around to classmates for signatures and notes, any additional mention often focuses on the area of giftedness -- acknowledging what continues to be acknowledged.

     There is nothing wrong about appreciating, and encouraging, strong "gifted" foci. But, it can be a burden to be ONLY acknowledged for that "gifted" quality. Everyone has gifts. Everyone has areas in which they most need to improve. Sometimes a person will be gifted in more than one area but there is a hierarchy of recognition of gifts. A person considered beautiful according to societal norms may ALSO be very intelligent, very caring, and very empathetic. They may struggle throughout their life to have acknowledgement of those non-praised aspects. "Blonde jokes" are not only hurtful but may also be self-determining.

     This focus on the area of "gifted" qualities can become a burden if it goes out of balance. In fact, it is possible to push a person to the "burnout" stage if perfection becomes the goal and assumption. A talented child may lose all interest in sports after having been pushed to never fail. In academics, there can never be a mistake -- one missed question is a catastrophe (and has been known to even lead to suicide). And, if that focus doesn't have any balance, what happens when a talented young athlete -- who has focused on their sport all of their life -- has a severe compound fracture which cannot be set correctly?

     If a particular "gifted" quality is acknowledged and praised, then the person with that characteristic may find it difficult to be treated as a whole person. "Brainy" children may be ostracized from the other children and have a difficulty in being able to learn to socialize due to lack of opportunity. While a person judged beautiful by societal norms may also have other internal attributes that are more useful to a full, active, life -- the societal pressures to accentuate that "gift" may push them away from becoming better integrated into society. Likewise, the high school football star may be fantastic in business classes and practical application and also have a great ability to do well academically -- but the pressure to perform within their gift can make it harder for them to achieve, or be acknowledged for, balance in their lives.

     These "shadow aspects" of the "gifted" have a strong need to be acknowledged and nurtured in order to have a good, balanced life -- especially for those whose "gift" may decrease over the years (for example, physical prowess or external beauty).

Friday, February 2, 2024

Imposter Syndrome: Preemptive sabotage

 

     I did a blog on Imposter Syndrome last year. But that blog was more about what it is and how you can work with it and overcome it. There are other aspects of that feeling. One. in particular, is what I would call "preemptive sabotage". If you aren't comfortable feeling like you are suited for a role then why not demonstrate that lack of ability? You aren't an imposter if you really aren't able to do it and shouldn't do it, right?

     This aspect is closely related to "Fear of Success". The rationales are slightly different but the methods of achievement are quite parallel.

     Not up to it physically? Oversleep those important meetings. When I was growing up, I didn't enjoy being home very much and school was my escape (I know that, for most people, it works the other way). Somehow, I was never able to convince the schools or the teachers to open up on holidays and weekends but I did succeed in having my major childhood illnesses during school holidays quite a bit more than statistically likely. I didn't want to be home, didn't want to be physically available to do things (other than reading and watching cartoons), and so I was sick. Mumps? Measles? Chicken Pox? Stomach Flus? Usually, they occurred during winter/Christmas break so that I had time to get sick and get better before school resumed though I certainly succeeded in phasing out of spring breaks also. Single day holidays were usually safe as there was a danger that I wouldn't be well enough to return to school if I got sick.

     Of course, there are also those self-inflicted aspects which help one not to show up. One can get into all kinds of reasons behind them but stage fright is certainly one way to do it. You've prepared for the possibilities, memorized your lines, perhaps memorized all of the lines such that you could play any role and, ready to step onto the stage, you can't make that first step. You've walked up and down that stairway hundreds of times but, on the way to a presentation, you miss a step, pull a tendon, and limp into that meeting 20 minutes late.

     Just completed a complex assignment? Worked extra hard on it? Had to skip meals and your twentieth anniversary? Ah, just tell people around you -- especially any manager or supervisor -- "it was nothing". And they'll believe you. That doesn't mean you need to go overboard the other direction and puff yourself up until you reach true blowhard state. But, if you don't claim credit for your work there will likely be someone else who will step up and claim it for themself.

     Don't allow enough time for potential problems with traffic and you WILL sometimes miss those important meetings. In some cultures, it is worse to be early than to be late -- but you can always sit in the car, take an extra trip to the restroom, get a drink of water, or otherwise use any time that turns out to be unneeded.

     In many activities, meditative practice can be useful. Sit, imagine you are walking up to the plate and hitting the ball. Be in that meeting room showing those slides and not finding the ones you want. Practice recovering from mistakes and problems as much as you prepare for presenting it correctly. There may be a point at which you are tired of rehearsing but it is unlikely you shall ever reach a point where you have over-rehearsed.

     There are many ways to prepare. Failing to do such is also preparation -- preparation to not succeed. Follow through on what you have as your goal and dream.

Friday, January 26, 2024

Mission Impossible: concurrent multitasking for individuals

 

    There is always a temptation to try to work on more than one thing at a time. Back in the long ago, it might take a half hour or more for a compilation of a program to complete. During that time, one did something else -- possibly even something useful (isn't playing games useful?). But that was not truly multitasking. I was either working on the program, submitting it, playing games, or testing the program after it completed compilation. Compiling was a background task with which there was no active effort.

     It is certainly possible to work on more than one thing at a time if you have a group of people. One plus one does not quite equal two -- but it certainly allows more to be done. Reduction of need to interact helps the efficiency (but not necessarily the quality). As is seen in "The Mythical Man Month", projects can easily reach a size and complexity such that adding additional people is counter-productive. Teams can still add productivity if each task is delineated sufficiently. Consider a building crew for a house. Two people working together on framing, one person doing insulation, one person doing materials preparation. When tasks can be cleanly split, productivity from multitasking reaches its best -- but each person (or "processor") is working as an individual unit.

     As implied, a computer can make use of multiple processors -- or cores -- to allow simultaneous task performance. Four core, six core, eight core (usually in multiples of two -- does anyone know why?). Once again, the scheduling of software being performed must be coordinated between the cores.

     This blog, however, is primarily aimed at multitasking for individuals (or, more properly, the inability for a person to multitask). It may be easier to explain this by going outside of ourselves and use the single processor system as an example. A processor is moving along, performing the actions required by a particular program. Then, for whatever reason (operating systems, timers, and scheduling algorithms are not current topics), another program needs to be performed. The processor (actually part of yet another program called the operating system -- or done explicitly by each program) needs to "write down" current "context". This context is an image of the current situation at the time of moving to the other program. What is the next line of code to be executed, what are all current (temporary and permanent) results from the program -- all these need to be written down so that, when processing resumes on the current program, it can continue as if nothing had ever interrupted.

     This process -- context switching -- has a certain fixed amount of time needed. So, if you are doing two programs in "multitasking" (true but not concurrent) then there will be time in program one, time to store context, time in program two, store context, restore program one context, time in program one, ... The more often that programs need to swap with each other, the greater the percentage overhead -- this infers that the more programs being swapped between, the smaller active time available per task and the greater percentage overhead.  For example, if dividing up 150 secs of activity between two tasks:

2 tasks present; 75 secs task 1, 25 secs save and swap, 75 secs task2, 25 secs save/swap: 25% overhead

10 tasks present; 15 secs task 1, 25 secs save/swap, 15 secs task2, 25 secs save/swap, 15 secs task3,
25 secs save/swap, 25 secs save/swap, 15 secs task4, 25 secs save/swap, ... : 25/40 = 62.5% overhead

These numbers are greatly simplified (probably would be using nanoseconds, for example) but the principle holds -- the more tasks, the greater the overhead. Note that storage and retrieval of context requires space in addition to time. Too many tasks, too little resources and you have a system unable to do useful work.

     Although there appears to be some similarities between computers and the way humans process information (after all, we did design them) -- we are not the same. We probably do processes much differently than a computer. But the effects can be the same.

     Note that humans are able to walk and chew gum. We can listen to music while writing a letter. This is because different activities use different parts of the brain. In this manner, we have the equivalent of multiple processors -- however, these are not separate general processors -- they are very task-specific processors.

     We can only do one similar category of thing at a time. We can have laundry washing in the background, or a loaf of bread in the oven but those are not tasks in which we are currently active (once we reach the point of taking the laundry out, we are now active again). When we change tasks, we need to keep track of "just what we were doing" at the point of time we changed tasks in order to resume a task. The more tasks we switch between, the less time we have to do each task because of overhead.

     How do we save the context when changing tasks? The process is a statistical curve (maybe not quite a standard bell curve but still ...). When we are young and we get distracted, we may never get back to the original task (which might be the point of the distraction). As we get older, we learn to store context in medium-term memory (maybe jot a short note in addition) and get back to the original. We get "better" at doing more and more tasks -- but we are still decreasing efficiency with each additional task. At some point, we lose context. We cannot remember well enough to resume a task or a set of tasks. We can start recording the context more fully on paper but then we have to file and retrieve that piece of paper.

     From my 66-year-old point-of-view, that is a problem that gets worse with age. Short and medium-term memory gets more clogged with past events/contexts/swaps and we are less efficient at storing and retrieving those contexts. "Why did I go into the kitchen? Where did I put my keys when my son called asking what was for dinner?" At first it is irritating and then, with active understanding, it becomes humorous. I have learned that not everything MUST be remembered -- and that has reduced stress. But it will probably get worse.

     But I am in good company as the ability to task swap is a matter of degree but the challenges exist for all.

Wednesday, January 17, 2024

Microbiomes: The middle road to understanding


     "Western" medicine is excellent on the mechanics of the body -- replacing a hip, changing out a lens for the eye, insert a replacement cochlea, add an artificial replacement limb, and so forth. When it comes to working with the biochemistry of the body, it is more hit-and-miss and many discoveries have been serendipitous findings rather than the tail end of a focused search. Some aspects of interactions have been investigated and understood but, still, more from the point of view of mechanics -- receptacle points, chemical reactions, and enzymal subsystems.

     "Eastern" medicine has a longer history of treating the body, mind, and functioning as a holistic system -- recognizing that the mind (whatever that is), soul (whatever that is), and body all interact to make us live, react, and process life the way that we do. There is a tradition of building up longtime knowledge of the effects of various herbs, foods, and other substances with the way the body works. The manner in which such substances interact with the body are subsumed into spiritual, and traditional, teachings which often use words without specific "western" definitions.

     All approaches, and knowledge, can be of benefit.  Divisions between areas of knowledge are only useful for classification. "Western" medicine is becoming more interesting in energy aspects, such as chi. "Eastern" medicine is more allowing of the uses for engineering aspects of treating bodily ailments. Perhaps at some point, no divisions or classification will shine out.

     One area of development which is more in the "middle" of such development is research into the microbiomes of the body. As a recent "Gates Notes" blog indicated, a large percentage of our bodies are actually cells not directly part of the body. Most of such are beneficial and many are symbiotic. His blog emphasizes uses of pro- and prebiotics to help the microbiome in their tasks and provide a better symbiosis with the body. There are also books on microbiomes, a major tome of which is "I Contain Multitudes" by Ed Yong.

     Gates' blog emphasizes the role of the microbiome as applied to treatment of malnutrition. My own older blog on microbiomes (perhaps of interest to read, from May 25, 2015) covers some of the various important microbiomes -- including that on the skin which can provide a first point of defense for intruders into the body as well as the effect of the microbiomes may play with systemic diseases such as diabetes and asthma.

     I point out that a large problem with the transformation of our microbiomes is that manipulation can be very difficult to achieve. The upper digestive tract -- which should be the most direct route to the environment of the lower digestive tract, acting as a home for a major microbiome -- destroys most of any pre- or probiotics that try to enter via the route of ingestion. Substitution from the other direction (not necessarily for the squeamish) works much better but has the problem of being more of a "mallet" approach where it is not a manipulation but, rather, an attempt to conquer the old microbiome by a newer, hopefully healthier, microbiome.

     One side-effect of being aware of the existence of the microbiomes on, and inside, our bodies is that it has become much more obvious that it is wrong to say "kill all the bacteria" or "stop all the viruses". In fact, although antibiotics have been literal lifesavers to prevent major harm from bad bacteria, antibiotics may also contribute to more general illnesses (diabetes, arthritis, cancer, etc.) as a result of killing off helpful good bacteria.

     Note that there will never be ONE general population of microbiomes because each environment supports different situations. Someone who has rice as their principal food will support different creatures in their microbiome than someone who has wheat bread as their dominant food (and even different for those who have cola soft drinks as THEIR dominant calorie intake). A person living in near constant heat will have a different skin microbiome than someone who lives (or, with climate change, lived) in sub-freezing temperatures year-round.

     One of our challenges in this investigation is inventories. Just what viruses and bacteria are present in, and on, our bodies? What are their side-effects from living? What helps the good ones (which are typically dominant)? What hinders the specific bad ones (the methods we have do not discriminate well between beneficial inhabitants and those that are bad). We are aware that traditional uses of antibiotics are losing their usefulness as bacteria adapt to resist the medicines. Knowledge of our microbiomes will require us to understand well enough to tailor medicines against specific creatures rather than "all" creatures.

     After inventories -- knowing what, and what they do -- we need methods of manipulation. Pre-, pro-, and regular biotics might be encapsulated such that they only become vulnerable after the digestive process has completed its task of breaking down food into components. "Good" cultures could be maintained outside of the body and used to supplement existing microbiomes.

     Whatever emphasis is taken, our bodies ARE indeed houses for many. Some religious scriptures may refer to our bodies as temples and occupational spaces for the gods. We need good neighbors within our bodies as well as in the outside world between people. Finding methods of understanding these interior worlds of interacting cells may be as difficult as understanding of the outer world but both can be of great benefit and worth the effort;

Frankenstein's Monster: AI's shadow

       Alan Turing, in 1950, released a paper called "Computing Machinery and Intelligence" while at the University of Manchester....