Allan Cytryn’s discussion with Professor Neil Gershenfeld

Jun 10, 2019AI World Society Summit

Allan Cytryn, member of AI World Society Standards and Practice Committee, discussed with Professor Neil Gershenfeld with questions below:

  1. The historical barrier between computation and production has been a de facto constraint, or control, on what AI might do. But once AI can link up with manufacturing, that control is eliminated. What are the implications and what actions should be taken?
  2. In discussing the corollaries between computation and genetics, the question then arises, “What is life”? And will the linking of bits and atoms therefore allow AI to create life? Can we then interfere with the creation of life by AI, even if the life is unknown and unrecognizable to us?
  3. The discussion begins with a review of the scaling of computer power to brain power, but other researchers have said that computers do not possess cognitive skills, they lack the structures in the brain that produce cognitive behavior. Is this distinction material? If not, why not? If so, what are the implications?
  4. A key issue on AI is “transparency” and there are legitimate efforts to pursue increased transparency. But many people are now viewing this as potentially limited, since it assumes an anthropomorphic notion of intelligence and communication, which may not be relevant to non-humans. If we can’t understand what the machines are doing, how do we know what they might be building and whether that is good or bad?
  5. If the achievement of bits and atoms devolves all existing social structures, doesn’t that return mankind to a Hobbesian state. Consider, if the state is providing the universal fab infrastructure, but that same infrastructure destroys the state, then won’t there be a survival of the fittest, mad-max type of competition to lawlessly (if the state has devolved, there is no law) seize control of the means of production for power and advantage?

Professor Gershenfeld’s full speech at AIWS Summit 2019 can be found here.