Tree of Life (ToL) Web Project

The Tree of Life Web Project is an ongoing Internet project providing information about the diversity and evolutionary relationships of life on Earth. This collaborative peer reviewed project began in 1995, and is written by biologists from around the world. The goal of the Tree of Life Web Project is to contain a page with pictures, text, and other information for every species and for each group of organisms, living or extinct. Connections between Tree of Life web pages follow phylogenetic branching patterns between groups of organisms, so visitors can browse the hierarchy of life and learn about phylogeny and evolution as well as the characteristics of individual groups.

Sample web page of the Tree of Life

Sample web page of the Tree of Life Web Project

Links to websites with similar projects are provided in the following list :

N-gram databases & N-gram viewers

Last update : May 13, 2013

An N-gram is a contiguous sequence of n items from a given sequence, collected from a text or speech corpus. An N-gram could be any combination of letters, phonemes, syllables, words or base pairs, according to the application.

An N-gram of size 1 is referred to as a unigram, size 2 is a bigram, size 3 is a trigram. Larger sizes are referred to by the value of N (four-gram, five-gram, …). N-gram models are widely used in statistical natural language processing. In speech recognition, phonemes and sequences of phonemes are modeled using a N-gram distribution.

“All Our N-gram are Belong to You” was the title of a post published in August 2006 by Alex Franz and Thorsten Brants in the Google Research Blog. Google believed that the entire research community should benefit from access to their massive amounts of data collected by scanning books and by analysing the web. The data was distributed by the Linguistics Data Consortium (LDC) of the University of Pennsylvania. Four years later (December 2010), Google unveiled an online tool for analyzing the history of the data digitized as part of the Google Books project (N-Gram Viewer). The appeal of the N-gram Viewer was not only obvious to scholars (professional linguists, historians, and bibliophiles) in the digital humanities, linguistics, and lexicography, but also casual users got pleasure out of generating graphs showing how key words and phrases changed over the past few centuries.

Google Books N-gram Viewer, an addictive tool

Google Books N-gram Viewer, an addictive tool

The version 2 of the N-Gram Viewer was presented in October 2012 by engineering manager Jon Orwant. A detailed description how to use the N-Gram Viewer is available at the Google Books website. The maximum string that can be analyzed is five words long (Five gram). Mathematical operators allow you to add, subtract, multiply, and divide the counts of N-grams. Part-of-speech tags are available for advanced use, for example to distinguish between verbs or nouns of the same word. To make trends more apparent, data can be viewed as a moving average (0 = raw data without smoothing, 3 = default, 50 = maximum). The results are normalized by the number of books published in each year. The data can also be downloaded for further exploration.

N-Gram data is also provided by other institutions. Some sources are indicated hereafter :

Links to further informations about N-grams are provided in the following list :

EMPATHICA

Last update : August 6, 2013

EMPATHICA is a software program designed at the University of Waterloo, Canada, to help people understand and resolve conflicts. It is based on the hope that increasing empathy between people can help to overcome impasses in disputes in many domains.

EMPATHICA uses the idea of Cognitive-Affective Maps (CAMs) developed by Paul Thagard, Professor of Philosophy and director of Cognitive Sciences at the University of Waterloo, in collaboration with Thomas Homer-Dixon, Scott Findlay, and others. These maps derive from ideas about emotional cognition described in Thagard’s book Hot Thought.

A CAM is a diagram that shows concepts and beliefs along with the emotional values attached to them. It also shows the relationships between concepts that support each other or conflict with each other. CAMs are made up of simple nodes and edges. Nodes can have differing valences based on how a person feels about a concept related to the conflict.

EMPATHICA : Cognitive Affective Maps modeling the international Climate Change debates

Cognitive Affective Map modeling the international Climate Change debates

EMPATHICA handles the following web pages :

  • Conflict Management : open, close, view and edit conflicts
  • Conflict Overview : shows the CAMs associated with the conflict
  • Graph Editor : use manipulation tools to create and edit CAMs
  • Correlate : tie together concepts that are in both maps
  • Compare : shows the points of contention and the points of agreement in a conflict
  • Compromise : shows suggestions for the conflict

The program EMPATHICA was a Fourth-Year Design Project for the Software Engineering class of the University of Waterloo. Today I installed the Windows version of EMPATHICA (released in January 2013) on my PC. Great project !

Animal consciousness

Last update : August 6, 2013

Animal consciousness, or animal awareness, has been actively researched for over 100 years, but there has never been a agreement among scientists wether there is an animal consciousness or not, mainly because the problem of other minds.  As the field of consciousness research evolved and as new techniques and strategies for human and non-human animal research have been developed, the question has been answered last year.

Francis Crick, an English molecular biologist, biophysicist and neuroscientist, co-discovered the structure of the DNA molecule in 1953, together with James Watson. Francis Crick, James Watson and Maurice Wilkins were jointly awarded the 1962 Nobel Prize for Physiology or Medicine for their discoveries.

In 1994, Francis Crick published the book The Astonishing Hypothesis (The Scientific Search for the Soul) about consciousness. In Februrary 2003, Francis Crick and Christof Koch published A framework for consciousness in the Nature Neuroscience magazine. The same year, one year before his death, Francis Crick was one of 21 Nobel Laureates who signed the Humanist Manifesto, published by the American Humanist Association (AHA).

In July 2012 took place the first annual Francis Crick Memorial Conference at the University of Cambridge, UK. The upshot of the meeting was the Cambridge Declaration on Consciousness signed by Christof Koch, David Edelman, Philip Low, Diana Reiss, Bruno van Swinderen and Jaak Panksepp.

Cartoon about animal consciousness drawn by Andrzej Krauze

Cartoon by Andrzej Krauze

The Cambridge Declaration concludes that “non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors. Consequently, the weight of evidence indicates that humans are not unique in possessing the neurological substrates that generate consciousness. Non-human animals, including all mammals and birds, and many other creatures, including octopuses, also possess these neurological substrates.”

The related cartoon about animal consciousness, drawn by the Polish-born, British cartoonist Andrzej Krauze, was published in the NewScientist article Animals are conscious and should be treated as such, edited by Marc Bekoff.

The american Human Brain Activity Map Project

Last update : August 10, 2013

Human Brain Activity Map Project

Cartoon by Jordan Adwan, The New Yorker, 2013

Several weeks after the public announcement of the Human Brain Project as a european research FET Flagship by the European Commission, the US administration unveiled the planning of a decade-long scientific effort to examine the workings of the human brain and build a comprehensive map of its activity, seeking to do for the brain what the Human Genome Project did for genetics. The project called Brain Activity Map (BAM) will include federal agencies, private foundations and teams of neuroscientists and nanoscientists in a concerted effort to advance the knowledge of the brain’s billions of neurons and gain greater insights into perception, actions and, ultimately, consciousness. Moreover, the project holds the potential of paving the way for advances in artificial intelligence.

The Human Brain Activity Map initiative will be organized by the Office of Science and Technology Policy (OSTP). Partners will be the National Institutes of Health (NIH), the Defense Advanced Research Projects Agency (DARPA), the National Science Foundation (NSF), the Howard Hughes Medical Institute (HHMI) in Chevy Chase, the Allen Institute for Brain Science in Seattle and other big actors as Google and Microsoft.

Gary Marcus, a professor at New York University (N.Y.U.), recommends to endow five separate projects rather than putting a huge amount of money into a single project. He proposes to address the most fundamental unsolved questions in neuroscience :

  • Decipher the basic language of the brain : What is the basic element of neural computation ? What is the basic scheme by which symbolic information (like sentences) are stored ?
  • Understand the rules governing how neurons organize into circuits
  • Determine which circuits to use in a given situation and understanding how the brain communicates information from one region to another (neural plasticity and neural development)
  • Find the relation between brain circuits, genes, and behavior
  • Develop new techniques for analyzing and observing brain function

The following list provides some links to additional informations about the Human Brain Activity Map Project :

The periodic table of chemical elements

A chemical element is a pure chemical substance consisting of one type of atom, distinguished by its atomic number, which is the number of protons in its nucleus. Atoms are build of particles.

Isotopes are atoms of the same element (same number of protons), but having different numbers of neutrons. Most naturally occurring elements (66 of 94) have more than one stable isotope. For example, there are three main isotopes of carbon. All carbon atoms have 6 protons in the nucleus, but they can have either 6, 7, or 8 neutrons. Since the mass numbers of these are 12, 13 and 14 respectively, the three isotopes of carbon are known as carbon-12 (12C), carbon-13 (13C), and carbon-14 (14C). Carbon in everyday life and in chemistry is a mixture of 12C, 13C, and a very small fraction of 14C atoms. Its presence in organic materials is the basis of the radiocarbon dating method to date archaeological, geological, and hydrogeological samples.

The tabular display of the chemical elements, organized on the basis of their atomic numbers, electron configurations, and recurring chemical properties is called the periodic table. Although precursors exist, Dmitri Mendeleev is generally credited with the publication, in 1869, of the first widely recognized periodic table.

Periodic table of chemical elements (Wikipedia)

As of 2012, the periodic table contains 118 confirmed chemical elements.The latest, ununseptium, has been identified in 2010. Of these 118 elements, 114 have been officially recognized and named by the International Union of Pure and Applied Chemistry (IUPAC). A total of 98 are known to occur naturally on earth. 80 of them are stable, while the others are radioactive, decaying into lighter elements. A detailed list of the 118 known chemical elements is available at Wikipedia.

The lightest of the chemical elements are hydrogen and helium, both created by the Big Bang nucleosynthesis during the first 20 minutes of the universe. They are by far the most abundant chemical elements in the universe. However, iron is the most abundant element making up the earth, and oxygen is the most common element in the earth’s crust.

Although all known chemical matter is composed of chemical elements, chemical matter itself constitutes only about 15% of the matter in the universe. The remainder is dark matter, a mysterious substance which is not composed of chemical elements.

When two distinct elements are chemically combined, with the atoms held together by chemical bonds, the result is termed a chemical compound. Two thirds of the chemical elements occur on earth only as compounds. Just six elements – carbon, hydrogen, nitrogen, oxygen, calcium, and phosphorus – make up almost 99% of the composition of a human body.

 

Pantheism and the Anthropic Principle

Pantheism symbols

Pantheism is the belief that everything composes an all-encompassing, immanent God, or that the Universe (or Nature) is identical with divinity. Pantheists thus do not believe in a personal god or a anthropomorphic god.

The Universal Pantheist Society and the World Pantheist Movement (WPM) are two organizations of people associated with pantheism.

The Copernican principle, named after Nicolaus Copernicus, states that the Earth is not the center of the universe. Copernicus was a Renaissance astronomer and the first person to formulate a comprehensive heliocentric cosmology. The Anthropic Principle was first raised by Brandon Carter in 1973 in reaction to the Copernican principle. Carter stated “Although our situation is not necessarily central, it is inevitably privileged to some extent”. The anthropic principle has given rise to some confusion and controversy, partly because the phrase has been applied to several distinct ideas.

The anthropic principle is related to the fundamental parameters, that is the dimensionless physical constants and the initial conditions for the Big Bang. Connections between physical constants that seem to be necessary for the existence of life in the universe are called the anthropic coincidences. Many examples of claimed anthropic coincidences can be found in the literature. The constants of nature seem to be extraordinarily fine-tuned for the production of life. Opponents to this theory argue that the universe is less fine-tuned than often claimed or that there is not one universe, but a whole infinite ensemble of universes with all possible fundamental parameters, the multiverse.

Particles, strings and M-Theory

In the physical science, a particle is a small localized object to which can be ascribed several physical properties such as volume or mass.

In particle physics, an elementary particle (or fundamental particle) is a particle not known to be made up of smaller particles. If an elementary particle truly has no substructure, then it is one of the basic building blocks of the universe from which all other particles are made.

The Standard Model of particle physics has 61 particles :

  • 2*3*3 (=18) quarks (fermions) with corresponding antiparticles (total 36)
  • 2*3 (=6) leptons (fermions) with corresponding antiparticles (total 12)
  • 1*8 gluons (bosons) without antiparticles (total 8)
  • 1 W boson with one corresponding antiparticle (total 2)
  • 1 Z boson without antiparticle (total 1)
  • 1 photon (boson) without antiparticle (total 1)
  • Higgs boson without antiparticle (total 1)

Standard model of particles (Wikipedia)

Quarks and leptons are fermions. According to the spin-statistics theorem, fermions respect the Pauli exclusion principle. Each fermion has a corresponding antiparticle.

Elementary fermions are matter particles, segmented as :

  • 6 Quarks : up, down, charm, strange, top, bottom
  • 3 Leptons with electrical charge : electron, muon, tau
  • 3 Leptons without electrical charge : electron neutrino,  muon neutrino,  tau neutrino

Pairs from each classification are grouped together to form a generation, with corresponding particles exhibiting similar physical behavior.

Elemetary bosons are force-carrying particles, segmented as :

Gluons have 8 color charges.

The elementary fermions and bosons are represented in the following scheme :

Standard Model of elementary particles (Wikipedia)

The Higgs particle is a massive scalar elementary particle without intrinsic spin.

Additional elementary particles may exist, such as the graviton, which would mediate gravitation. Such particles lie beyond the Standard Model.

Composite particles are hadrons, made of quarks, held together by the strong interaction (also called strong force). Hadrons are categorized into two families: baryons and mesons. Baryons are hadrons and fermions, mesons are hadrons and bosons.

Baryons are made of three valence quarks. The best-known baryons are the proton and the neutron that make up most of the mass of the visible matter in the universe. Both form together the atom,  a basic unit that consists of a dense central nucleus surrounded by a cloud of negatively charged electrons.

Each type of baryon has a corresponding antiparticle (antibaryon) in which quarks are replaced by their corresponding antiquarks. For example : just as a proton is made of two up-quarks and one down-quark, its corresponding antiparticle, the antiproton, is made of two up-antiquarks and one down-antiquark.

Mesons are hadronic subatomic particles composed of one quark and one antiquark, bound together by the strong interaction. Pions are the lightest mesons. A list of all mesons and a list of all particles are available at Wikipedia.

All particles of the Standard Model have been observed in nature, including the Higgs boson. Particles are described by the quantum field theory (quantum mechanics). String theory are an active research framework in particle physics that attempts to reconcile quantum mechanics and general relativity.  String theory posits that the elementary particles within an atom are not 0-dimensional objects, but rather 1-dimensional oscillating lines (strings). A key feature of string theory is the existence of D-branes. There are different flavors of the string theory. The version that incorporates fermions and supersymmetry is called superstring theory.

An extension of the superstring theory is the M-theory in which 11 dimensions are identified. According to Stephen Hawking in particular, M-theory is the only candidate for a complete theory of the universe, the theory of everything (TOE), a self-contained mathematical model that describes all fundamental forces and forms of matter.

Learning and the hebbian theory

Learning is acquiring new, or modifying existing, knowledge, behaviors, skills, values, or preferences and may involve synthesizing different types of information. The ability to learn is possessed by humans, animals and some machines. Progress over time tends to follow learning curves.

Three domains of learning have been proposed by Benjamin Bloom, an American educational psychologist who made contributions to the classification of educational objectives and to the theory of mastery-learning :

There are also numerous types of learning. Wikipedia lists 17 different types with several subtypes. More  types are proposed by other sources.

The adaptation of neurons in the brain during the learning process is explained by the
Hebbian theory,  a scientific theory in neurobiology. Introduced by Donald O. Hebb in 1949, it is also called Hebb’s rule, Hebb’s postulate, or cell assembly theory. Donald O. Hebb was a Canadian psychologist who sought to understand how the function of neurons contributed to psychological processes such as learning. He has been described as the father of neuropsychology and neural networks.

Collective intelligence of ants and swarms

Last update : August 6, 2013

Collective intelligence, also called group wisdom, is shared knowledge arrived at by individuals and groups. The wisdom of the crowd is the process of taking into account the collective opinion of a group of individuals rather than a single expert to answer a question. James Surowiecki published published in 2004 his book The Wisdom of Crowds about the aggregation of information in groups, resulting in decisions that, he argues, are often better than could have been made by any single member of the group.

Group intelligence refers to a process by which large numbers of people simultaneously converge upon the same point(s) of knowledge.

Collective intelligence, which is sometimes used synonymously with collective wisdom, is more of a shared decision process than collective wisdom. Collective intelligence is a shared intelligence that emerges from the collaboration and competition of many individuals and appears in consensus decision making in animals, humans and computer networks. The term is related to the Global Brain.

If we look at ants, we can see that they exhibit many of the characteristics and behaviours that we associate with intelligence and civilization, for example :

  • ants build cities (ant hills) with contain complex ventilation systems, waste recycling and complex transportation systems including highways
  • ants farm and cultivate mushrooms
  • ants raise and keep other insects for food
  • ants wage wars in organized batallions
  • ants capture slaves
  • ants teach and communicate
  • ants collaborate and do teamwork

The study of the behavior of social insects like ants and bees is part of the Swarm Intelligence (SI). This is a relatively new discipline that deals with the study of self-organizing processes both in nature and in artificial systems. The concept is employed in work on artificial intelligence. The expression was introduced by Gerardo Beni and Jing Wang in 1989, in the context of cellular robotic systems. Besides ant colonies, natural examples of SI include bird flocking, animal herding, bacterial growth and fish schooling. The application of swarm principles to robots is called swarm robotics, a special case is ant robotics. In computer science and operations research, the ant colony optimization algorithm (ACO) is used to find good paths through graphs.

collective intelligence of ants

ANTS2012 , September 2012 Brussels

A first workshop ANTS98 on Ant Colony Optimization, “From ant colonies to artificial ants”, took place in October 1998 in Brussels. The eight international conference ANTS2012 (in the meantime called Swarm Intelligence) took place in September 2012 in Brussels.

In 2006, the Center for Collective Intelligence (CCI) was created at MIT to make collective intelligence a topic of serious academic study. O’Reilly Media published in 2007 the book Programming Collective Intelligence, written by Toby Segaran.