Bloodtype Mutations
Around 40KYA mutations likely occurred creating blood types A and B. What could cause such mutations?
Laschamp event: Roughly 40,000 years ago Earth's magnetic shield went down, exposing humanity to unusual amounts of cosmic radiation.
The Leap: The revolution that made the biggest difference occurred on the savanna of East Africa roughly 45,000 years ago, Klein and others maintain. “Communicating with symbols provides an unambiguous sign of our modernity,” says Klein, an eminent archaeologist who has taught at Stanford for nine years. “Once symbols appear, we know we’re dealing with people like us: people with advanced cognitive skills who could not only invent sophisticated tools and weapons and develop complex social networks for mutual security, but could also marvel at the intricacies of nature and their place in it; people who were self-aware.”
Klein suggests a third possibility—a strictly neurological scenario that has gained few followers in a field of study dominated by cultural explanations, he says. Humanity’s big bang, he speculates, was sparked not by an increase in brain size but by a sudden increase in brain quality. Klein thinks a fortuitous genetic mutation may have somehow reorganized the brain around 45,000 years ago, boosting the capacity to innovate. “It’s possible this change produced the modern ability for spoken language,” he says.
http://alumni.stanford.edu/get/page/magazine/article/?article_id=38306
The burst of modern behavior—like other momentous happenings in our evolution—arose not in South Africa, Klein says, but in East Africa, which was wetter during the drought. Around 45,000 years ago, he believes, a group of simple people in East Africa began to behave in new ways and rapidly expanded in population and range. With better weapons, they broadened their diet to include more challenging and nutritious prey. With their new sense of aesthetic, they made the first clearly identifiable art. And they freed themselves to wander beyond the local watering hole—setting the stage for long-distance trade—with contrivances like canteens and the delicately crafted eggshell beads, which may have functioned as “hostess gifts” to cement goodwill with other clans.
Dramatic evidence of a surge in ingenuity and adaptability comes from a wave of human migration around 40,000 to 35,000 years ago. Fully modern Africans made their way into Europe, Klein says, where they encountered the Neanderthals, cave dwellers who had lived in and around Europe for more than 200,000 years. The lanky Africans, usually called Cro-Magnons once they reached Europe, were more vulnerable to cold than the husky Neanderthals. Yet they came, saw and conquered in short order, and the Neanderthals vanished forever.
Compare that with an earlier migration around 100,000 years ago, in which the Neanderthals eventually prevailed. Physically—but not yet behaviorally—modern Africans took advantage of a long warm spell to expand northward into Neanderthal territory in the Middle East, only to scuttle south again when temperatures later plunged. The critical difference between the two migrations? The earlier settlers apparently lacked the modern ability to respond to change with new survival strategies, such as fitted garments, projectile weapons and well-heated huts.
Exactly what finished off the Neanderthals remains a mystery. They don’t seem to have blended into human groups, says Klein, who sees the two as separate species. Although humans may have mingled with Neanderthals on occasion, genetic or cultural swapping seems to have been rare. DNA studies show that “there are no Neanderthal genes in my body or yours,” he says.
By 30,000 years ago, everyone on earth was fully modern.
The world-shaking transformation of our species probably boils down to a tiny genetic glitch, Klein asserts. He developed his maverick notion as “the simplest, most parsimonious explanation for the available archaeological evidence,” he says. “I propose it only because it seems to be far more plausible and to explain more than the alternatives.”
Genes mutate all the time, Klein notes. Mutations can be useful, harmful or neutral in their effects. In large populations, even helpful mutations tend to get “swamped” by nonmutant genes and vanish over time. But Klein’s proposed mutation would have arisen in a small population, where its bearers could enjoy a survival advantage potent enough to maximize their offspring and spread the new trait like wildfire.
A mutation that improved the organization of the brain, Klein says, could give humans the ability to “conceive, create and communicate in symbols,” soon leading to speech. The symbolic brain would set the stage for future cultural revolutions, from the inception of agriculture about 10,000 years ago to the development of the World Wide Web in our lifetime.
Once our ancestors’ superior brains made them much more adaptable to change, cultural advancement rather than biological evolution became the predominant force shaping our future, Klein says. Before “the leap,” cultural advances paralleled major changes in anatomy. (For instance, crude stone tools first appeared some 2.5 million years ago along with a surge in brain size.) In the last 45,000 years, however, our bodies have changed little, while our culture has churned at an ever-increasing pace. Our flexible, enterprising brains enable us to adapt to almost any conditions and live almost anywhere, from Phoenix to Vladivostok, without changing physically.
Clearly, speech eases communication. But it also fosters something less obvious and equally important. Spoken language, Klein says, “allows people to conceive and model complex natural and social circumstances entirely within their minds.”
Most—though not all—anthropologists agree that human culture, imagination and ingenuity suddenly flowered around 45,000 years ago. The evidence ranges from fantastic cave paintings and elaborate graves to the first fishing equipment and sturdy huts. And whether scientists call it the great leap forward, the dawn of culture or civilization’s big bang, they agree that the change was momentous, giving humans the cohesion and adaptability to expand their range into Europe, Asia, and eventually Australia and the Americas. “In its wake,” Klein says, “humanity was transformed from a relatively rare and insignificant large mammal to something like a geologic force.”
Evolutionary geneticist Johannes Krause of the University of Tübingen in Germany, however, wasn't so sure that a mutation rate calibrated for living humans could be applied so far back in time. He and his colleagues decided to test the idea by sequencing the DNA from the maternally inherited mitochondria (mtDNA), or powerhouses of the cell, from fossils of modern humans who lived in the past 40,000 years and whose age was reliably known from calibrated radiocarbon dating methods. If the age of the fossil was 40,000 years, for example, it would be missing 40,000 years of evolution that took place in the lineage of a living person—and, therefore, missing mutations that would have arisen during the time since the fossil human died.
The team analyzed 10 well-dated fossils, including a medieval man who lived in France 700 years ago; the 4550-year-old Iceman; two 14,000-year-old skeletons from the tombs of Oberkassel in Germany; three related, modern humans from 31,000 years ago in Dolni Vestonice in the Czech Republic; and an early modern human from 40,000 years ago in Tianyuan, China. When the researchers applied this ancient DNA-derived mutation rate to the out-of-Africa migration, they got a range of dates from 62,000 to 95,000 years ago for the start of the migration, which is almost half the age of the migration out of Africa that was calculated using the "de novo" mutation rate, the group reports online today in Current Biology. "The nice thing about this is it was similar to the archaeological evidence," Krause says.
The team's method for checking the mutation rate is clever, says geneticist Aylwyn Scally of the Wellcome Trust Sanger Institute in Hinxton, U.K., co-author of one of the studies that calculated the slower mutation rate in living humans. "It's excellent that they have been able to get a better baseline for calibrating the mtDNA mutation rate by looking at ancient DNA."
However, Scally notes, mtDNA is a single genetic lineage, which is not typical of the genome, partly because the mutation rate of mtDNA could be higher because it has a higher proportion of genes under selection than the entire nuclear genome. Krause and one of his collaborators, paleogeneticist Svante Pääbo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, agree that future work will be needed to resolve the differences in mutation rates in the mtDNA and nuclear genomes. "It is possible that there are things we do not understand about mitochondrial inheritance and mutation patterns," Pääbo says.
Or the problem may be with undercounting nuclear mutations in living humans. When it comes to precisely counting about 50 new mutations out of 3.2 billion bases in a newborn's genome, current sequencing methods are at the limit of their ability to filter out true mutations from mistaken ones and may be discounting a statistically significant number of actual mutations, Krause says. "The way forward is to truly master accurate sequencing of nuclear genomes," Pääbo says.
And that matters, Krause says, because a sense of timing is critical in human evolution. Knowing when modern humans spread out of Africa and into Europe and Asia, for example, allowed Krause and his collaborators to show that the same modern humans were in Europe before and after the glaciers covered that continent—and had the ability to adapt to changing climates. They found that modern humans before and after the last major ice age in Europe share the same mtDNA lineage, making them direct descendants of the same linage. "Out of Africa is one of the major events within human evolution," Krause says. "We need to know when it happened." http://news.sciencemag.org/sciencenow/2013/03/clocking-the-human-exodus-out-of.html
Cosmic rays reveal event in Earth's magnetic field history
Nov 29, 2012
41 000 years ago, the Earth's magnetic field faded and practically disappeared, leaving our planet unprotected from the bombardment of cosmic rays. Evidence for this event has been found in ocean sediment cores by a team from the Centre de Recherche et d'Enseignement de Géosciences de l'Environnement (CEREGE, CNRS/Aix-Marseille Université/IRD/Collège de France). In the cores, the researchers measured variations in concentrations of beryllium-10, a radioactive isotope produced by the action of cosmic rays on oxygen and nitrogen atoms in the atmosphere. The work, published in the Journal of Geophysical Research, is an important step towards developing a new method for studying the history of Earth's magnetic field, which should shed light on why its strength has been declining over the past three thousand years.
The Earth's magnetic field forms an efficient shield that deflects charged particles of cosmic origin headed for Earth. Far from being constant, the magnetic field has undergone many reversals, with the North magnetic pole shifting to the South geographic pole. Such reversals are always accompanied by a disappearance of the magnetic field. The last such reversal took place 780 000 years ago. The magnetic field can also undergo excursions, periods when the field suddenly drops as if it was going to reverse, before recovering its normal polarity. The most recent event of this kind, known as the Laschamp excursion, took place 41 000 years ago.
Evidence for the event was uncovered by the researchers in sediment cores collected off the coasts of Portugal and Papua New Guinea. In the samples, they found an excess of beryllium-10, an isotope produced solely by collisions between particles of cosmic origin and atoms of nitrogen and oxygen. The beryllium-10 (10Be) produced in the atmosphere then falls to the Earth's surface where it is incorporated into ice and sediments. In sedimentary beds dating from the age of the Laschamp excursion, the researchers found up to twice as much 10Be as normal, evidence of the intense cosmic ray bombardment that the Earth underwent for several thousand years.
Traditionally, the presence of various iron oxides, especially magnetite, in volcanic lavas, sediments and ancient pottery provides information on the history of the magnetic field by indicating its direction and strength at the time when these materials solidified. This so-called paleomagnetic approach does not always allow global variations in the magnetic field to be quantified accurately. The researchers combined this method with the measurement of beryllium-10 concentrations in the same sedimentary records. This enabled them to demonstrate that peak concentrations of this isotope are synchronous and have the same dynamics and amplitude in Atlantic and Pacific sediments as in the previously analyzed Greenland ice cores. The method based on beryllium-10, which has been developed over the past 10 years at CEREGE, therefore makes it possible to obtain a continuous reconstruction of variations in the strength of the Earth's global magnetic field.
It is also known that over the past 3000 years the magnetic field has lost 30% of its strength. This trend suggests that in the coming centuries, the Earth might undergo an excursion similar to the one that took place 41 000 years ago. Since high energy cosmic rays can cause mutations and cell damage, such an event would have a significant impact on biodiversity, and in particular on humans. This is why the researchers are seeking to find out the precise rates of the magnetic field's reversal and excursion sequences, in order to identify potential regularities in its behavior and thus shed light on the cause of these phenomena, which originate in the Earth's core. This is the objective of the MAGORB project, launched in 2009, funded by the French National Research Agency ANR and run by CEREGE, the Institut de Physique du Globe de Paris (IPGP) and the Laboratoire des Sciences du Climat et de l'Environnement (LSCE, CNRS/CEA/UVSQ).
Laschamp event: Roughly 40,000 years ago Earth's magnetic shield went down, exposing humanity to unusual amounts of cosmic radiation.
The Leap: The revolution that made the biggest difference occurred on the savanna of East Africa roughly 45,000 years ago, Klein and others maintain. “Communicating with symbols provides an unambiguous sign of our modernity,” says Klein, an eminent archaeologist who has taught at Stanford for nine years. “Once symbols appear, we know we’re dealing with people like us: people with advanced cognitive skills who could not only invent sophisticated tools and weapons and develop complex social networks for mutual security, but could also marvel at the intricacies of nature and their place in it; people who were self-aware.”
Klein suggests a third possibility—a strictly neurological scenario that has gained few followers in a field of study dominated by cultural explanations, he says. Humanity’s big bang, he speculates, was sparked not by an increase in brain size but by a sudden increase in brain quality. Klein thinks a fortuitous genetic mutation may have somehow reorganized the brain around 45,000 years ago, boosting the capacity to innovate. “It’s possible this change produced the modern ability for spoken language,” he says.
http://alumni.stanford.edu/get/page/magazine/article/?article_id=38306
The burst of modern behavior—like other momentous happenings in our evolution—arose not in South Africa, Klein says, but in East Africa, which was wetter during the drought. Around 45,000 years ago, he believes, a group of simple people in East Africa began to behave in new ways and rapidly expanded in population and range. With better weapons, they broadened their diet to include more challenging and nutritious prey. With their new sense of aesthetic, they made the first clearly identifiable art. And they freed themselves to wander beyond the local watering hole—setting the stage for long-distance trade—with contrivances like canteens and the delicately crafted eggshell beads, which may have functioned as “hostess gifts” to cement goodwill with other clans.
Dramatic evidence of a surge in ingenuity and adaptability comes from a wave of human migration around 40,000 to 35,000 years ago. Fully modern Africans made their way into Europe, Klein says, where they encountered the Neanderthals, cave dwellers who had lived in and around Europe for more than 200,000 years. The lanky Africans, usually called Cro-Magnons once they reached Europe, were more vulnerable to cold than the husky Neanderthals. Yet they came, saw and conquered in short order, and the Neanderthals vanished forever.
Compare that with an earlier migration around 100,000 years ago, in which the Neanderthals eventually prevailed. Physically—but not yet behaviorally—modern Africans took advantage of a long warm spell to expand northward into Neanderthal territory in the Middle East, only to scuttle south again when temperatures later plunged. The critical difference between the two migrations? The earlier settlers apparently lacked the modern ability to respond to change with new survival strategies, such as fitted garments, projectile weapons and well-heated huts.
Exactly what finished off the Neanderthals remains a mystery. They don’t seem to have blended into human groups, says Klein, who sees the two as separate species. Although humans may have mingled with Neanderthals on occasion, genetic or cultural swapping seems to have been rare. DNA studies show that “there are no Neanderthal genes in my body or yours,” he says.
By 30,000 years ago, everyone on earth was fully modern.
The world-shaking transformation of our species probably boils down to a tiny genetic glitch, Klein asserts. He developed his maverick notion as “the simplest, most parsimonious explanation for the available archaeological evidence,” he says. “I propose it only because it seems to be far more plausible and to explain more than the alternatives.”
Genes mutate all the time, Klein notes. Mutations can be useful, harmful or neutral in their effects. In large populations, even helpful mutations tend to get “swamped” by nonmutant genes and vanish over time. But Klein’s proposed mutation would have arisen in a small population, where its bearers could enjoy a survival advantage potent enough to maximize their offspring and spread the new trait like wildfire.
A mutation that improved the organization of the brain, Klein says, could give humans the ability to “conceive, create and communicate in symbols,” soon leading to speech. The symbolic brain would set the stage for future cultural revolutions, from the inception of agriculture about 10,000 years ago to the development of the World Wide Web in our lifetime.
Once our ancestors’ superior brains made them much more adaptable to change, cultural advancement rather than biological evolution became the predominant force shaping our future, Klein says. Before “the leap,” cultural advances paralleled major changes in anatomy. (For instance, crude stone tools first appeared some 2.5 million years ago along with a surge in brain size.) In the last 45,000 years, however, our bodies have changed little, while our culture has churned at an ever-increasing pace. Our flexible, enterprising brains enable us to adapt to almost any conditions and live almost anywhere, from Phoenix to Vladivostok, without changing physically.
Clearly, speech eases communication. But it also fosters something less obvious and equally important. Spoken language, Klein says, “allows people to conceive and model complex natural and social circumstances entirely within their minds.”
Most—though not all—anthropologists agree that human culture, imagination and ingenuity suddenly flowered around 45,000 years ago. The evidence ranges from fantastic cave paintings and elaborate graves to the first fishing equipment and sturdy huts. And whether scientists call it the great leap forward, the dawn of culture or civilization’s big bang, they agree that the change was momentous, giving humans the cohesion and adaptability to expand their range into Europe, Asia, and eventually Australia and the Americas. “In its wake,” Klein says, “humanity was transformed from a relatively rare and insignificant large mammal to something like a geologic force.”
Evolutionary geneticist Johannes Krause of the University of Tübingen in Germany, however, wasn't so sure that a mutation rate calibrated for living humans could be applied so far back in time. He and his colleagues decided to test the idea by sequencing the DNA from the maternally inherited mitochondria (mtDNA), or powerhouses of the cell, from fossils of modern humans who lived in the past 40,000 years and whose age was reliably known from calibrated radiocarbon dating methods. If the age of the fossil was 40,000 years, for example, it would be missing 40,000 years of evolution that took place in the lineage of a living person—and, therefore, missing mutations that would have arisen during the time since the fossil human died.
The team analyzed 10 well-dated fossils, including a medieval man who lived in France 700 years ago; the 4550-year-old Iceman; two 14,000-year-old skeletons from the tombs of Oberkassel in Germany; three related, modern humans from 31,000 years ago in Dolni Vestonice in the Czech Republic; and an early modern human from 40,000 years ago in Tianyuan, China. When the researchers applied this ancient DNA-derived mutation rate to the out-of-Africa migration, they got a range of dates from 62,000 to 95,000 years ago for the start of the migration, which is almost half the age of the migration out of Africa that was calculated using the "de novo" mutation rate, the group reports online today in Current Biology. "The nice thing about this is it was similar to the archaeological evidence," Krause says.
The team's method for checking the mutation rate is clever, says geneticist Aylwyn Scally of the Wellcome Trust Sanger Institute in Hinxton, U.K., co-author of one of the studies that calculated the slower mutation rate in living humans. "It's excellent that they have been able to get a better baseline for calibrating the mtDNA mutation rate by looking at ancient DNA."
However, Scally notes, mtDNA is a single genetic lineage, which is not typical of the genome, partly because the mutation rate of mtDNA could be higher because it has a higher proportion of genes under selection than the entire nuclear genome. Krause and one of his collaborators, paleogeneticist Svante Pääbo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, agree that future work will be needed to resolve the differences in mutation rates in the mtDNA and nuclear genomes. "It is possible that there are things we do not understand about mitochondrial inheritance and mutation patterns," Pääbo says.
Or the problem may be with undercounting nuclear mutations in living humans. When it comes to precisely counting about 50 new mutations out of 3.2 billion bases in a newborn's genome, current sequencing methods are at the limit of their ability to filter out true mutations from mistaken ones and may be discounting a statistically significant number of actual mutations, Krause says. "The way forward is to truly master accurate sequencing of nuclear genomes," Pääbo says.
And that matters, Krause says, because a sense of timing is critical in human evolution. Knowing when modern humans spread out of Africa and into Europe and Asia, for example, allowed Krause and his collaborators to show that the same modern humans were in Europe before and after the glaciers covered that continent—and had the ability to adapt to changing climates. They found that modern humans before and after the last major ice age in Europe share the same mtDNA lineage, making them direct descendants of the same linage. "Out of Africa is one of the major events within human evolution," Krause says. "We need to know when it happened." http://news.sciencemag.org/sciencenow/2013/03/clocking-the-human-exodus-out-of.html
Cosmic rays reveal event in Earth's magnetic field history
Nov 29, 2012
41 000 years ago, the Earth's magnetic field faded and practically disappeared, leaving our planet unprotected from the bombardment of cosmic rays. Evidence for this event has been found in ocean sediment cores by a team from the Centre de Recherche et d'Enseignement de Géosciences de l'Environnement (CEREGE, CNRS/Aix-Marseille Université/IRD/Collège de France). In the cores, the researchers measured variations in concentrations of beryllium-10, a radioactive isotope produced by the action of cosmic rays on oxygen and nitrogen atoms in the atmosphere. The work, published in the Journal of Geophysical Research, is an important step towards developing a new method for studying the history of Earth's magnetic field, which should shed light on why its strength has been declining over the past three thousand years.
The Earth's magnetic field forms an efficient shield that deflects charged particles of cosmic origin headed for Earth. Far from being constant, the magnetic field has undergone many reversals, with the North magnetic pole shifting to the South geographic pole. Such reversals are always accompanied by a disappearance of the magnetic field. The last such reversal took place 780 000 years ago. The magnetic field can also undergo excursions, periods when the field suddenly drops as if it was going to reverse, before recovering its normal polarity. The most recent event of this kind, known as the Laschamp excursion, took place 41 000 years ago.
Evidence for the event was uncovered by the researchers in sediment cores collected off the coasts of Portugal and Papua New Guinea. In the samples, they found an excess of beryllium-10, an isotope produced solely by collisions between particles of cosmic origin and atoms of nitrogen and oxygen. The beryllium-10 (10Be) produced in the atmosphere then falls to the Earth's surface where it is incorporated into ice and sediments. In sedimentary beds dating from the age of the Laschamp excursion, the researchers found up to twice as much 10Be as normal, evidence of the intense cosmic ray bombardment that the Earth underwent for several thousand years.
Traditionally, the presence of various iron oxides, especially magnetite, in volcanic lavas, sediments and ancient pottery provides information on the history of the magnetic field by indicating its direction and strength at the time when these materials solidified. This so-called paleomagnetic approach does not always allow global variations in the magnetic field to be quantified accurately. The researchers combined this method with the measurement of beryllium-10 concentrations in the same sedimentary records. This enabled them to demonstrate that peak concentrations of this isotope are synchronous and have the same dynamics and amplitude in Atlantic and Pacific sediments as in the previously analyzed Greenland ice cores. The method based on beryllium-10, which has been developed over the past 10 years at CEREGE, therefore makes it possible to obtain a continuous reconstruction of variations in the strength of the Earth's global magnetic field.
It is also known that over the past 3000 years the magnetic field has lost 30% of its strength. This trend suggests that in the coming centuries, the Earth might undergo an excursion similar to the one that took place 41 000 years ago. Since high energy cosmic rays can cause mutations and cell damage, such an event would have a significant impact on biodiversity, and in particular on humans. This is why the researchers are seeking to find out the precise rates of the magnetic field's reversal and excursion sequences, in order to identify potential regularities in its behavior and thus shed light on the cause of these phenomena, which originate in the Earth's core. This is the objective of the MAGORB project, launched in 2009, funded by the French National Research Agency ANR and run by CEREGE, the Institut de Physique du Globe de Paris (IPGP) and the Laboratoire des Sciences du Climat et de l'Environnement (LSCE, CNRS/CEA/UVSQ).
Physicist Richard Firestone proposes in his book "The Cycle of Cosmic Catastrophes" that:
Quote from: Firestone
Our early ancestors had only type O blood. Around 40KYA mutations likely occurred creating blood types A and B. Types A and B blood are from dominant genes, so they spread through the population and became more common.
DNA evidence suggests that B type blood probably originated in Central Asia or Africa, where the percentage is uniformly highest. Because the percentage is still very low in Australia and the Americas, it seems unlikely that it originated in either of those two places. Some geneticists conclude that type B is the youngest blood type, which appeared no earlier than 15KYA and later than 45KYA, and if so, this distribution seems inconsistent with early Americans originating in Asia and traveling across the Bering land bridge. If they had done so, there would be a lot more type B in the Americas.
For type A blood, the picture is more complicated, with apparent origins in Europe, Canada, and Australia. Again, there is little evidence that type A spread from Asia to the Americas. Instead, paradoxically, it appears to have arrived in the Americas from Europe long before Columbus did.
Is it possible that the Indians came from Europe? That idea seems far-fetched according to traditional views, and yet, according to Dennis Stanford, of the Smithsonian and Bruce Bradley, there is intriguing evidence connecting Clovis flint-knapping technology to the Solutrean flint technology in Spain at the end of the Ice Age. In addition, Clovis points are very unlike flint points from Asia, their supposed land of origin. Since blood types show a connection with Europe, perhaps there is one rather than Asians, maybe the Solutreans really discovered the New World - or perhaps others did, because, remarkably, recent studies of early South American skulls suggest aboriginal or African origins.
Although type O blood is common everywhere, it is nearly universal among natives of South and Central America, and much more common in North American than in Asia or Europe. If people populated the Americas from Asia at the end of the Ice Age after types An and B arose, those people neglected to bring their normal distribution of blood types with them.
Another blood-typing system has been used to demonstrate the Asian origin of Native Americans. Called Diego, it evolved recently as a mutation, and all Africans, Europeans, East Indians, Australian Aborigines, and Polynesians are Diego-negative. East Asians and Native Americans are the only people that are Diego-positive. But Diego-positive is more common among Native Americans than among East Asians, raising the question of who got these genes first. From blood types alone, a case can be made that the oldest indigenous people are the Native Americans!
Quote from: Firestone
Our early ancestors had only type O blood. Around 40KYA mutations likely occurred creating blood types A and B. Types A and B blood are from dominant genes, so they spread through the population and became more common.
DNA evidence suggests that B type blood probably originated in Central Asia or Africa, where the percentage is uniformly highest. Because the percentage is still very low in Australia and the Americas, it seems unlikely that it originated in either of those two places. Some geneticists conclude that type B is the youngest blood type, which appeared no earlier than 15KYA and later than 45KYA, and if so, this distribution seems inconsistent with early Americans originating in Asia and traveling across the Bering land bridge. If they had done so, there would be a lot more type B in the Americas.
For type A blood, the picture is more complicated, with apparent origins in Europe, Canada, and Australia. Again, there is little evidence that type A spread from Asia to the Americas. Instead, paradoxically, it appears to have arrived in the Americas from Europe long before Columbus did.
Is it possible that the Indians came from Europe? That idea seems far-fetched according to traditional views, and yet, according to Dennis Stanford, of the Smithsonian and Bruce Bradley, there is intriguing evidence connecting Clovis flint-knapping technology to the Solutrean flint technology in Spain at the end of the Ice Age. In addition, Clovis points are very unlike flint points from Asia, their supposed land of origin. Since blood types show a connection with Europe, perhaps there is one rather than Asians, maybe the Solutreans really discovered the New World - or perhaps others did, because, remarkably, recent studies of early South American skulls suggest aboriginal or African origins.
Although type O blood is common everywhere, it is nearly universal among natives of South and Central America, and much more common in North American than in Asia or Europe. If people populated the Americas from Asia at the end of the Ice Age after types An and B arose, those people neglected to bring their normal distribution of blood types with them.
Another blood-typing system has been used to demonstrate the Asian origin of Native Americans. Called Diego, it evolved recently as a mutation, and all Africans, Europeans, East Indians, Australian Aborigines, and Polynesians are Diego-negative. East Asians and Native Americans are the only people that are Diego-positive. But Diego-positive is more common among Native Americans than among East Asians, raising the question of who got these genes first. From blood types alone, a case can be made that the oldest indigenous people are the Native Americans!