Tuesday, December 24, 2019

Chisala Helps HBDers Walk Back the Sale

The virtue of Chanda Chisala's response to Lance Welton is that by knocking down the minor fallacies of the Human BioDiversity (HBD) brain-trust, he relies on major fallacies they've been sold.

Why is this virtuous?

HBDers might, finally, question these major fallacies.

Ever hear the phrase "talking past the sale"?  It's a salesman's gimmick.  Prior to closing the sale, the salesman asks his customer:  "What color car would you like?"  In so doing, he enlists the customer in subverting his own decision-making process.  In the customer's mind, he has accepted the premise that he bought the car.

What major fallacies did HBDers buy?  

From most, these major fallacies are:
  1. Consensus can precede consent.
  2. Popper's "falsification" criterion is the gold standard of science.
  3. Nurture can't be 100% determinative.
We'll take these in reverse order and, hopefully, end up walking back the fallacious premises HBDers bought.

Major Fallacy #3

Nurture Can't Be 100% Determinative  


For example, I recently laid the following rhetorical trap for a Swede who, in the presence of wealthy associates, piously asserted the genetic equality of intelligence between races.

I laid the trap by amplifying his piety with, "Intelligence is 100% a product of nurture."
"THAT'S ABSOLUTELY RIGHT!", thundered the gelded Thor.

At which point I further amplified, "Yes, if I put a bullet through your head, your IQ will decrease by 100%!"  He abruptly switched to another topic.

For relevance, contrast Chanda Chisala's own words he quotes in establishing his race-realist credentials:

The average (genetic) potentials [emphasis JAB] of intelligence could indeed be as varied as the heights of different populations.

Do you see it?

Here "it" is:  

100% destruction of potential is something environments are fully capable of achieving -- and cheaply at that.  A .22 bullet costs a nickle.  Since wealthy individuals are known for a keen appreciation of costs -- the gelding was wise to quickly change the subject.  Indeed, destruction of potential is the central thesis of "anti-racism" but only when the potential being destroyed is that of non-whites.  If only the decades of programs and preferences had cost a nickle per supposed beneficiary of "anti-racism"!  We can see why HBDers are loathe to legitimize an argument so-abused as to impose such costly, an actually destructive, projects.  

But why didn't Chisala propose this destruction of potential in the case of UK white youths?  Does he really believe that decades of government-imposed racial preferences for non-whites throughout the West have had insignificant deleterious impact on white youths?

Perhaps.  But surely Chisala might have anticipated that HBDers would, in his words, "change the goalpost" once again:  Use the most popular argument among social scientists to explain differential outcomes.  After all,  HBDers, in Chisala's words "exude the obstinate fervor of pseudoscience" by "changing the goalposts" despite having their hypotheses "falsified".  So why not?


Major Fallacy #2

Popper's "Falsification" Criterion is the Gold Standard of Science



For example, we have Chisala quoting someone of Richard Lynn's stature, who sets the ultimate "goalpost" in the statement:

If a multiracial society is found where these race differences in intelligence are absent, the evolutionary and genetic theory of these differences would be falsified [emphasis JAB]. Those who maintain that there are no genetic differences in intelligence between the races are urged to attempt this task.  

Lynn, long ago bought the pop philosophy of science called "naïve falsificationism".  Popper muddied scientific philosophy.  He did so by introducing a qualitative "falsification" criterion so as to obscure the prior, and superior, quantitative criterion of simplicity:  Ockham's Razor.  Ockham's Razor has long been the Gold Standard in the philosophy of science.  Unlike "falsification" it is quantitative: Choosing the simplest of proposed theories.  Moreover, Ockham's Razor has been rendered measurable in Algorithmic Information Theory.  AIT posits when theories are presented with a dataset of observations, the most informative theory best-compresses the dataset.  Fewer bits means better predictions.  The theoretic maximum compression is called the dataset's Kolmogorov Complexity, measured in bits of information.  AIT is now applied as the universal measure in artificial general intelligence.

Remember that phrase -- "general intelligence".  

Popper's popularization of a qualitative Fool's Gold Standard in the pop philosophy of science did untold damage to Western Civilization.  He performed this feat at the very same instant in history that Algorithmic Information Theory rigorously formalized Ockham's Razor.  In AIT a so-called "falsification" merely increases the number of bits in the theory -- its approximation of the dataset's Kolmogorov Complexity.  Oh, to be sure, Popper gestured toward a more quantitative, less naive "falsification" standard, but that was merely a backhanded recasting of Ockham's Razor, rendering his entire project an obscuration.

Popper obscured AIT for a half century.  But worse, he did so at the dawn of the computer age when AIT should have revolutionized the social sciences as the Gold Standard criterion.  It is only now that AIT's superiority as model selection criterion is starting to be recognized by a few with deep theoretic understanding of universal "general intelligence".   General intelligence is at the heart of science.  Aided by Popper's obscurationist rhetoric, the social pseudosciences have obscured social causality even as they imposed, on vast populations, experimental treatments with no experimental controls, such as mass immigration.  This they achieved in large measure by eliding the rigorous, universal model selection criterion of social causality afforded by AIT.  This is the kind of damage Popper did to Western Civilization's intelligence, in theory, practice and consequence.  This damage is, by the way, evident in continued attempts to statistically refute pseudoscientific models of social causality without a model selection criterion that avoids specious "information criteria" typically utilized by social scientists, such as BIC, AIC, etc. which, unlike AIT, fail to bring, both, errors in prediction and model complexity into the same, commensurable definition of "bits" so that adding them together makes sense.

We should not, therefore, be surprised when someone of Chisala's rhetorical excellence amplifies Lynn's quote with, "I expected the HBDers to at least admit, by their own standard, that this simple unambiguous falsification standard had apparently been met."  Chisala performs a Popperian obscuration when he touches on, but doesn't own, Ockham's Razor in his quantitative statement, "Remember, the more [emphasis JAB] universal your claim, the more [emphasis JAB] it can potentially be falsified by a simple singular unambiguous event, which is why Lynn was right to give such a simple falsifying standard."  

Chisala's proportional phrase "the more" demonstrates that he, unlike Lynn, is aware of "naïve falsificationism", even as he bases his argument on Lynn's fallacious use of it.  In so doing, Chisala captures The Ultimate Prize -- a prize offered by Lynn in desperation to have anyone correspond on anything remotely resembling a collegial basis.  Lynn should not have been so desperate as to think past the sale of Popper's "contribution" to Western Civilization.  Both of these errors are understandable given the desperate straits of Western Civilization's academy due to 20th century intellectual movements, as evidenced by Brimelow's introduction welcoming Chisala's correspondence.

Major Fallacy #1

Consensus can precede consent. 


Consensus reached under duress is a false consensus.

The context of discourse with rhetoricians like Chisala is a regime that violates the consent of those that dissent from the prevailing, anti-HBD, orthodoxy.  The HBDers have bought the unstated premise that scientific consensus can be reached while violating consent of the participants. 

Proponents of government enforced anti-HBD orthodoxy bear far more than a mere burden of proof in any discourse toward a consensus.  They bear an ethical responsibility to abjure such force, and do so as a prerequisite to entering into scientific discourse with their non-consenting colleagues.  I am speaking here not of consent to entering into scientific discourse, but government imposition of social theories on entire populations against their will.  The majority of HBDers do not consent to their communities being subjected to experimental treatments based on anti-HBD theories.  Chisala, as among the most ethical of all social scientists, must own up, with due prominence, to the illegitimate advantage he, and they, enjoy.  But he and they, must do more than that.

They must prominently and persistently advocate for the societal investment required to sort proponents of social theories into governments that test them.

Can HBDers do as much for themselves?

Only if they can walk-back the sale.

Sunday, May 05, 2019

A Practical Theory of Equality Is Relative

"Equality" is a "problematic" concept due to a lack of nuance when applied to practical matters such as human affairs.

Take, for example, the standard axioms of equality theory:

x=x
if x=y then y=x
if x=y and y=z then x=z

In human affairs, many people extend equality theory with one more axiom:

∀x,y(x=y)

That is, everyone is equal to everyone else.

This doesn't get one very far in practice.

Now does it?

On the other hand, let's talk about relative equality theory with a new notation:

"x(y=z)" which means  "x regards y as the same as z".

Reformulating the standard axioms of equality theory:

x(y=y)
if x(y=z) then x(z=y)
if x(y=z) and x(z=w) then x(y=w)

Now, we're in a far more interesting domain of discourse, aren't we?

For example, let:

x = "US Constitution"
y = "Some White Guy"
z = "Some Black Guy"

We have:

US Constitution(Some White Guy = Some Black Guy)

In contrast, if we let:

x = "Race"
y = "Some White Guy"
z = "Some Black Guy"

We have:

Race(Some White Guy ≠ Some Black Guy)

(Yes, I know, I didn't introduce the axioms for "≠" yet... so ban me from Facebook.)

PS: I can't claim credit for this, very powerful, notion of equality. See Tom Etter's paper, "Three-place Identity".

Saturday, April 27, 2019

Ockham's Guillotine: Minimizing the Argument Surface of the Social Sciences

Established interests view "Trumpism" as foreboding a modern "storming of the Bastille".  They wonder, "What form of guillotine will populists roll out this time?  Will my head lop in a bucket?"

This article suggests rolling out an inherently judicious "guillotine":

Ockham's Guillotine

Ockham's Guillotine lops only the metaphorical heads of the social pseudoscience Bastille.  It precisely guides Ockham's Razor down the grooves spanning only those necks.

This it does by selecting the best unified model of society, based on a single number:

The model's size, measured in bits of information.

Deprived of "wiggle room" for their "lies, damn lies and statistics", social pseudoscientists will be helpless as the Razor slips, the buckets receive and the crowds roar.

The science for Ockham's Guillotine is here; its mechanisms driven by one of the most powerful forces in history:

The explosion of computation and social data detonated by Moore's Law.

Harnessing this raw power in the design of Ockham's Guillotine requires a theory equal to the task:

Algorithmic Information Theory

Algorithmic Information Theory (AIT) is the computational form of Ockham's Razor:

The Algorithmic Information content of any data, including social data, is the size of the smallest program (algorithm) that outputs that data.  That program necessarily embodies the smallest model of the data.

AIT founds the exploding field of Artificial General Intelligence (AGI).  AIT is sometimes called "Algorithmic Probability Theory" or "Solomonoff Induction".

The final advice given by seminal artificial intelligence figure, the late Marvin Minsky:
"The most important discovery since Godel is Algorithmic Probability which is a fundamental new theory of how to make predictions given a collection of experiences... This is a beautiful theory... that will make better predictions than anything we have today and everybody should learn all about that and spend the rest of their lives working on it."
By "experiences" Minsky meant observations/measurements in the form of data, such as social measurements.

Minimizing the Argument Surface

In cybersecurity, the concept of "attack surface" is the number of ways that an external actor can interact with the system.  Each way in which there is an interaction presents a potential vulnerability to attack.  These interactions can be viewed as "dialogues" based on the communications protocols so, in that sense, they can also be viewed as "issues" over which "arguments" can obtain.

Human conversations between potential adversaries are similar in that the more ways an issue presents itself, the more ways in which sophistic arguments can exploit the social contract upon which civil discourse obtains.  The social sciences are particularly problematic in this respect, involving a myriad of "issues" over which arguments may turn into sophistic exploits.

Ockham's Guillotine reduces the argument surface of the social sciences to just two issues:

1) Is the basis of the artificial intelligence industry valid?
2) What data is relevant to social policy decisions?

#1 is supported by the most powerful modern force as previously described:  The explosion of data, computation and economic incentives behind AGI.

#2 can be dealt with by the simple expedient of not arguing about it:  If the sophists demand that their data be included in the corpus, then let it be included.  This is enabled by the explosion of computation capacity on the one hand, and the ruthless nature of AIT on the other.  AIT is ruthless in the sense that any biased data will best be modeled by algorithmic explication of said bias that corrects the data accordingly.  Then the corrected data will be brought into consilience with the larger body of knowledge/data rather than standing alone, which requires more bits of information.

Perhaps the most ruthless approach to "nuking the social pseudosciences" would be a monetary prize for improvements in the unified model of society.  An exemplar of this kind of prize is The Hutter Prize for Lossless Compression of Human Knowledge, which targets natural language modeling based on Wikipedia's corpus.  Prize awards are paid out for each incremental improvement in the compression of Wikipedia.

A prize of this sort, targeting a unified model of society could trigger an avalanche of activity resulting from the social pseudoscientists attempting to take on the juggernaut of industrial artificial intelligence at the same time that hundreds or thousands of young people, eager to prove their chops (and earn money) increasingly embarrass the corrupt authorities of academia.

A preliminary data set of a wide range of longitudinal social measures for Ockham's Guillotine is available as an example at github.

Thursday, April 04, 2019

Chlorine Sequestration During Exponential Remediation of Civilization's Footprint

UPDATE: The research below exposed a possible new way of synthesizing EdenCrete from in situ resources -- one that bypasses the Calera process hence obviates the chlorine disposal problem.  It may also reduce the energy per mass.  This would reduce the doubling time and simplify the process.  A future article will discuss the implications.  For now, this article remains the best approach to balancing the geochemistry of the reference proposal.

The greatest challenge of the proposed "Exponential Remediation of Civilization's Footprint" is the necessary sequestration of chlorine evolved during the production of concrete, from oceanic salt ions (Ca++, Na+, Cl- and CO3--), for 70,000 Bowery Atolls.  The cited Calera process concrete produces 71% of its weight in chlorine (see "Comparison With Land Based Geologic Sequestration of CO2" below) while sequestering CO2 in the atoll concrete.

Geologic Sequestration


Later, we'll compare the magnitude of chlorine to the magnitude of land-based geologic sequestration of CO2, which can support many times the CO2 projected to be sequestered in the Bowery Atolls. But that requires transportation of evolved chlorine to those sites.  So first, we'll look at the in situ potential for geologic sequestration.

In situ resource utilization is highly desirable in an exponentially growing system.  The civil engineering sense of "in situ" is applicable:  "construction which is carried out at the building site using raw materials... which are present at or near a project site".  In situ resources obviate their transportation cost which, in exponential growth, can represent a severe constraint.  In the present case, resources include not only those that go into the atoll concrete, but also the resources to dispose of waste: geologic formations under the tropical doldrums suitable for chlorine sequestration.

Look at this map of deep sea marine sediments.

Map showing distribution of marine sediments.
  • Gray: land.
  • White: Sediments of the continental margin.
  • Blue: glacial sediments.
  • Orange: land-formed sediments.
  • Brown: pelagic clay.
  • Green: siliceous sediments.
  • Yellow: calcareous sediments.
Notice the eastern equatorial Pacific (doldrums) sediments are yellow, with green just to the north.  Underlying the construction site are "siliceous" and "calcareous" sediments.  There are sand (very fine grained, suitable for EveCrete binding) and calcium carbonate sediments, respectively, available in situ.

Another map disagrees somewhat but substantially supports the general point:



Sand is used in the concrete.  Calcium carbonate sediments lying about 1000ft beneath the ocean floor, offer virtually ideal chlorine sequestration via the reaction:

CaCO3 + 2HCl => CaCl2 + H2CO3 + heat

Chlorine sequestration is thereby turned into H2CO3 (carbonic acid) sequestration, 1000ft beneath the already several-kilometer deep ocean floor.  The H2CO3 will act as a connate fluid that is gradually expressed from the sediment, upward toward the ocean floor 1000ft above, over geologic time during lithification:  the process turning sediments to sedimentary rock.  Unlike water, the usual connate fluid, this particular connate fluid chemically interacts with the CaCO3 sediments via carbonate buffering.  This provides additional environmental protections in the form of pH stabilization and slowing of the already geologic time rate of reentry to the biosphere.

Is There Enough Sediment To Contain All That H2CO3?


In a word: Yes.

The volume of in situ calcareous sediments is on the order of 10,000 times greater than the total volume of H2CO3 evolved (which is comparable to the volume of Cl2 evolved from the 70,000 Bowery Atolls).  If only 1% of that volume is utilized for geologic sequestration, the volumetric concentration would still be only 1% of that reduced volume.

How Costly Is the Geologic Sequestration?


Existing deep sea drilling technology suffices as an economic existence proof.  A deep sea drilling platform costs about 10% of the net present value of an atoll.  Even if each atoll requires its own drilling platform, this is not blocking.  If chlorine is delivered to the sediment with ocean floor of about 12,000ft, liquid chlorine density of 1.5625gm/ml will provide about 8,000psi over-pressure at the ocean floor due to its higher density than water.

12000ft;1.5625g/ml?psi
(12000 * foot) * ([1.5625 * gramf] / [milli*liter]) ? psi
= 8128.6473 psi

This is well within the engineering limits of deep sea drilling.  Going 1000ft deep into CaCO3 sediments will subtract about 1,000psi from that injection pressure.

1000ft;2.7g/ml?psi
(1000 * foot) * ([2.7 * gramf] / [milli*liter]) ? psi
= 1170.5252 psi

The compressive strength of concrete is only about 5,000psi so even if the CaCO3 sediment is in the form of concrete, the injection pressure at that depth will combine with the corrosive nature of Cl2 will fracture the sediments, permitting the ingress of liquid chlorine.

The connate fluid already in the sediments will be dominated by H2O, thereby producing HCl via the reaction:

2Cl2 + 3H2O => 5HCl + HClO3 + heat

It is this HCl that will participate in the reaction already described that produces H2CO3.

Comparison With Land Based Geologic Sequestration of CO2


Land-based geologic sequestration capacity is known to be vastly greater than that required for sequestering the CO2 sequestered by 70,000 Bowery Atolls.

As it turns out, the liquid volume of CO2 sequestered in the concrete of the artificial atolls is significantly greater than the liquid volume of Cl2 produced.

100000people/atoll;7e9people;20km/atoll;(20/100)*2000tonne/m?tonne
 = 5.6E11 tonne
Total CaCO3 mass of atolls.

(12/100)*5.6E11 tonne?tonne (12 / 100) * (5.6E11 * ton_metric) ? ton_metric
 = 6.72E10 tonne
Total C mass of atolls.

(40/100)*5.6E11 tonne?tonne (40 / 100) * (5.6E11 * ton_metric) ? ton_metric
= 2.24E11 tonne
Total Ca mass of atolls.

321003271mi^3;0.04%*1020kg/m^3?tonne
= 5.45904E14 tonne
Total Ca mass in the entire ocean.

(2*35.45/100)*5.6E11 tonne?tonne
= 3.9704E11 tonne
Total Cl2 mass evolved during atoll construction from Calera process.

(44/100)*5.6E11 tonne?tonne
= 2.464E11 tonne
Total CO2 mass sequestered during atoll construction from Calera process.

1.5625g/cm^3;3.9704E11?m^3
= 2.541056E11 m^3
Total (liquid) Cl volume as geologically sequestered (prior to mineralization)

1101 kg/m^3;2.464E11 tonne?m^3
= 2.2E11 m^3
Total CO2 volume as it would have been geologically sequestered (prior to mineralization)







Friday, March 29, 2019

Now Amassing The Army aka #HighNoonPatriots

TL;DR

As many days as you can, at high noon central time,  do SOMETHING VISIBLE (and legal*) petition for redress of grievances.  Include the hashtag #highnoonpatriots to show solidarity with others.  Do it anonymously as appropriate.

END TL;DR


About 1 in 3 Americans believe that civil war will breakout in the near future.  Individuals are increasingly vulnerable to mobs, particularly as mobs capture American institutions established to protect the individual from mobs.  Individuals, powerless against and fearful of the growing power and virulence of mobs, seek safety in numbers by joining mobs.

First we must answer the question, "What happened?"

Next we must answer the question, "How can individuals protect themselves against institutionalize mobs?"


What Happened?


Centralization of power forces everyone, as a matter of self defense, to vie for the center of power.  This encourages mob mentality.  The centralization of power in America has gradually empowered mobs of all stripes.  The fatal blow was the The Immigration and Nationality Act of 1965 steadily increasing immigration decade after decade -- a policy overwhelmingly opposed by AmericansWhy was this the fatal blow against the individual?  Because about 80% of naturalized citizens vote for greater centralization of social policy in the Federal government.  

That's why.  

And that's what happened.


How Can Individuals Protect Themselves?


Act together or act individually and hang individually.  

First and foremost, this means synchronized action.


Why Synchronized Action?


Think about the public "shutting down the Congressional switchboard" in response to mass media inspired outrage.

Any synchronized action is vastly more effective than asynchronous action because of the effects of transients on networks upon which civilization depends

Start small and symbolic but synchronized. This will build a pattern of victory.


How Can Individuals Synchronize?


Do something visible to others at high noon central time.


Every Day?


As many days as you can.




*Prior versions of this article recommended less frequent but more targeted and even costly actions, as a way of providing quorum sensing in a decentralized manner impervious to managerial class censorship.  Daily and less targeted, but still synchronized, action will provide a bigger tent.

Monday, February 11, 2019

The Expropriation Condition For a Single Tax On Wealth

Perhaps the greatest fear of a wealth tax, more accurately called a tax on liquid value of net assets, is that it would expropriate liquidation value.  To calculate the level of expropriation it is helpful to assume a single tax on wealth and then measure the difference in liquidation value.  This can be done by subtracting the owner's original value under activity (income) taxation, from the prospective buyer's value under asset (wealth) taxation:

\[\frac{  \mathit{buyer\_ income}-\mathit{buyer\_ expense}}{\log{\left( \mathit{buyer\_ interest\_ rate}+\mathit{asset\_ tax\_ rate}+1\right) }}-\frac{\left( 1-\mathit{income\_ tax\_ rate}\right)  \left( \mathit{owner\_ income}-\mathit{owner\_ expense}\right) }{\log{\left( \mathit{owner\_ interest\_ rate}+1\right) }}\]

WHERE
income_tax_rate = the aggregate tax rate on economic activities, such as income, capital gains, value added, etc.
asset_tax_rate = the net asset tax rate (on liquid value)
owner_income = the owner's expected gross periodic income from the asset
buyer_income = the buyer's expected gross periodic income from the asset
owner_expense = the owner's expected periodic expenditure on the asset
buyer_expense = the buyer's expected periodic expenditure on the asset
owner_interest_rate = the periodic interest rate paid by the owner in borrowing to purchase the asset
buyer_interest_rate = the periodic interest rate paid by the buyer in borrowing to purchase the asset

The more negative this difference goes, the greater the expropriation of liquid value.

It is important to note that the above formula assumes the buyer does not enjoy a standard deduction -- for example a homestead deduction as normally protected under Chapter 7 bankruptcy.  Such a deduction is an ordinary feature of wealth tax proposals and would frequently come into play in a change to single tax on wealth as tenants  purchase their residences from landlords.

The derivation follows:

\[\tag{profit_stream}\frac{\left( \mathit{income}-\mathit{expense}\right) \, \left( 1-\mathit{income\_ tax\_ rate}\right) }{{{\left( \mathit{interest\_ rate}+\mathit{asset\_ tax\_ rate}+1\right) }^{t}}}\]

\[\tag{net_present_value}\frac{\left( \mathit{income}-\mathit{expense}\right) \, \left( 1-\mathit{income\_ tax\_ rate}\right) }{\log{\left( \mathit{interest\_ rate}+\mathit{asset\_ tax\_ rate}+1\right) }}\]

\[\tag{AT_ NPV}\frac{\mathit{income}-\mathit{expense}}{\log{\left( \mathit{interest\_ rate}+\mathit{asset\_ tax\_ rate}+1\right) }}\]

\[\tag{IT_ NPV}\frac{\left( \mathit{income}-\mathit{expense}\right) \, \left( 1-\mathit{income\_ tax\_ rate}\right) }{\log{\left( \mathit{interest\_ rate}+1\right) }}\]

\[\tag{Buyer_NPV}\frac{  \mathit{buyer\_ income}-\mathit{buyer\_ expense}}{\log{\left( \mathit{buyer\_ interest\_ rate}+\mathit{asset\_ tax\_ rate}+1\right) }}\]

\[\tag{Owner_NPV}\frac{\left( 1-\mathit{income\_ tax\_ rate}\right)  \left( \mathit{owner\_ income}-\mathit{owner\_ expense}\right) }{\log{\left( \mathit{owner\_ interest\_ rate}+1\right) }}\]

Thursday, July 19, 2018

"Genocide"

You Have Been Misled As to the Meaning of the Word

"Genocide"

You have been taught that nationalism is the primary source of "genocide" -- that nationalists perpetrate "genocide" and that ridding the world of nationalism is an important, perhaps the most important step in eradicating the threat of "genocide".


You have been taught, and are now a believer in, the exact opposite of the truth.


Raphael Lemkin and his work with the Geneva Conventions led the term "genocide" to be incorporated into the Geneva Conventions.

Here is Lemkin's definition:

"Generally speaking, genocide does not necessarily mean the immediate destruction of a nation, except when accomplished by mass killings of all members of a nation. It is intended rather to signify a coordinated plan of different actions aiming at the destruction of essential foundations of the life of national groups, with the aim of annihilating the groups themselves. The objectives of such a plan would be the disintegration of the political and social institutions, of culture, language, national feelings, religion, and the economic existence of national groups, and the destruction of the personal security, liberty, health, dignity, and even the lives of the individuals belonging to such groups. Genocide is directed against the national group as an entity, and the actions involved are directed against individuals, not in their individual capacity but as members of a national group."
Cited in "Beyond the 1948 Convention -- Emerging principles of Genocide in Customary International Law," Maryland Journal of International Law and Trade, vol. 17, no. 2, Fall 1993, ppp. 193-226.

The conclusion is inescapable:


Those who have taught you that:

"Genocide can be eradicated by eradicating nationalism."

are actually perpetrators of genocide under its proper definition within the Geneva Conventions.

Furthermore, since the pervasive teaching of this ideology has been the primary moral force for the disintegration of, not one, but most national identities during the last half of the 20th century, its teachers have been and are by definition the primary perpetrators of genocide over the last half century.